Sample records for image archives header

  1. Initial Experience With A Prototype Storage System At The University Of North Carolina

    NASA Astrophysics Data System (ADS)

    Creasy, J. L.; Loendorf, D. D.; Hemminger, B. M.

    1986-06-01

    A prototype archiving system manufactured by the 3M Corporation has been in place at the University of North Carolina for approximately 12 months. The system was installed as a result of a collaboration between 3M and UNC, with 3M seeking testing of their system, and UNC realizing the need for an archiving system as an essential part of their PACS test-bed facilities. System hardware includes appropriate network and disk interface devices as well as media for both short and long term storage of images and their associated information. The system software includes those procedures necessary to communicate with the network interface elements(NIEs) as well as those procedures necessary to interpret the ACR-NEMA header blocks and to store the images. A subset of the total ACR-NEMA header is parsed and stored in a relational database system. The entire header is stored on disk with the completed study. Interactive programs have been developed that allow radiologists to easily retrieve information about the archived images and to send the full images to a viewing console. Initial experience with the system has consisted primarily of hardware and software debugging. Although the system is ACR-NEMA compatable, further objective and subjective assessments of system performance is awaiting the connection of compatable consoles and acquisition devices to the network.

  2. More flexibility in representing geometric distortion in astronomical images

    NASA Astrophysics Data System (ADS)

    Shupe, David L.; Laher, Russ R.; Storrie-Lombardi, Lisa; Surace, Jason; Grillmair, Carl; Levitan, David; Sesar, Branimir

    2012-09-01

    A number of popular software tools in the public domain are used by astronomers, professional and amateur alike, but some of the tools that have similar purposes cannot be easily interchanged, owing to the lack of a common standard. For the case of image distortion, SCAMP and SExtractor, available from Astromatic.net, perform astrometric calibration and source-object extraction on image data, and image-data geometric distortion is computed in celestial coordinates with polynomial coefficients stored in the FITS header with the PV i_j keywords. Another widely-used astrometric-calibration service, Astrometry.net, solves for distortion in pixel coordinates using the SIP convention that was introduced by the Spitzer Science Center. Up until now, due to the complexity of these distortion representations, it was very difficult to use the output of one of these packages as input to the other. New Python software, along with faster-computing C-language translations, have been developed at the Infrared Processing and Analysis Center (IPAC) to convert FITS-image headers from PV to SIP and vice versa. It is now possible to straightforwardly use Astrometry.net for astrometric calibration and then SExtractor for source-object extraction. The new software also enables astrometric calibration by SCAMP followed by image visualization with tools that support SIP distortion, but not PV . The software has been incorporated into the image-processing pipelines of the Palomar Transient Factory (PTF), which generate FITS images with headers containing both distortion representations. The software permits the conversion of archived images, such as from the Spitzer Heritage Archive and NASA/IPAC Infrared Science Archive, from SIP to PV or vice versa. This new capability renders unnecessary any new representation, such as the proposed TPV distortion convention.

  3. RADIANCE: An automated, enterprise-wide solution for archiving and reporting CT radiation dose estimates.

    PubMed

    Cook, Tessa S; Zimmerman, Stefan L; Steingall, Scott R; Maidment, Andrew D A; Kim, Woojin; Boonn, William W

    2011-01-01

    There is growing interest in the ability to monitor, track, and report exposure to radiation from medical imaging. Historically, however, dose information has been stored on an image-based dose sheet, an arrangement that precludes widespread indexing. Although scanner manufacturers are beginning to include dose-related parameters in the Digital Imaging and Communications in Medicine (DICOM) headers of imaging studies, there remains a vast repository of retrospective computed tomographic (CT) data with image-based dose sheets. Consequently, it is difficult for imaging centers to monitor their dose estimates or participate in the American College of Radiology (ACR) Dose Index Registry. An automated extraction software pipeline known as Radiation Dose Intelligent Analytics for CT Examinations (RADIANCE) has been designed that quickly and accurately parses CT dose sheets to extract and archive dose-related parameters. Optical character recognition of information in the dose sheet leads to creation of a text file, which along with the DICOM study header is parsed to extract dose-related data. The data are then stored in a relational database that can be queried for dose monitoring and report creation. RADIANCE allows efficient dose analysis of CT examinations and more effective education of technologists, radiologists, and referring physicians regarding patient exposure to radiation at CT. RADIANCE also allows compliance with the ACR's dose reporting guidelines and greater awareness of patient radiation dose, ultimately resulting in improved patient care and treatment.

  4. The Next Generation of HLA Image Products

    NASA Astrophysics Data System (ADS)

    Gaffney, N. I.; Casertano, S.; Ferguson, B.

    2012-09-01

    We present the re-engineered pipeline based on existing and improved algorithms with the aim of improving processing quality, cross-instrument portability, data flow management, and software maintenance. The Hubble Legacy Archive (HLA) is a project to add value to the Hubble Space Telescope data archive by producing and delivering science-ready drizzled data products and source lists derived from these products. Initially, ACS, NICMOS, and WFCP2 data were combined using instrument-specific pipelines based on scripts developed to process the ACS GOODS data and a separate set of scripts to generate source extractor and DAOPhot source lists. The new pipeline, initially designed for WFC3 data, isolates instrument-specific processing and is easily extendable to other instruments and to generating wide-area mosaics. Significant improvements have been made in image combination using improved alignment, source detection, and background equalization routines. It integrates improved alignment procedures, better noise model, and source list generation within a single code base. Wherever practical, PyRAF based routines have been replaced with non-IRAF based python libraries (e.g. NumPy and PyFITS). The data formats have been modified to handle better and more consistent propagation of information from individual exposures to the combined products. A new exposure layer stores the effective exposure time for each pixel in the sky which is key in properly interpreting combined images from diverse data that were not initially planned to be mosaiced. We worked to improve the validity of the metadata within our FITS headers for these products relative to standard IRAF/PyRAF processing. Any keywords that pertain to individual exposures have been removed from the primary and extension headers and placed in a table extension for more direct and efficient perusal. This mechanism also allows for more detailed information on the processing of individual images to be stored and propagated providing a more hierarchical metadata storage system than key value pair FITS headers provide. In this poster we will discuss the changes to the pipeline processing and source list generation and the lessons learned which may be applicable to other archive projects as well as discuss our new metadata curation and preservation process.

  5. The Alaska Arctic Vegetation Archive (AVA-AK)

    Treesearch

    Donald A. Walker; Amy L. Breen; Lisa A. Druckenmiller; Lisa W. Wirth; Will Fisher; Martha K. Raynolds; Jozef Šibík; Marilyn D. Walker; Stephan Hennekens; Keith Boggs; Tina Boucher; Marcel Buchhorn; Helga Bültmann; David J. Cooper; Fred J.A Daniëls; Scott J. Davidson; James J. Ebersole; Sara C. Elmendorf; Howard E. Epstein; William A. Gould; Robert D. Hollister; Colleen M. Iversen; M. Torre Jorgenson; Anja Kade; Michael T. Lee; William H. MacKenzie; Robert K. Peet; Jana L. Peirce; Udo Schickhoff; Victoria L. Sloan; Stephen S. Talbot; Craig E. Tweedie; Sandra Villarreal; Patrick J. Webber; Donatella Zona

    2016-01-01

    The Alaska Arctic Vegetation Archive (AVA-AK, GIVD-ID: NA-US-014) is a free, publically available database archive of vegetation-plot data from the Arctic tundra region of northern Alaska. The archive currently contains 24 datasets with 3,026 non-overlapping plots. Of these, 74% have geolocation data with 25-m or better precision. Species cover data and header data are...

  6. DICOM image quantification secondary capture (DICOM IQSC) integrated with numeric results, regions, and curves: implementation and applications in nuclear medicine

    NASA Astrophysics Data System (ADS)

    Cao, Xinhua; Xu, Xiaoyin; Voss, Stephan

    2017-03-01

    In this paper, we describe an enhanced DICOM Secondary Capture (SC) that integrates Image Quantification (IQ) results, Regions of Interest (ROIs), and Time Activity Curves (TACs) with screen shots by embedding extra medical imaging information into a standard DICOM header. A software toolkit of DICOM IQSC has been developed to implement the SC-centered information integration of quantitative analysis for routine practice of nuclear medicine. Primary experiments show that the DICOM IQSC method is simple and easy to implement seamlessly integrating post-processing workstations with PACS for archiving and retrieving IQ information. Additional DICOM IQSC applications in routine nuclear medicine and clinic research are also discussed.

  7. A New Archive of UKIRT Legacy Data at CADC

    NASA Astrophysics Data System (ADS)

    Bell, G. S.; Currie, M. J.; Redman, R. O.; Purves, M.; Jenness, T.

    2014-05-01

    We describe a new archive of legacy data from the United Kingdom Infrared Telescope (UKIRT) at the Canadian Astronomy Data Centre (CADC) containing all available data from the Cassegrain instruments. The desire was to archive the raw data in as close to the original format as possible, so where the data followed our current convention of having a single data file per observation, it was archived without alteration, except for minor fixes to headers of data in FITS format to allow it to pass fitsverify and be accepted by CADC. Some of the older data comprised multiple integrations in separate files per observation, stored in either Starlink NDF or Figaro DST format. These were placed inside HDS container files, and DST files were rearranged into NDF format. The describing the observations is ingested into the CAOM-2 repository via an intermediate MongoDB header database, which will also be used to guide the ORAC-DR pipeline in generating reduced data products.

  8. Development of the Subaru-Mitaka-Okayama-Kiso Archive System

    NASA Astrophysics Data System (ADS)

    Baba, Hajime; Yasuda, Naoki; Ichikawa, Shin-Ichi; Yagi, Masafumi; Iwamoto, Nobuyuki; Takata, Tadafumi; Horaguchi, Toshihiro; Taga, Masatoshi; Watanabe, Masaru; Ozawa, Tomohiko; Hamabe, Masaru

    We have developed the Subaru-Mitaka-Okayama-Kiso-Archive (SMOKA) public science archive system which provides access to the data of the Subaru Telescope, the 188 cm telescope at Okayama Astrophysical Observatory, and the 105 cm Schmidt telescope at Kiso Observatory/University of Tokyo. SMOKA is the successor of the MOKA3 system. The user can browse the Quick-Look Images, Header Information (HDI) and the ASCII Table Extension (ATE) of each frame from the search result table. A request for data can be submitted in a simple manner. The system is developed with Java Servlet for the back-end, and Java Server Pages (JSP) for content display. The advantage of JSP's is the separation of the front-end presentation from the middle- and back-end tiers which led to an efficient development of the system. The SMOKA homepage is available at SMOKA

  9. Assessing the impact of a radiology information management system in the emergency department

    NASA Astrophysics Data System (ADS)

    Redfern, Regina O.; Langlotz, Curtis P.; Lowe, Robert A.; Horii, Steven C.; Abbuhl, Stephanie B.; Kundel, Harold L.

    1998-07-01

    To evaluate a conventional radiology image management system, by investigating information accuracy, and information delivery. To discuss the customization of a picture archival and communication system (PACS), integrated radiology information system (RIS) and hospital information system (HIS) to a high volume emergency department (ED). Materials and Methods: Two data collection periods were completed. After the first data collection period, a change in work rules was implemented to improve the quality of data in the image headers. Data from the RIS, the ED information system, and the HIS as well as observed time motion data were collected for patients admitted to the ED. Data accuracy, patient waiting times, and radiology exam information delivery were compared. Results: The percentage of examinations scheduled in the RIS by the technologists increased from 0% (0 of 213) during the first period to 14% (44 of 317) during the second (p less than 0.001). The percentage of images missing identification numbers decreased from 36% (98 of 272) during the first data collection period to 10% (56 of 562) during the second period (p less than 0.001). Conclusions: Radiologic services in a high-volume ED, requiring rapid service, present important challenges to a PACS system. Strategies can be implemented to improve accuracy and completeness of the data in PACS image headers in such an environment.

  10. Providing integrity, authenticity, and confidentiality for header and pixel data of DICOM images.

    PubMed

    Al-Haj, Ali

    2015-04-01

    Exchange of medical images over public networks is subjected to different types of security threats. This has triggered persisting demands for secured telemedicine implementations that will provide confidentiality, authenticity, and integrity for the transmitted images. The medical image exchange standard (DICOM) offers mechanisms to provide confidentiality for the header data of the image but not for the pixel data. On the other hand, it offers mechanisms to achieve authenticity and integrity for the pixel data but not for the header data. In this paper, we propose a crypto-based algorithm that provides confidentially, authenticity, and integrity for the pixel data, as well as for the header data. This is achieved by applying strong cryptographic primitives utilizing internally generated security data, such as encryption keys, hashing codes, and digital signatures. The security data are generated internally from the header and the pixel data, thus a strong bond is established between the DICOM data and the corresponding security data. The proposed algorithm has been evaluated extensively using DICOM images of different modalities. Simulation experiments show that confidentiality, authenticity, and integrity have been achieved as reflected by the results we obtained for normalized correlation, entropy, PSNR, histogram analysis, and robustness.

  11. BOREAS RSS-14 Level-1a GOES-8 Visible, IR and Water Vapor Images

    NASA Technical Reports Server (NTRS)

    Hall, Forrest G. (Editor); Newcomer, Jeffrey A.; Faysash, David; Cooper, Harry J.; Smith, Eric A.

    2000-01-01

    The BOREAS RSS-14 team collected and processed several GOES-7 and GOES-8 image data sets that covered the BOREAS study region. The level-1a GOES-8 images were created by BORIS personnel from the level-1 images delivered by FSU personnel. The data cover 14-Jul-1995 to 21-Sep-1995 and 12-Feb-1996 to 03-Oct-1996. The data start out as three bands with 8-bit pixel values and end up as five bands with 10-bit pixel values. No major problems with the data have been identified. The differences between the level-1 and level-1a GOES-8 data are the formatting and packaging of the data. The images missing from the temporal series of level-1 GOES-8 images were zero-filled by BORIS staff to create files consistent in size and format. In addition, BORIS staff packaged all the images of a given type from a given day into a single file, removed the header information from the individual level-1 files, and placed it into a single descriptive ASCII header file. The data are contained in binary image format files. Due to the large size of the images, the level-1a GOES-8 data are not contained on the BOREAS CD-ROM set. An inventory listing file is supplied on the CD-ROM to inform users of what data were collected. The level-1a GOES-8 image data are available from the Earth Observing System Data and Information System (EOSDIS) Oak Ridge National Laboratory (ORNL) Distributed Active Archive Center (DAAC). See sections 15 and 16 for more information. The data files are available on a CD-ROM (see document number 20010000884).

  12. Development of public science archive system of Subaru Telescope

    NASA Astrophysics Data System (ADS)

    Baba, Hajime; Yasuda, Naoki; Ichikawa, Shin-Ichi; Yagi, Masafumi; Iwamoto, Nobuyuki; Takata, Tadafumi; Horaguchi, Toshihiro; Taga, Masatochi; Watanabe, Masaru; Okumura, Shin-Ichiro; Ozawa, Tomohiko; Yamamoto, Naotaka; Hamabe, Masaru

    2002-09-01

    We have developed a public science archive system, Subaru-Mitaka-Okayama-Kiso Archive system (SMOKA), as a successor of Mitaka-Okayama-Kiso Archive (MOKA) system. SMOKA provides an access to the public data of Subaru Telescope, the 188 cm telescope at Okayama Astrophysical Observatory, and the 105 cm Schmidt telescope at Kiso Observatory of the University of Tokyo. Since 1997, we have tried to compile the dictionary of FITS header keywords. The accomplishment of the dictionary enabled us to construct an unified public archive of the data obtained with various instruments at the telescopes. SMOKA has two kinds of user interfaces; Simple Search and Advanced Search. Novices can search data by simply selecting the name of the target with the Simple Search interface. Experts would prefer to set detailed constraints on the query, using the Advanced Search interface. In order to improve the efficiency of searching, several new features are implemented, such as archive status plots, calibration data search, an annotation system, and an improved Quick Look Image browsing system. We can efficiently develop and operate SMOKA by adopting a three-tier model for the system. Java servlets and Java Server Pages (JSP) are useful to separate the front-end presentation from the middle and back-end tiers.

  13. Present status and future directions of the Mayo/IBM PACS project

    NASA Astrophysics Data System (ADS)

    Morin, Richard L.; Forbes, Glenn S.; Gehring, Dale G.; Salutz, James R.; Pavlicek, William

    1991-07-01

    This joint project began in 1988 and was motivated by the need to develop an alternative to the archival process in place at that time (magnetic tape) for magnetic resonance imaging and neurological computed tomography. In addition, this project was felt to be an important step in gaining the necessary clinical experience for the future implementation of various aspects of electronic imaging. The initial phase of the project was conceived and developed to prove the concept, test the fundamental components, and produce performance measurements for future work. The key functions of this phase centered on attachment of imaging equipment (GE Signa) and archival processes using a non-dedicated (institutionally supplied) local area network (LAN). Attachment of imaging equipment to the LAN was performed using commercially available devices (Ethernet, PS/2, Token Ring). Image data were converted to ACR/NEMA format with retention of the vendor specific header information. Performance measurements were encouraging and led to the design of following projects. The second phase has recently been concluded. The major features of this phase have been to greatly expand the network, put the network into clinical use, establish an efficient and useful viewing station, include diagnostic reports in the archive data, provide wide area network (WAN) capability via ISDN, and establish two-way real-time video between remote sites. This phase has heightened both departmental and institutional thought regarding various issues raised by electronic imaging. Much discussion regarding both present as well as future archival processes has occurred. The use of institutional LAN resources has proven to be adequate for the archival function examined thus far. Experiments to date have shown that use of dedicated resources will be necessary for retrieval activities at even a basic level. This report presents an overview of the background present status and future directions of the project.

  14. Galaxy of Images

    Science.gov Websites

    This site has moved! Please go to our new Image Gallery site! dot header Basic Image Search Options dot header Search Tips Enter a keyword term below: Submit Use this search to find ANY words you Irish Lion Cubs Taxonomic (Scientific) Keyword Search: Submit Many of the images in the Galaxy of Images

  15. The Hopkins Ultraviolet Telescope: The Final Archive

    NASA Astrophysics Data System (ADS)

    Dixon, William V.; Blair, William P.; Kruk, Jeffrey W.; Romelfanger, Mary L.

    2013-04-01

    The Hopkins Ultraviolet Telescope (HUT) was a 0.9 m telescope and moderate-resolution (Δλ = 3 Å) far-ultraviolet (820-1850 Å) spectrograph that flew twice on the space shuttle, in 1990 December (Astro-1, STS-35) and 1995 March (Astro-2, STS-67). The resulting spectra were originally archived in a nonstandard format that lacked important descriptive metadata. To increase their utility, we have modified the original data-reduction software to produce a new and more user-friendly data product, a time-tagged photon list similar in format to the Intermediate Data Files (IDFs) produced by the Far Ultraviolet Spectroscopic Explorer calibration pipeline. We have transferred all relevant pointing and instrument-status information from locally-archived science and engineering databases into new FITS header keywords for each data set. Using this new pipeline, we have reprocessed the entire HUT archive from both missions, producing a new set of calibrated spectral products in a modern FITS format that is fully compliant with Virtual Observatory requirements. For each exposure, we have generated quick-look plots of the fully-calibrated spectrum and associated pointing history information. Finally, we have retrieved from our archives HUT TV guider images, which provide information on aperture positioning relative to guide stars, and converted them into FITS-format image files. All of these new data products are available in the new HUT section of the Mikulski Archive for Space Telescopes (MAST), along with historical and reference documents from both missions. In this article, we document the improved data-processing steps applied to the data and show examples of the new data products.

  16. The Hopkins Ultraviolet Telescope: The Final Archive

    NASA Technical Reports Server (NTRS)

    Dixon, William V.; Blair, William P.; Kruk, Jeffrey W.; Romelfanger, Mary L.

    2013-01-01

    The Hopkins Ultraviolet Telescope (HUT) was a 0.9 m telescope and moderate-resolution (Delta)lambda equals 3 A) far-ultraviolet (820-1850 Å) spectrograph that flew twice on the space shuttle, in 1990 December (Astro-1, STS-35) and 1995 March (Astro-2, STS-67). The resulting spectra were originally archived in a nonstandard format that lacked important descriptive metadata. To increase their utility, we have modified the original datareduction software to produce a new and more user-friendly data product, a time-tagged photon list similar in format to the Intermediate Data Files (IDFs) produced by the Far Ultraviolet Spectroscopic Explorer calibration pipeline. We have transferred all relevant pointing and instrument-status information from locally-archived science and engineering databases into new FITS header keywords for each data set. Using this new pipeline, we have reprocessed the entire HUT archive from both missions, producing a new set of calibrated spectral products in a modern FITS format that is fully compliant with Virtual Observatory requirements. For each exposure, we have generated quicklook plots of the fully-calibrated spectrum and associated pointing history information. Finally, we have retrieved from our archives HUT TV guider images, which provide information on aperture positioning relative to guide stars, and converted them into FITS-format image files. All of these new data products are available in the new HUT section of the Mikulski Archive for Space Telescopes (MAST), along with historical and reference documents from both missions. In this article, we document the improved data-processing steps applied to the data and show examples of the new data products.

  17. Non-parametric adaptative JPEG fragments carving

    NASA Astrophysics Data System (ADS)

    Amrouche, Sabrina Cherifa; Salamani, Dalila

    2018-04-01

    The most challenging JPEG recovery tasks arise when the file header is missing. In this paper we propose to use a two layer machine learning model to restore headerless JPEG images. We first build a classifier able to identify the structural properties of the images/fragments and then use an AutoEncoder (AE) to learn the fragment features for the header prediction. We define a JPEG universal header and the remaining free image parameters (Height, Width) are predicted with a Gradient Boosting Classifier. Our approach resulted in 90% accuracy using the manually defined features and 78% accuracy using the AE features.

  18. The PDS-based Data Processing, Archiving and Management Procedures in Chang'e Mission

    NASA Astrophysics Data System (ADS)

    Zhang, Z. B.; Li, C.; Zhang, H.; Zhang, P.; Chen, W.

    2017-12-01

    PDS is adopted as standard format of scientific data and foundation of all data-related procedures in Chang'e mission. Unlike the geographically distributed nature of the planetary data system, all procedures of data processing, archiving, management and distribution are proceeded in the headquarter of Ground Research and Application System of Chang'e mission in a centralized manner. The RAW data acquired by the ground stations is transmitted to and processed by data preprocessing subsystem (DPS) for the production of PDS-compliant Level 0 Level 2 data products using established algorithms, with each product file being well described using an attached label, then all products with the same orbit number are put together into a scheduled task for archiving along with a XML archive list file recoding all product files' properties such as file name, file size etc. After receiving the archive request from DPS, data management subsystem (DMS) is provoked to parse the XML list file to validate all the claimed files and their compliance to PDS using a prebuilt data dictionary, then to exact metadata of each data product file from its PDS label and the fields of its normalized filename. Various requirements of data management, retrieving, distribution and application can be well met using the flexible combination of the rich metadata empowered by the PDS. In the forthcoming CE-5 mission, all the design of data structure and procedures will be updated from PDS version 3 used in previous CE-1, CE-2 and CE-3 missions to the new version 4, the main changes would be: 1) a dedicated detached XML label will be used to describe the corresponding scientific data acquired by the 4 instruments carried, the XML parsing framework used in archive list validation will be reused for the label after some necessary adjustments; 2) all the image data acquired by the panorama camera, landing camera and lunar mineralogical spectrometer should use an Array_2D_Image/Array_3D_Image object to store image data, and use a Table_Character object to store image frame header; the tabulated data acquired by the lunar regolith penetrating radar should use a Table_Binary object to store measurements.

  19. Archive of Digital Chirp Subbottom Profile Data Collected During USGS Cruise 14BIM05 Offshore of Breton Island, Louisiana, August 2014

    USGS Publications Warehouse

    Forde, Arnell S.; Flocks, James G.; Wiese, Dana S.; Fredericks, Jake J.

    2016-03-29

    The archived trace data are in standard SEG Y rev. 0 format (Barry and others, 1975); the first 3,200 bytes of the card image header are in American Standard Code for Information Interchange (ASCII) format instead of Extended Binary Coded Decimal Interchange Code (EBCDIC) format. The SEG Y files are available on the DVD version of this report or online, downloadable via the USGS Coastal and Marine Geoscience Data System (http://cmgds.marine.usgs.gov). The data are also available for viewing using GeoMapApp (http://www.geomapapp.org) and Virtual Ocean (http://www.virtualocean.org) multi-platform open source software. The Web version of this archive does not contain the SEG Y trace files. To obtain the complete DVD archive, contact USGS Information Services at 1-888-ASK-USGS or infoservices@usgs.gov. The SEG Y files may be downloaded and processed with commercial or public domain software such as Seismic Unix (SU) (Cohen and Stockwell, 2010). See the How To Download SEG Y Data page for download instructions. The printable profiles are provided as Graphics Interchange Format (GIF) images processed and gained using SU software and can be viewed from theProfiles page or by using the links located on the trackline maps; refer to the Software page for links to example SU processing scripts.

  20. System Integration Issues in Digital Photogrammetric Mapping

    DTIC Science & Technology

    1992-01-01

    elevation models, and/or rectified imagery/ orthophotos . Imagery exported from the DSPW can be either in a tiled image format or standard raster format...data. In the near future, correlation using "window shaping" operations along with an iterative orthophoto refinements methodology (Norvelle, 1992) is...components of TIES. The IDS passes tiled image data and ASCII header data to the DSPW. The tiled image file contains only image data. The ASCII header

  1. Colorado Water Institute

    Science.gov Websites

    Colorado Water Institute Colorado State University header HomeMission StatementGRAD592NewslettersPublications/ReportsCSU Water ExpertsFunding OpportunitiesScholarshipsSubscribeEmploymentAdvisory BoardStaffContact UsCommentsLinks Water Center Logo Water Resources Archive Office of Engagement Ag Water

  2. An automatic detection method for the boiler pipe header based on real-time image acquisition

    NASA Astrophysics Data System (ADS)

    Long, Yi; Liu, YunLong; Qin, Yongliang; Yang, XiangWei; Li, DengKe; Shen, DingJie

    2017-06-01

    Generally, an endoscope is used to test the inner part of the thermal power plants boiler pipe header. However, since the endoscope hose manual operation, the length and angle of the inserted probe cannot be controlled. Additionally, it has a big blind spot observation subject to the length of the endoscope wire. To solve these problems, an automatic detection method for the boiler pipe header based on real-time image acquisition and simulation comparison techniques was proposed. The magnetic crawler with permanent magnet wheel could carry the real-time image acquisition device to complete the crawling work and collect the real-time scene image. According to the obtained location by using the positioning auxiliary device, the position of the real-time detection image in a virtual 3-D model was calibrated. Through comparing of the real-time detection images and the computer simulation images, the defects or foreign matter fall into could be accurately positioning, so as to repair and clean up conveniently.

  3. International Metadata Initiatives: Lessons in Bibliographic Control.

    ERIC Educational Resources Information Center

    Caplan, Priscilla

    This paper looks at a subset of metadata schemes, including the Text Encoding Initiative (TEI) header, the Encoded Archival Description (EAD), the Dublin Core Metadata Element Set (DCMES), and the Visual Resources Association (VRA) Core Categories for visual resources. It examines why they developed as they did, major point of difference from…

  4. Automated extraction of radiation dose information for CT examinations.

    PubMed

    Cook, Tessa S; Zimmerman, Stefan; Maidment, Andrew D A; Kim, Woojin; Boonn, William W

    2010-11-01

    Exposure to radiation as a result of medical imaging is currently in the spotlight, receiving attention from Congress as well as the lay press. Although scanner manufacturers are moving toward including effective dose information in the Digital Imaging and Communications in Medicine headers of imaging studies, there is a vast repository of retrospective CT data at every imaging center that stores dose information in an image-based dose sheet. As such, it is difficult for imaging centers to participate in the ACR's Dose Index Registry. The authors have designed an automated extraction system to query their PACS archive and parse CT examinations to extract the dose information stored in each dose sheet. First, an open-source optical character recognition program processes each dose sheet and converts the information to American Standard Code for Information Interchange (ASCII) text. Each text file is parsed, and radiation dose information is extracted and stored in a database which can be queried using an existing pathology and radiology enterprise search tool. Using this automated extraction pipeline, it is possible to perform dose analysis on the >800,000 CT examinations in the PACS archive and generate dose reports for all of these patients. It is also possible to more effectively educate technologists, radiologists, and referring physicians about exposure to radiation from CT by generating report cards for interpreted and performed studies. The automated extraction pipeline enables compliance with the ACR's reporting guidelines and greater awareness of radiation dose to patients, thus resulting in improved patient care and management. Copyright © 2010 American College of Radiology. Published by Elsevier Inc. All rights reserved.

  5. An optical disk archive for a data base management system

    NASA Technical Reports Server (NTRS)

    Thomas, Douglas T.

    1985-01-01

    An overview is given of a data base management system that can catalog and archive data at rates up to 50M bits/sec. Emphasis is on the laser disk system that is used for the archive. All key components in the system (3 Vax 11/780s, a SEL 32/2750, a high speed communication interface, and the optical disk) are interfaced to a 100M bits/sec 16-port fiber optic bus to achieve the high data rates. The basic data unit is an autonomous data packet. Each packet contains a primary and secondary header and can be up to a million bits in length. The data packets are recorded on the optical disk at the same time the packet headers are being used by the relational data base management software ORACLE to create a directory independent of the packet recording process. The user then interfaces to the VAX that contains the directory for a quick-look scan or retrieval of the packet(s). The total system functions are distributed between the VAX and the SEL. The optical disk unit records the data with an argon laser at 100M bits/sec from its buffer, which is interfaced to the fiber optic bus. The same laser is used in the read cycle by reducing the laser power. Additional information is given in the form of outlines, charts, and diagrams.

  6. Standardizing Documentation of FITS Headers

    NASA Astrophysics Data System (ADS)

    Hourcle, Joseph

    2014-06-01

    Although the FITS file format[1] can be self-documenting, human intervention is often needed to read the headers to write the necessary transformations to make a given instrument team's data compatible with our preferred analysis package. External documentation may be needed to determine what the values are of coded values or unfamiliar acronyms.Different communities have interpreted keywords slightly differently. This has resulted in ambiguous fields such as DATE-OBS, which could be either the start or mid-point of an observation.[2]Conventions for placing units and additional information within the comments of a FITS card exist, but they require re-writing the FITS file. This operation can be quite costly for large archives, and should not be taken lightly when dealing with issues of digital preservation.We present what we believe is needed for a machine-actionable external file describing a given collection of FITS files. We seek comments from data producers, archives, and those writing software to help develop a single, useful, implementable standard.References:[1] Pence, et.al. 2010, http://dx.doi.org/10.1051/0004-6361/201015362[2] Rots, et.al, (in preparation), http://hea-www.cfa.harvard.edu arots/TimeWCS/

  7. VizieR Online Data Catalog: NiI transition probability measurements (Wood+, 2014)

    NASA Astrophysics Data System (ADS)

    Wood, M. P.; Lawler, J. E.; Sneden, C.; Cowan, J. J.

    2014-04-01

    As in much of our previous branching fraction work, this NiI branching fraction study makes use of archived FTS data from both the 1.0m Fourier Transform Spectrometer (FTS) previously at the National Solar Observatory (NSO) on Kitt Peak and the Chelsea Instruments FT500 UV FTS at Lund University in Sweden. Table 1 lists the 37 FTS spectra used in our NiI branching fraction study. All NSO spectra, raw interferograms, and header files are available in the NSO electronic archives. The 80 CCD frames of spectra from commercial Ni HCD lamps of the echelle spectrograph are listed in Table 2. (6 data files).

  8. BIRAM: a content-based image retrieval framework for medical images

    NASA Astrophysics Data System (ADS)

    Moreno, Ramon A.; Furuie, Sergio S.

    2006-03-01

    In the medical field, digital images are becoming more and more important for diagnostics and therapy of the patients. At the same time, the development of new technologies has increased the amount of image data produced in a hospital. This creates a demand for access methods that offer more than text-based queries for retrieval of the information. In this paper is proposed a framework for the retrieval of medical images that allows the use of different algorithms for the search of medical images by similarity. The framework also enables the search for textual information from an associated medical report and DICOM header information. The proposed system can be used for support of clinical decision making and is intended to be integrated with an open source picture, archiving and communication systems (PACS). The BIRAM has the following advantages: (i) Can receive several types of algorithms for image similarity search; (ii) Allows the codification of the report according to a medical dictionary, improving the indexing of the information and retrieval; (iii) The algorithms can be selectively applied to images with the appropriated characteristics, for instance, only in magnetic resonance images. The framework was implemented in Java language using a MS Access 97 database. The proposed framework can still be improved, by the use of regions of interest (ROI), indexing with slim-trees and integration with a PACS Server.

  9. Enterprise-scale image distribution with a Web PACS.

    PubMed

    Gropper, A; Doyle, S; Dreyer, K

    1998-08-01

    The integration of images with existing and new health care information systems poses a number of challenges in a multi-facility network: image distribution to clinicians; making DICOM image headers consistent across information systems; and integration of teleradiology into PACS. A novel, Web-based enterprise PACS architecture introduced at Massachusetts General Hospital provides a solution. Four AMICAS Web/Intranet Image Servers were installed as the default DICOM destination of 10 digital modalities. A fifth AMICAS receives teleradiology studies via the Internet. Each AMICAS includes: a Java-based interface to the IDXrad radiology information system (RIS), a DICOM autorouter to tape-library archives and to the Agfa PACS, a wavelet image compressor/decompressor that preserves compatibility with DICOM workstations, a Web server to distribute images throughout the enterprise, and an extensible interface which permits links between other HIS and AMICAS. Using wavelet compression and Internet standards as its native formats, AMICAS creates a bridge to the DICOM networks of remote imaging centers via the Internet. This teleradiology capability is integrated into the DICOM network and the PACS thereby eliminating the need for special teleradiology workstations. AMICAS has been installed at MGH since March of 1997. During that time, it has been a reliable component of the evolving digital image distribution system. As a result, the recently renovated neurosurgical ICU will be filmless and use only AMICAS workstations for mission-critical patient care.

  10. Digital image archiving: challenges and choices.

    PubMed

    Dumery, Barbara

    2002-01-01

    In the last five years, imaging exam volume has grown rapidly. In addition to increased image acquisition, there is more patient information per study. RIS-PACS integration and information-rich DICOM headers now provide us with more patient information relative to each study. The volume of archived digital images is increasing and will continue to rise at a steeper incline than film-based storage of the past. Many filmless facilities have been caught off guard by this increase, which has been stimulated by many factors. The most significant factor is investment in new digital and DICOM-compliant modalities. A huge volume driver is the increase in images per study from multi-slice technology. Storage requirements also are affected by disaster recovery initiatives and state retention mandates. This burgeoning rate of imaging data volume presents many challenges: cost of ownership, data accessibility, storage media obsolescence, database considerations, physical limitations, reliability and redundancy. There are two basic approaches to archiving--single tier and multi-tier. Each has benefits. With a single-tier approach, all the data is stored on a single media that can be accessed very quickly. A redundant copy of the data is then stored onto another less expensive media. This is usually a removable media. In this approach, the on-line storage is increased incrementally as volume grows. In a multi-tier approach, storage levels are set up based on access speed and cost. In other words, all images are stored at the deepest archiving level, which is also the least expensive. Images are stored on or moved back to the intermediate and on-line levels if they will need to be accessed more quickly. It can be difficult to decide what the best approach is for your organization. The options include RAIDs (redundant array of independent disks), direct attached RAID storage (DAS), network storage using RAIDs (NAS and SAN), removable media such as different types of tape, compact disks (CDs and DVDs) and magneto-optical disks (MODs). As you evaluate the various options for storage, it is important to consider both performance and cost. For most imaging enterprises, a single-tier archiving approach is the best solution. With the cost of hard drives declining, NAS is a very feasible solution today. It is highly reliable, offers immediate access to all exams, and easily scales as imaging volume grows. Best of all, media obsolescence challenges need not be of concern. For back-up storage, removable media can be implemented, with a smaller investment needed as it will only be used for a redundant copy of the data. There is no need to keep it online and available. If further system redundancy is desired, multiple servers should be considered. The multi-tier approach still has its merits for smaller enterprises, but with a detailed long-term cost of ownership analysis, NAS will probably still come out on top as the solution of choice for many imaging facilities.

  11. Archive of digital boomer seismic reflection data collected during USGS cruises 94GFP01, 95GFP01, 96GFP01, 97GFP01, and 98GFP02 in Lakes Pontchartrain, Borgne, and Maurepas, Louisiana, 1994-1998

    USGS Publications Warehouse

    Calderon, Karynna; Dadisman, Shawn V.; Kindinger, Jack G.; Williams, S. Jeffress; Flocks, James G.; Penland, Shea; Wiese, Dana S.

    2003-01-01

    The U.S. Geological Survey, in cooperation with the University of New Orleans, the Lake Pontchartrain Basin Foundation, the National Oceanic and Atmospheric Administration, the Coalition to Restore Coastal Louisiana, the U.S. Army Corps of Engineers, the Environmental Protection Agency, and the University of Georgia, conducted five geophysical surveys of Lakes Pontchartrain, Borgne, and Maurepas in Louisiana from 1994 to 1998. This report serves as an archive of unprocessed digital boomer seismic reflection data, trackline maps, navigation files, observers' logbooks, GIS information, and formal FGDC metadata. In addition, a filtered and gained digital GIF image of each seismic profile is provided. Refer to the Acronyms page for expansion of acronyms and abbreviations used in this report. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG-Y format (Barry and others, 1975) and may be downloaded and processed with commercial or public domain software such as Seismic Unix (SU). Examples of SU processing scripts and in-house (USGS) software for viewing SEG-Y headers (Zihlman, 1992) are also provided. Processed profile images, trackline maps, navigation files, and formal metadata may be viewed with a web browser, and scanned handwritten logbooks may be viewed with Adobe Reader. To access the information contained on these discs, open the file 'index.htm' located at the top level of the discs using a web browser. This report also contains hyperlinks to USGS collaborators and other agencies. These links are only accessible if access to the Internet is available while viewing these documents.

  12. Archive of digital boomer subbottom data collected during USGS cruise 05FGS01 offshore east-central Florida, July 17-29, 2005

    USGS Publications Warehouse

    Forde, Arnell S.; Dadisman, Shawn V.; Wiese, Dana S.; Phelps, Daniel C.

    2012-01-01

    In July of 2005, the U.S. Geological Survey (USGS), in cooperation with the Florida Geological Survey (FGS), conducted a geophysical survey of the Atlantic Ocean offshore of Florida's east coast from Flagler Beach to Daytona Beach. This report serves as an archive of unprocessed digital boomer subbottom data, trackline maps, navigation files, Geographic Information System (GIS) files, Field Activity Collection System (FACS) logs and formal Federal Geographic Data Committee (FGDC) metadata. Filtered and gained (showing a relative increase in signal amplitude) digital images of the seismic profiles are also provided. Refer to the Acronyms page for expansions of acronyms and abbreviations used in this report. The USGS Saint Petersburg Coastal and Marine Science Center (SPCMSC) assigns a unique identifier to each cruise or field activity. For example, 05FGS01 tells us the data were collected in 2005 for cooperative work with the FGS and the data were collected during the first field activity for that project in that calendar year. Refer to http://walrus.wr.usgs.gov/infobank/programs/html/definition/activity.html for a detailed description of the method used to assign the field activity ID. The boomer subbottom processing system consists of an acoustic energy source that is made up of capacitors charged to a high voltage and discharged through a transducer in the water. The transducer is towed on a sled floating on the water surface and when discharged emits a short acoustic pulse, or shot, which propagates through the water column and shallow stratrigraphy below. The acoustic energy is reflected at density boundaries (such as the seafloor or sediment layers beneath the seafloor), detected by the receiver (a hydrophone streamer), and recorded by a PC-based seismic acquisition system. This process is repeated at timed intervals (for example, 0.5 s) and recorded for specific intervals of time (for example, 100 ms). In this way, a two-dimensional (2-D) vertical image of the shallow geologic structure beneath the ship track is produced. Figure 1 displays the acquisition geometry. Refer to table 1 for a summary of acquisition parameters and table 2 for trackline statistics. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG Y format (Barry and others, 1975), except an ASCII format is used for the first 3,200 bytes of the card image header instead of the standard EBCDIC format. For a detailed description about the recorded trace headers, refer to the SEG Y Format page. The SEG Y files may be downloaded and processed with commercial or public domain software such as Seismic Unix (Cohen and Stockwell, 2005). See the How To Download SEG Y Data page for download instructions. The printable profiles provided here are GIF images that were processed and gained using SU software; refer to the Software page for links to example SU processing scripts. The processed SEG Y data were also exported to Chesapeake Technology, Inc. (CTI) SonarWeb software to produce a geospatially interactive version of the profile that allows the user to obtain a geographic location and depth from the profile for a given cursor position; this information is displayed in the status bar of the browser. Please note that clicking on the profile image switches it to "Expanded View" (a compressed image of the entire line) and cursor tracking is not available in this mode.

  13. The HST/WFC3 Quicklook Project: A User Interface to Hubble Space Telescope Wide Field Camera 3 Data

    NASA Astrophysics Data System (ADS)

    Bourque, Matthew; Bajaj, Varun; Bowers, Ariel; Dulude, Michael; Durbin, Meredith; Gosmeyer, Catherine; Gunning, Heather; Khandrika, Harish; Martlin, Catherine; Sunnquist, Ben; Viana, Alex

    2017-06-01

    The Hubble Space Telescope's Wide Field Camera 3 (WFC3) instrument, comprised of two detectors, UVIS (Ultraviolet-Visible) and IR (Infrared), has been acquiring ~ 50-100 images daily since its installation in 2009. The WFC3 Quicklook project provides a means for instrument analysts to store, calibrate, monitor, and interact with these data through the various Quicklook systems: (1) a ~ 175 TB filesystem, which stores the entire WFC3 archive on disk, (2) a MySQL database, which stores image header data, (3) a Python-based automation platform, which currently executes 22 unique calibration/monitoring scripts, (4) a Python-based code library, which provides system functionality such as logging, downloading tools, database connection objects, and filesystem management, and (5) a Python/Flask-based web interface to the Quicklook system. The Quicklook project has enabled large-scale WFC3 analyses and calibrations, such as the monitoring of the health and stability of the WFC3 instrument, the measurement of ~ 20 million WFC3/UVIS Point Spread Functions (PSFs), the creation of WFC3/IR persistence calibration products, and many others.

  14. The Alaska Arctic Vegetation Archive (AVA-AK)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Walker, Donald; Breen, Amy; Druckenmiller, Lisa

    The Alaska Arctic Vegetation Archive (AVA-AK, GIVD-ID: NA-US-014) is a free, publically available database archive of vegetation-plot data from the Arctic tundra region of northern Alaska. The archive currently contains 24 datasets with 3,026 non-overlapping plots. Of these, 74% have geolocation data with 25-m or better precision. Species cover data and header data are stored in a Turboveg database. A standardized Pan Arctic Species List provides a consistent nomenclature for vascular plants, bryophytes, and lichens in the archive. A web-based online Alaska Arctic Geoecological Atlas (AGA-AK) allows viewing and downloading the species data in a variety of formats, and providesmore » access to a wide variety of ancillary data. We conducted a preliminary cluster analysis of the first 16 datasets (1,613 plots) to examine how the spectrum of derived clusters is related to the suite of datasets, habitat types, and environmental gradients. Here, we present the contents of the archive, assess its strengths and weaknesses, and provide three supplementary files that include the data dictionary, a list of habitat types, an overview of the datasets, and details of the cluster analysis.« less

  15. The Alaska Arctic Vegetation Archive (AVA-AK)

    DOE PAGES

    Walker, Donald; Breen, Amy; Druckenmiller, Lisa; ...

    2016-05-17

    The Alaska Arctic Vegetation Archive (AVA-AK, GIVD-ID: NA-US-014) is a free, publically available database archive of vegetation-plot data from the Arctic tundra region of northern Alaska. The archive currently contains 24 datasets with 3,026 non-overlapping plots. Of these, 74% have geolocation data with 25-m or better precision. Species cover data and header data are stored in a Turboveg database. A standardized Pan Arctic Species List provides a consistent nomenclature for vascular plants, bryophytes, and lichens in the archive. A web-based online Alaska Arctic Geoecological Atlas (AGA-AK) allows viewing and downloading the species data in a variety of formats, and providesmore » access to a wide variety of ancillary data. We conducted a preliminary cluster analysis of the first 16 datasets (1,613 plots) to examine how the spectrum of derived clusters is related to the suite of datasets, habitat types, and environmental gradients. Here, we present the contents of the archive, assess its strengths and weaknesses, and provide three supplementary files that include the data dictionary, a list of habitat types, an overview of the datasets, and details of the cluster analysis.« less

  16. Search | Galaxy of Images

    Science.gov Websites

    Books dot header Search Tips Search Keywords in Author Last Name or in the Title of the Books: Enter a books Images FAQ | Privacy | SI Terms of Use | Smithsonian Home DCSIMG

  17. Image Use Fees | Galaxy of Images

    Science.gov Websites

    This site has moved! Please go to our new Image Gallery site! dot header Image Use Fees Licensing , research and study purposes only. For current pricing, please download our Image Use Fee Schedule See our Frequently Asked Questions (FAQ) list for additional information. Purchase an image now Contact Information

  18. Contacts | Galaxy of Images

    Science.gov Websites

    This site has moved! Please go to our new Image Gallery site! dot header Contact Us About the Image Galaxy For licensing and other usage questions, please contact: Image use and licensing ! Enter a search term and hit the search button to quickly find an image Go The above "Quick Search

  19. About Galaxy of Images

    Science.gov Websites

    This site has moved! Please go to our new Image Gallery site! dot header About the Image Galaxy are added regularly. Statistics about the Galaxy of Images Frequently Asked Questions Image Use Fees Quick Search! Enter a search term and hit the search button to quickly find an image Go The above "

  20. The NOAO NEWFIRM Data Handling System

    NASA Astrophysics Data System (ADS)

    Zárate, N.; Fitzpatrick, M.

    2008-08-01

    The NOAO Extremely Wide-Field IR Mosaic (NEWFIRM) is a new 1-2.4 micron IR camera that is now being commissioned for the 4m Mayall telescope at Kitt Peak. The focal plane consists of a 2x2 mosaic of 2048x2048 arrays offerring a field-of-view of 27.6' on a side. The use of dual MONSOON array controllers permits very fast readout, a scripting interface allows for highly efficient observing modes. We describe the Data Handling System (DHS) for the NEWFIRM camera which is designed to meet the performance requirements of the instrument as well as the observing environment in which in operates. It is responsible for receiving the data stream from the detector and instrument software, rectifying the image geometry, presenting a real-time display of the image to the user, final assembly of a science-grade image with complete headers, as well as triggering automated pipeline and archival functions. The DHS uses an event-based messaging system to control multiple processes on a distributed network of machines. The asynchronous nature of this processing means the DHS operates independently from the camera readout and the design of the system is inherently scalable to larger focal planes that use a greater number of array controllers. Current status and future plans for the DHS are also discussed.

  1. Radiation dose and image quality for paediatric interventional cardiology

    NASA Astrophysics Data System (ADS)

    Vano, E.; Ubeda, C.; Leyton, F.; Miranda, P.

    2008-08-01

    Radiation dose and image quality for paediatric protocols in a biplane x-ray system used for interventional cardiology have been evaluated. Entrance surface air kerma (ESAK) and image quality using a test object and polymethyl methacrylate (PMMA) phantoms have been measured for the typical paediatric patient thicknesses (4-20 cm of PMMA). Images from fluoroscopy (low, medium and high) and cine modes have been archived in digital imaging and communications in medicine (DICOM) format. Signal-to-noise ratio (SNR), figure of merit (FOM), contrast (CO), contrast-to-noise ratio (CNR) and high contrast spatial resolution (HCSR) have been computed from the images. Data on dose transferred to the DICOM header have been used to test the values of the dosimetric display at the interventional reference point. ESAK for fluoroscopy modes ranges from 0.15 to 36.60 µGy/frame when moving from 4 to 20 cm PMMA. For cine, these values range from 2.80 to 161.10 µGy/frame. SNR, FOM, CO, CNR and HCSR are improved for high fluoroscopy and cine modes and maintained roughly constant for the different thicknesses. Cumulative dose at the interventional reference point resulted 25-45% higher than the skin dose for the vertical C-arm (depending of the phantom thickness). ESAK and numerical image quality parameters allow the verification of the proper setting of the x-ray system. Knowing the increases in dose per frame when increasing phantom thicknesses together with the image quality parameters will help cardiologists in the good management of patient dose and allow them to select the best imaging acquisition mode during clinical procedures.

  2. Review on the Celestial Sphere Positioning of FITS Format Image Based on WCS and Research on General Visualization

    NASA Astrophysics Data System (ADS)

    Song, W. M.; Fan, D. W.; Su, L. Y.; Cui, C. Z.

    2017-11-01

    Calculating the coordinate parameters recorded in the form of key/value pairs in FITS (Flexible Image Transport System) header is the key to determine FITS images' position in the celestial system. As a result, it has great significance in researching the general process of calculating the coordinate parameters. By combining CCD related parameters of astronomical telescope (such as field, focal length, and celestial coordinates in optical axis, etc.), astronomical images recognition algorithm, and WCS (World Coordinate System) theory, the parameters can be calculated effectively. CCD parameters determine the scope of star catalogue, so that they can be used to build a reference star catalogue by the corresponding celestial region of astronomical images; Star pattern recognition completes the matching between the astronomical image and reference star catalogue, and obtains a table with a certain number of stars between CCD plane coordinates and their celestial coordinates for comparison; According to different projection of the sphere to the plane, WCS can build different transfer functions between these two coordinates, and the astronomical position of image pixels can be determined by the table's data we have worked before. FITS images are used to carry out scientific data transmission and analyze as a kind of mainstream data format, but only to be viewed, edited, and analyzed in the professional astronomy software. It decides the limitation of popular science education in astronomy. The realization of a general image visualization method is significant. FITS is converted to PNG or JPEG images firstly. The coordinate parameters in the FITS header are converted to metadata in the form of AVM (Astronomy Visualization Metadata), and then the metadata is added to the PNG or JPEG header. This method can meet amateur astronomers' general needs of viewing and analyzing astronomical images in the non-astronomical software platform. The overall design flow is realized through the java program and tested by SExtractor, WorldWide Telescope, picture viewer, and other software.

  3. The Small Bodies Imager Browser --- finding asteroid and comet images without pain

    NASA Astrophysics Data System (ADS)

    Palmer, E.; Sykes, M.; Davis, D.; Neese, C.

    2014-07-01

    To facilitate accessing and downloading spatially resolved imagery of asteroids and comets in the NASA Planetary Data System (PDS), we have created the Small Bodies Image Browser. It is a HTML5 webpage that runs inside a standard web browser needing no installation (http://sbn.psi.edu/sbib/). The volume of data returned by spacecraft missions has grown substantially over the last decade. While this wealth of data provides scientists with ample support for research, it has greatly increased the difficulty of managing, accessing and processing these data. Further, the complexity necessary for a long-term archive results in an architecture that is efficient for computers, but not user friendly. The Small Bodies Image Browser (SBIB) is tied into the PDS archive of the Small Bodies Asteroid Subnode hosted at the Planetary Science Institute [1]. Currently, the tool contains the entire repository of the Dawn mission's encounter with Vesta [2], and we will be adding other datasets in the future. For Vesta, this includes both the level 1A and 1B images for the Framing Camera (FC) and the level 1B spectral cubes from the Visual and Infrared (VIR) spectrometer, providing over 30,000 individual images. A key strength of the tool is providing quick and easy access of these data. The tool allows for searches based on clicking on a map or typing in coordinates. The SBIB can show an entire mission phase (such as cycle 7 of the Low Altitude Mapping Orbit) and the associated footprints, as well as search by image name. It can focus the search by mission phase, resolution or instrument. Imagery archived in the PDS are generally provided by missions in a single or narrow range of formats. To enhance the value and usability of this data to researchers, SBIB makes these available in these original formats as well as PNG, JPEG and ArcGIS compatible ISIS cubes [3]. Additionally, we provide header files for the VIR cubes so they can be read into ENVI without additional processing. Finally, we also provide both camera-based and map-projected products with geometric data embedded for use within ArcGIS and ISIS. We use the Gaskell shape model for terrain projections [4]. There are several other outstanding data analysis tools that have access to asteroid and comet data: JAsteroid (a derivative of JMARS [5]) and the Applied Physics Laboratory's Small Body Mapping Tool [6]. The SBIB has specifically focused on providing data in the easiest manner possible rather than trying to be an analytical tool.

  4. Determining approximate age of digital images using sensor defects

    NASA Astrophysics Data System (ADS)

    Fridrich, Jessica; Goljan, Miroslav

    2011-02-01

    The goal of temporal forensics is to establish temporal relationship among two or more pieces of evidence. In this paper, we focus on digital images and describe a method using which an analyst can estimate the acquisition time of an image given a set of other images from the same camera whose time ordering is known. This is achieved by first estimating the parameters of pixel defects, including their onsets, and then detecting their presence in the image under investigation. Both estimators are constructed using the maximum-likelihood principle. The accuracy and limitations of this approach are illustrated on experiments with three cameras. Forensic and law-enforcement analysts are expected to benefit from this technique in situations when the temporal data stored in the EXIF header is lost due to processing or editing images off-line or when the header cannot be trusted. Reliable methods for establishing temporal order between individual pieces of evidence can help reveal deception attempts of an adversary or a criminal. The causal relationship may also provide information about the whereabouts of the photographer.

  5. Medical image security in a HIPAA mandated PACS environment.

    PubMed

    Cao, F; Huang, H K; Zhou, X Q

    2003-01-01

    Medical image security is an important issue when digital images and their pertinent patient information are transmitted across public networks. Mandates for ensuring health data security have been issued by the federal government such as Health Insurance Portability and Accountability Act (HIPAA), where healthcare institutions are obliged to take appropriate measures to ensure that patient information is only provided to people who have a professional need. Guidelines, such as digital imaging and communication in medicine (DICOM) standards that deal with security issues, continue to be published by organizing bodies in healthcare. However, there are many differences in implementation especially for an integrated system like picture archiving and communication system (PACS), and the infrastructure to deploy these security standards is often lacking. Over the past 6 years, members in the Image Processing and Informatics Laboratory, Childrens Hospital, Los Angeles/University of Southern California, have actively researched image security issues related to PACS and teleradiology. The paper summarizes our previous work and presents an approach to further research on the digital envelope (DE) concept that provides image integrity and security assurance in addition to conventional network security protection. The DE, including the digital signature (DS) of the image as well as encrypted patient information from the DICOM image header, can be embedded in the background area of the image as an invisible permanent watermark. The paper outlines the systematic development, evaluation and deployment of the DE method in a PACS environment. We have also proposed a dedicated PACS security server that will act as an image authority to check and certify the image origin and integrity upon request by a user, and meanwhile act also as a secure DICOM gateway to the outside connections and a PACS operation monitor for HIPAA supporting information. Copyright 2002 Elsevier Science Ltd.

  6. Visualization of JPEG Metadata

    NASA Astrophysics Data System (ADS)

    Malik Mohamad, Kamaruddin; Deris, Mustafa Mat

    There are a lot of information embedded in JPEG image than just graphics. Visualization of its metadata would benefit digital forensic investigator to view embedded data including corrupted image where no graphics can be displayed in order to assist in evidence collection for cases such as child pornography or steganography. There are already available tools such as metadata readers, editors and extraction tools but mostly focusing on visualizing attribute information of JPEG Exif. However, none have been done to visualize metadata by consolidating markers summary, header structure, Huffman table and quantization table in a single program. In this paper, metadata visualization is done by developing a program that able to summarize all existing markers, header structure, Huffman table and quantization table in JPEG. The result shows that visualization of metadata helps viewing the hidden information within JPEG more easily.

  7. 36 CFR 1237.28 - What special concerns apply to digital photographs?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... defects, evaluate the accuracy of finding aids, and verify file header information and file name integrity... sampling methods or more comprehensive verification systems (e.g., checksum programs), to evaluate image.... For permanent or unscheduled images descriptive elements must include: (1) An identification number...

  8. 36 CFR § 1237.28 - What special concerns apply to digital photographs?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... defects, evaluate the accuracy of finding aids, and verify file header information and file name integrity... sampling methods or more comprehensive verification systems (e.g., checksum programs), to evaluate image.... For permanent or unscheduled images descriptive elements must include: (1) An identification number...

  9. 36 CFR 1237.28 - What special concerns apply to digital photographs?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... defects, evaluate the accuracy of finding aids, and verify file header information and file name integrity... sampling methods or more comprehensive verification systems (e.g., checksum programs), to evaluate image.... For permanent or unscheduled images descriptive elements must include: (1) An identification number...

  10. 36 CFR 1237.28 - What special concerns apply to digital photographs?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... defects, evaluate the accuracy of finding aids, and verify file header information and file name integrity... sampling methods or more comprehensive verification systems (e.g., checksum programs), to evaluate image.... For permanent or unscheduled images descriptive elements must include: (1) An identification number...

  11. 36 CFR 1237.28 - What special concerns apply to digital photographs?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... defects, evaluate the accuracy of finding aids, and verify file header information and file name integrity... sampling methods or more comprehensive verification systems (e.g., checksum programs), to evaluate image.... For permanent or unscheduled images descriptive elements must include: (1) An identification number...

  12. Incorporating the APS Catalog of the POSS I and Image Archive in ADS

    NASA Technical Reports Server (NTRS)

    Humphreys, Roberta M.

    1998-01-01

    The primary purpose of this contract was to develop the software to both create and access an on-line database of images from digital scans of the Palomar Sky Survey. This required modifying our DBMS (called Star Base) to create an image database from the actual raw pixel data from the scans. The digitized images are processed into a set of coordinate-reference index and pixel files that are stored in run-length files, thus achieving an efficient lossless compression. For efficiency and ease of referencing, each digitized POSS I plate is then divided into 900 subplates. Our custom DBMS maps each query into the corresponding POSS plate(s) and subplate(s). All images from the appropriate subplates are retrieved from disk with byte-offsets taken from the index files. These are assembled on-the-fly into a GIF image file for browser display, and a FITS format image file for retrieval. The FITS images have a pixel size of 0.33 arcseconds. The FITS header contains astrometric and photometric information. This method keeps the disk requirements manageable while allowing for future improvements. When complete, the APS Image Database will contain over 130 Gb of data. A set of web pages query forms are available on-line, as well as an on-line tutorial and documentation. The database is distributed to the Internet by a high-speed SGI server and a high-bandwidth disk system. URL is http://aps.umn.edu/IDB/. The image database software is written in perl and C and has been compiled on SGI computers with MIX5.3. A copy of the written documentation is included and the software is on the accompanying exabyte tape.

  13. 10 CFR 2.1013 - Use of the electronic docket during the proceeding.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... searchable full text, by header and image, as appropriate. (b) Absent good cause, all exhibits tendered... circumstances where submitters may need to use an image scanned before January 1, 2004, in a document created after January 1, 2004, or the scanning process for a large, one-page image may not successfully complete...

  14. 10 CFR 2.1013 - Use of the electronic docket during the proceeding.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... header and image, as appropriate. (b) Absent good cause, all exhibits tendered during the hearing must... may need to use an image scanned before January 1, 2004, in a document created after January 1, 2004, or the scanning process for a large, one-page image may not successfully complete at the 300 dpi...

  15. Current Document Handling Procedures at Defense Technical Information Center

    DTIC Science & Technology

    1985-11-01

    File so that a microfiche header can be made for the document. Headers are generated on magnetic tape , and a paper copy of each header is printed for...review. If a header contains an error, corrections are made to the initial tape and a printout of the corrected header is reviewed before approval is...made and the header released. The final tape is sent to Micrographics for inclusion in the microfiche copy. The header tape usually reaches

  16. Teleradiology mobile internet system with a new information security solution

    NASA Astrophysics Data System (ADS)

    Satoh, Hitoshi; Niki, Noboru; Eguchi, Kenji; Ohmatsu, Hironobu; Kusumoto, Masahiko; Kaneko, Masahiro; Moriyama, Noriyuki

    2014-03-01

    We have developed an external storage system by using secret sharing scheme and tokenization for regional medical cooperation, PHR service and information preservation. The use of mobile devices such as smart phones and tablets will be accelerated for a PHR service, and the confidential medical information is exposed to the risk of damage and intercept. We verified the transfer rate of the sending and receiving of data to and from the external storage system that connected it with PACS by the Internet this time. External storage systems are the data centers that exist in Okinawa, in Osaka, in Sapporo and in Tokyo by using secret sharing scheme. PACS continuously transmitted 382 CT images to the external data centers. Total capacity of the CT images is about 200MB. The total time that had been required to transmit was about 250 seconds. Because the preservation method to use secret sharing scheme is applied, security is strong. But, it also takes the information transfer time of this system too much. Therefore, DICOM data is masked to the header information part because it is made to anonymity in our method. The DICOM data made anonymous is preserved in the data base in the hospital. Header information including individual information is divided into two or more tallies by secret sharing scheme, and preserved at two or more external data centers. The token to relate the DICOM data anonymity made to header information preserved outside is strictly preserved in the token server. The capacity of header information that contains patient's individual information is only about 2% of the entire DICOM data. This total time that had been required to transmit was about 5 seconds. Other, common solutions that can protect computer communication networks from attacks are classified as cryptographic techniques or authentication techniques. Individual number IC card is connected with electronic certification authority of web medical image conference system. Individual number IC card is given only to the person to whom the authority to operate web medical image conference system was given.

  17. A New Compression Method for FITS Tables

    NASA Technical Reports Server (NTRS)

    Pence, William; Seaman, Rob; White, Richard L.

    2010-01-01

    As the size and number of FITS binary tables generated by astronomical observatories increases, so does the need for a more efficient compression method to reduce the amount disk space and network bandwidth required to archive and down1oad the data tables. We have developed a new compression method for FITS binary tables that is modeled after the FITS tiled-image compression compression convention that has been in use for the past decade. Tests of this new method on a sample of FITS binary tables from a variety of current missions show that on average this new compression technique saves about 50% more disk space than when simply compressing the whole FITS file with gzip. Other advantages of this method are (1) the compressed FITS table is itself a valid FITS table, (2) the FITS headers remain uncompressed, thus allowing rapid read and write access to the keyword values, and (3) in the common case where the FITS file contains multiple tables, each table is compressed separately and may be accessed without having to uncompress the whole file.

  18. HRSCview: a web-based data exploration system for the Mars Express HRSC instrument

    NASA Astrophysics Data System (ADS)

    Michael, G.; Walter, S.; Neukum, G.

    2007-08-01

    The High Resolution Stereo Camera (HRSC) on the ESA Mars Express spacecraft has been orbiting Mars since January 2004. By spring 2007 it had returned around 2 terabytes of image data, covering around 35% of the Martian surface in stereo and colour at a resolu-tion of 10-20 m/pixel. HRSCview provides a rapid means to explore these images up to their full resolu-tion with the data-subsetting, sub-sampling, stretching and compositing being carried out on-the-fly by the image server. It is a joint website of the Free University of Berlin and the German Aerospace Center (DLR). The system operates by on-the-fly processing of the six HRSC level-4 image products: the map-projected ortho-rectified nadir pan-chromatic and four colour channels, and the stereo-derived DTM (digital terrain model). The user generates a request via the web-page for an image with several parameters: the centre of the view in surface coordinates, the image resolution in metres/pixel, the image dimensions, and one of several colour modes. If there is HRSC coverage at the given location, the necessary segments are extracted from the full orbit images, resampled to the required resolution, and composited according to the user's choice. In all modes the nadir channel, which has the highest resolu-tion, is included in the composite so that the maximum detail is always retained. The images are stretched ac-cording to the current view: this applies to the eleva-tion colour scale, as well as the nadir brightness and the colour channels. There are modes for raw colour, stretched colour, enhanced colour (exaggerated colour differences), and a synthetic 'Mars-like' colour stretch. A colour ratio mode is given as an alternative way to examine colour differences (R=IR/R, G=R/G and B=G/B). The final image is packaged as a JPEG file and returned to the user over the web. Each request requires approximately 1 second to process. A link is provided from each view to a data product page, where header items describing the full map-projected science data product are displayed, and a direct link to the archived data products on the ESA Planetary Science Archive (PSA) is provided. At pre-sent the majority of the elevation composites are de-rived from the HRSC Preliminary 200m DTMs gener-ated at the German Aerospace Center (DLR), which will not be available as separately downloadable data products. These DTMs are being progressively super-seded by systematically generated higher resolution archival DTMs, also from DLR, which will become available for download through the PSA, and be simi-larly accessible via HRSCview. At the time of writing this abstract (May 2007), four such high resolution DTMs are available for download via the HRSCview data product pages (for images from orbits 0572, 0905, 1004, and 2039).

  19. miniSEED: The Backbone Data Format for Seismological Time Series

    NASA Astrophysics Data System (ADS)

    Ahern, T. K.; Benson, R. B.; Trabant, C. M.

    2017-12-01

    In 1987, the International Federation of Digital Seismograph Networks (FDSN), adopted the Standard for the Exchange of Earthquake Data (SEED) format to be used for data archiving and exchange of seismological time series data. Since that time, the format has evolved to accommodate new capabilities and features. For example, a notable change in 1992 allowed the format, which includes both the comprehensive metadata and the time series samples, to be used in two additional forms: a container for metadata only called "dataless SEED", and 2) a stand-alone structure for time series called "miniSEED". While specifically designed for seismological data and related metadata, this format has proven to be a useful format for a wide variety of geophysical time series data. Many FDSN data centers now store temperature, pressure, infrasound, tilt and other time series measurements in this internationally used format. Since April 2016, members of the FDSN have been in discussions to design a next generation miniSEED format to accommodate current and future needs, to further generalize the format, and to address a number of historical problems or limitations. We believe the correct approach is to simplify the header, allow for arbitrary header additions, expand the current identifiers, and allow for anticipated future identifiers which are currently unknown. We also believe the primary goal of the format is for efficient archiving, selection and exchange of time series data. By focusing on these goals we avoid trying to generalize the format too broadly into specialized areas such as efficient, low-latency delivery, or including unbounded non-time series data. Our presentation will provide an overview of this format and highlight its most valuable characteristics for time series data from any geophysical domain or beyond.

  20. Secure Oblivious Hiding, Authentication, Tamper Proofing, and Verification Techniques

    DTIC Science & Technology

    2002-08-01

    compressing the bit- planes. The algorithm always starts with inspecting the 5th LSB plane. For color images , all three color-channels are compressed...use classical encryption engines, such as IDEA or DES . These algorithms have a fixed encryption block size, and, depending on the image dimensions, we...information can be stored either in a separate file, in the image header, or embedded in the image itself utilizing the modern concepts of steganography

  1. Archive of Boomer and Chirp Seismic Reflection Data Collected During USGS Cruise 01RCE02, Southern Louisiana, April and May 2001

    USGS Publications Warehouse

    Calderon, Karynna; Dadisman, Shawn V.; Flocks, James G.; Wiese, Dana S.

    2003-01-01

    In April and May of 2001, the U.S. Geological Survey conducted a geophysical study of the Mississippi River Delta, Atchafalaya River Delta, and Shell Island Pass in southern Louisiana. This study was part of a larger USGS River Contaminant Evaluation (RCE) Project. This disc serves as an archive of unprocessed digital seismic reflection data, trackline navigation files, shotpoint navigation maps, observers' logbooks, GIS information, and formal Federal Geographic Data Committee (FGDC) metadata. In addition, a filtered and gained digital GIF-formatted image of each seismic profile is provided. For your convenience, a list of acronyms and abbreviations frequently used in this report has also been provided. This DVD (Digital Versatile Disc) document is readable on any computing platform that has standard DVD driver software installed. Documentation on this DVD was produced using Hyper Text Markup Language (HTML) utilized by the World Wide Web (WWW) and allows the user to access the information by using a web browser (i.e. Netscape or Internet Explorer). To access the information contained on this disc, open the file 'index.htm' located at the top level of the disc using your web browser. This report also contains WWW links to USGS collaborators and other agencies. These links are only accessible if access to the internet is available while viewing the DVD. The archived boomer and chirp seismic reflection data are in standard Society of Exploration Geophysicists (SEG) SEG-Y format (Barry et al., 1975) and may be downloaded for processing with public domain software such as Seismic Unix (SU), currently located at http://www.cwp.mines.edu/cwpcodes. Examples of SU processing scripts are provided in the boom.tar and chirp.tar files located in the SU subfolder of the SOFTWARE folder located at the top level of this DVD. In-house (USGS) DOS and Microsoft Windows compatible software for viewing SEG-Y headers - DUMPSEGY.EXE (Zilhman, 1992) - is provided in the USGS subfolder of the SOFTWARE folder. Processed profile images, shotpoint navigation maps, logbooks, and formal metadata may be viewed with your web browser.

  2. --No Title--

    Science.gov Websites

    -color:#5e6a71;border-top:3px solid #62d2ff}header .logo{background-position:center center}@media (min -position:center right}}header a.app-name,header a.app-name:hover,header a.app-name:visited{color:#fff

  3. TAPAS, a VO archive at the IRAM 30-m telescope

    NASA Astrophysics Data System (ADS)

    Leon, Stephane; Espigares, Victor; Ruíz, José Enrique; Verdes-Montenegro, Lourdes; Mauersberger, Rainer; Brunswig, Walter; Kramer, Carsten; Santander-Vela, Juan de Dios; Wiesemeyer, Helmut

    2012-07-01

    Astronomical observatories are today generating increasingly large volumes of data. For an efficient use of them, databases have been built following the standards proposed by the International Virtual Observatory Alliance (IVOA), providing a common protocol to query them and make them interoperable. The IRAM 30-m radio telescope, located in Sierra Nevada (Granada, Spain) is a millimeter wavelength telescope with a constantly renewed, extensive choice of instruments, and capable of covering the frequency range between 80 and 370 GHz. It is continuously producing a large amount of data thanks to the more than 200 scientific projects observed each year. The TAPAS archive at the IRAM 30-m telescope is aimed to provide public access to the headers describing the observations performed with the telescope, according to a defined data policy, making as well the technical data available to the IRAM staff members. A special emphasis has been made to make it Virtual Observatory (VO) compliant, and to offer a VO compliant web interface allowing to make the information available to the scientific community. TAPAS is built using the Django Python framework on top of a relational MySQL database, and is fully integrated with the telescope control system. The TAPAS data model (DM) is based on the Radio Astronomical DAta Model for Single dish radio telescopes (RADAMS), to allow for easy integration into the VO infrastructure. A metadata modeling layer is used by the data-filler to allow an implementation free from assumptions about the control system and the underlying database. TAPAS and its public web interface ( http://tapas.iram.es ) provides a scalable system that can evolve with new instruments and observing modes. A meta description of the DM has been introduced in TAPAS in order to both avoid undesired coupling between the code and the DM and to provide a better management of the archive. A subset of the header data stored in TAPAS will be made available at the CDS.

  4. VO-Compatible Architecture for Managing and Processing Images of Moving Celestial Bodies : Application to the Gaia-GBOT Project

    NASA Astrophysics Data System (ADS)

    Barache, C.; Bouquillon, S.; Carlucci, T.; Taris, F.; Michel, L.; Altmann, M.

    2013-10-01

    The Ground Based Optical Tracking (GBOT) group is a part of the Data Processing and Analysis Consortium, the large consortium of over 400 scientists from many European countries, charged with the scientific conduction of the Gaia mission by ESA. The GBOT group is in charge of the optical part of tracking of the Gaia satellite. This optical tracking is necessary to allow the Gaia mission to fully reach its goal in terms of astrometry precision level. These observations will be done daily, during the 5 years of the mission, with the use of optical CCD frames taken by a small network of 1-2m class telescopes located all over the world. The requirements for the accuracy on the satellite position determination, with respect of the stars in the field of view, are 20 mas. These optical satellite positions will be sent weekly by GBOT to the SOC of ESAC and used with other kinds of observations (radio ranging and Doppler) by MOC of ESOC to improve the Gaia ephemeris. For this purpose, we developed a set of accurate astrometry reduction programs specially adapted for tracking moving objects. The inputs of these programs for each tracked target are an ephemeris and a set of FITS images. The outputs for each image are: a file containing all information about the detected objects, a catalogue file used for calibration, a TIFF file for visual explanation of the reduction result, and an improvement of the fits image header. The final result is an overview file containing only the data related to the target extracted from all the images. These programs are written in GNU Fortran 95 and provide results in VOTable format (supported by Virtual Observatory protocols). All these results are sent automatically into the GBOT Database which is built with the SAADA freeware. The user of this Database can archive and query the data but also, thanks to the delegate option provided by SAADA, select a set of images and directly run the GBOT reduction programs with a dedicated Web interface. For more information about SAADA (an Automatic System for Astronomy Data Archive under GPL license and VOcompatible), see the related paper Michel et al. (2013).

  5. Optical Circuit Switched Protocol

    NASA Technical Reports Server (NTRS)

    Monacos, Steve P. (Inventor)

    2000-01-01

    The present invention is a system and method embodied in an optical circuit switched protocol for the transmission of data through a network. The optical circuit switched protocol is an all-optical circuit switched network and includes novel optical switching nodes for transmitting optical data packets within a network. Each optical switching node comprises a detector for receiving the header, header detection logic for translating the header into routing information and eliminating the header, and a controller for receiving the routing information and configuring an all optical path within the node. The all optical path located within the node is solely an optical path without having electronic storage of the data and without having optical delay of the data. Since electronic storage of the header is not necessary and the initial header is eliminated by the first detector of the first switching node. multiple identical headers are sent throughout the network so that subsequent switching nodes can receive and read the header for setting up an optical data path.

  6. SPINS: standardized protein NMR storage. A data dictionary and object-oriented relational database for archiving protein NMR spectra.

    PubMed

    Baran, Michael C; Moseley, Hunter N B; Sahota, Gurmukh; Montelione, Gaetano T

    2002-10-01

    Modern protein NMR spectroscopy laboratories have a rapidly growing need for an easily queried local archival system of raw experimental NMR datasets. SPINS (Standardized ProteIn Nmr Storage) is an object-oriented relational database that provides facilities for high-volume NMR data archival, organization of analyses, and dissemination of results to the public domain by automatic preparation of the header files required for submission of data to the BioMagResBank (BMRB). The current version of SPINS coordinates the process from data collection to BMRB deposition of raw NMR data by standardizing and integrating the storage and retrieval of these data in a local laboratory file system. Additional facilities include a data mining query tool, graphical database administration tools, and a NMRStar v2. 1.1 file generator. SPINS also includes a user-friendly internet-based graphical user interface, which is optionally integrated with Varian VNMR NMR data collection software. This paper provides an overview of the data model underlying the SPINS database system, a description of its implementation in Oracle, and an outline of future plans for the SPINS project.

  7. A case study in adaptable and reusable infrastructure at the Keck Observatory Archive: VO interfaces, moving targets, and more

    NASA Astrophysics Data System (ADS)

    Berriman, G. Bruce; Cohen, Richard W.; Colson, Andrew; Gelino, Christopher R.; Good, John C.; Kong, Mihseh; Laity, Anastasia C.; Mader, Jeffrey A.; Swain, Melanie A.; Tran, Hien D.; Wang, Shin-Ywan

    2016-08-01

    The Keck Observatory Archive (KOA) (https://koa.ipac.caltech.edu) curates all observations acquired at the W. M. Keck Observatory (WMKO) since it began operations in 1994, including data from eight active instruments and two decommissioned instruments. The archive is a collaboration between WMKO and the NASA Exoplanet Science Institute (NExScI). Since its inception in 2004, the science information system used at KOA has adopted an architectural approach that emphasizes software re-use and adaptability. This paper describes how KOA is currently leveraging and extending open source software components to develop new services and to support delivery of a complete set of instrument metadata, which will enable more sophisticated and extensive queries than currently possible. In August 2015, KOA deployed a program interface to discover public data from all instruments equipped with an imaging mode. The interface complies with version 2 of the Simple Imaging Access Protocol (SIAP), under development by the International Virtual Observatory Alliance (IVOA), which defines a standard mechanism for discovering images through spatial queries. The heart of the KOA service is an R-tree-based, database-indexing mechanism prototyped by the Virtual Astronomical Observatory (VAO) and further developed by the Montage Image Mosaic project, designed to provide fast access to large imaging data sets as a first step in creating wide-area image mosaics (such as mosaics of subsets of the 4.7 million images of the SDSS DR9 release). The KOA service uses the results of the spatial R-tree search to create an SQLite data database for further relational filtering. The service uses a JSON configuration file to describe the association between instrument parameters and the service query parameters, and to make it applicable beyond the Keck instruments. The images generated at the Keck telescope usually do not encode the image footprints as WCS fields in the FITS file headers. Because SIAP searches are spatial, much of the effort in developing the program interface involved processing the instrument and telescope parameters to understand how accurately we can derive the WCS information for each instrument. This knowledge is now being fed back into the KOA databases as part of a program to include complete metadata information for all imaging observations. The R-tree program was itself extended to support temporal (in addition to spatial) indexing, in response to requests from the planetary science community for a search engine to discover observations of Solar System objects. With this 3D-indexing scheme, the service performs very fast time and spatial matches between the target ephemerides, obtained from the JPL SPICE service. Our experiments indicate these matches can be more than 100 times faster than when separating temporal and spatial searches. Images of the tracks of the moving targets, overlaid with the image footprints, are computed with a new command-line visualization tool, mViewer, released with the Montage distribution. The service is currently in test and will be released in late summer 2016.

  8. Digital image envelope: method and evaluation

    NASA Astrophysics Data System (ADS)

    Huang, H. K.; Cao, Fei; Zhou, Michael Z.; Mogel, Greg T.; Liu, Brent J.; Zhou, Xiaoqiang

    2003-05-01

    Health data security, characterized in terms of data privacy, authenticity, and integrity, is a vital issue when digital images and other patient information are transmitted through public networks in telehealth applications such as teleradiology. Mandates for ensuring health data security have been extensively discussed (for example The Health Insurance Portability and Accountability Act, HIPAA) and health informatics guidelines (such as the DICOM standard) are beginning to focus on issues of data continue to be published by organizing bodies in healthcare; however, there has not been a systematic method developed to ensure data security in medical imaging Because data privacy and authenticity are often managed primarily with firewall and password protection, we have focused our research and development on data integrity. We have developed a systematic method of ensuring medical image data integrity across public networks using the concept of the digital envelope. When a medical image is generated regardless of the modality, three processes are performed: the image signature is obtained, the DICOM image header is encrypted, and a digital envelope is formed by combining the signature and the encrypted header. The envelope is encrypted and embedded in the original image. This assures the security of both the image and the patient ID. The embedded image is encrypted again and transmitted across the network. The reverse process is performed at the receiving site. The result is two digital signatures, one from the original image before transmission, and second from the image after transmission. If the signatures are identical, there has been no alteration of the image. This paper concentrates in the method and evaluation of the digital image envelope.

  9. Heat pipe radiators for space

    NASA Technical Reports Server (NTRS)

    Sellers, J. P.

    1976-01-01

    Analysis of the data heat pipe radiator systems tested in both vacuum and ambient environments was continued. The systems included (1) a feasibility VCHP header heat-pipe panel, (2) the same panel reworked to eliminate the VCHP feature and referred to as the feasibility fluid header panel, and (3) an optimized flight-weight fluid header panel termed the 'prototype.' A description of freeze-thaw thermal vacuum tests conducted on the feasibility VCHP was included. In addition, the results of ambient tests made on the feasibility fluid header are presented, including a comparison with analytical results. A thermal model of a fluid header heat pipe radiator was constructed and a computer program written. The program was used to make a comparison of the VCHP and fluid-header concepts for both single and multiple panel applications. The computer program was also employed for a parametric study, including optimum feeder heat pipe spacing, of the prototype fluid header.

  10. Recognition of VLSI Module Isomorphism

    DTIC Science & Technology

    1990-03-01

    forthforth->next; 6.5 else{ prev4=prev4->next; forth=forth->next; if (header-. nenI ->tai==third){ header-.nevrI->tail=prev3; prev3->next=NULL; end...end=TRUE; if (header-. nenI ->head=third){ header-.newn->head=third->next; I if((third!=prev3)&&(finished!=TRUE)){ prev3->next=prev3->next->next; third

  11. Aviation and Airports, Transportation & Public Facilities, State of Alaska

    Science.gov Websites

    State Employees Alaska Department of Transportation & Public Facilities header image Alaska Department of Transportation & Public Facilities / Aviation and Airports Search DOT&PF State of Stevens Anchorage International Airport Link to List of Alaska Public Airports Ketchikan International

  12. Design, Implementation and Evaluation of an Operating System for a Network of Transputers.

    DTIC Science & Technology

    1987-06-01

    WHILE TRUE -- listen to linki SEQ receiving the header BYTE.SLICE.INPUT (linkl,headerl,1,header.size) -- decoding the block size block.sizelLO] z...I’m done BYTE.SLICE.OUTPUT (screen[0] ,header0,3,1) WHILE TRUE -- listen to linki SEQ- rec eiving the header BYTE.SLICE. IPUT (linkl,headerl,1

  13. 78 FR 55251 - Southeast Supply Header, LLC; Notice of Request Under Blanket Authorization

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-10

    ... Supply Header, LLC; Notice of Request Under Blanket Authorization Take notice that on August 23, 2013, Southeast Supply Header, LLC (SESH), P.O. Box 1642, Houston, Texas 77251-1642, filed in Docket No. CP13-537... Southeast Supply Header, LLC et al, 119 FERC ] 61,153 (2007). SESH proposes to offset and replace...

  14. Browns Ferry Nuclear Plant Unit 2: Control rod drive scram discharge headers decontamination effort

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Traynor, J.C.

    1983-08-01

    The control rod drive (CRD) scram discharge headers were decontaminated during the Browns Ferry unit 2, cycle 4 refueling outage (August 2-5, 1982). Hydrolasing (high-pressure water blasting) was used as the method of decontamination to remove fixed and loose radioactive contaminants from the headers. It was found that hydrolasing of the west scram discharge headers resulted in approximate maximum and average decontamination factors (DFs) on contact of 13 and 5, respectively. For the east scram discharge headers, hydrolasing resulted in a maximum and average DF on contact of approximately 3. The maximum and average DFs on contact for the individualmore » headers ranged from 1 to 33 and 1 to 10, respectively, while the walkway (head-level) DFs were in the range of 3 to 4. Higher DFs were impeded by inadequate drainage and backwashing of fluid. This led to increased radiation levels in some areas and recontamination of adjacent headers.« less

  15. MO-E-17A-01: BEST IN PHYSICS (IMAGING) - Calculating SSDE From CT Exams Using Size Data Available in the DICOM Header of CT Localizer Radiographs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McMillan, K; Bostani, M; McNitt-Gray, M

    2014-06-15

    Purpose: To demonstrate the feasibility of using existing data stored within the DICOM header of certain CT localizer radiographs as a patient size metric for calculating CT size-specific dose estimates (SSDE). Methods: For most Siemens CT scanners, the CT localizer radiograph (topogram) contains a private DICOM field that stores an array of numbers describing AP and LAT attenuation-based measures of patient dimension. The square root of the product of the AP and LAT size data, which provides an estimate of water-equivalent-diameter (WED), was calculated retrospectively from topogram data of 20 patients who received clinically-indicated abdomen/pelvis (n=10) and chest (n=10) scansmore » (WED-topo). In addition, slice-by-slice water-equivalent-diameter (WED-image) and effective diameter (ED-image) values were calculated from the respective image data. Using TG-204 lookup tables, size-dependent conversion factors were determined based upon WED-topo, WED-image and ED-image values. These conversion factors were used with the reported CTDIvol to calculate slice-by-slice SSDE for each method. Averaging over all slices, a single SSDE value was determined for each patient and size metric. Patientspecific SSDE and CTDIvol values were then compared with patientspecific organ doses derived from detailed Monte Carlo simulations of fixed tube current scans. Results: For abdomen/pelvis scans, the average difference between liver dose and CTDIvol, SSDE(WED-topo), SSDE(WED-image), and SSDE(ED-image) was 18.70%, 8.17%, 6.84%, and 7.58%, respectively. For chest scans, the average difference between lung dose and CTDIvol, SSDE(WED-topo), SSDE(WED-image), and SSDE(ED-image) was 25.80%, 3.33%, 4.11%, and 7.66%, respectively. Conclusion: SSDE calculated using WED derived from data in the DICOM header of the topogram was comparable to SSDE calculated using WED and ED derived from axial images; each of these estimated organ dose to within 10% for both abdomen/pelvis and chest CT examinations. The topogrambased method has the advantage that WED data are already provided and therefore available without additional post-processing of the image data. Funding Support: NIH Grant R01-EB017095; Disclosures - Michael McNitt-Gray: Institutional Research Agreement, Siemens AG; Research Support, Siemens AG; Consultant, Flaherty Sensabaugh Bonasso PLLC; Consultant, Fulbright and Jaworski; Disclosures - Cynthia McCollough: Research Grant, Siemens Healthcare.« less

  16. Survey of Header Compression Techniques

    NASA Technical Reports Server (NTRS)

    Ishac, Joseph

    2001-01-01

    This report provides a summary of several different header compression techniques. The different techniques included are: (1) Van Jacobson's header compression (RFC 1144); (2) SCPS (Space Communications Protocol Standards) header compression (SCPS-TP, SCPS-NP); (3) Robust header compression (ROHC); and (4) The header compression techniques in RFC2507 and RFC2508. The methodology for compression and error correction for these schemes are described in the remainder of this document. All of the header compression schemes support compression over simplex links, provided that the end receiver has some means of sending data back to the sender. However, if that return path does not exist, then neither Van Jacobson's nor SCPS can be used, since both rely on TCP (Transmission Control Protocol). In addition, under link conditions of low delay and low error, all of the schemes perform as expected. However, based on the methodology of the schemes, each scheme is likely to behave differently as conditions degrade. Van Jacobson's header compression relies heavily on the TCP retransmission timer and would suffer an increase in loss propagation should the link possess a high delay and/or bit error rate (BER). The SCPS header compression scheme protects against high delay environments by avoiding delta encoding between packets. Thus, loss propagation is avoided. However, SCPS is still affected by an increased BER (bit-error-rate) since the lack of delta encoding results in larger header sizes. Next, the schemes found in RFC2507 and RFC2508 perform well for non-TCP connections in poor conditions. RFC2507 performance with TCP connections is improved by various techniques over Van Jacobson's, but still suffers a performance hit with poor link properties. Also, RFC2507 offers the ability to send TCP data without delta encoding, similar to what SCPS offers. ROHC is similar to the previous two schemes, but adds additional CRCs (cyclic redundancy check) into headers and improves compression schemes which provide better tolerances in conditions with a high BER.

  17. Enabling IP Header Compression in COTS Routers via Frame Relay on a Simplex Link

    NASA Technical Reports Server (NTRS)

    Nguyen, Sam P.; Pang, Jackson; Clare, Loren P.; Cheng, Michael K.

    2010-01-01

    NASA is moving toward a networkcentric communications architecture and, in particular, is building toward use of Internet Protocol (IP) in space. The use of IP is motivated by its ubiquitous application in many communications networks and in available commercial off-the-shelf (COTS) technology. The Constellation Program intends to fit two or more voice (over IP) channels on both the forward link to, and the return link from, the Orion Crew Exploration Vehicle (CEV) during all mission phases. Efficient bandwidth utilization of the links is key for voice applications. In Voice over IP (VoIP), the IP packets are limited to small sizes to keep voice latency at a minimum. The common voice codec used in VoIP is G.729. This new algorithm produces voice audio at 8 kbps and in packets of 10-milliseconds duration. Constellation has designed the VoIP communications stack to use the combination of IP/UDP/RTP protocols where IP carries a 20-byte header, UDP (User Datagram Protocol) carries an 8-byte header, and RTP (Real Time Transport Protocol) carries a 12-byte header. The protocol headers total 40 bytes and are equal in length to a 40-byte G.729 payload, doubling the VoIP latency. Since much of the IP/UDP/RTP header information does not change from IP packet to IP packet, IP/UDP/RTP header compression can avoid transmission of much redundant data as well as reduce VoIP latency. The benefits of IP header compression are more pronounced at low data rate links such as the forward and return links during CEV launch. IP/UDP/RTP header compression codecs are well supported by many COTS routers. A common interface to the COTS routers is through frame relay. However, enabling IP header compression over frame relay, according to industry standard (Frame Relay IP Header Compression Agreement FRF.20), requires a duplex link and negotiations between the compressor router and the decompressor router. In Constellation, each forward to and return link from the CEV in space is treated independently as a simplex link. Without negotiation, the COTS routers are prevented from entering into the IP header compression mode, and no IP header compression would be performed. An algorithm is proposed to enable IP header compression in COTS routers on a simplex link with no negotiation or with a one-way messaging. In doing so, COTS routers can enter IP header compression mode without the need to handshake through a bidirectional link as required by FRF.20. This technique would spoof the routers locally and thereby allow the routers to enter into IP header compression mode without having the negotiations between routers actually occur. The spoofing function is conducted by a frame relay adapter (also COTS) with the capability to generate control messages according to the FRF.20 descriptions. Therefore, negotiation is actually performed between the FRF.20 adapter and the connecting COTS router locally and never occurs over the space link. Through understanding of the handshaking protocol described by FRF.20, the necessary FRF.20 negotiations messages can be generated to control the connecting router, not only to turn on IP header compression but also to adjust the compression parameters. The FRF.20 negotiation (or control) message is composed in the FRF.20 adapter by interpreting the incoming router request message. Many of the fields are simply transcribed from request to response while the control field indicating response and type are modified.

  18. Proposed U.S. Geological Survey standard for digital orthophotos

    USGS Publications Warehouse

    Hooper, David; Caruso, Vincent

    1991-01-01

    The U.S. Geological Survey has added the new category of digital orthophotos to the National Digital Cartographic Data Base. This differentially rectified digital image product enables users to take advantage of the properties of current photoimagery as a source of geographic information. The product and accompanying standard were implemented in spring 1991. The digital orthophotos will be quadrangle based and cast on the Universal Transverse Mercator projection and will extend beyond the 3.75-minute or 7.5-minute quadrangle area at least 300 meters to form a rectangle. The overedge may be used for mosaicking with adjacent digital orthophotos. To provide maximum information content and utility to the user, metadata (header) records exist at the beginning of the digital orthophoto file. Header information includes the photographic source type, date, instrumentation used to create the digital orthophoto, and information relating to the DEM that was used in the rectification process. Additional header information is included on transformation constants from the 1927 and 1983 North American Datums to the orthophoto internal file coordinates to enable the user to register overlays on either datum. The quadrangle corners in both datums are also imprinted on the image. Flexibility has been built into the digital orthophoto format for future enhancements, such as the provision to include the corresponding digital elevation model elevations used to rectify the orthophoto. The digital orthophoto conforms to National Map Accuracy Standards and provides valuable mapping data that can be used as a tool for timely revision of standard map products, for land use and land cover studies, and as a digital layer in a geographic information system.

  19. The Hopkins Ultraviolet Telescope Data Archive: Old Data in a New Format

    NASA Astrophysics Data System (ADS)

    Blair, William P.; Dixon, V.; Kruk, J.; Romelfanger, M.

    2011-05-01

    The Hopkins Ultraviolet Telescope (HUT) was a key component of the Astro Observatory, a package of telescopes that flew on the space shuttle as part of two dedicated astronomy missions, Astro-1 in December 1990 (STS-35), and Astro-2 in March 1995 (STS-67). HUT was a 0.9m telescope and prime-focus spectrograph operating primarily in the far-ultraviolet 900 - 1800 Angstrom spectral region, returning spectra with about 3 Angstrom resolution. Over 330 objects were observed during the two shuttle missions, and the data were originally archived at the NSSDC (NASA/GSFC), before moving to MAST, the Multimission Archive at Space Telescope. As part of a NASA Astrophysics Data Program grant, we are reprocessing and re-archiving this unique data set in a modern and more user-friendly format. Additional file-header keywords include the RA and Dec in J2000 coordinates, the aperture position angle, and target-magnitude and color information. A new data product, similar to the Intermediate Data Files developed for the FUSE mission, provides a flux- and wavelength-calibrated photon-event list with two-second time resolution. These files will allow users to customize their data extractions (e.g., to search for temporal variations in flux or exclude times of bad pointing). The reprocessed data are fully compliant with NVO specifications. They will be available from MAST starting in late 2011. We acknowledge support from NASA ADP grant NNX09AC70G to the Johns Hopkins University.

  20. Improved Air-Treatment Canister

    NASA Technical Reports Server (NTRS)

    Boehm, A. M.

    1982-01-01

    Proposed air-treatment canister integrates a heater-in-tube water evaporator into canister header. Improved design prevents water from condensing and contaminating chemicals that regenerate the air. Heater is evenly spiraled about the inlet header on the canister. Evaporator is brazed to the header.

  1. Chapter 35: Describing Data and Data Collections in the VO

    NASA Astrophysics Data System (ADS)

    Kent, B. R.; Hanisch, R. J.; Williams, R. D.

    The list of numbers: 19.22, 17.23, 18.11, 16.98, and 15.11, is of little intrinsic interest without information about the context in which they appear. For instance, are these daily closing stock prices for your favorite investment, or are they hourly photometric measurements of an increasingly bright quasar? The information needed to define this context is called metadata. Metadata are data about data. Astronomers are familiar with metadata through the headers of FITS files and the names and units associated with columns in a table or database. In the VO, metadata describe the contents of tables, images, and spectra, as well as aggregate collections of data (archives, surveys) and computational services. Moreover, VO metadata are constructed according to rules that avoid ambiguity and make it clear whether, in the example above, the stock prices are in dollars or euros, or the photometry is Johnson V or Sloan g. Organization of data is important in any scientific discipline. Equally crucial are the descriptions of that data: the organization publishing the data, its creator or the person making it available, what instruments were used, units assigned to measurement, calibration status, and data quality assessment. The Virtual Observatory metadata scheme not only applies to datasets, but to resources as well, including data archive facilities, searchable web forms, and online analysis and display tools. Since the scientific output flowing from large datasets depends greatly on how well the data are described, it is important for users to understand the basics of the metadata scheme in order to locate the data that they want and use it correctly. Metadata are the key to data discovery and data and service interoperability in the Virtual Observatory.

  2. DDN (Defense Data Network) Protocol Handbook. Volume 2. DARPA Internet Protocols

    DTIC Science & Technology

    1985-12-01

    header padding is used to ensure that the internet header ends on a 32 bit boundary. The padding is zero . 3.2. Discussion The implementation of a... zeros . The first of these would be interpreted as the end-of-options option, and the remainder as internet header padding , Every internet module must...several octets in length. The internet header Padding field is used to ensure that the data begins on 32 bit word boundary. The padding is zero

  3. Clustering header categories extracted from web tables

    NASA Astrophysics Data System (ADS)

    Nagy, George; Embley, David W.; Krishnamoorthy, Mukkai; Seth, Sharad

    2015-01-01

    Revealing related content among heterogeneous web tables is part of our long term objective of formulating queries over multiple sources of information. Two hundred HTML tables from institutional web sites are segmented and each table cell is classified according to the fundamental indexing property of row and column headers. The categories that correspond to the multi-dimensional data cube view of a table are extracted by factoring the (often multi-row/column) headers. To reveal commonalities between tables from diverse sources, the Jaccard distances between pairs of category headers (and also table titles) are computed. We show how about one third of our heterogeneous collection can be clustered into a dozen groups that exhibit table-title and header similarities that can be exploited for queries.

  4. ONE MILLION GALLON WATER TANK, PUMP HEADER PIPE (AT LEFT), ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    ONE MILLION GALLON WATER TANK, PUMP HEADER PIPE (AT LEFT), HEADER BYPASS PIPE (AT RIGHT), AND PUMPHOUSE FOUNDATIONS. Looking northeast - Edwards Air Force Base, Air Force Rocket Propulsion Laboratory, Flame Deflector Water System, Test Area 1-120, north end of Jupiter Boulevard, Boron, Kern County, CA

  5. Non-vascular interventional procedures: effective dose to patient and equivalent dose to abdominal organs by means of DICOM images and Monte Carlo simulation.

    PubMed

    Longo, Mariaconcetta; Marchioni, Chiara; Insero, Teresa; Donnarumma, Raffaella; D'Adamo, Alessandro; Lucatelli, Pierleone; Fanelli, Fabrizio; Salvatori, Filippo Maria; Cannavale, Alessandro; Di Castro, Elisabetta

    2016-03-01

    This study evaluates X-ray exposure in patient undergoing abdominal extra-vascular interventional procedures by means of Digital Imaging and COmmunications in Medicine (DICOM) image headers and Monte Carlo simulation. The main aim was to assess the effective and equivalent doses, under the hypothesis of their correlation with the dose area product (DAP) measured during each examination. This allows to collect dosimetric information about each patient and to evaluate associated risks without resorting to in vivo dosimetry. The dose calculation was performed in 79 procedures through the Monte Carlo simulator PCXMC (A PC-based Monte Carlo program for calculating patient doses in medical X-ray examinations), by using the real geometrical and dosimetric irradiation conditions, automatically extracted from DICOM headers. The DAP measurements were also validated by using thermoluminescent dosemeters on an anthropomorphic phantom. The expected linear correlation between effective doses and DAP was confirmed with an R(2) of 0.974. Moreover, in order to easily calculate patient doses, conversion coefficients that relate equivalent doses to measurable quantities, such as DAP, were obtained. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  6. High pressure ceramic heat exchanger

    DOEpatents

    Harkins, Bruce D.; Ward, Michael E.

    1998-01-01

    Many recuperators have components which react to corrosive gases and are used in applications where the donor fluid includes highly corrosive gases. These recuperators have suffered reduced life, increased service or maintenance, and resulted in increased cost. The present header assembly when used with recuperators reduces the brittle effect of a portion of the ceramic components. Thus, the present header assembly used with the present recuperator increases the life, reduces the service and maintenance, and reduces the increased cost associated with corrosive action of components used to manufacture recuperators. The present header assembly is comprised of a first ceramic member, a second ceramic member, a strengthening reinforcing member being in spaced relationship to the first ceramic member and the second ceramic member. The header assembly is further comprised of a refractory material disposed in contacting relationship with the first ceramic member, the second ceramic member and the strengthening reinforcing member. The present header assembly provides a high strength load bearing header assembly having good thermal cycling characteristics, good resistance to a corrosive environment and good steady state strength at elevated temperatures.

  7. High pressure ceramic heat exchanger

    DOEpatents

    Harkins, Bruce D.; Ward, Michael E.

    1999-01-01

    Many recuperators have components which react to corrosive gases and are used in applications where the donor fluid includes highly corrosive gases. These recuperators have suffered reduced life, increased service or maintenance, and resulted in increased cost. The present header assembly when used with recuperators reduces the brittle effect of a portion of the ceramic components. Thus, the present header assembly used with the present recuperator increases the life, reduces the service and maintenance, and reduces the increased cost associated with corrosive action of components used to manufacture recuperators. The present header assembly is comprised of a first ceramic member, a second ceramic member, a reinforcing member being in spaced relationship to the first ceramic member and the second ceramic member. The header assembly is further comprised of a refractory material disposed in contacting relationship with the first ceramic member, the second ceramic member and the reinforcing member and having a strengthening member wrapped around the refractory material. The present header assembly provides a high strength load bearing header assembly having good thermal cycling characteristics, good resistance to a corrosive environment and good steady state strength at elevated temperatures.

  8. High pressure ceramic heat exchanger

    DOEpatents

    Harkins, B.D.; Ward, M.E.

    1998-09-22

    Many recuperators have components which react to corrosive gases and are used in applications where the donor fluid includes highly corrosive gases. These recuperators have suffered reduced life, increased service or maintenance, and resulted in increased cost. The present header assembly when used with recuperators reduces the brittle effect of a portion of the ceramic components. Thus, the present header assembly used with the present recuperator increases the life, reduces the service and maintenance, and reduces the increased cost associated with corrosive action of components used to manufacture recuperators. The present header assembly is comprised of a first ceramic member, a second ceramic member, a strengthening reinforcing member being in spaced relationship to the first ceramic member and the second ceramic member. The header assembly is further comprised of a refractory material disposed in contacting relationship with the first ceramic member, the second ceramic member and the strengthening reinforcing member. The present header assembly provides a high strength load bearing header assembly having good thermal cycling characteristics, good resistance to a corrosive environment and good steady state strength at elevated temperatures. 5 figs.

  9. Notes on Operations. The Documentation of Electronic Texts Using Text Encoding Initiative Headers: An Introduction.

    ERIC Educational Resources Information Center

    Giordano, Richard

    1994-01-01

    Describes the Text Encoding Initiative (TEI) project and the TEI header, which documents electronic text in a standard interchange format understandable to both librarian catalogers and nonlibrarian text encoders. The form and function of the TEI header is introduced, and its relationship to the MARC record is explained. (10 references) (KRN)

  10. --No Title--

    Science.gov Websites

    a#show-docs-search{display:inline;padding-left:2%}div.popover span{display:block}div.popover %}p.pub-note span{font-style:italic;padding-bottom:10px}#category-search input[type=submit]{margin-top :100px;margin-left:90px}div#search-header{padding:0 0 20px 0}div#search-header span.header{font-size

  11. Data rescue of NASA First ISLSCP (International Satellite Land Surface Climatology Project) Field Experiment (FIFE) aerial observations

    NASA Astrophysics Data System (ADS)

    Santhana Vannan, S. K.; Boyer, A.; Deb, D.; Beaty, T.; Wei, Y.; Wei, Z.

    2017-12-01

    The Oak Ridge National Laboratory Distributed Active Archive Center (ORNL DAAC) for biogeochemical dynamics is one of the NASA Earth Observing System Data and Information System (EOSDIS) data centers. ORNL DAAC (https://daac.ornl.gov) is responsible for data archival, product development and distribution, and user support for biogeochemical and ecological data and models. In particular, ORNL DAAC has been providing data management support for NASA's terrestrial ecology field campaign programs for the last several decades. Field campaigns combine ground, aircraft, and satellite-based measurements in specific ecosystems over multi-year time periods. The data collected during NASA field campaigns are archived at the ORNL DAAC (https://daac.ornl.gov/get_data/). This paper describes the effort of the ORNL DAAC team for data rescue of a First ISLSCP Field Experiment (FIFE) dataset containing airborne and satellite data observations from the 1980s. The data collected during the FIFE campaign contain high resolution aerial imageries collected over Kansas. The data rescue workflow was prepared to test for successful recovery of the data from a CD-ROM and to ensure that the data are usable and preserved for the future. The imageries contain spectral reflectance data that can be used as a historical benchmark to examine climatological and ecological changes in the Kansas region since the 1980s. Below are the key steps taken to convert the files to modern standards. Decompress the imageries using custom compression software provided with the data. The compression algorithm created for MS-DOS in 1980s had to be set up to run on modern computer systems. Decompressed files were geo-referenced by using metadata information stored in separate compressed header files. Standardized file names were applied (File names and details were described in separate readme documents). Image files were converted to GeoTIFF format with embedded georeferencing information. Leverage Open Geospatial Consortium (OGC) Web services to provide dynamic data transformation and visualization. We will describe the steps in detail and share lessons learned during the AGU session.

  12. Phishtest: Measuring the Impact of Email Headers on the Predictive Accuracy of Machine Learning Techniques

    ERIC Educational Resources Information Center

    Tout, Hicham

    2013-01-01

    The majority of documented phishing attacks have been carried by email, yet few studies have measured the impact of email headers on the predictive accuracy of machine learning techniques in detecting email phishing attacks. Research has shown that the inclusion of a limited subset of email headers as features in training machine learning…

  13. Excalibur Strategic Configured Load (SCL) for the Heavy Expanded Mobility Tactical Truck (HEMTT). Testing IAW TP-94-01, Revision 2, June 2004, Transportability Testing Procedures

    DTIC Science & Technology

    2008-06-01

    2-1 PART 3 - TEST EQUIPMENT 1. Semitrailer, flatbed , breakbulk/container transporter, 34 ton Model #: M872A1 Manufactured by Heller Truck Body... LAMINATING DUNNAGE. ADDITIONALLY, THE NAILING PATTERN FOR AN UP- DA39 PALLET UNITS CENTERED ON TOP OF PRIOR DA39 PALLET UNITS, PER PIECE OF LAMINATED ...ai FT (APPWO HEADER I" X V’ X 7-4- (AS RECID, 1 SHOWYN). LAMINATE EACH HEADER TO PREVIOUS HEADER W/9-10d NAILS. HEADER 2" x 8" x r-" (AS RECID, 3 SHOWN

  14. Secured Hash Based Burst Header Authentication Design for Optical Burst Switched Networks

    NASA Astrophysics Data System (ADS)

    Balamurugan, A. M.; Sivasubramanian, A.; Parvathavarthini, B.

    2017-12-01

    The optical burst switching (OBS) is a promising technology that could meet the fast growing network demand. They are featured with the ability to meet the bandwidth requirement of applications that demand intensive bandwidth. OBS proves to be a satisfactory technology to tackle the huge bandwidth constraints, but suffers from security vulnerabilities. The objective of this proposed work is to design a faster and efficient burst header authentication algorithm for core nodes. There are two important key features in this work, viz., header encryption and authentication. Since the burst header is an important in optical burst switched network, it has to be encrypted; otherwise it is be prone to attack. The proposed MD5&RC4-4S based burst header authentication algorithm runs 20.75 ns faster than the conventional algorithms. The modification suggested in the proposed RC4-4S algorithm gives a better security and solves the correlation problems between the publicly known outputs during key generation phase. The modified MD5 recommended in this work provides 7.81 % better avalanche effect than the conventional algorithm. The device utilization result also shows the suitability of the proposed algorithm for header authentication in real time applications.

  15. Browsing the PDS Image Archive with the Imaging Atlas and Apache Solr

    NASA Astrophysics Data System (ADS)

    Grimes, K. M.; Padams, J. H.; Stanboli, A.; Wagstaff, K. L.

    2018-04-01

    The PDS Image Archive is home to tens of millions of images, nearly 30 million of which are associated with rich metadata. By leveraging the Solr indexing technology and the Imaging Atlas interactive frontend, we enable intuitive archive browsing.

  16. Software for Managing an Archive of Images

    NASA Technical Reports Server (NTRS)

    Hallai, Charles; Jones, Helene; Callac, Chris

    2003-01-01

    This is a revised draft by Innovators concerning the report on Software for Managing and Archive of Images.The SSC Multimedia Archive is an automated electronic system to manage images, acquired both by film and digital cameras, for the Public Affairs Office (PAO) at Stennis Space Center (SSC). Previously, the image archive was based on film photography and utilized a manual system that, by todays standards, had become inefficient and expensive. Now, the SSC Multimedia Archive, based on a server at SSC, contains both catalogs and images for pictures taken both digitally and with a traditional film-based camera, along with metadata about each image.

  17. Archive of Boomer Seismic Reflection Data Collected During USGS Cruises 01SCC01 and 01SCC02, Timbalier Bay and Offshore East Timbalier Island, Louisiana, June-August, 2001

    USGS Publications Warehouse

    Calderon, Karynna; Dadisman, Shawn V.; Flocks, James G.; Kindinger, Jack G.; Wiese, Dana S.

    2003-01-01

    In June, July, and August of 2001, the U.S. Geological Survey (USGS), in cooperation with the University of New Orleans (UNO), the U.S. Army Corps of Engineers, and the Louisiana Department of Natural Resources, conducted a shallow geophysical and sediment core survey of Timbalier Bay and the Gulf of Mexico offshore East Timbalier Island, Louisiana. This report serves as an archive of unprocessed digital seismic reflection data, trackline navigation files, trackline navigation maps, observers' logbooks, Geographic Information Systems (GIS) information, and formal Federal Geographic Data Committee (FGDC) metadata. In addition, a filtered and gained digital Graphics Interchange Format (GIF) image of each seismic profile is provided. Please see Kulp and others (2002), Flocks and others (2003), and Kulp and others (in prep.) for further information about the sediment cores collected and the geophysical results. For convenience, a list of acronyms and abbreviations frequently used in this report is also included. This Digital Versatile Disc (DVD) document is readable on any computing platform that has standard DVD driver software installed. Documentation on this DVD was produced using Hyper Text Markup Language (HTML) utilized by the World Wide Web (WWW) and allows the user to access the information using a web browser (i.e. Netscape, Internet Explorer). To access the information contained on this disc, open the file 'index.htm' located at the top level of the disc using a web browser. This report also contains WWW links to USGS collaborators and other agencies. These links are only accessible if access to the Internet is available while viewing this DVD. The archived boomer seismic reflection data are in standard Society of Exploration Geophysicists (SEG) SEG-Y format (Barry et al., 1975) and may be downloaded for processing with public domain software such as Seismic Unix (SU), currently located at http://www.cwp.mines.edu/cwpcodes/index.html. Examples of SU processing scripts are provided in the BOOM.tar file located in the SU subfolder of the SOFTWARE folder located at the top level of this disc. In-house (USGS) DOS and Microsoft Windows compatible software for viewing SEG-Y headers - DUMPSEGY.EXE (Zihlman, 1992) - is provided in the USGS subfolder of the SOFTWARE folder. Processed profile images, trackline navigation maps, logbooks, and formal metadata may be viewed with a web browser.

  18. Archive of Chirp Seismic Reflection Data Collected During USGS Cruises 01SCC01 and 01SCC02, Timbalier Bay and Offshore East Timbalier Island, Louisiana, June 30 - July 9 and August 1 - 12, 2001

    USGS Publications Warehouse

    Calderon, Karynna; Dadisman, Shawn V.; Flocks, James G.; Wiese, Dana S.; Kindinger, Jack G.

    2003-01-01

    In June, July, and August of 2001, the U.S. Geological Survey (USGS), in cooperation with the University of New Orleans, the U.S. Army Corps of Engineers, and the Louisiana Department of Natural Resources, conducted a shallow geophysical and sediment core survey of Timbalier Bay and the Gulf of Mexico offshore East Timbalier Island, Louisiana. This report serves as an archive of unprocessed digital seismic reflection data, trackline navigation files, trackline navigation maps, observers' logbooks, Geographic Information Systems (GIS) information, and formal Federal Geographic Data Committee (FGDC) metadata. In addition, a gained digital Graphics Interchange Format (GIF) image of each seismic profile is provided. Please see Kulp and others (2002), Flocks and others (2003), and Kulp and others (in prep.) for further information about the sediment cores collected and the geophysical results. For convenience, a list of acronyms and abbreviations frequently used in this report is also included. This Digital Versatile Disc (DVD) document is readable on any computing platform that has standard DVD driver software installed. Documentation on this DVD was produced using Hyper Text Markup Language (HTML) utilized by the World Wide Web (WWW) and allows the user to access the information using a web browser (i.e. Netscape, Internet Explorer). To access the information contained on these discs, open the file 'index.htm' located at the top level of each disc using a web browser. This report also contains WWW links to USGS collaborators and other agencies. These links are only accessible if access to the internet is available while viewing these DVDs. The archived chirp seismic reflection data are in standard Society of Exploration Geophysicists (SEG) SEG-Y format (Barry et al., 1975) and may be downloaded for processing with public domain software such as Seismic Unix (SU), currently located at http://www.cwp.mines.edu/cwpcodes/index.html. Examples of SU processing scripts are provided in the CHIRP.tar file located in the SU subfolder of the SOFTWARE folder located at the top level of each disc. In-house (USGS) DOS and Microsoft Windows compatible software for viewing SEG-Y headers - DUMPSEGY.EXE (Zihlman, 1992) - is provided in the USGS subfolder of the SOFTWARE folder. Processed profile images, trackline navigation maps, logbooks, and formal metadata may be viewed with a web browser.

  19. Simple online recognition of optical data strings based on conservative optical logic

    NASA Astrophysics Data System (ADS)

    Caulfield, H. John; Shamir, Joseph; Zavalin, Andrey I.; Silberman, Enrique; Qian, Lei; Vikram, Chandra S.

    2006-06-01

    Optical packet switching relies on the ability of a system to recognize header information on an optical signal. Unless the headers are very short with large Hamming distances, optical correlation fails and optical logic becomes attractive because it can handle long headers with Hamming distances as low as 1. Unfortunately, the only optical logic gates fast enough to keep up with current communication speeds involve semiconductor optical amplifiers and do not lend themselves to the incorporation of large numbers of elements for header recognition and would consume a lot of power as well. The ideal system would operate at any bandwidth with no power consumption. We describe how to design and build such a system by using passive optical logic. This too leads to practical problems that we discuss. We show theoretically various ways to use optical interferometric logic for reliable recognition of long data streams such as headers in optical communication. In addition, we demonstrate one particularly simple experimental approach using interferometric coinc gates.

  20. Thermal shock testing for assuring reliability of glass-sealed microelectronic packages

    NASA Technical Reports Server (NTRS)

    Thomas, Walter B., III; Lewis, Michael D.

    1991-01-01

    Tests were performed to determine if thermal shocking is destructive to glass-to-metal seal microelectronic packages and if thermal shock step stressing can compare package reliabilities. Thermal shocking was shown to be not destructive to highly reliable glass seals. Pin-pull tests used to compare the interfacial pin glass strengths showed no differences between thermal shocked and not-thermal shocked headers. A 'critical stress resistance temperature' was not exhibited by the 14 pin Dual In-line Package (DIP) headers evaluated. Headers manufactured in cryogenic nitrogen based and exothermically generated atmospheres showed differences in as-received leak rates, residual oxide depths and pin glass interfacial strengths; these were caused by the different manufacturing methods, in particular, by the chemically etched pins used by one manufacturer. Both header types passed thermal shock tests to temperature differentials of 646 C. The sensitivity of helium leak rate measurements was improved up to 70 percent by baking headers for two hours at 200 C after thermal shocking.

  1. Recognition of the optical packet header for two channels utilizing the parallel reservoir computing based on a semiconductor ring laser

    NASA Astrophysics Data System (ADS)

    Bao, Xiurong; Zhao, Qingchun; Yin, Hongxi; Qin, Jie

    2018-05-01

    In this paper, an all-optical parallel reservoir computing (RC) system with two channels for the optical packet header recognition is proposed and simulated, which is based on a semiconductor ring laser (SRL) with the characteristic of bidirectional light paths. The parallel optical loops are built through the cross-feedback of the bidirectional light paths where every optical loop can independently recognize each injected optical packet header. Two input signals are mapped and recognized simultaneously by training all-optical parallel reservoir, which is attributed to the nonlinear states in the laser. The recognition of optical packet headers for two channels from 4 bits to 32 bits is implemented through the simulation optimizing system parameters and therefore, the optimal recognition error ratio is 0. Since this structure can combine with the wavelength division multiplexing (WDM) optical packet switching network, the wavelength of each channel of optical packet headers for recognition can be different, and a better recognition result can be obtained.

  2. The Cancer Imaging Archive (TCIA): maintaining and operating a public information repository.

    PubMed

    Clark, Kenneth; Vendt, Bruce; Smith, Kirk; Freymann, John; Kirby, Justin; Koppel, Paul; Moore, Stephen; Phillips, Stanley; Maffitt, David; Pringle, Michael; Tarbox, Lawrence; Prior, Fred

    2013-12-01

    The National Institutes of Health have placed significant emphasis on sharing of research data to support secondary research. Investigators have been encouraged to publish their clinical and imaging data as part of fulfilling their grant obligations. Realizing it was not sufficient to merely ask investigators to publish their collection of imaging and clinical data, the National Cancer Institute (NCI) created the open source National Biomedical Image Archive software package as a mechanism for centralized hosting of cancer related imaging. NCI has contracted with Washington University in Saint Louis to create The Cancer Imaging Archive (TCIA)-an open-source, open-access information resource to support research, development, and educational initiatives utilizing advanced medical imaging of cancer. In its first year of operation, TCIA accumulated 23 collections (3.3 million images). Operating and maintaining a high-availability image archive is a complex challenge involving varied archive-specific resources and driven by the needs of both image submitters and image consumers. Quality archives of any type (traditional library, PubMed, refereed journals) require management and customer service. This paper describes the management tasks and user support model for TCIA.

  3. Estimating pediatric entrance skin dose from digital radiography examination using DICOM metadata: A quality assurance tool.

    PubMed

    Brady, S L; Kaufman, R A

    2015-05-01

    To develop an automated methodology to estimate patient examination dose in digital radiography (DR) imaging using DICOM metadata as a quality assurance (QA) tool. Patient examination and demographical information were gathered from metadata analysis of DICOM header data. The x-ray system radiation output (i.e., air KERMA) was characterized for all filter combinations used for patient examinations. Average patient thicknesses were measured for head, chest, abdomen, knees, and hands using volumetric images from CT. Backscatter factors (BSFs) were calculated from examination kVp. Patient entrance skin air KERMA (ESAK) was calculated by (1) looking up examination technique factors taken from DICOM header metadata (i.e., kVp and mA s) to derive an air KERMA (k air) value based on an x-ray characteristic radiation output curve; (2) scaling k air with a BSF value; and (3) correcting k air for patient thickness. Finally, patient entrance skin dose (ESD) was calculated by multiplying a mass-energy attenuation coefficient ratio by ESAK. Patient ESD calculations were computed for common DR examinations at our institution: dual view chest, anteroposterior (AP) abdomen, lateral (LAT) skull, dual view knee, and bone age (left hand only) examinations. ESD was calculated for a total of 3794 patients; mean age was 11 ± 8 yr (range: 2 months to 55 yr). The mean ESD range was 0.19-0.42 mGy for dual view chest, 0.28-1.2 mGy for AP abdomen, 0.18-0.65 mGy for LAT view skull, 0.15-0.63 mGy for dual view knee, and 0.10-0.12 mGy for bone age (left hand) examinations. A methodology combining DICOM header metadata and basic x-ray tube characterization curves was demonstrated. In a regulatory era where patient dose reporting has become increasingly in demand, this methodology will allow a knowledgeable user the means to establish an automatable dose reporting program for DR and perform patient dose related QA testing for digital x-ray imaging.

  4. Estimating pediatric entrance skin dose from digital radiography examination using DICOM metadata: A quality assurance tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brady, S. L., E-mail: samuel.brady@stjude.org; Kaufman, R. A., E-mail: robert.kaufman@stjude.org

    Purpose: To develop an automated methodology to estimate patient examination dose in digital radiography (DR) imaging using DICOM metadata as a quality assurance (QA) tool. Methods: Patient examination and demographical information were gathered from metadata analysis of DICOM header data. The x-ray system radiation output (i.e., air KERMA) was characterized for all filter combinations used for patient examinations. Average patient thicknesses were measured for head, chest, abdomen, knees, and hands using volumetric images from CT. Backscatter factors (BSFs) were calculated from examination kVp. Patient entrance skin air KERMA (ESAK) was calculated by (1) looking up examination technique factors taken frommore » DICOM header metadata (i.e., kVp and mA s) to derive an air KERMA (k{sub air}) value based on an x-ray characteristic radiation output curve; (2) scaling k{sub air} with a BSF value; and (3) correcting k{sub air} for patient thickness. Finally, patient entrance skin dose (ESD) was calculated by multiplying a mass–energy attenuation coefficient ratio by ESAK. Patient ESD calculations were computed for common DR examinations at our institution: dual view chest, anteroposterior (AP) abdomen, lateral (LAT) skull, dual view knee, and bone age (left hand only) examinations. Results: ESD was calculated for a total of 3794 patients; mean age was 11 ± 8 yr (range: 2 months to 55 yr). The mean ESD range was 0.19–0.42 mGy for dual view chest, 0.28–1.2 mGy for AP abdomen, 0.18–0.65 mGy for LAT view skull, 0.15–0.63 mGy for dual view knee, and 0.10–0.12 mGy for bone age (left hand) examinations. Conclusions: A methodology combining DICOM header metadata and basic x-ray tube characterization curves was demonstrated. In a regulatory era where patient dose reporting has become increasingly in demand, this methodology will allow a knowledgeable user the means to establish an automatable dose reporting program for DR and perform patient dose related QA testing for digital x-ray imaging.« less

  5. Digital Archival Image Collections: Who Are the Users?

    ERIC Educational Resources Information Center

    Herold, Irene M. H.

    2010-01-01

    Archival digital image collections are a relatively new phenomenon in college library archives. Digitizing archival image collections may make them accessible to users worldwide. There has been no study to explore whether collections on the Internet lead to users who are beyond the institution or a comparison of users to a national or…

  6. Production of Previews and Advanced Data Products for the ESO Science Archive

    NASA Astrophysics Data System (ADS)

    Rité, C.; Slijkhuis, R.; Rosati, P.; Delmotte, N.; Rino, B.; Chéreau, F.; Malapert, J.-C.

    2008-08-01

    We present a project being carried out by the Virtual Observatory Systems Department/Advanced Data Products group in order to populate the ESO Science Archive Facility with image previews and advanced data products. The main goal is to provide users of the ESO Science Archive Facility with the possibility of viewing pre-processed images associated with instruments like WFI, ISAAC and SOFI before actually retrieving the data for full processing. The image processing is done by using the ESO/MVM image reduction software developed at ESO, to produce astrometrically calibrated FITS images, ranging from simple previews of single archive images, to fully stacked mosaics. These data products can be accessed via the ESO Science Archive Query Form and also be viewed with the browser VirGO {http://archive.eso.org/cms/virgo}.

  7. Image dissemination and archiving.

    PubMed

    Robertson, Ian

    2007-08-01

    Images generated as part of the sonographic examination are an integral part of the medical record and must be retained according to local regulations. The standard medical image format, known as DICOM (Digital Imaging and COmmunications in Medicine) makes it possible for images from many different imaging modalities, including ultrasound, to be distributed via a standard internet network to distant viewing workstations and a central archive in an almost seamless fashion. The DICOM standard is a truly universal standard for the dissemination of medical images. When purchasing an ultrasound unit, the consumer should research the unit's capacity to generate images in a DICOM format, especially if one wishes interconnectivity with viewing workstations and an image archive that stores other medical images. PACS, an acronym for Picture Archive and Communication System refers to the infrastructure that links modalities, workstations, the image archive, and the medical record information system into an integrated system, allowing for efficient electronic distribution and storage of medical images and access to medical record data.

  8. Community archiving of imaging studies

    NASA Astrophysics Data System (ADS)

    Fritz, Steven L.; Roys, Steven R.; Munjal, Sunita

    1996-05-01

    The quantity of image data created in a large radiology practice has long been a challenge for available archiving technology. Traditional methods ofarchiving the large quantity of films generated in radiology have relied on warehousing in remote sites, with courier delivery of film files for historical comparisons. A digital community archive, accessible via a wide area network, represents a feasible solution to the problem of archiving digital images from a busy practice. In addition, it affords a physician caring for a patient access to imaging studies performed at a variety ofhealthcare institutions without the need to repeat studies. Security problems include both network security issues in the WAN environment and access control for patient, physician and imaging center. The key obstacle to developing a community archive is currently political. Reluctance to participate in a community archive can be reduced by appropriate design of the access mechanisms.

  9. Selective encryption for H.264/AVC video coding

    NASA Astrophysics Data System (ADS)

    Shi, Tuo; King, Brian; Salama, Paul

    2006-02-01

    Due to the ease with which digital data can be manipulated and due to the ongoing advancements that have brought us closer to pervasive computing, the secure delivery of video and images has become a challenging problem. Despite the advantages and opportunities that digital video provide, illegal copying and distribution as well as plagiarism of digital audio, images, and video is still ongoing. In this paper we describe two techniques for securing H.264 coded video streams. The first technique, SEH264Algorithm1, groups the data into the following blocks of data: (1) a block that contains the sequence parameter set and the picture parameter set, (2) a block containing a compressed intra coded frame, (3) a block containing the slice header of a P slice, all the headers of the macroblock within the same P slice, and all the luma and chroma DC coefficients belonging to the all the macroblocks within the same slice, (4) a block containing all the ac coefficients, and (5) a block containing all the motion vectors. The first three are encrypted whereas the last two are not. The second method, SEH264Algorithm2, relies on the use of multiple slices per coded frame. The algorithm searches the compressed video sequence for start codes (0x000001) and then encrypts the next N bits of data.

  10. Managing an Archive of Images

    NASA Technical Reports Server (NTRS)

    Andres, Vince; Walter, David; Hallal, Charles; Jones, Helene; Callac, Chris

    2004-01-01

    The SSC Multimedia Archive is an automated electronic system to manage images, acquired both by film and digital cameras, for the Public Affairs Office (PAO) at Stennis Space Center (SSC). Previously, the image archive was based on film photography and utilized a manual system that, by today s standards, had become inefficient and expensive. Now, the SSC Multimedia Archive, based on a server at SSC, contains both catalogs and images for pictures taken both digitally and with a traditional, film-based camera, along with metadata about each image. After a "shoot," a photographer downloads the images into the database. Members of the PAO can use a Web-based application to search, view and retrieve images, approve images for publication, and view and edit metadata associated with the images. Approved images are archived and cross-referenced with appropriate descriptions and information. Security is provided by allowing administrators to explicitly grant access privileges to personnel to only access components of the system that they need to (i.e., allow only photographers to upload images, only PAO designated employees may approve images).

  11. Archive of digital Chirp sub-bottom profile data collected during USGS Cruise 07SCC01 offshore of the Chandeleur Islands, Louisiana, June 2007

    USGS Publications Warehouse

    Forde, Arnell S.; Dadisman, Shawn V.; Flocks, James G.; Wiese, Dana S.

    2010-01-01

    In June of 2007, the U.S. Geological Survey (USGS) conducted a geophysical survey offshore of the Chandeleur Islands, Louisiana, in cooperation with the Louisiana Department of Natural Resources (LDNR) as part of the USGS Barrier Island Comprehensive Monitoring (BICM) project. This project is part of a broader study focused on Subsidence and Coastal Change (SCC). The purpose of the study was to investigate the shallow geologic framework and monitor the enviromental impacts of Hurricane Katrina (Louisiana landfall was on August 29, 2005) on the Gulf Coast's barrier island chains. This report serves as an archive of unprocessed digital 512i and 424 Chirp sub-bottom profile data, trackline maps, navigation files, Geographic Information System (GIS) files, Field Activity Collection System (FACS) logs, observer's logbook, and formal Federal Geographic Data Committee (FGDC) metadata. Gained (a relative increase in signal amplitude) digital images of the seismic profiles are also provided. Refer to the Acronyms page for expansion of acronyms and abbreviations used in this report. The USGS St. Petersburg Coastal and Marine Science Center (SPCMSC) assigns a unique identifier to each cruise or field activity. For example, 07SCC01 tells us the data were collected in 2007 for the Subsidence and Coastal Change (SCC) study and the data were collected during the first field activity for that study in that calendar year. Refer to http://walrus.wr.usgs.gov/infobank/programs/html/definition/activity.html for a detailed description of the method used to assign the field activity identification (ID). All Chirp systems use a signal of continuously varying frequency; the Chirp systems used during this survey produce high resolution, shallow penetration profile images beneath the seafloor. The towfish is a sound source and receiver, which is typically towed 1 - 2 m below the sea surface. The acoustic energy is reflected at density boundaries (such as the seafloor or sediment layers beneath the seafloor), detected by a receiver, and recorded by a PC-based seismic acquisition system. This process is repeated at timed intervals (for example, 0.125 s) and recorded for specific intervals of time (for example, 50 ms). In this way, a two-dimensional vertical image of the shallow geologic structure beneath the ship track is produced. Figure 1 displays the acquisition geometry. Refer to table 1 for a summary of acquisition parameters. See the digital FACS equipment log (11-KB PDF) for details about the acquisition equipment used. Table 2 lists trackline statistics. Scanned images of the handwritten FACS logs and handwritten science logbook (449-KB PDF) are also provided. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG-Y rev 1 format (Norris and Faichney, 2002); ASCII character encoding is used for the first 3,200 bytes of the card image header instead of the SEG-Y rev 0 (Barry and others, 1975) EBCDIC format. The SEG-Y files may be downloaded and processed with commercial or public domain software such as Seismic Unix (SU) (Cohen and Stockwell, 2010). See the How To Download SEG-Y Data page for download instructions. The web version of this archive does not contain the SEG-Y trace files. These files are very large and would require extremely long download times. To obtain the complete DVD archive, contact USGS Information at 1-888-ASK-USGS or infoservices@usgs.gov. The printable profiles provided here are GIF images that were processed and gained using SU software; refer to the Software page for links to example SU processing scripts and USGS software for viewing the SEG-Y files (Zihlman, 1992). The processed SEG-Y data were also exported to Chesapeake Technology, Inc. (CTI) SonarWeb software to produce an interactive version of the profile that allows the user to obtain a geographic location and depth from the profile for a given cursor position. This information is displayed in the status bar of the browser.

  12. VizieR Online Data Catalog: HD61005 SPHERE H and Ks images (Olofsson+, 2016)

    NASA Astrophysics Data System (ADS)

    Olofsson, J.; Samland, M.; Avenhaus, H.; Caceres, C.; Henning, T.; Moor, A.; Milli, J.; Canovas, H.; Quanz, S. P.; Schreiber, M. R.; Augereau, J.-C.; Bayo, A.; Bazzon, A.; Beuzit, J.-L.; Boccaletti, A.; Buenzli, E.; Casassus, S.; Chauvin, G.; Dominik, C.; Desidera, S.; Feldt, M.; Gratton, R.; Janson, M.; Lagrange, A.-M.; Langlois, M.; Lannier, J.; Maire, A.-L.; Mesa, D.; Pinte, C.; Rouan, D.; Salter, G.; Thalmann, C.; Vigan, A.

    2016-05-01

    The fits files contains the reduced ADI and DPI SPHERE observations used to produce Fig. 1 of the paper. Besides the primary card, the files consists of 6 additional ImageHDU. The first and second one contain the SPHERE IRDIS ADI H band observations and the noise map. The third and fourth contain the SPHERE IRDIS ADI Ks band observations and the corresponding noise map. Finally, the fifth and sixth ImageHDU contain the SPHERE IRDIS DPI H band data as well as the noise map. Each ADI image has 1024x1024 pixels, while the DPI images have 1800x1800 pixels. The header of the primary card contains the pixel sizes for each datasets and the wavelengths of the H and K band observations. (2 data files).

  13. Mega-precovery and data mining of near-Earth asteroids and other Solar System objects

    NASA Astrophysics Data System (ADS)

    Popescu, M.; Vaduvescu, O.; Char, F.; Curelaru, L.; Euronear Team

    2014-07-01

    The vast collection of CCD images and photographic plate archives available from the world-wide archives and telescopes is still insufficiently exploited. Within the EURONEAR project we designed two data mining software with the purpose to search very large collections of archives for images which serendipitously include known asteroids or comets in their field, with the main aims to extend the arc and improve the orbits. In this sense, ''Precovery'' (published in 2008, aiming to search all known NEAs in few archives via IMCCE's SkyBoT server) and ''Mega-Precovery'' (published in 2010, querying the IMCCE's Miriade server) were made available to the community via the EURONEAR website (euronear.imcce.fr). Briefly, Mega-Precovery aims to search one or a few known asteroids or comets in a mega-collection including millions of images from some of the largest observatory archives: ESO (15 instruments served by ESO Archive including VLT), NVO (8 instruments served by U.S. NVO Archive), CADC (11 instruments, including HST and Gemini), plus other important instrument archives: SDSS, CFHTLS, INT-WFC, Subaru-SuprimeCam and AAT-WFI, adding together 39 instruments and 4.3 million images (Mar 2014), and our Mega-Archive is growing. Here we present some of the most important results obtained with our data-mining software and some new planned search options of Mega-Precovery. Particularly, the following capabilities will be added soon: the ING archive (all imaging cameras) will be included and new search options will be made available (such as query by orbital elements and by observations) to be able to target new Solar System objects such as Virtual Impactors, bolides, planetary satellites, TNOs (besides the comets added recently). In order to better characterize the archives, we introduce the ''AOmegaA'' factor (archival etendue) proportional to the AOmega (etendue) and the number of images in an archive. With the aim to enlarge the Mega-Archive database, we invite the observatories (particularly those storing their images online and also those that own plate archives which could be scanned on request) to contact us in order to add their instrument archives (consisting of an ASCII file with telescope pointings in a simple format) to our Mega-Precovery open project. We intend for the future to synchronise our service with the Virtual Observatory.

  14. Sizing a PACS

    NASA Astrophysics Data System (ADS)

    Wilson, Dennis L.; Glicksman, Robert A.

    1994-05-01

    A Picture Archiving and Communications System (PACS) must be able to support the image rate of the medical treatment facility. In addition the PACS must have adequate working storage and archive storage capacity required. The calculation of the number of images per minute and the capacity of working storage and of archiving storage is discussed. The calculation takes into account the distribution of images over the different size of radiological images, the distribution between inpatient and outpatient, and the distribution over plain film CR images and other modality images. The support of the indirect clinical image load is difficult to estimate and is considered in some detail. The result of the exercise for a particular hospital is an estimate of the average size of the images and exams on the system, of the number of gigabytes of working storage, of the number of images moved per minute, of the size of the archive in gigabytes, and of the number of images that are to be moved by the archive per minute. The types of storage required to support the image rates and the capacity required are discussed.

  15. Multi-protocol header generation system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roberts, David A.; Ignatowski, Michael; Jayasena, Nuwan

    A communication device includes a data source that generates data for transmission over a bus, and a data encoder that receives and encodes outgoing data. An encoder system receives outgoing data from a data source and stores the outgoing data in a first queue. An encoder encodes outgoing data with a header type that is based upon a header type indication from a controller and stores the encoded data that may be a packet or a data word with at least one layered header in a second queue for transmission. The device is configured to receive at a payload extractor,more » a packet protocol change command from the controller and to remove the encoded data and to re-encode the data to create a re-encoded data packet and placing the re-encoded data packet in the second queue for transmission.« less

  16. Codestream-Based Identification of JPEG 2000 Images with Different Coding Parameters

    NASA Astrophysics Data System (ADS)

    Watanabe, Osamu; Fukuhara, Takahiro; Kiya, Hitoshi

    A method of identifying JPEG 2000 images with different coding parameters, such as code-block sizes, quantization-step sizes, and resolution levels, is presented. It does not produce false-negative matches regardless of different coding parameters (compression rate, code-block size, and discrete wavelet transform (DWT) resolutions levels) or quantization step sizes. This feature is not provided by conventional methods. Moreover, the proposed approach is fast because it uses the number of zero-bit-planes that can be extracted from the JPEG 2000 codestream by only parsing the header information without embedded block coding with optimized truncation (EBCOT) decoding. The experimental results revealed the effectiveness of image identification based on the new method.

  17. Efficient image data distribution and management with application to web caching architectures

    NASA Astrophysics Data System (ADS)

    Han, Keesook J.; Suter, Bruce W.

    2003-03-01

    We present compact image data structures and associated packet delivery techniques for effective Web caching architectures. Presently, images on a web page are inefficiently stored, using a single image per file. Our approach is to use clustering to merge similar images into a single file in order to exploit the redundancy between images. Our studies indicate that a 30-50% image data size reduction can be achieved by eliminating the redundancies of color indexes. Attached to this file is new metadata to permit an easy extraction of images. This approach will permit a more efficient use of the cache, since a shorter list of cache references will be required. Packet and transmission delays can be reduced by 50% eliminating redundant TCP/IP headers and connection time. Thus, this innovative paradigm for the elimination of redundancy may provide valuable benefits for optimizing packet delivery in IP networks by reducing latency and minimizing the bandwidth requirements.

  18. An Image Archive With The ACR/NEMA Message Formats

    NASA Astrophysics Data System (ADS)

    Seshadri, Sridhar B.; Khalsa, Satjeet; Arenson, Ronald L.; Brikman, Inna; Davey, Michael J.

    1988-06-01

    An image archive has been designed to manage and store radiologic images received from within the main Hospital and a from a suburban orthopedic clinic. Images are stored on both magnetic as well as optical media. Prior comparison examinations are combined with the current examination to generate a 'viewing folder' that is sent to the display station for primary diagnosis. An 'archive-manager' controls the database managment, periodic optical disk backup and 'viewing-folder' generation. Images are converted into the ACR/NEMA message format before being written to the optical disk. The software design of the 'archive-manager' and its associated modules is presented. Enhancements to the system are discussed.

  19. Development and Evaluation of a Clinical Note Section Header Terminology

    PubMed Central

    Denny, Joshua C.; Miller, Randolph A.; Johnson, Kevin B.; Spickard, Anderson

    2008-01-01

    Clinical documentation is often expressed in natural language text, yet providers often use common organizations that segment these notes in sections, such as “history of present illness” or “physical examination.” We developed a hierarchical section header terminology, supporting mappings to LOINC and other vocabularies; it contained 1109 concepts and 4332 synonyms. Physicians evaluated it compared to LOINC and the Evaluation and Management billing schema using a randomly selected corpus of history and physical notes. Evaluated documents contained a median of 54 sections and 27 “major sections.” There were 16,196 total sections in the evaluation note corpus. The terminology contained 99.9% of the clinical sections; LOINC matched 77% of section header concepts and 20% of section header strings in those documents. The section terminology may enable better clinical note understanding and interoperability. Future development and integration into natural language processing systems is needed. PMID:18999303

  20. Method and apparatus for eliminating unsuccessful tries in a search tree

    NASA Technical Reports Server (NTRS)

    Peterson, John C. (Inventor); Chow, Edward (Inventor); Madan, Herb S. (Inventor)

    1991-01-01

    A circuit switching system in an M-ary, n-cube connected network completes a best-first path from an originating node to a destination node by latching valid legs of the path as the path is being sought out. Each network node is provided with a routing hyperswitch sub-network, (HSN) connected between that node and bidirectional high capacity communication channels of the n-cube network. The sub-networks are all controlled by routing algorithms which respond to message identification headings (headers) on messages to be routed along one or more routing legs. The header includes information embedded therein which is interpreted by each sub-network to route and historically update the header. A logic circuit, available at every node, implements the algorithm and automatically forwards or back-tracks the header in the network legs of various paths until a completed path is latched.

  1. Automated Content Detection for Cassini Images

    NASA Astrophysics Data System (ADS)

    Stanboli, A.; Bue, B.; Wagstaff, K.; Altinok, A.

    2017-06-01

    NASA missions generate numerous images ever organized in increasingly large archives. Image archives are currently not searchable by image content. We present an automated content detection prototype that can enable content search.

  2. Image acquisition context: procedure description attributes for clinically relevant indexing and selective retrieval of biomedical images.

    PubMed

    Bidgood, W D; Bray, B; Brown, N; Mori, A R; Spackman, K A; Golichowski, A; Jones, R H; Korman, L; Dove, B; Hildebrand, L; Berg, M

    1999-01-01

    To support clinically relevant indexing of biomedical images and image-related information based on the attributes of image acquisition procedures and the judgments (observations) expressed by observers in the process of image interpretation. The authors introduce the notion of "image acquisition context," the set of attributes that describe image acquisition procedures, and present a standards-based strategy for utilizing the attributes of image acquisition context as indexing and retrieval keys for digital image libraries. The authors' indexing strategy is based on an interdependent message/terminology architecture that combines the Digital Imaging and Communication in Medicine (DICOM) standard, the SNOMED (Systematized Nomenclature of Human and Veterinary Medicine) vocabulary, and the SNOMED DICOM microglossary. The SNOMED DICOM microglossary provides context-dependent mapping of terminology to DICOM data elements. The capability of embedding standard coded descriptors in DICOM image headers and image-interpretation reports improves the potential for selective retrieval of image-related information. This favorably affects information management in digital libraries.

  3. Clinical experiences with an ASP model backup archive for PACS images

    NASA Astrophysics Data System (ADS)

    Liu, Brent J.; Cao, Fei; Documet, Luis; Huang, H. K.; Muldoon, Jean

    2003-05-01

    Last year we presented a Fault-Tolerant Backup Archive using an Application Service Provider (ASP) model for disaster recovery. The purpose of this paper is to update and provide clinical experiences related towards implementing the ASP model archive solution for short-term backup of clinical PACS image data as well as possible applications other than disaster recovery. The ASP backup archive provides instantaneous, automatic backup of acquired PACS image data and instantaneous recovery of stored PACS image data all at a low operational cost and with little human intervention. This solution can be used for a variety of scheduled and unscheduled downtimes that occur on the main PACS archive. A backup archive server with hierarchical storage was implemented offsite from the main PACS archive location. Clinical data from a hospital PACS is sent to this ASP storage server in parallel to the exams being archived in the main server. Initially, connectivity between the main archive and the ASP storage server is established via a T-1 connection. In the future, other more cost-effective means of connectivity will be researched such as the Internet 2. We have integrated the ASP model backup archive with a clinical PACS at Saint John's Health Center and has been operational for over 6 months. Pitfalls encountered during integration with a live clinical PACS and the impact to clinical workflow will be discussed. In addition, estimations of the cost of establishing such a solution as well as the cost charged to the users will be included. Clinical downtime scenarios, such as a scheduled mandatory downtime and an unscheduled downtime due to a disaster event to the main archive, were simulated and the PACS exams were sent successfully from the offsite ASP storage server back to the hospital PACS in less than 1 day. The ASP backup archive was able to recover PACS image data for comparison studies with no complex operational procedures. Furthermore, no image data loss was encountered during the recovery. During any clinical downtime scenario, the ASP backup archive server can repopulate a clinical PACS quickly with the majority of studies available for comparison during the interim until the main PACS archive is fully recovered.

  4. THE PANCHROMATIC STARBURST IRREGULAR DWARF SURVEY (STARBIRDS): OBSERVATIONS AND DATA ARCHIVE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McQuinn, Kristen B. W.; Mitchell, Noah P.; Skillman, Evan D., E-mail: kmcquinn@astro.umn.edu

    2015-06-22

    Understanding star formation in resolved low mass systems requires the integration of information obtained from observations at different wavelengths. We have combined new and archival multi-wavelength observations on a set of 20 nearby starburst and post-starburst dwarf galaxies to create a data archive of calibrated, homogeneously reduced images. Named the panchromatic “STARBurst IRregular Dwarf Survey” archive, the data are publicly accessible through the Mikulski Archive for Space Telescopes. This first release of the archive includes images from the Galaxy Evolution Explorer Telescope (GALEX), the Hubble Space Telescope (HST), and the Spitzer Space Telescope (Spitzer) Multiband Imaging Photometer instrument. The datamore » sets include flux calibrated, background subtracted images, that are registered to the same world coordinate system. Additionally, a set of images are available that are all cropped to match the HST field of view. The GALEX and Spitzer images are available with foreground and background contamination masked. Larger GALEX images extending to 4 times the optical extent of the galaxies are also available. Finally, HST images convolved with a 5″ point spread function and rebinned to the larger pixel scale of the GALEX and Spitzer 24 μm images are provided. Future additions are planned that will include data at other wavelengths such as Spitzer IRAC, ground-based Hα, Chandra X-ray, and Green Bank Telescope H i imaging.« less

  5. Project MICAS: a multivendor open-system incremental approach to implementing an integrated enterprise-wide PACS: works in progress

    NASA Astrophysics Data System (ADS)

    Smith, Edward M.; Wright, Jeffrey; Fontaine, Marc T.; Robinson, Arvin E.

    1998-07-01

    The Medical Information, Communication and Archive System (MICAS) is a multi-vendor incremental approach to PACS. MICAS is a multi-modality integrated image management system that incorporates the radiology information system (RIS) and radiology image database (RID) with future 'hooks' to other hospital databases. Even though this approach to PACS is more risky than a single-vendor turn-key approach, it offers significant advantages. The vendors involved in the initial phase of MICAS are IDX Corp., ImageLabs, Inc. and Digital Equipment Corp (DEC). The network architecture operates at 100 MBits per sec except between the modalities and the stackable intelligent switch which is used to segment MICAS by modality. Each modality segment contains the acquisition engine for the modality, a temporary archive and one or more diagnostic workstations. All archived studies are available at all workstations, but there is no permanent archive at this time. At present, the RIS vendor is responsible for study acquisition and workflow as well as maintenance of the temporary archive. Management of study acquisition, workflow and the permanent archive will become the responsibility of the archive vendor when the archive is installed in the second quarter of 1998. The modalities currently interfaced to MICAS are MRI, CT and a Howtek film digitizer with Nuclear Medicine and computed radiography (CR) to be added when the permanent archive is installed. There are six dual-monitor diagnostic workstations which use ImageLabs Shared Vision viewer software located in MRI, CT, Nuclear Medicine, musculoskeletal reading areas and two in Radiology's main reading area. One of the major lessons learned to date is that the permanent archive should have been part of the initial MICAS installation and the archive vendor should have been responsible for image acquisition rather than the RIS vendor. Currently an archive vendor is being selected who will be responsible for the management of the archive plus the HIS/RIS interface, image acquisition, modality work list manager and interfacing to the current DICOM viewer software. The next phase of MICAS will include interfacing ultrasound, locating servers outside of the Radiology LAN to support the distribution of images and reports to the clinical floors and physician offices both within and outside of the University of Rochester Medical Center (URMC) campus and the teaching archive.

  6. Image Acquisition Context

    PubMed Central

    Bidgood, W. Dean; Bray, Bruce; Brown, Nicolas; Mori, Angelo Rossi; Spackman, Kent A.; Golichowski, Alan; Jones, Robert H.; Korman, Louis; Dove, Brent; Hildebrand, Lloyd; Berg, Michael

    1999-01-01

    Objective: To support clinically relevant indexing of biomedical images and image-related information based on the attributes of image acquisition procedures and the judgments (observations) expressed by observers in the process of image interpretation. Design: The authors introduce the notion of “image acquisition context,” the set of attributes that describe image acquisition procedures, and present a standards-based strategy for utilizing the attributes of image acquisition context as indexing and retrieval keys for digital image libraries. Methods: The authors' indexing strategy is based on an interdependent message/terminology architecture that combines the Digital Imaging and Communication in Medicine (DICOM) standard, the SNOMED (Systematized Nomenclature of Human and Veterinary Medicine) vocabulary, and the SNOMED DICOM microglossary. The SNOMED DICOM microglossary provides context-dependent mapping of terminology to DICOM data elements. Results: The capability of embedding standard coded descriptors in DICOM image headers and image-interpretation reports improves the potential for selective retrieval of image-related information. This favorably affects information management in digital libraries. PMID:9925229

  7. LAMP Educational Site

    Science.gov Websites

    topHeader GSFC NASA SwRI Denver Museum of Nature and Science sub header Home Science and the ready to go back to the Moon? NASA took the first step in that direction in 2009 with the launch of

  8. Image acquisition unit for the Mayo/IBM PACS project

    NASA Astrophysics Data System (ADS)

    Reardon, Frank J.; Salutz, James R.

    1991-07-01

    The Mayo Clinic and IBM Rochester, Minnesota, have jointly developed a picture archiving, distribution and viewing system for use with Mayo's CT and MRI imaging modalities. Images are retrieved from the modalities and sent over the Mayo city-wide token ring network to optical storage subsystems for archiving, and to server subsystems for viewing on image review stations. Images may also be retrieved from archive and transmitted back to the modalities. The subsystems that interface to the modalities and communicate to the other components of the system are termed Image Acquisition Units (LAUs). The IAUs are IBM Personal System/2 (PS/2) computers with specially developed software. They operate independently in a network of cooperative subsystems and communicate with the modalities, archive subsystems, image review server subsystems, and a central subsystem that maintains information about the content and location of images. This paper provides a detailed description of the function and design of the Image Acquisition Units.

  9. a Geographic Data Gathering System for Image Geolocalization Refining

    NASA Astrophysics Data System (ADS)

    Semaan, B.; Servières, M.; Moreau, G.; Chebaro, B.

    2017-09-01

    Image geolocalization has become an important research field during the last decade. This field is divided into two main sections. The first is image geolocalization that is used to find out which country, region or city the image belongs to. The second one is refining image localization for uses that require more accuracy such as augmented reality and three dimensional environment reconstruction using images. In this paper we present a processing chain that gathers geographic data from several sources in order to deliver a better geolocalization than the GPS one of an image and precise camera pose parameters. In order to do so, we use multiple types of data. Among this information some are visible in the image and are extracted using image processing, other types of data can be extracted from image file headers or online image sharing platforms related information. Extracted information elements will not be expressive enough if they remain disconnected. We show that grouping these information elements helps finding the best geolocalization of the image.

  10. Open-Source Radiation Exposure Extraction Engine (RE3) with Patient-Specific Outlier Detection.

    PubMed

    Weisenthal, Samuel J; Folio, Les; Kovacs, William; Seff, Ari; Derderian, Vana; Summers, Ronald M; Yao, Jianhua

    2016-08-01

    We present an open-source, picture archiving and communication system (PACS)-integrated radiation exposure extraction engine (RE3) that provides study-, series-, and slice-specific data for automated monitoring of computed tomography (CT) radiation exposure. RE3 was built using open-source components and seamlessly integrates with the PACS. RE3 calculations of dose length product (DLP) from the Digital imaging and communications in medicine (DICOM) headers showed high agreement (R (2) = 0.99) with the vendor dose pages. For study-specific outlier detection, RE3 constructs robust, automatically updating multivariable regression models to predict DLP in the context of patient gender and age, scan length, water-equivalent diameter (D w), and scanned body volume (SBV). As proof of concept, the model was trained on 811 CT chest, abdomen + pelvis (CAP) exams and 29 outliers were detected. The continuous variables used in the outlier detection model were scan length (R (2)  = 0.45), D w (R (2) = 0.70), SBV (R (2) = 0.80), and age (R (2) = 0.01). The categorical variables were gender (male average 1182.7 ± 26.3 and female 1047.1 ± 26.9 mGy cm) and pediatric status (pediatric average 710.7 ± 73.6 mGy cm and adult 1134.5 ± 19.3 mGy cm).

  11. Archive of digital chirp subbottom profile data collected during USGS cruise 12BIM03 offshore of the Chandeleur Islands, Louisiana, July 2012

    USGS Publications Warehouse

    Forde, Arnell S.; Miselis, Jennifer L.; Wiese, Dana S.

    2014-01-01

    From July 23 - 31, 2012, the U.S. Geological Survey conducted geophysical surveys to investigate the geologic controls on barrier island framework and long-term sediment transport along the oil spill mitigation sand berm constructed at the north end and just offshore of the Chandeleur Islands, La. (figure 1). This effort is part of a broader USGS study, which seeks to better understand barrier island evolution over medium time scales (months to years). This report serves as an archive of unprocessed digital chirp subbottom data, trackline maps, navigation files, Geographic Information System (GIS) files, Field Activity Collection System (FACS) logs, and formal Federal Geographic Data Committee (FGDC) metadata. Gained (showing a relative increase in signal amplitude) digital images of the seismic profiles are also provided. Refer to the Abbreviations page for expansions of acronyms and abbreviations used in this report. The USGS St. Petersburg Coastal and Marine Science Center (SPCMSC) assigns a unique identifier to each cruise or field activity. For example, 12BIM03 tells us the data were collected in 2012 during the third field activity for that project in that calendar year and BIM is a generic code, which represents efforts related to Barrier Island Mapping. Refer to http://walrus.wr.usgs.gov/infobank/programs/html/definition/activity.html for a detailed description of the method used to assign the field activity ID. All chirp systems use a signal of continuously varying frequency; the EdgeTech SB-424 system used during this survey produces high-resolution, shallow-penetration (typically less than 50 milliseconds (ms)) profile images of sub-seafloor stratigraphy. The towfish contains a transducer that transmits and receives acoustic energy and is typically towed 1 - 2 m below the sea surface. As transmitted acoustic energy intersects density boundaries, such as the seafloor or sub-surface sediment layers, energy is reflected back toward the transducer, received, and recorded by a PC-based seismic acquisition system. This process is repeated at regular time intervals (for example, 0.125 seconds (s)) and returned energy is recorded for a specific duration (for example, 50 ms). In this way, a two-dimensional (2-D) vertical image of the shallow geologic structure beneath the ship track is produced. Figure 2 displays the acquisition geometry. Refer to table 1 for a summary of acquisition parameters and table 2 for trackline statistics. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG Y rev. 0 format (Barry and others, 1975); the first 3,200 bytes of the card image header are in ASCII format instead of EBCDIC format. The SEG Y files may be downloaded and processed with commercial or public domain software such as Seismic Unix (SU) (Cohen and Stockwell, 2010). See the How To Download SEG Y Data page for download instructions. The web version of this archive does not contain the SEG Y trace files. These files are very large and would require extremely long download times. To obtain the complete DVD archive, contact USGS Information Services at 1-888-ASK-USGS or infoservices@usgs.gov. The printable profiles provided here are GIF images that were processed and gained using SU software and can be viewed from the Profiles page or from links located on the trackline maps; refer to the Software page for links to example SU processing scripts. The SEG Y files are available on the DVD version of this report or on the Web, downloadable via the USGS Coastal and Marine Geoscience Data System (http://cmgds.marine.usgs.gov). The data are also available for viewing using GeoMapApp (http://www.geomapapp.org) and Virtual Ocean (http://www.virtualocean.org) multi-platform open source software. Detailed information about the navigation system used can be found in table 1 and the Field Activity Collection System (FACS) logs. To view the trackline maps and navigation files, and for more information about these items, see the Navigation page.

  12. ASTROPOP: ASTROnomical Polarimetry and Photometry pipeline

    NASA Astrophysics Data System (ADS)

    Campagnolo, Julio C. N.

    2018-05-01

    AstroPoP reduces almost any CCD photometry and image polarimetry data. For photometry reduction, the code performs source finding, aperture and PSF photometry, astrometry calibration using different automated and non-automated methods and automated source identification and magnitude calibration based on online and local catalogs. For polarimetry, the code resolves linear and circular Stokes parameters produced by image beam splitter or polarizer polarimeters. In addition to the modular functions, ready-to-use pipelines based in configuration files and header keys are also provided with the code. AstroPOP was initially developed to reduce the IAGPOL polarimeter data installed at Observatório Pico dos Dias (Brazil).

  13. The Panchromatic STARBurst IRregular Dwarf Survey (STARBIRDS): Observations and Data Archive

    NASA Astrophysics Data System (ADS)

    McQuinn, Kristen B. W.; Mitchell, Noah P.; Skillman, Evan D.

    2015-06-01

    Understanding star formation in resolved low mass systems requires the integration of information obtained from observations at different wavelengths. We have combined new and archival multi-wavelength observations on a set of 20 nearby starburst and post-starburst dwarf galaxies to create a data archive of calibrated, homogeneously reduced images. Named the panchromatic “STARBurst IRregular Dwarf Survey” archive, the data are publicly accessible through the Mikulski Archive for Space Telescopes. This first release of the archive includes images from the Galaxy Evolution Explorer Telescope (GALEX), the Hubble Space Telescope (HST), and the Spitzer Space Telescope (Spitzer) Multiband Imaging Photometer instrument. The data sets include flux calibrated, background subtracted images, that are registered to the same world coordinate system. Additionally, a set of images are available that are all cropped to match the HST field of view. The GALEX and Spitzer images are available with foreground and background contamination masked. Larger GALEX images extending to 4 times the optical extent of the galaxies are also available. Finally, HST images convolved with a 5″ point spread function and rebinned to the larger pixel scale of the GALEX and Spitzer 24 μm images are provided. Future additions are planned that will include data at other wavelengths such as Spitzer IRAC, ground-based Hα, Chandra X-ray, and Green Bank Telescope H i imaging. Based on observations made with the NASA/ESA Hubble Space Telescope, and obtained from the Hubble Legacy Archive, which is a collaboration between the Space Telescope Science Institute (STScI/NASA), the Space Telescope European Coordinating Facility (ST-ECF/ESA), and the Canadian Astronomy Data Centre (CADC/NRC/CSA).

  14. CD-based image archival and management on a hybrid radiology intranet.

    PubMed

    Cox, R D; Henri, C J; Bret, P M

    1997-08-01

    This article describes the design and implementation of a low-cost image archival and management solution on a radiology network consisting of UNIX, IBM personal computer-compatible (IBM, Purchase, NY) and Macintosh (Apple Computer, Cupertino, CA) workstations. The picture archiving and communications system (PACS) is modular, scaleable and conforms to the Digital Imaging and Communications in Medicine (DICOM) 3.0 standard for image transfer, storage and retrieval. Image data is made available on soft-copy reporting workstations by a work-flow management scheme and on desktop computers through a World Wide Web (WWW) interface. Data archival is based on recordable compact disc (CD) technology and is automated. The project has allowed the radiology department to eliminate the use of film in magnetic resonance (MR) imaging, computed tomography (CT) and ultrasonography.

  15. Medical image digital archive: a comparison of storage technologies

    NASA Astrophysics Data System (ADS)

    Chunn, Timothy; Hutchings, Matt

    1998-07-01

    A cost effective, high capacity digital archive system is one of the remaining key factors that will enable a radiology department to eliminate film as an archive medium. The ever increasing amount of digital image data is creating the need for huge archive systems that can reliably store and retrieve millions of images and hold from a few terabytes of data to possibly hundreds of terabytes. Selecting the right archive solution depends on a number of factors: capacity requirements, write and retrieval performance requirements, scaleability in capacity and performance, conformance to open standards, archive availability and reliability, security, cost, achievable benefits and cost savings, investment protection, and more. This paper addresses many of these issues. It compares and positions optical disk and magnetic tape technologies, which are the predominant archive mediums today. New technologies will be discussed, such as DVD and high performance tape. Price and performance comparisons will be made at different archive capacities, plus the effect of file size on random and pre-fetch retrieval time will be analyzed. The concept of automated migration of images from high performance, RAID disk storage devices to high capacity, NearlineR storage devices will be introduced as a viable way to minimize overall storage costs for an archive.

  16. Reversible watermarking for knowledge digest embedding and reliability control in medical images.

    PubMed

    Coatrieux, Gouenou; Le Guillou, Clara; Cauvin, Jean-Michel; Roux, Christian

    2009-03-01

    To improve medical image sharing in applications such as e-learning or remote diagnosis aid, we propose to make the image more usable by watermarking it with a digest of its associated knowledge. The aim of such a knowledge digest (KD) is for it to be used for retrieving similar images with either the same findings or differential diagnoses. It summarizes the symbolic descriptions of the image, the symbolic descriptions of the findings semiology, and the similarity rules that contribute to balancing the importance of previous descriptors when comparing images. Instead of modifying the image file format by adding some extra header information, watermarking is used to embed the KD in the pixel gray-level values of the corresponding images. When shared through open networks, watermarking also helps to convey reliability proofs (integrity and authenticity) of an image and its KD. The interest of these new image functionalities is illustrated in the updating of the distributed users' databases within the framework of an e-learning application demonstrator of endoscopic semiology.

  17. Automated measurements of metabolic tumor volume and metabolic parameters in lung PET/CT imaging

    NASA Astrophysics Data System (ADS)

    Orologas, F.; Saitis, P.; Kallergi, M.

    2017-11-01

    Patients with lung tumors or inflammatory lung disease could greatly benefit in terms of treatment and follow-up by PET/CT quantitative imaging, namely measurements of metabolic tumor volume (MTV), standardized uptake values (SUVs) and total lesion glycolysis (TLG). The purpose of this study was the development of an unsupervised or partially supervised algorithm using standard image processing tools for measuring MTV, SUV, and TLG from lung PET/CT scans. Automated metabolic lesion volume and metabolic parameter measurements were achieved through a 5 step algorithm: (i) The segmentation of the lung areas on the CT slices, (ii) the registration of the CT segmented lung regions on the PET images to define the anatomical boundaries of the lungs on the functional data, (iii) the segmentation of the regions of interest (ROIs) on the PET images based on adaptive thresholding and clinical criteria, (iv) the estimation of the number of pixels and pixel intensities in the PET slices of the segmented ROIs, (v) the estimation of MTV, SUVs, and TLG from the previous step and DICOM header data. Whole body PET/CT scans of patients with sarcoidosis were used for training and testing the algorithm. Lung area segmentation on the CT slices was better achieved with semi-supervised techniques that reduced false positive detections significantly. Lung segmentation results agreed with the lung volumes published in the literature while the agreement between experts and algorithm in the segmentation of the lesions was around 88%. Segmentation results depended on the image resolution selected for processing. The clinical parameters, SUV (either mean or max or peak) and TLG estimated by the segmented ROIs and DICOM header data provided a way to correlate imaging data to clinical and demographic data. In conclusion, automated MTV, SUV, and TLG measurements offer powerful analysis tools in PET/CT imaging of the lungs. Custom-made algorithms are often a better approach than the manufacturer’s general analysis software at much lower cost. Relatively simple processing techniques could lead to customized, unsupervised or partially supervised methods that can successfully perform the desirable analysis and adapt to the specific disease requirements.

  18. Radiologic image communication and archive service: a secure, scalable, shared approach

    NASA Astrophysics Data System (ADS)

    Fellingham, Linda L.; Kohli, Jagdish C.

    1995-11-01

    The Radiologic Image Communication and Archive (RICA) service is designed to provide a shared archive for medical images to the widest possible audience of customers. Images are acquired from a number of different modalities, each available from many different vendors. Images are acquired digitally from those modalities which support direct digital output and by digitizing films for projection x-ray exams. The RICA Central Archive receives standard DICOM 3.0 messages and data streams from the medical imaging devices at customer institutions over the public telecommunication network. RICA represents a completely scalable resource. The user pays only for what he is using today with the full assurance that as the volume of image data that he wishes to send to the archive increases, the capacity will be there to accept it. To provide this seamless scalability imposes several requirements on the RICA architecture: (1) RICA must support the full array of transport services. (2) The Archive Interface must scale cost-effectively to support local networks that range from the very small (one x-ray digitizer in a medical clinic) to the very large and complex (a large hospital with several CTs, MRs, Nuclear medicine devices, ultrasound machines, CRs, and x-ray digitizers). (3) The Archive Server must scale cost-effectively to support rapidly increasing demands for service providing storage for and access to millions of patients and hundreds of millions of images. The architecture must support the incorporation of improved technology as it becomes available to maintain performance and remain cost-effective as demand rises.

  19. Fallon FORGE Well Lithologies

    DOE Data Explorer

    Doug Blankenship

    2016-03-01

    x,y,z text file of the downhole lithologic interpretations in the wells in and around the Fallon FORGE site. All the relevant information is in the file header (the spatial reference, the projection etc.) In addition all the fields in the data file are identified in the header.

  20. Monolithic exploding foil initiator

    DOEpatents

    Welle, Eric J; Vianco, Paul T; Headley, Paul S; Jarrell, Jason A; Garrity, J. Emmett; Shelton, Keegan P; Marley, Stephen K

    2012-10-23

    A monolithic exploding foil initiator (EFI) or slapper detonator and the method for making the monolithic EFI wherein the exploding bridge and the dielectric from which the flyer will be generated are integrated directly onto the header. In some embodiments, the barrel is directly integrated directly onto the header.

  1. Asynchronous broadcast for ordered delivery between compute nodes in a parallel computing system where packet header space is limited

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kumar, Sameer

    Disclosed is a mechanism on receiving processors in a parallel computing system for providing order to data packets received from a broadcast call and to distinguish data packets received at nodes from several incoming asynchronous broadcast messages where header space is limited. In the present invention, processors at lower leafs of a tree do not need to obtain a broadcast message by directly accessing the data in a root processor's buffer. Instead, each subsequent intermediate node's rank id information is squeezed into the software header of packet headers. In turn, the entire broadcast message is not transferred from the rootmore » processor to each processor in a communicator but instead is replicated on several intermediate nodes which then replicated the message to nodes in lower leafs. Hence, the intermediate compute nodes become "virtual root compute nodes" for the purpose of replicating the broadcast message to lower levels of a tree.« less

  2. NIMBUS 7 Earth Radiation Budget (ERB) Matrix User's Guide. Volume 2: Tape Specifications

    NASA Technical Reports Server (NTRS)

    Ray, S. N.; Vasanth, K. L.

    1984-01-01

    The ERB MATRIX tape is generated by an IBM 3081 computer program and is a 9 track, 1600 BPI tape. The gross format of the tape given on Page 1, shows an initial standard header file followed by data files. The standard header file contains two standard header records. A trailing documentation file (TDF) is the last file on the tape. Pages 9 through 17 describe, in detail, the standard header file and the TDF. The data files contain data for 37 different ERB parameters. Each file has data based on either a daily, 6 day cyclic, or monthly time interval. There are three types of physical records in the data files; namely, the world grid physical record, the documentation mercator/polar map projection physical record, and the monthly calibration physical record. The manner in which the data for the 37 ERB parameters are stored in the physical records comprising the data files, is given in the gross format section.

  3. Design of Boiler Welding for Improvement of Lifetime and Cost Control.

    PubMed

    Thong-On, Atcharawadi; Boonruang, Chatdanai

    2016-11-03

    Fe-2.25Cr-1Mo a widely used material for headers and steam tubes of boilers. Welding of steam tube to header is required for production of boiler. Heat affected zone of the weld can have poor mechanical properties and poor corrosion behavior leading to weld failure. The cost of material used for steam tube and header of boiler should be controlled. This study propose a new materials design for boiler welding to improve the lifetime and cost control, using tungsten inert gas (TIG) welding of Fe-2.25Cr-1Mo tube to carbon steel pipe with chromium-containing filler. The cost of production could be reduced by the use of low cost material such as carbon steel pipe for boiler header. The effect of chromium content on corrosion behavior of the weld was greater than that of the microstructure. The lifetime of the welded boiler can be increased by improvement of mechanical properties and corrosion behavior of the heat affected zone.

  4. Design of Boiler Welding for Improvement of Lifetime and Cost Control

    PubMed Central

    Thong-On, Atcharawadi; Boonruang, Chatdanai

    2016-01-01

    Fe-2.25Cr-1Mo a widely used material for headers and steam tubes of boilers. Welding of steam tube to header is required for production of boiler. Heat affected zone of the weld can have poor mechanical properties and poor corrosion behavior leading to weld failure. The cost of material used for steam tube and header of boiler should be controlled. This study propose a new materials design for boiler welding to improve the lifetime and cost control, using tungsten inert gas (TIG) welding of Fe-2.25Cr-1Mo tube to carbon steel pipe with chromium-containing filler. The cost of production could be reduced by the use of low cost material such as carbon steel pipe for boiler header. The effect of chromium content on corrosion behavior of the weld was greater than that of the microstructure. The lifetime of the welded boiler can be increased by improvement of mechanical properties and corrosion behavior of the heat affected zone. PMID:28774014

  5. Cardio-PACs: a new opportunity

    NASA Astrophysics Data System (ADS)

    Heupler, Frederick A., Jr.; Thomas, James D.; Blume, Hartwig R.; Cecil, Robert A.; Heisler, Mary

    2000-05-01

    It is now possible to replace film-based image management in the cardiac catheterization laboratory with a Cardiology Picture Archiving and Communication System (Cardio-PACS) based on digital imaging technology. The first step in the conversion process is installation of a digital image acquisition system that is capable of generating high-quality DICOM-compatible images. The next three steps, which are the subject of this presentation, involve image display, distribution, and storage. Clinical requirements and associated cost considerations for these three steps are listed below: Image display: (1) Image quality equal to film, with DICOM format, lossless compression, image processing, desktop PC-based with color monitor, and physician-friendly imaging software; (2) Performance specifications include: acquire 30 frames/sec; replay 15 frames/sec; access to file server 5 seconds, and to archive 5 minutes; (3) Compatibility of image file, transmission, and processing formats; (4) Image manipulation: brightness, contrast, gray scale, zoom, biplane display, and quantification; (5) User-friendly control of image review. Image distribution: (1) Standard IP-based network between cardiac catheterization laboratories, file server, long-term archive, review stations, and remote sites; (2) Non-proprietary formats; (3) Bidirectional distribution. Image storage: (1) CD-ROM vs disk vs tape; (2) Verification of data integrity; (3) User-designated storage capacity for catheterization laboratory, file server, long-term archive. Costs: (1) Image acquisition equipment, file server, long-term archive; (2) Network infrastructure; (3) Review stations and software; (4) Maintenance and administration; (5) Future upgrades and expansion; (6) Personnel.

  6. The Image and Data Archive at the Laboratory of Neuro Imaging.

    PubMed

    Crawford, Karen L; Neu, Scott C; Toga, Arthur W

    2016-01-01

    The LONI Image and Data Archive (IDA)(1) is a repository for sharing and long-term preservation of neuroimaging and biomedical research data. Originally designed to archive strictly medical image files, the IDA has evolved over the last ten years and now encompasses the storage and dissemination of neuroimaging, clinical, biospecimen, and genetic data. In this article, we report upon the genesis of the IDA and how it currently securely manages data and protects data ownership. Copyright © 2015 Elsevier Inc. All rights reserved.

  7. Malfunction of subpectorally implanted cardiac resynchronization therapy defibrillators due to weakened header bond.

    PubMed

    Hayat, Sajad A; Kojodjojo, Pipin; Mason, Anthony; Benfield, Ann; Wright, Ian; Whinnett, Zachary; Lim, Phang Boon; Davies, D Wyn; Lefroy, David; Peters, Nicholas S; Kanagaratnam, Prapa

    2013-03-01

    Implantable cardioverter defibrillator (ICD) implantation has increased significantly over the last 10 years. Concerns about the safety and reliability of ICD systems have been raised, with premature lead failure and battery malfunctions accounting for the majority of reported adverse events. We describe the unique mode of presentation, diagnosis, and management of cardiac resynchronization therapy defibrillators (CRT-D) malfunctions that were caused by weakened bonding between the generator and header. Between June 2008 and December 2009, 22 Teligen™ ICDs and 24 Cognis™ CRT-Ds were implanted subpectorally at our institution, until a product advisory was issued. Of 24 Cognis™ CRT-D implants, 3 patients presented with CRT-D malfunctions. All our cases presented with initially intermittent and then persisting increases in shock lead impedance, associated with nonphysiological noise in the shock electrogram channels. These issues were rectified by generator change. Postexplant laboratory analysis confirmed inadequate bonding between device header and titanium casing in all cases, resulting in loosening and rocking of the header followed by fatigue-induced fracture of the shock circuitry. Weakened bonding between the header and generator casing of subpectorally implanted CRT-Ds can result in fractures and malfunction of the HV circuit. Physicians monitoring patients with devices affected by the product advisory should remain vigilant in order to diagnose and manage similar device malfunction expediently. © 2012 Wiley Periodicals, Inc.

  8. Defense RDT&E Online System (DROLS) Handbook

    DTIC Science & Technology

    1993-07-01

    of the descriptor TROPICAL DISEASES hierarchically will produce the same results as a cumulated search of the following terms: CHOLERA DENGUE ...Header List The Source header List is a two volume listing of all source names arranged in alphabetical order. Each en ~try consists of: Source Name...BB Belgium ................................................................ BE Belize

  9. 40 CFR 205.165 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... respect to the parameters listed in § 205.168 of this subpart. (2) Exhaust header pipe means any tube of... be “exhaust header pipes.” (3) Failing exhaust system means that, when installed on any Federally... EQUIPMENT NOISE EMISSION CONTROLS Motorcycle Exhaust Systems § 205.165 Definitions. (a) As used in this...

  10. Large variable conductance heat pipe. Transverse header

    NASA Technical Reports Server (NTRS)

    Edelstein, F.

    1975-01-01

    The characteristics of gas-loaded, variable conductance heat pipes (VCHP) are discussed. The difficulties involved in developing a large VCHP header are analyzed. The construction of the large capacity VCHP is described. A research project to eliminate some of the problems involved in large capacity VCHP operation is explained.

  11. Manifold to uniformly distribute a solid-liquid slurry

    DOEpatents

    Kern, Kenneth C.

    1983-01-01

    This invention features a manifold that divides a stream of coal particles and liquid into several smaller streams maintaining equal or nearly equal mass compositions. The manifold consists of a horizontal, variable area header having sharp-edged, right-angled take-offs which are oriented on the bottom of the header.

  12. --No Title--

    Science.gov Websites

    NREL TRANSPORTATION stylesheet*/ .content-list-widget .header-box .title { color: #fff; } .content -list-widget .header-box { background-color: #0079C2; border-bottom: 5px solid #00A4E4 ) ************************************************************/ .greybg { background-color: #E3E6E8; } .hide-bullets { list-style:none; margin-left: -40px; margin-top

  13. Fault-tolerant back-up archive using an ASP model for disaster recovery

    NASA Astrophysics Data System (ADS)

    Liu, Brent J.; Huang, H. K.; Cao, Fei; Documet, Luis; Sarti, Dennis A.

    2002-05-01

    A single point of failure in PACS during a disaster scenario is the main archive storage and server. When a major disaster occurs, it is possible to lose an entire hospital's PACS data. Few current PACS archives feature disaster recovery, but the design is limited at best. These drawbacks include the frequency with which the back-up is physically removed to an offsite facility, the operational costs associated to maintain the back-up, the ease-of-use to perform the backup consistently and efficiently, and the ease-of-use to perform the PACS image data recovery. This paper describes a novel approach towards a fault-tolerant solution for disaster recovery of short-term PACS image data using an Application Service Provider model for service. The ASP back-up archive provides instantaneous, automatic backup of acquired PACS image data and instantaneous recovery of stored PACS image data all at a low operational cost. A back-up archive server and RAID storage device is implemented offsite from the main PACS archive location. In the example of this particular hospital, it was determined that at least 2 months worth of PACS image exams were needed for back-up. Clinical data from a hospital PACS is sent to this ASP storage server in parallel to the exams being archived in the main server. A disaster scenario was simulated and the PACS exams were sent from the offsite ASP storage server back to the hospital PACS. Initially, connectivity between the main archive and the ASP storage server is established via a T-1 connection. In the future, other more cost-effective means of connectivity will be researched such as the Internet 2. A disaster scenario was initiated and the disaster recovery process using the ASP back-up archive server was success in repopulating the clinical PACS within a short period of time. The ASP back-up archive was able to recover two months of PACS image data for comparison studies with no complex operational procedures. Furthermore, no image data loss was encountered during the recovery.

  14. Optimisation of solar synoptic observations

    NASA Astrophysics Data System (ADS)

    Klvaña, Miroslav; Sobotka, Michal; Švanda, Michal

    2012-09-01

    The development of instrumental and computer technologies is connected with steadily increasing needs for archiving of large data volumes. The current trend to meet this requirement includes the data compression and growth of storage capacities. This approach, however, has technical and practical limits. A further reduction of the archived data volume can be achieved by means of an optimisation of the archiving that consists in data selection without losing the useful information. We describe a method of optimised archiving of solar images, based on the selection of images that contain a new information. The new information content is evaluated by means of the analysis of changes detected in the images. We present characteristics of different kinds of image changes and divide them into fictitious changes with a disturbing effect and real changes that provide a new information. In block diagrams describing the selection and archiving, we demonstrate the influence of clouds, the recording of images during an active event on the Sun, including a period before the event onset, and the archiving of long-term history of solar activity. The described optimisation technique is not suitable for helioseismology, because it does not conserve the uniform time step in the archived sequence and removes the information about solar oscillations. In case of long-term synoptic observations, the optimised archiving can save a large amount of storage capacities. The actual capacity saving will depend on the setting of the change-detection sensitivity and on the capability to exclude the fictitious changes.

  15. [A new concept for integration of image databanks into a comprehensive patient documentation].

    PubMed

    Schöll, E; Holm, J; Eggli, S

    2001-05-01

    Image processing and archiving are of increasing importance in the practice of modern medicine. Particularly due to the introduction of computer-based investigation methods, physicians are dealing with a wide variety of analogue and digital picture archives. On the other hand, clinical information is stored in various text-based information systems without integration of image components. The link between such traditional medical databases and picture archives is a prerequisite for efficient data management as well as for continuous quality control and medical education. At the Department of Orthopedic Surgery, University of Berne, a software program was developed to create a complete multimedia electronic patient record. The client-server system contains all patients' data, questionnaire-based quality control, and a digital picture archive. Different interfaces guarantee the integration into the hospital's data network. This article describes our experiences in the development and introduction of a comprehensive image archiving system at a large orthopedic center.

  16. XML at the ADC: Steps to a Next Generation Data Archive

    NASA Astrophysics Data System (ADS)

    Shaya, E.; Blackwell, J.; Gass, J.; Oliversen, N.; Schneider, G.; Thomas, B.; Cheung, C.; White, R. A.

    1999-05-01

    The eXtensible Markup Language (XML) is a document markup language that allows users to specify their own tags, to create hierarchical structures to qualify their data, and to support automatic checking of documents for structural validity. It is being intensively supported by nearly every major corporate software developer. Under the funds of a NASA AISRP proposal, the Astronomical Data Center (ADC, http://adc.gsfc.nasa.gov) is developing an infrastructure for importation, enhancement, and distribution of data and metadata using XML as the document markup language. We discuss the preliminary Document Type Definition (DTD, at http://adc.gsfc.nasa.gov/xml) which specifies the elements and their attributes in our metadata documents. This attempts to define both the metadata of an astronomical catalog and the `header' information of an astronomical table. In addition, we give an overview of the planned flow of data through automated pipelines from authors and journal presses into our XML archive and retrieval through the web via the XML-QL Query Language and eXtensible Style Language (XSL) scripts. When completed, the catalogs and journal tables at the ADC will be tightly hyperlinked to enhance data discovery. In addition one will be able to search on fragmentary information. For instance, one could query for a table by entering that the second author is so-and-so or that the third author is at such-and-such institution.

  17. Picture archiving and communication in radiology.

    PubMed

    Napoli, Marzia; Nanni, Marinella; Cimarra, Stefania; Crisafulli, Letizia; Campioni, Paolo; Marano, Pasquale

    2003-01-01

    After over 80 years of exclusive archiving of radiologic films, at present, in Radiology, digital archiving is increasingly gaining ground. Digital archiving allows a considerable reduction in costs and space saving, but most importantly, immediate or remote consultation of all examinations and reports in the hospital clinical wards, is feasible. The RIS system, in this case, is the starting point of the process of electronic archiving which however is the task of PACS. The latter can be used as radiologic archive in accordance with the law provided that it is in conformance with some specifications as the use of optical long-term storage media or with electronic track of change. PACS archives, in a hierarchical system, all digital images produced by each diagnostic imaging modality. Images and patient data can be retrieved and used for consultation or remote consultation by the reporting radiologist who requires images and reports of previous radiologic examinations or by the referring physician of the ward. Modern PACS owing to the WEB server allow remote access to extremely simplified images and data however ensuring the due regulations and access protections. Since the PACS enables a simpler data communication within the hospital, security and patient privacy should be protected. A secure and reliable PACS should be able to minimize the risk of accidental data destruction, and should prevent non authorized access to the archive with adequate security measures in relation to the acquired knowledge and based on the technological advances. Archiving of data produced by modern digital imaging is a problem now present also in small Radiology services. The technology is able to readily solve problems which were extremely complex up to some years ago as the connection between equipment and archiving system owing also to the universalization of the DICOM 3.0 standard. The evolution of communication networks and the use of standard protocols as TCP/IP can minimize problems of data and image remote transmission within the healthcare enterprise as well as over the territory. However, new problems are appearing as that of digital data security profiles and of the different systems which should ensure it. Among these, algorithms of electronic signature should be mentioned. In Italy they are validated by law and therefore can be used in digital archives in accordance with the law.

  18. Microchannel laminated mass exchanger and method of making

    DOEpatents

    Martin, Peter M [Kennewick, WA; Bennett, Wendy D [Kennewick, WA; Matson, Dean W [Kennewick, WA; Stewart, Donald C [Richland, WA; Drost, Monte K [Pasco, WA; Wegeng, Robert S [Richland, WA; Perez, Joseph M [Richland, WA; Feng, Xiangdong [West Richland, WA; Liu, Jun [West Richland, WA

    2003-03-18

    The present invention is a microchannel mass exchanger having a first plurality of inner thin sheets and a second plurality of outer thin sheets. The inner thin sheets each have a solid margin around a circumference, the solid margin defining a slot through the inner thin sheet thickness. The outer thin sheets each have at least two header holes on opposite ends and when sandwiching an inner thin sheet. The outer thin sheets further have a mass exchange medium. The assembly forms a closed flow channel assembly wherein fluid enters through one of the header holes into the slot and exits through another of the header holes after contacting the mass exchange medium.

  19. Microchannel laminated mass exchanger and method of making

    DOEpatents

    Martin, Peter M.; Bennett, Wendy D.; Matson, Dean W.; Stewart, Donald C.; Drost, Monte K.; Wegeng, Robert S.; Perez, Joseph M.; Feng, Xiangdong; Liu, Jun

    2000-01-01

    The present invention is a microchannel mass exchanger having a first plurality of inner thin sheets and a second plurality of outer thin sheets. The inner thin sheets each have a solid margin around a circumference, the solid margin defining a slot through the inner thin sheet thickness. The outer thin sheets each have at least two header holes on opposite ends and when sandwiching an inner thin sheet. The outer thin sheets further have a mass exchange medium. The assembly forms a closed flow channel assembly wherein fluid enters through one of the header holes into the slot and exits through another of the header holes after contacting the mass exchange medium.

  20. Microchannel laminated mass exchanger and method of making

    DOEpatents

    Martin, Peter M [Kennewick, WA; Bennett, Wendy D [Kennewick, WA; Matson, Dean W [Kennewick, WA; Stewart, Donald C [Richland, WA; Drost, Monte K [Pasco, WA; Wegeng, Robert S [Richland, WA; Perez, Joseph M [Richland, WA; Feng, Xiangdong [West Richland, WA; Liu, Jun [West Richland, WA

    2002-03-05

    The present invention is a microchannel mass exchanger having a first plurality of inner thin sheets and a second plurality of outer thin sheets. The inner thin sheets each have a solid margin around a circumference, the solid margin defining a slot through the inner thin sheet thickness. The outer thin sheets each have at least two header holes on opposite ends and when sandwiching an inner thin sheet. The outer thin sheets further have a mass exchange medium. The assembly forms a closed flow channel assembly wherein fluid enters through one of the header holes into the slot and exits through another of the header holes after contacting the mass exchange medium.

  1. IMAGE EXPLORER: Astronomical Image Analysis on an HTML5-based Web Application

    NASA Astrophysics Data System (ADS)

    Gopu, A.; Hayashi, S.; Young, M. D.

    2014-05-01

    Large datasets produced by recent astronomical imagers cause the traditional paradigm for basic visual analysis - typically downloading one's entire image dataset and using desktop clients like DS9, Aladin, etc. - to not scale, despite advances in desktop computing power and storage. This paper describes Image Explorer, a web framework that offers several of the basic visualization and analysis functionality commonly provided by tools like DS9, on any HTML5 capable web browser on various platforms. It uses a combination of the modern HTML5 canvas, JavaScript, and several layers of lossless PNG tiles producted from the FITS image data. Astronomers are able to rapidly and simultaneously open up several images on their web-browser, adjust the intensity min/max cutoff or its scaling function, and zoom level, apply color-maps, view position and FITS header information, execute typically used data reduction codes on the corresponding FITS data using the FRIAA framework, and overlay tiles for source catalog objects, etc.

  2. 47 CFR 11.61 - Tests of EAS procedures.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... EAS header codes, Attention Signal, Test Script and EOM code. (i) Tests in odd numbered months shall... substitute for a monthly test, activation must include transmission of the EAS header codes, Attention Signal, emergency message and EOM code and comply with the visual message requirements in § 11.51. To substitute for...

  3. Practical End-to-End Performance Testing Tool for High Speed 3G-Based Networks

    NASA Astrophysics Data System (ADS)

    Shinbo, Hiroyuki; Tagami, Atsushi; Ano, Shigehiro; Hasegawa, Toru; Suzuki, Kenji

    High speed IP communication is a killer application for 3rd generation (3G) mobile systems. Thus 3G network operators should perform extensive tests to check whether expected end-to-end performances are provided to customers under various environments. An important objective of such tests is to check whether network nodes fulfill requirements to durations of processing packets because a long duration of such processing causes performance degradation. This requires testers (persons who do tests) to precisely know how long a packet is hold by various network nodes. Without any tool's help, this task is time-consuming and error prone. Thus we propose a multi-point packet header analysis tool which extracts and records packet headers with synchronized timestamps at multiple observation points. Such recorded packet headers enable testers to calculate such holding durations. The notable feature of this tool is that it is implemented on off-the shelf hardware platforms, i.e., lap-top personal computers. The key challenges of the implementation are precise clock synchronization without any special hardware and a sophisticated header extraction algorithm without any drop.

  4. Astronomical Archive at Tartu Observatory

    NASA Astrophysics Data System (ADS)

    Annuk, K.

    2007-10-01

    Archiving astronomical data is important task not only at large observatories but also at small observatories. Here we describe the astronomical archive at Tartu Observatory. The archive consists of old photographic plate images, photographic spectrograms, CCD direct--images and CCD spectroscopic data. The photographic plate digitizing project was started in 2005. An on-line database (based on MySQL) was created. The database includes CCD data as well photographic data. A PHP-MySQL interface was written for access to all data.

  5. Clinical experiences utilizing wireless remote control and an ASP model backup archive for a disaster recovery event

    NASA Astrophysics Data System (ADS)

    Liu, Brent J.; Documet, Luis; Documet, Jorge; Huang, H. K.; Muldoon, Jean

    2004-04-01

    An Application Service Provider (ASP) archive model for disaster recovery for Saint John"s Health Center (SJHC) clinical PACS data has been implemented using a Fault-Tolerant Archive Server at the Image Processing and Informatics Laboratory, Marina del Rey, CA (IPIL) since mid-2002. The purpose of this paper is to provide clinical experiences with the implementation of an ASP model backup archive in conjunction with handheld wireless technologies for a particular disaster recovery scenario, an earthquake, in which the local PACS archive and the hospital are destroyed and the patients are moved from one hospital to another. The three sites involved are: (1) SJHC, the simulated disaster site; (2) IPIL, the ASP backup archive site; and (3) University of California, Los Angeles Medical Center (UCLA), the relocated patient site. An ASP backup archive has been established at IPIL to receive clinical PACS images daily using a T1 line from SJHC for backup and disaster recovery storage. Procedures were established to test the network connectivity and data integrity on a regular basis. In a given disaster scenario where the local PACS archive has been destroyed and the patients need to be moved to a second hospital, a wireless handheld device such as a Personal Digital Assistant (PDA) can be utilized to route images to the second hospital site with a PACS and reviewed by radiologists. To simulate this disaster scenario, a wireless network was implemented within the clinical environment in all three sites: SJHC, IPIL, and UCLA. Upon executing the disaster scenario, the SJHC PACS archive server simulates a downtime disaster event. Using the PDA, the radiologist at UCLA can query the ASP backup archive server at IPIL for PACS images and route them directly to UCLA. Implementation experiences integrating this solution within the three clinical environments as well as the wireless performance are discussed. A clinical downtime disaster scenario was implemented and successfully tested. Radiologists were able to successfully query PACS images utilizing a wireless handheld device from the ASP backup archive at IPIL and route the PACS images directly to a second clinical site at UCLA where they and the patients are located at that time. In a disaster scenario, using a wireless device, radiologists at the disaster health care center can route PACS data from an ASP backup archive server to be reviewed in a live clinical PACS environment at a secondary site. This solution allows Radiologists to use a wireless handheld device to control the image workflow and to review PACS images during a major disaster event where patients must be moved to a secondary site.

  6. PACS archive upgrade and data migration: clinical experiences

    NASA Astrophysics Data System (ADS)

    Liu, Brent J.; Documet, Luis; Sarti, Dennis A.; Huang, H. K.; Donnelly, John

    2002-05-01

    Saint John's Health Center PACS data volumes have increased dramatically since the hospital became filmless in April of 1999. This is due in part of continuous image accumulation, and the integration of a new multi-slice detector CT scanner into PACS. The original PACS archive would not be able to handle the distribution and archiving load and capacity in the near future. Furthermore, there is no secondary copy backup of all the archived PACS image data for disaster recovery purposes. The purpose of this paper is to present a clinical and technical process template to upgrade and expand the PACS archive, migrate existing PACs image data to the new archive, and provide a back-up and disaster recovery function not currently available. Discussion of the technical and clinical pitfalls and challenges involved in this process will be presented as well. The server hardware configuration was upgraded and a secondary backup implemented for disaster recovery. The upgrade includes new software versions, database reconfiguration, and installation of a new tape jukebox to replace the current MOD jukebox. Upon completion, all PACS image data from the original MOD jukebox was migrated to the new tape jukebox and verified. The migration was performed during clinical operation continuously in the background. Once the data migration was completed the MOD jukebox was removed. All newly acquired PACS exams are now archived to the new tape jukebox. All PACs image data residing on the original MOD jukebox have been successfully migrated into the new archive. In addition, a secondary backup of all PACS image data has been implemented for disaster recovery and has been verified using disaster scenario testing. No PACS image data was lost during the entire process and there was very little clinical impact during the entire upgrade and data migration. Some of the pitfalls and challenges during this upgrade process included hardware reconfiguration for the original archive server, clinical downtime involved with the upgrade, and data migration planning to minimize impact on clinical workflow. The impact was minimized with a downtime contingency plan.

  7. Chapter 7:I-joists and headers

    Treesearch

    Brian K. Brashaw; Robert J. Ross

    2005-01-01

    Prefabricated wood I-joists and headers are widely used in wood construction throughout the world. They are used in roof and floor systems in both residential and commercial applications. These structural members consist of flanges, which are made from either solid-sawn or laminated veneer lumber, that are adhesively bonded to a web that is made of plywood or oriented...

  8. 46 CFR 52.05-45 - Circumferential joints in pipes, tubes and headers (modifies PW-41).

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 2 2010-10-01 2010-10-01 false Circumferential joints in pipes, tubes and headers (modifies PW-41). 52.05-45 Section 52.05-45 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) MARINE ENGINEERING POWER BOILERS Requirements for Boilers Fabricated by Welding § 52.05-45...

  9. 46 CFR 52.05-45 - Circumferential joints in pipes, tubes and headers (modifies PW-41).

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 46 Shipping 2 2014-10-01 2014-10-01 false Circumferential joints in pipes, tubes and headers (modifies PW-41). 52.05-45 Section 52.05-45 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) MARINE ENGINEERING POWER BOILERS Requirements for Boilers Fabricated by Welding § 52.05-45...

  10. 46 CFR 52.05-45 - Circumferential joints in pipes, tubes and headers (modifies PW-41).

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 46 Shipping 2 2011-10-01 2011-10-01 false Circumferential joints in pipes, tubes and headers (modifies PW-41). 52.05-45 Section 52.05-45 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) MARINE ENGINEERING POWER BOILERS Requirements for Boilers Fabricated by Welding § 52.05-45...

  11. 46 CFR 52.05-45 - Circumferential joints in pipes, tubes and headers (modifies PW-41).

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 46 Shipping 2 2012-10-01 2012-10-01 false Circumferential joints in pipes, tubes and headers (modifies PW-41). 52.05-45 Section 52.05-45 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) MARINE ENGINEERING POWER BOILERS Requirements for Boilers Fabricated by Welding § 52.05-45...

  12. 46 CFR 52.05-45 - Circumferential joints in pipes, tubes and headers (modifies PW-41).

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 46 Shipping 2 2013-10-01 2013-10-01 false Circumferential joints in pipes, tubes and headers (modifies PW-41). 52.05-45 Section 52.05-45 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) MARINE ENGINEERING POWER BOILERS Requirements for Boilers Fabricated by Welding § 52.05-45...

  13. --No Title--

    Science.gov Websites

    .showcase,.showcasetransportation{opacity:1}.content-list-widget .header-box .title{color:#fff }.content-list-widget .header-box{background-color:#0079C2;border-bottom:5px solid #00A4E4}ul.fa-blue-arrow a::before{font-family:FontAwesome;content:'\\f138';margin:0 5px 0 -15px;color:#0079C2;text-indent

  14. The Keck keyword layer

    NASA Technical Reports Server (NTRS)

    Conrad, A. R.; Lupton, W. F.

    1992-01-01

    Each Keck instrument presents a consistent software view to the user interface programmer. The view consists of a small library of functions, which are identical for all instruments, and a large set of keywords, that vary from instrument to instrument. All knowledge of the underlying task structure is hidden from the application programmer by the keyword layer. Image capture software uses the same function library to collect data for the image header. Because the image capture software and the instrument control software are built on top of the same keyword layer, a given observation can be 'replayed' by extracting keyword-value pairs from the image header and passing them back to the control system. The keyword layer features non-blocking as well as blocking I/O. A non-blocking keyword write operation (such as setting a filter position) specifies a callback to be invoked when the operation is complete. A non-blocking keyword read operation specifies a callback to be invoked whenever the keyword changes state. The keyword-callback style meshes well with the widget-callback style commonly used in X window programs. The first keyword library was built for the two Keck optical instruments. More recently, keyword libraries have been developed for the infrared instruments and for telescope control. Although the underlying mechanisms used for inter-process communication by each of these systems vary widely (Lick MUSIC, Sun RPC, and direct socket I/O, respectively), a basic user interface has been written that can be used with any of these systems. Since the keyword libraries are bound to user interface programs dynamically at run time, only a single set of user interface executables is needed. For example, the same program, 'xshow', can be used to display continuously the telescope's position, the time left in an instrument's exposure, or both values simultaneously. Less generic tools that operate on specific keywords, for example an X display that controls optical instrument exposures, have also been written using the keyword layer.

  15. Detailed description of the Mayo/IBM PACS

    NASA Astrophysics Data System (ADS)

    Gehring, Dale G.; Persons, Kenneth R.; Rothman, Melvyn L.; Salutz, James R.; Morin, Richard L.

    1991-07-01

    The Mayo Clinic and IBM/Rochester have jointly developed a picture archiving system (PACS) for use with Mayo's MRI and Neuro-CT imaging modalities. The system was developed to replace the imaging system's vendor-supplied magnetic tape archiving capability. The system consists of seven MR imagers and nine CT scanners, each interfaced to the PACS via IBM Personal System/2(tm) (PS/2) computers, which act as gateways from the imaging modality to the PACS network. The PAC system operates on the token-ring component of Mayo's city-wide local area network. Also on the PACS network are four optical storage subsystems used for image archival, three optical subsystems used for image retrieval, an IBM Application System/400(tm) (AS/400) computer used for database management and multiple PS/2-based image display systems and their image servers.

  16. Fallon FORGE Well Temp data

    DOE Data Explorer

    Doug Blankenship

    2016-03-01

    x,y,z downhole temperature data for wells in and around the Fallon FORGE site. Data for the following wells are included: 82-36, 82-19, 84.31, 61-36, 88-24, FOH-3D, FDU-1, and FDU-2. Data are formatted in txt format and in columns for importing into Earthvision Software. Column headers and coordinate system information is stored in the file header.

  17. Integral collector storage system with heat exchange apparatus

    DOEpatents

    Rhodes, Richard O.

    2004-04-20

    The present invention relates to an integral solar energy collector storage systems. Generally, an integral collector storage system includes a tank system, a plurality of heat exchange tubes with at least some of the heat exchange tubes arranged within the tank system, a first glazing layer positioned over the tank system and a base plate positioned under the tank system. In one aspect of the invention, the tank system, the first glazing layer an the base plate each include protrusions and a clip is provided to hold the layers together. In another aspect of the invention, the first glazing layer and the base plate are ribbed to provide structural support. This arrangement is particularly useful when these components are formed from plastic. In yet another aspect of the invention, the tank system has a plurality of interconnected tank chambers formed from tubes. In this aspect, a supply header pipe and a fluid return header pipe are provided at a first end of the tank system. The heat exchange tubes have inlets coupled to the supply header pipe and outlets coupled to the return header pipe. With this arrangement, the heat exchange tubes may be inserted into the tank chambers from the first end of the tank system.

  18. Integration, acceptance testing, and clinical operation of the Medical Information, Communication and Archive System, phase II.

    PubMed

    Smith, E M; Wandtke, J; Robinson, A

    1999-05-01

    The Medical Information, Communication and Archive System (MICAS) is a multivendor incremental approach to picture archiving and communications system (PACS). It is a multimodality integrated image management system that is seamlessly integrated with the radiology information system (RIS). Phase II enhancements of MICAS include a permanent archive, automated workflow, study caches, Microsoft (Redmond, WA) Windows NT diagnostic workstations with all components adhering to Digital Information Communications in Medicine (DICOM) standards. MICAS is designed as an enterprise-wide PACS to provide images and reports throughout the Strong Health healthcare network. Phase II includes the addition of a Cemax-Icon (Fremont, CA) archive, PACS broker (Mitra, Waterloo, Canada), an interface (IDX PACSlink, Burlington, VT) to the RIS (IDXrad) plus the conversion of the UNIX-based redundant array of inexpensive disks (RAID) 5 temporary archives in phase I to NT-based RAID 0 DICOM modality-specific study caches (ImageLabs, Bedford, MA). The phase I acquisition engines and workflow management software was uninstalled and the Cemax archive manager (AM) assumed these functions. The existing ImageLabs UNIX-based viewing software was enhanced and converted to an NT-based DICOM viewer. Installation of phase II hardware and software and integration with existing components began in July 1998. Phase II of MICAS demonstrates that a multivendor open-system incremental approach to PACS is feasible, cost-effective, and has significant advantages over a single-vendor implementation.

  19. SETI-EC: SETI Encryption Code

    NASA Astrophysics Data System (ADS)

    Heller, René

    2018-03-01

    The SETI Encryption code, written in Python, creates a message for use in testing the decryptability of a simulated incoming interstellar message. The code uses images in a portable bit map (PBM) format, then writes the corresponding bits into the message, and finally returns both a PBM image and a text (TXT) file of the entire message. The natural constants (c, G, h) and the wavelength of the message are defined in the first few lines of the code, followed by the reading of the input files and their conversion into 757 strings of 359 bits to give one page. Each header of a page, i.e. the little-endian binary code translation of the tempo-spatial yardstick, is calculated and written on-the-fly for each page.

  20. PDS Archive Release of Apollo 11, Apollo 12, and Apollo 17 Lunar Rock Sample Images

    NASA Technical Reports Server (NTRS)

    Garcia, P. A.; Stefanov, W. L.; Lofgren, G. E.; Todd, N. S.; Gaddis, L. R.

    2013-01-01

    Scientists at the Johnson Space Center (JSC) Lunar Sample Laboratory, Information Resources Directorate, and Image Science & Analysis Laboratory have been working to digitize (scan) the original film negatives of Apollo Lunar Rock Sample photographs [1, 2]. The rock samples, and associated regolith and lunar core samples, were obtained during the Apollo 11, 12, 14, 15, 16 and 17 missions. The images allow scientists to view the individual rock samples in their original or subdivided state prior to requesting physical samples for their research. In cases where access to the actual physical samples is not practical, the images provide an alternate mechanism for study of the subject samples. As the negatives are being scanned, they have been formatted and documented for permanent archive in the NASA Planetary Data System (PDS). The Astromaterials Research and Exploration Science Directorate (which includes the Lunar Sample Laboratory and Image Science & Analysis Laboratory) at JSC is working collaboratively with the Imaging Node of the PDS on the archiving of these valuable data. The PDS Imaging Node is now pleased to announce the release of the image archives for Apollo missions 11, 12, and 17.

  1. Recovering Seasat SAR Data

    NASA Astrophysics Data System (ADS)

    Logan, T. A.; Arko, S. A.; Rosen, P. A.

    2013-12-01

    To demonstrate the feasibility of orbital remote sensing for global ocean observations, NASA launched Seasat on June 27th, 1978. Being the first space borne SAR mission, Seasat produced the most detailed SAR images of Earth from space ever seen to that point in time. While much of the data collected in the USA was processed optically, a mere 150 scenes had been digitally processed by March 1980. In fact, only an estimated 3% of Seasat data was ever digitally processed. Thus, for over three decades, the majority of the SAR data from this historic mission has been dormant, virtually unavailable to scientists in the 21st century. Over the last year, researchers at the Alaska Satellite Facility (ASF) Distributed Active Archive Center (DAAC) have processed the Seasat SAR archives into imagery products. A telemetry decoding system was created and the data were filtered into readily processable signal files. Due to nearly 35 years of bit rot, the bit error rate (BER) for the ASF DAAC Seasat archives was on the order of 1 out of 100 to 1 out of 100,000. This extremely high BER initially seemed to make much of the data undecodable - because the minor frame numbers are just 7 bits and no range line numbers exist in the telemetry even the 'simple' tasks of tracking the minor frame number or locating the start of each range line proved difficult. Eventually, using 5 frame numbers in sequence and a handful of heuristics, the data were successfully decoded into full range lines. Concurrently, all metadata were stored into external files. Recovery of this metadata was also problematic, the BER making the information highly suspect and, initially at least, unusable in any sort of automated fashion. Because of the BER, all of the single bit metadata fields proved unreliable. Even fields that should be constant for a data take (e.g. receiving station, day of the year) showed high variability, each requiring a median filter to be usable. The most challenging, however, were the supposedly 'steadily' changing millisecond (MSEC) timing values. The elevated BER made even a basic linear fit difficult. In addition, the MSEC field often shows a 'stair step' function, assumed to be a spacecraft clock malfunction. To fix these issues, three separate levels of time filtering were applied. After the initial three-pass time filter, a fourth procedure located and removed discontinuities - missing data sections that occurred randomly throughout the data takes - by inserting random valued lines into the effected data file and repeated value lines into the corresponding header file. Finally, a fifth pass through the metadata was required to fix remaining start time anomalies. After the data were filtered, all times were linearly increasing, and all discontinuities filled, images could finally be formed. ASF DAAC utilized a custom version of ROI, the Repeat Orbit Interferometric SAR processor, to focus the data. Special focusing tasks for Seasat included dealing with Doppler ambiguity issues and filtering out 'spikes' in the power spectra. Once these obstacles were overcome via additional pre-processing software developed in house, well-focused SAR imagery was obtained from approximately 80% the ASF DAAC archives. These focused products, packaged in either HDF5 or geotiff formats with XML metadata, are downloadable from ASF DAAC free of charge.

  2. Content-based retrieval of historical Ottoman documents stored as textual images.

    PubMed

    Saykol, Ediz; Sinop, Ali Kemal; Güdükbay, Ugur; Ulusoy, Ozgür; Cetin, A Enis

    2004-03-01

    There is an accelerating demand to access the visual content of documents stored in historical and cultural archives. Availability of electronic imaging tools and effective image processing techniques makes it feasible to process the multimedia data in large databases. In this paper, a framework for content-based retrieval of historical documents in the Ottoman Empire archives is presented. The documents are stored as textual images, which are compressed by constructing a library of symbols occurring in a document, and the symbols in the original image are then replaced with pointers into the codebook to obtain a compressed representation of the image. The features in wavelet and spatial domain based on angular and distance span of shapes are used to extract the symbols. In order to make content-based retrieval in historical archives, a query is specified as a rectangular region in an input image and the same symbol-extraction process is applied to the query region. The queries are processed on the codebook of documents and the query images are identified in the resulting documents using the pointers in textual images. The querying process does not require decompression of images. The new content-based retrieval framework is also applicable to many other document archives using different scripts.

  3. Block selective redaction for minimizing loss during de-identification of burned in text in irreversibly compressed JPEG medical images.

    PubMed

    Clunie, David A; Gebow, Dan

    2015-01-01

    Deidentification of medical images requires attention to both header information as well as the pixel data itself, in which burned-in text may be present. If the pixel data to be deidentified is stored in a compressed form, traditionally it is decompressed, identifying text is redacted, and if necessary, pixel data are recompressed. Decompression without recompression may result in images of excessive or intractable size. Recompression with an irreversible scheme is undesirable because it may cause additional loss in the diagnostically relevant regions of the images. The irreversible (lossy) JPEG compression scheme works on small blocks of the image independently, hence, redaction can selectively be confined only to those blocks containing identifying text, leaving all other blocks unchanged. An open source implementation of selective redaction and a demonstration of its applicability to multiframe color ultrasound images is described. The process can be applied either to standalone JPEG images or JPEG bit streams encapsulated in other formats, which in the case of medical images, is usually DICOM.

  4. Imaged Document Optical Correlation and Conversion System (IDOCCS)

    NASA Astrophysics Data System (ADS)

    Stalcup, Bruce W.; Dennis, Phillip W.; Dydyk, Robert B.

    1999-03-01

    Today, the paper document is fast becoming a thing of the past. With the rapid development of fast, inexpensive computing and storage devices, many government and private organizations are archiving their documents in electronic form (e.g., personnel records, medical records, patents, etc.). In addition, many organizations are converting their paper archives to electronic images, which are stored in a computer database. Because of this, there is a need to efficiently organize this data into comprehensive and accessible information resources. The Imaged Document Optical Correlation and Conversion System (IDOCCS) provides a total solution to the problem of managing and retrieving textual and graphic information from imaged document archives. At the heart of IDOCCS, optical correlation technology provides the search and retrieval capability of document images. The IDOCCS can be used to rapidly search for key words or phrases within the imaged document archives and can even determine the types of languages contained within a document. In addition, IDOCCS can automatically compare an input document with the archived database to determine if it is a duplicate, thereby reducing the overall resources required to maintain and access the document database. Embedded graphics on imaged pages can also be exploited, e.g., imaged documents containing an agency's seal or logo, or documents with a particular individual's signature block, can be singled out. With this dual capability, IDOCCS outperforms systems that rely on optical character recognition as a basis for indexing and storing only the textual content of documents for later retrieval.

  5. TCIA: An information resource to enable open science.

    PubMed

    Prior, Fred W; Clark, Ken; Commean, Paul; Freymann, John; Jaffe, Carl; Kirby, Justin; Moore, Stephen; Smith, Kirk; Tarbox, Lawrence; Vendt, Bruce; Marquez, Guillermo

    2013-01-01

    Reusable, publicly available data is a pillar of open science. The Cancer Imaging Archive (TCIA) is an open image archive service supporting cancer research. TCIA collects, de-identifies, curates and manages rich collections of oncology image data. Image data sets have been contributed by 28 institutions and additional image collections are underway. Since June of 2011, more than 2,000 users have registered to search and access data from this freely available resource. TCIA encourages and supports cancer-related open science communities by hosting and managing the image archive, providing project wiki space and searchable metadata repositories. The success of TCIA is measured by the number of active research projects it enables (>40) and the number of scientific publications and presentations that are produced using data from TCIA collections (39).

  6. A hierarchical storage management (HSM) scheme for cost-effective on-line archival using lossy compression.

    PubMed

    Avrin, D E; Andriole, K P; Yin, L; Gould, R G; Arenson, R L

    2001-03-01

    A hierarchical storage management (HSM) scheme for cost-effective on-line archival of image data using lossy compression is described. This HSM scheme also provides an off-site tape backup mechanism and disaster recovery. The full-resolution image data are viewed originally for primary diagnosis, then losslessly compressed and sent off site to a tape backup archive. In addition, the original data are wavelet lossy compressed (at approximately 25:1 for computed radiography, 10:1 for computed tomography, and 5:1 for magnetic resonance) and stored on a large RAID device for maximum cost-effective, on-line storage and immediate retrieval of images for review and comparison. This HSM scheme provides a solution to 4 problems in image archiving, namely cost-effective on-line storage, disaster recovery of data, off-site tape backup for the legal record, and maximum intermediate storage and retrieval through the use of on-site lossy compression.

  7. Taking digital imaging to the next level: challenges and opportunities.

    PubMed

    Hobbs, W Cecyl

    2004-01-01

    New medical imaging technology, such as multi-detector computed tomography (CT) scanners and positron emission tomography (PET) scanners, are creating new possibilities for non-invasive diagnosis that are leading providers to invest heavily in these new technologies. The volume of data produced by such technology is so large that it cannot be "read" using traditional film-based methods, and once in digital form, it creates a massive data integration and archiving challenge. Despite the benefits of digital imaging and archiving, there are several key challenges that healthcare organizations should consider in planning, selecting, and implementing the information technology (IT) infrastructure to support digital imaging. Decisions about storage and image distribution are essentially questions of "where" and "how fast." When planning the digital archiving infrastructure, organizations should think about where they want to store and distribute their images. This is similar to decisions that organizations have to make in regard to physical film storage and distribution, except the portability of images is even greater in a digital environment. The principle of "network effects" seems like a simple concept, yet the effect is not always considered when implementing a technology plan. To fully realize the benefits of digital imaging, the radiology department must integrate the archiving solutions throughout the department and, ultimately, with applications across other departments and enterprises. Medical institutions can derive a number of benefits from implementing digital imaging and archiving solutions like PACS. Hospitals and imaging centers can use the transition from film-based imaging as a foundational opportunity to reduce costs, increase competitive advantage, attract talent, and improve service to patients. The key factors in achieving these goals include attention to the means of data storage, distribution and protection.

  8. Method and system for the diagnosis of disease using retinal image content and an archive of diagnosed human patient data

    DOEpatents

    Tobin, Kenneth W; Karnowski, Thomas P; Chaum, Edward

    2013-08-06

    A method for diagnosing diseases having retinal manifestations including retinal pathologies includes the steps of providing a CBIR system including an archive of stored digital retinal photography images and diagnosed patient data corresponding to the retinal photography images, the stored images each indexed in a CBIR database using a plurality of feature vectors, the feature vectors corresponding to distinct descriptive characteristics of the stored images. A query image of the retina of a patient is obtained. Using image processing, regions or structures in the query image are identified. The regions or structures are then described using the plurality of feature vectors. At least one relevant stored image from the archive based on similarity to the regions or structures is retrieved, and an eye disease or a disease having retinal manifestations in the patient is diagnosed based on the diagnosed patient data associated with the relevant stored image(s).

  9. WFC3/UVIS Updated 2017 Chip-Dependent Inverse Sensitivity Values

    NASA Astrophysics Data System (ADS)

    Deustua, S. E.; Mack, J.; Bajaj, V.; Khandrika, H.

    2017-06-01

    We present chip-dependent inverse sensitivity values recomputed for the 42 full frame filters based on the analysis of standard star observations with the WFC3/UVIS imager obtained between 2009 and 2015. Chip-dependent inverse sensitivities reported in the image header are now for the 'infinite' aperture, which is defined to have a radius of 6 arcseconds (151 pixels), and supercede the 2016 photometry header keyword values (PHOTFLAM, PHTFLAM1, PHTFLAM2), which correspond to a 0.3962 arcsecond (10 pixel) aperture. These new values are implemented in the June 2017 IMPHTTAB delivery and are concordant with the current synthetic photometry tables in the reference file database (CRDS). Since approximately 90% of the light is enclosed within 10 pixels, the new keyword values are 10% smaller. We also compute inverse sensitivities for an aperture with radius of 0.3962 arcseconds. Compared to the 2016 implementation, these new inverse sensitivity values differ by less than 0.5%, on average, for the same aperture. Values for the filters F200LP, F350LP, F600LP and F487N changed by more than 1% for UVIS1. UVIS2 values that changed by more than 1% are for the filters F350LP, F600LP, F850LP, F487N, and F814W. The 2017 VEGAmag zeropoint values in the UV change by up to 0.1 mag compared to 2016 and are calculated using the CALPSEC STIS spectrum for Vega. In 2016, the zeropoints were calculated with the CALSPEC Vega model.

  10. Imaged document information location and extraction using an optical correlator

    NASA Astrophysics Data System (ADS)

    Stalcup, Bruce W.; Dennis, Phillip W.; Dydyk, Robert B.

    1999-12-01

    Today, the paper document is fast becoming a thing of the past. With the rapid development of fast, inexpensive computing and storage devices, many government and private organizations are archiving their documents in electronic form (e.g., personnel records, medical records, patents, etc.). Many of these organizations are converting their paper archives to electronic images, which are then stored in a computer database. Because of this, there is a need to efficiently organize this data into comprehensive and accessible information resources and provide for rapid access to the information contained within these imaged documents. To meet this need, Litton PRC and Litton Data Systems Division are developing a system, the Imaged Document Optical Correlation and Conversion System (IDOCCS), to provide a total solution to the problem of managing and retrieving textual and graphic information from imaged document archives. At the heart of IDOCCS, optical correlation technology provide a means for the search and retrieval of information from imaged documents. IDOCCS can be used to rapidly search for key words or phrases within the imaged document archives and has the potential to determine the types of languages contained within a document. In addition, IDOCCS can automatically compare an input document with the archived database to determine if it is a duplicate, thereby reducing the overall resources required to maintain and access the document database. Embedded graphics on imaged pages can also be exploited, e.g., imaged documents containing an agency's seal or logo can be singled out. In this paper, we present a description of IDOCCS as well as preliminary performance results and theoretical projections.

  11. 40 CFR 63.7325 - What test methods and other procedures must I use to demonstrate initial compliance with the TDS...

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ...) National Emission Standards for Hazardous Air Pollutants for Coke Ovens: Pushing, Quenching, and Battery... applied to the coke (e.g., from the header that feeds water to the quench tower reservoirs). Conduct... sample of the quench water as applied to the coke (e.g., from the header that feeds water to the quench...

  12. 40 CFR 63.7325 - What test methods and other procedures must I use to demonstrate initial compliance with the TDS...

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ...) National Emission Standards for Hazardous Air Pollutants for Coke Ovens: Pushing, Quenching, and Battery... applied to the coke (e.g., from the header that feeds water to the quench tower reservoirs). Conduct... sample of the quench water as applied to the coke (e.g., from the header that feeds water to the quench...

  13. 40 CFR 63.7325 - What test methods and other procedures must I use to demonstrate initial compliance with the TDS...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ...) National Emission Standards for Hazardous Air Pollutants for Coke Ovens: Pushing, Quenching, and Battery... applied to the coke (e.g., from the header that feeds water to the quench tower reservoirs). Conduct... sample of the quench water as applied to the coke (e.g., from the header that feeds water to the quench...

  14. 40 CFR 63.7325 - What test methods and other procedures must I use to demonstrate initial compliance with the TDS...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ...) National Emission Standards for Hazardous Air Pollutants for Coke Ovens: Pushing, Quenching, and Battery... applied to the coke (e.g., from the header that feeds water to the quench tower reservoirs). Conduct... sample of the quench water as applied to the coke (e.g., from the header that feeds water to the quench...

  15. Writing user selectable data on the extended header of seismic recordings made on the Texas Instruments DFS-V

    USGS Publications Warehouse

    Robinson, W.C.

    1996-01-01

    A circuit has been developed to allow the writing of up to 192 digits of user-selectable data on a portion of tape called extended header, which is always available for use before each DFS-V seismic record is written. Such data could include navigation information, air gun and streamer depth and shot times.

  16. Expanding the PACS archive to support clinical review, research, and education missions

    NASA Astrophysics Data System (ADS)

    Honeyman-Buck, Janice C.; Frost, Meryll M.; Drane, Walter E.

    1999-07-01

    Designing an image archive and retrieval system that supports multiple users with many different requirements and patterns of use without compromising the performance and functionality required by diagnostic radiology is an intellectual and technical challenge. A diagnostic archive, optimized for performance when retrieving diagnostic images for radiologists needed to be expanded to support a growing clinical review network, the University of Florida Brain Institute's demands for neuro-imaging, Biomedical Engineering's imaging sciences, and an electronic teaching file. Each of the groups presented a different set of problems for the designers of the system. In addition, the radiologists did not want to see nay loss of performance as new users were added.

  17. Automated search and retrieval of information from imaged documents using optical correlation techniques

    NASA Astrophysics Data System (ADS)

    Stalcup, Bruce W.; Dennis, Phillip W.; Dydyk, Robert B.

    1999-10-01

    Litton PRC and Litton Data Systems Division are developing a system, the Imaged Document Optical Correlation and Conversion System (IDOCCS), to provide a total solution to the problem of managing and retrieving textual and graphic information from imaged document archives. At the heart of IDOCCS, optical correlation technology provides the search and retrieval of information from imaged documents. IDOCCS can be used to rapidly search for key words or phrases within the imaged document archives. In addition, IDOCCS can automatically compare an input document with the archived database to determine if it is a duplicate, thereby reducing the overall resources required to maintain and access the document database. Embedded graphics on imaged pages can also be exploited; e.g., imaged documents containing an agency's seal or logo can be singled out. In this paper, we present a description of IDOCCS as well as preliminary performance results and theoretical projections.

  18. Autonomous grain combine control system

    DOEpatents

    Hoskinson, Reed L.; Kenney, Kevin L.; Lucas, James R.; Prickel, Marvin A.

    2013-06-25

    A system for controlling a grain combine having a rotor/cylinder, a sieve, a fan, a concave, a feeder, a header, an engine, and a control system. The feeder of the grain combine is engaged and the header is lowered. A separator loss target, engine load target, and a sieve loss target are selected. Grain is harvested with the lowered header passing the grain through the engaged feeder. Separator loss, sieve loss, engine load and ground speed of the grain combine are continuously monitored during the harvesting. If the monitored separator loss exceeds the selected separator loss target, the speed of the rotor/cylinder, the concave setting, the engine load target, or a combination thereof is adjusted. If the monitored sieve loss exceeds the selected sieve loss target, the speed of the fan, the size of the sieve openings, or the engine load target is adjusted.

  19. Ignitor with stable low-energy thermite igniting system

    DOEpatents

    Kelly, Michael D.; Munger, Alan C.

    1991-02-05

    A stable compact low-energy igniting system in an ignitor utilizes two components, an initiating charge and an output charge. The initiating charge is a thermite in ultra-fine powder form compacted to 50-70% of theoretical maximum density and disposed in a cavity of a header of the ignitor adjacent to an electrical ignition device, or bridgewire, mounted in the header cavity. The initiating charge is ignitable by operation of the ignition device in a hot-wire mode. The output charge is a thermite in high-density consoladated form compacted to 90-99% of theoretical maximum density and disposed adjacent to the initiating charge on an opposite end thereof from the electrical ignition device and ignitable by the initiating charge. A sleeve is provided for mounting the output charge to the ignitor header with the initiating charge confined therebetween in the cavity.

  20. Secure content objects

    DOEpatents

    Evans, William D [Cupertino, CA

    2009-02-24

    A secure content object protects electronic documents from unauthorized use. The secure content object includes an encrypted electronic document, a multi-key encryption table having at least one multi-key component, an encrypted header and a user interface device. The encrypted document is encrypted using a document encryption key associated with a multi-key encryption method. The encrypted header includes an encryption marker formed by a random number followed by a derivable variation of the same random number. The user interface device enables a user to input a user authorization. The user authorization is combined with each of the multi-key components in the multi-key encryption key table and used to try to decrypt the encrypted header. If the encryption marker is successfully decrypted, the electronic document may be decrypted. Multiple electronic documents or a document and annotations may be protected by the secure content object.

  1. Records & Information Management Services | Alaska State Archives

    Science.gov Websites

    Search Search in: Archives State of Alaska Home About Records Management (RIMS) For Researchers Collections Imaging (IMS) ASHRAB Libraries, Archives, & Museums Archives Records Management (RIMS) Records records and information management for the State of Alaska. Frequently Asked Questions Submit Records

  2. Cities at Night: Citizens science to rescue an archive for the science

    NASA Astrophysics Data System (ADS)

    Sánchez de Miguel, Alejandro; Gomez Castaño, José; Lombraña, Daniel; Zamorano, Jaime; Gallego, Jesús

    2015-08-01

    Since 2003, astronauts have been taking photos from the International Space Station. Many of these images have been published on the websites of participating agencies or the Twitter accounts of the astronauts. However, most of the images taken by astronauts have not been published remaining, on archive without being shown to the world. This ISS archive of nighttime images are not being used for conducting scientific projects because of the difficulty of cataloging. The project Citiesatnight have managed to scientificly prepare these images. The main goal of the project is to characterize light pollution in colors, that is fundamental to track the impact of the new LED lighting on the light pollution. However, other science can be benefited from the project as the study of meteors, auroras studies and general knowledge of these images. The current status of the project, methodology and ideas for exploiting the same platform for other projects is presented. The current results of the project are the complete documentation of all high resolution images archive in just one month.Until now, more tha 132.000 images have been catalogues (30.000 of thouse are cities), more than 2800 images have been located, 1000 have been georeferenced. Also several meteors have been detected on non dedicated pictures. More tha 16.000 have been participated.

  3. Archive of digital chirp subbottom profile data collected during USGS cruise 10BIM04 offshore Cat Island, Mississippi, September 2010

    USGS Publications Warehouse

    Forde, Arnell S.; Dadisman, Shawn V.; Kindinger, Jack G.; Miselis, Jennifer L.; Wiese, Dana S.; Buster, Noreen A.

    2012-01-01

    In September of 2010, the U.S. Geological Survey (USGS), in cooperation with the U.S. Army Corps of Engineers (USACE), conducted a geophysical survey to investigate the geologic controls on barrier island framework of Cat Island, Miss., as part of a broader USGS study on Barrier Island Mapping (BIM). These surveys were funded through the Mississippi Coastal Improvements Program (MsCIP) and the Northern Gulf of Mexico (NGOM) Ecosystem Change and Hazard Susceptibility Project as part of the Holocene Coastal Evolution of the Mississippi-Alabama Region Subtask. This report serves as an archive of unprocessed digital chirp subbottom data, trackline maps, navigation files, GIS files, Field Activity Collection System (FACS) logs, and formal FGDC metadata. Gained (showing a relative increase in signal amplitude) digital images of the seismic profiles are also provided. Refer to the Acronyms page for expansions of acronyms and abbreviations used in this report. The USGS Saint Petersburg Coastal and Marine Science Center (SPCMSC) assigns a unique identifier to each cruise or field activity. For example, 10BIM04 tells us the data were collected in 2010 during the fourth field activity for that project in that calendar year. Refer to http://walrus.wr.usgs.gov/infobank/programs/html/definition/activity.html for a detailed description of the method used to assign the field activity identification (ID). All chirp systems use a signal of continuously varying frequency; the EdgeTech SB-512i system used during this survey produces high-resolution, shallow-penetration (typically less than 50 milliseconds (ms)) profile images of sub-seafloor stratigraphy. The towfish contains a transducer that transmits and receives acoustic energy; it was housed within a float system (built at the SPCMSC), which allows the towfish to be towed at a constant depth of 1.07 meters (m) below the sea surface. As transmitted acoustic energy intersects density boundaries, such as the seafloor or sub-surface sediment layers, some energy is reflected back toward the transducer, received, and recorded by a Personal Computer (PC)-based seismic acquisition system. This process is repeated at regular time intervals (for example, 0.125 seconds (s)), and returned energy is recorded for a specific duration (for example, 50 ms). In this way, a two-dimensional (2-D) vertical image of the shallow geologic structure beneath the ship track is produced. Figure 1 displays the acquisition geometry. Refer to table 1 for a summary of acquisition parameters and table 2 for trackline statistics. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG Y rev. 0 format (Barry and others, 1975); the first 3,200 bytes of the card image header are in American Standard Code for Information Interchange (ASCII) format instead of Extended Binary Coded Decimal Interchange Code (EBCDIC) format. The SEG Y files may be downloaded and processed with commercial or public domain software such as Seismic Unix (SU) (Cohen and Stockwell, 2010). See the How To Download SEG Y Data page for download instructions. The printable profiles provided here are GIF images that were processed and gained using SU software, and they can be viewed from the Profiles page or from links located on the trackline maps; refer to the Software page for links to example SU processing scripts. The SEG Y files are available on the DVD version of this report or on the Web, downloadable via the USGS Coastal and Marine Geoscience Data System (http://cmgds.marine.usgs.gov). The data are also available for viewing using GeoMapApp (http://www.geomapapp.org) and Virtual Ocean (http://www.virtualocean.org) multi-platform open source software.

  4. Multi-provider architecture for cloud outsourcing of medical imaging repositories.

    PubMed

    Godinho, Tiago Marques; Bastião Silva, Luís A; Costa, Carlos; Oliveira, José Luís

    2014-01-01

    Over the last few years, the extended usage of medical imaging procedures has raised the medical community attention towards the optimization of their workflows. More recently, the federation of multiple institutions into a seamless distribution network has brought hope of increased quality healthcare services along with more efficient resource management. As a result, medical institutions are constantly looking for the best infrastructure to deploy their imaging archives. In this scenario, public cloud infrastructures arise as major candidates, as they offer elastic storage space, optimal data availability without great requirements of maintenance costs or IT personnel, in a pay-as-you-go model. However, standard methodologies still do not take full advantage of outsourced archives, namely because their integration with other in-house solutions is troublesome. This document proposes a multi-provider architecture for integration of outsourced archives with in-house PACS resources, taking advantage of foreign providers to store medical imaging studies, without disregarding security. It enables the retrieval of images from multiple archives simultaneously, improving performance, data availability and avoiding the vendor-locking problem. Moreover it enables load balancing and cache techniques.

  5. AstroImageJ: Image Processing and Photometric Extraction for Ultra-precise Astronomical Light Curves

    NASA Astrophysics Data System (ADS)

    Collins, Karen A.; Kielkopf, John F.; Stassun, Keivan G.; Hessman, Frederic V.

    2017-02-01

    ImageJ is a graphical user interface (GUI) driven, public domain, Java-based, software package for general image processing traditionally used mainly in life sciences fields. The image processing capabilities of ImageJ are useful and extendable to other scientific fields. Here we present AstroImageJ (AIJ), which provides an astronomy specific image display environment and tools for astronomy specific image calibration and data reduction. Although AIJ maintains the general purpose image processing capabilities of ImageJ, AIJ is streamlined for time-series differential photometry, light curve detrending and fitting, and light curve plotting, especially for applications requiring ultra-precise light curves (e.g., exoplanet transits). AIJ reads and writes standard Flexible Image Transport System (FITS) files, as well as other common image formats, provides FITS header viewing and editing, and is World Coordinate System aware, including an automated interface to the astrometry.net web portal for plate solving images. AIJ provides research grade image calibration and analysis tools with a GUI driven approach, and easily installed cross-platform compatibility. It enables new users, even at the level of undergraduate student, high school student, or amateur astronomer, to quickly start processing, modeling, and plotting astronomical image data with one tightly integrated software package.

  6. The NOAO NVO Portal

    NASA Astrophysics Data System (ADS)

    Miller, C. J.; Gasson, D.; Fuentes, E.

    2007-10-01

    The NOAO NVO Portal is a web application for one-stop discovery, analysis, and access to VO-compliant imaging data and services. The current release allows for GUI-based discovery of nearly a half million images from archives such as the NOAO Science Archive, the Hubble Space Telescope WFPC2 and ACS instruments, XMM-Newton, Chandra, and ESO's INT Wide-Field Survey, among others. The NOAO Portal allows users to view image metadata, footprint wire-frames, FITS image previews, and provides one-click access to science quality imaging data throughout the entire sky via the Firefox web browser (i.e., no applet or code to download). Users can stage images from multiple archives at the NOAO NVO Portal for quick and easy bulk downloads. The NOAO NVO Portal also provides simplified and direct access to VO analysis services, such as the WESIX catalog generation service. We highlight the features of the NOAO NVO Portal (http://nvo.noao.edu).

  7. MINC 2.0: A Flexible Format for Multi-Modal Images.

    PubMed

    Vincent, Robert D; Neelin, Peter; Khalili-Mahani, Najmeh; Janke, Andrew L; Fonov, Vladimir S; Robbins, Steven M; Baghdadi, Leila; Lerch, Jason; Sled, John G; Adalat, Reza; MacDonald, David; Zijdenbos, Alex P; Collins, D Louis; Evans, Alan C

    2016-01-01

    It is often useful that an imaging data format can afford rich metadata, be flexible, scale to very large file sizes, support multi-modal data, and have strong inbuilt mechanisms for data provenance. Beginning in 1992, MINC was developed as a system for flexible, self-documenting representation of neuroscientific imaging data with arbitrary orientation and dimensionality. The MINC system incorporates three broad components: a file format specification, a programming library, and a growing set of tools. In the early 2000's the MINC developers created MINC 2.0, which added support for 64-bit file sizes, internal compression, and a number of other modern features. Because of its extensible design, it has been easy to incorporate details of provenance in the header metadata, including an explicit processing history, unique identifiers, and vendor-specific scanner settings. This makes MINC ideal for use in large scale imaging studies and databases. It also makes it easy to adapt to new scanning sequences and modalities.

  8. --No Title--

    Science.gov Websites

    li{list-style:none}ul#sort-by-form li{float:left;list-style:none;margin:0 3px}ul#chart-list li ul.data_set-list-item{display:block;height:88px}ul#chart-list li ul.data_set-list-item li{float:left ;display:block}ul#chart-list li.category-header{display:block}#chart-list{margin-top:10px}.header-text h3{font

  9. The effect of travel speed on thermal response in CO2 laser welding of small electronic components

    NASA Astrophysics Data System (ADS)

    Gianoulakis, S. E.; Burchett, S. N.; Fuerschbach, P. W.; Knorovsky, G. A.

    A comprehensive three-dimensional numerical investigation of the effect of beat source travel speed on temperatures and resulting thermal stresses was performed for CO2-laser welding. The test specimen was a small thermal battery header containing several stress-sensitive glass-to-metal seals surrounding the electrical connections and a temperature sensitive ignitor located under the header near the center. Predictions of the thermal stresses and temperatures in the battery header were made for several travel speeds of the laser. The travel speeds examined ranged from 10mm/sec to 50mm/sec. The results indicate that faster weld speeds result in lower temperatures and stresses for the same size weld. This is because the higher speed welds are more efficient, requiring less energy to produce a given weld. Less energy absorbed by the workpiece results in lower temperatures, which results in lower stresses.

  10. Dual circuit embossed sheet heat transfer panel

    DOEpatents

    Morgan, G.D.

    1984-02-21

    A heat transfer panel provides redundant cooling for fusion reactors or the like environment requiring low-mass construction. Redundant cooling is provided by two independent cooling circuits, each circuit consisting of a series of channels joined to inlet and outlet headers. The panel comprises a welded joinder of two full-size and two much smaller partial-size sheets. The first full-size sheet is embossed to form first portions of channels for the first and second circuits, as well as a header for the first circuit. The second full-sized sheet is then laid over and welded to the first full-size sheet. The first and second partial-size sheets are then overlaid on separate portions of the second full-sized sheet, and are welded thereto. The first and second partial-sized sheets are embossed to form inlet and outlet headers, which communicate with channels of the second circuit through apertures formed in the second full-sized sheet. 6 figs.

  11. Dual-circuit embossed-sheet heat-transfer panel

    DOEpatents

    Morgan, G.D.

    1982-08-23

    A heat transfer panel provides redundant cooling for fusion reactors or the like environment requiring low-mass construction. Redundant cooling is provided by two independent cooling circuits, each circuit consisting of a series of channels joined to inlet and outlet headers. The panel comprises a welded joinder of two full-size and two much smaller partial-size sheets. The first full-size sheet is embossed for form first portions of channels for the first and second circuits, as well as a header for the first circuit. The second full-sized sheet is then laid over and welded to the first full-size sheet. The first and second partial-size sheets are then overlaid on separate portions of the second full-sized sheet, and are welded thereto. The first and second partial-sized sheets are embossed to form inlet and outlet headers, which communicate with channels of the second circuit through apertures formed in the second full-sized sheet.

  12. Dual circuit embossed sheet heat transfer panel

    DOEpatents

    Morgan, Grover D.

    1984-01-01

    A heat transfer panel provides redundant cooling for fusion reactors or the like environment requiring low-mass construction. Redundant cooling is provided by two independent cooling circuits, each circuit consisting of a series of channels joined to inlet and outlet headers. The panel comprises a welded joinder of two full-size and two much smaller partial-size sheets. The first full-size sheet is embossed to form first portions of channels for the first and second circuits, as well as a header for the first circuit. The second full-sized sheet is then laid over and welded to the first full-size sheet. The first and second partial-size sheets are then overlaid on separate portions of the second full-sized sheet, and are welded thereto. The first and second partial-sized sheets are embossed to form inlet and outlet headers, which communicate with channels of the second circuit through apertures formed in the second full-sized sheet.

  13. AASG Wells Data for the EGS Test Site Planning and Analysis Task

    DOE Data Explorer

    Augustine, Chad

    2013-10-09

    AASG Wells Data for the EGS Test Site Planning and Analysis Task Temperature measurement data obtained from boreholes for the Association of American State Geologists (AASG) geothermal data project. Typically bottomhole temperatures are recorded from log headers, and this information is provided through a borehole temperature observation service for each state. Service includes header records, well logs, temperature measurements, and other information for each borehole. Information presented in Geothermal Prospector was derived from data aggregated from the borehole temperature observations for all states. For each observation, the given well location was recorded and the best available well identified (name), temperature and depth were chosen. The “Well Name Source,” “Temp. Type” and “Depth Type” attributes indicate the field used from the original service. This data was then cleaned and converted to consistent units. The accuracy of the observation’s location, name, temperature or depth was note assessed beyond that originally provided by the service. - AASG bottom hole temperature datasets were downloaded from repository.usgin.org between the dates of May 16th and May 24th, 2013. - Datasets were cleaned to remove “null” and non-real entries, and data converted into consistent units across all datasets - Methodology for selecting ”best” temperature and depth attributes from column headers in AASG BHT Data sets: • Temperature: • CorrectedTemperature – best • MeasuredTemperature – next best • Depth: • DepthOfMeasurement – best • TrueVerticalDepth – next best • DrillerTotalDepth – last option • Well Name/Identifier • APINo – best • WellName – next best • ObservationURI - last option. The column headers are as follows: • gid = internal unique ID • src_state = the state from which the well was downloaded (note: the low temperature wells in Idaho are coded as “ID_LowTemp”, while all other wells are simply the two character state abbreviation) • source_url = the url for the source WFS service or Excel file • temp_c = “best” temperature in Celsius • temp_type = indicates whether temp_c comes from the corrected or measured temperature header column in the source document • depth_m = “best” depth in meters • depth_type = indicates whether depth_m comes from the measured, true vertical, or driller total depth header column in the source document • well_name = “best” well name or ID • name_src = indicates whether well_name came from apino, wellname, or observationuri header column in the source document • lat_wgs84 = latitude in wgs84 • lon_wgs84 = longitude in wgs84 • state = state in which the point is located • county = county in which the point is located

  14. Data grid: a distributed solution to PACS

    NASA Astrophysics Data System (ADS)

    Zhang, Xiaoyan; Zhang, Jianguo

    2004-04-01

    In a hospital, various kinds of medical images acquired from different modalities are generally used and stored in different department and each modality usually attaches several workstations to display or process images. To do better diagnosis, radiologists or physicians often need to retrieve other kinds of images for reference. The traditional image storage solution is to buildup a large-scale PACS archive server. However, the disadvantages of pure centralized management of PACS archive server are obvious. Besides high costs, any failure of PACS archive server would cripple the entire PACS operation. Here we present a new approach to develop the storage grid in PACS, which can provide more reliable image storage and more efficient query/retrieval for the whole hospital applications. In this paper, we also give the performance evaluation by comparing the three popular technologies mirror, cluster and grid.

  15. Toward Automatic Georeferencing of Archival Aerial Photogrammetric Surveys

    NASA Astrophysics Data System (ADS)

    Giordano, S.; Le Bris, A.; Mallet, C.

    2018-05-01

    Images from archival aerial photogrammetric surveys are a unique and relatively unexplored means to chronicle 3D land-cover changes over the past 100 years. They provide a relatively dense temporal sampling of the territories with very high spatial resolution. Such time series image analysis is a mandatory baseline for a large variety of long-term environmental monitoring studies. The current bottleneck for accurate comparison between epochs is their fine georeferencing step. No fully automatic method has been proposed yet and existing studies are rather limited in terms of area and number of dates. State-of-the art shows that the major challenge is the identification of ground references: cartographic coordinates and their position in the archival images. This task is manually performed, and extremely time-consuming. This paper proposes to use a photogrammetric approach, and states that the 3D information that can be computed is the key to full automation. Its original idea lies in a 2-step approach: (i) the computation of a coarse absolute image orientation; (ii) the use of the coarse Digital Surface Model (DSM) information for automatic absolute image orientation. It only relies on a recent orthoimage+DSM, used as master reference for all epochs. The coarse orthoimage, compared with such a reference, allows the identification of dense ground references and the coarse DSM provides their position in the archival images. Results on two areas and 5 dates show that this method is compatible with long and dense archival aerial image series. Satisfactory planimetric and altimetric accuracies are reported, with variations depending on the ground sampling distance of the images and the location of the Ground Control Points.

  16. Fermilab History and Archives Project | Golden Books - The Early History of

    Science.gov Websites

    Fermilab History and Archives Project Home About the Archives History and Archives Online Request Contact ; - The Early History of URA and Fermilab Fermilab Golden Book Collection main page Click on Image for Larger View The Early History of URA and Fermilab Viewpoint of a URA President (1966-1981) Norman F

  17. The imaging node for the Planetary Data System

    USGS Publications Warehouse

    Eliason, E.M.; LaVoie, S.K.; Soderblom, L.A.

    1996-01-01

    The Planetary Data System Imaging Node maintains and distributes the archives of planetary image data acquired from NASA's flight projects with the primary goal of enabling the science community to perform image processing and analysis on the data. The Node provides direct and easy access to the digital image archives through wide distribution of the data on CD-ROM media and on-line remote-access tools by way of Internet services. The Node provides digital image processing tools and the expertise and guidance necessary to understand the image collections. The data collections, now approaching one terabyte in volume, provide a foundation for remote sensing studies for virtually all the planetary systems in our solar system (except for Pluto). The Node is responsible for restoring data sets from past missions in danger of being lost. The Node works with active flight projects to assist in the creation of their archive products and to ensure that their products and data catalogs become an integral part of the Node's data collections.

  18. A novel fuzzy logic-based image steganography method to ensure medical data security.

    PubMed

    Karakış, R; Güler, I; Çapraz, I; Bilir, E

    2015-12-01

    This study aims to secure medical data by combining them into one file format using steganographic methods. The electroencephalogram (EEG) is selected as hidden data, and magnetic resonance (MR) images are also used as the cover image. In addition to the EEG, the message is composed of the doctor׳s comments and patient information in the file header of images. Two new image steganography methods that are based on fuzzy-logic and similarity are proposed to select the non-sequential least significant bits (LSB) of image pixels. The similarity values of the gray levels in the pixels are used to hide the message. The message is secured to prevent attacks by using lossless compression and symmetric encryption algorithms. The performance of stego image quality is measured by mean square of error (MSE), peak signal-to-noise ratio (PSNR), structural similarity measure (SSIM), universal quality index (UQI), and correlation coefficient (R). According to the obtained result, the proposed method ensures the confidentiality of the patient information, and increases data repository and transmission capacity of both MR images and EEG signals. Copyright © 2015 Elsevier Ltd. All rights reserved.

  19. DoD Electronic Data Interchange (EDI) Convention: ASC X12 Transaction Set 832 Price Sales Catalog (Version 003030)

    DTIC Science & Technology

    1992-12-01

    DATA DES . ELEMENT NAME ATlNPUTES Conditional TD401 152 Special Handling Code C ID 2/3 Code specifying special transportation handling instructions. HAN...Executhre Age"t for Eketronic Conmnerce/Electmnlc Dots lnterchange/Protection of Logistica Undaasslfled/Serssltlve Systerr Executive Agent for EC/EDI...PRICEISALES CATALOG ANSI ASC X12 VERSIONIRELEASE 003030DOD_ 7 Communications Transport Protocol ISA /_Interchange Control Header GS/ Functional Group Header

  20. Long-Life Thermal Battery for Sonobuoy

    DTIC Science & Technology

    2000-04-20

    Microtherm insulation. This Phase I project provided a cost-effective prototype development for fully meeting size ’A’ sonobuoy performance objectives. An...Crossection of Thermal Battery inside V/M-Insulated Battery Case having a Spun Inner Wall and the Standard Battery Can ( Microtherm added at header-end...34 thick Microtherm — — / 0.250" extra Fiberfrax Wrap X^ 0.075 " vacuum annulus ,/ Detail of Header-End 0.250’’ Fig.l, Drawing of a Prototype

  1. Fault tolerance techniques to assure data integrity in high-volume PACS image archives

    NASA Astrophysics Data System (ADS)

    He, Yutao; Huang, Lu J.; Valentino, Daniel J.; Wingate, W. Keith; Avizienis, Algirdas

    1995-05-01

    Picture archiving and communication systems (PACS) perform the systematic acquisition, archiving, and presentation of large quantities of radiological image and text data. In the UCLA Radiology PACS, for example, the volume of image data archived currently exceeds 2500 gigabytes. Furthermore, the distributed heterogeneous PACS is expected to have near real-time response, be continuously available, and assure the integrity and privacy of patient data. The off-the-shelf subsystems that compose the current PACS cannot meet these expectations; therefore fault tolerance techniques had to be incorporated into the system. This paper is to report our first-step efforts towards the goal and is organized as follows: First we discuss data integrity and identify fault classes under the PACS operational environment, then we describe auditing and accounting schemes developed for error-detection and analyze operational data collected. Finally, we outline plans for future research.

  2. Data management and digital delivery of analog data

    USGS Publications Warehouse

    Miller, W.A.; Longhenry, Ryan; Smith, T.

    2008-01-01

    The U.S. Geological Survey's (USGS) data archive at the Earth Resources Observation and Science (EROS) Center is a comprehensive and impartial record of the Earth's changing land surface. USGS/EROS has been archiving and preserving land remote sensing data for over 35 years. This remote sensing archive continues to grow as aircraft and satellites acquire more imagery. As a world leader in preserving data, USGS/EROS has a reputation as a technological innovator in solving challenges and ensuring that access to these collections is available. Other agencies also call on the USGS to consider their collections for long-term archive support. To improve access to the USGS film archive, each frame on every roll of film is being digitized by automated high performance digital camera systems. The system robotically captures a digital image from each film frame for the creation of browse and medium resolution image files. Single frame metadata records are also created to improve access that otherwise involves interpreting flight indexes. USGS/EROS is responsible for over 8.6 million frames of aerial photographs and 27.7 million satellite images.

  3. The Planetary Archive

    NASA Astrophysics Data System (ADS)

    Penteado, Paulo F.; Trilling, David; Szalay, Alexander; Budavári, Tamás; Fuentes, César

    2014-11-01

    We are building the first system that will allow efficient data mining in the astronomical archives for observations of Solar System Bodies. While the Virtual Observatory has enabled data-intensive research making use of large collections of observations across multiple archives, Planetary Science has largely been denied this opportunity: most astronomical data services are built based on sky positions, and moving objects are often filtered out.To identify serendipitous observations of Solar System objects, we ingest the archive metadata. The coverage of each image in an archive is a volume in a 3D space (RA,Dec,time), which we can represent efficiently through a hierarchical triangular mesh (HTM) for the spatial dimensions, plus a contiguous time interval. In this space, an asteroid occupies a curve, which we determine integrating its orbit into the past. Thus when an asteroid trajectory intercepts the volume of an archived image, we have a possible observation of that body. Our pipeline then looks in the archive's catalog for a source with the corresponding coordinates, to retrieve its photometry. All these matches are stored into a database, which can be queried by object identifier.This database consists of archived observations of known Solar System objects. This means that it grows not only from the ingestion of new images, but also from the growth in the number of known objects. As new bodies are discovered, our pipeline can find archived observations where they could have been recorded, providing colors for these newly-found objects. This growth becomes more relevant with the new generation of wide-field surveys, particularly LSST.We also present one use case of our prototype archive: after ingesting the metadata for SDSS, 2MASS and GALEX, we were able to identify serendipitous observations of Solar System bodies in these 3 archives. Cross-matching these occurrences provided us with colors from the UV to the IR, a much wider spectral range than that commonly used for asteroid taxonomy. We present here archive-derived spectrophotometry from searching for 440 thousand asteroids, from 0.3 to 3 µm. In the future we will expand to other archives, including HST, Spitzer, WISE and Pan-STARRS.

  4. The GIK-Archive of sediment core radiographs with documentation

    NASA Astrophysics Data System (ADS)

    Grobe, Hannes; Winn, Kyaw; Werner, Friedrich; Driemel, Amelie; Schumacher, Stefanie; Sieger, Rainer

    2017-12-01

    The GIK-Archive of radiographs is a collection of X-ray negative and photographic images of sediment cores based on exposures taken since the early 1960s. During four decades of marine geological work at the University of Kiel, Germany, several thousand hours of sampling, careful preparation and X-raying were spent on producing a unique archive of sediment radiographs from several parts of the World Ocean. The archive consists of more than 18 500 exposures on chemical film that were digitized, geo-referenced, supplemented with metadata and archived in the data library PANGAEA®. With this publication, the images have become available open-access for use by the scientific community at https://doi.org/10.1594/PANGAEA.854841.

  5. A case for automated tape in clinical imaging.

    PubMed

    Bookman, G; Baune, D

    1998-08-01

    Electronic archiving of radiology images over many years will require many terabytes of storage with a need for rapid retrieval of these images. As more large PACS installations are installed and implemented, a data crisis occurs. The ability to store this large amount of data using the traditional method of optical jukeboxes or online disk alone becomes an unworkable solution. The amount of floor space number of optical jukeboxes, and off-line shelf storage required to store the images becomes unmanageable. With the recent advances in tape and tape drives, the use of tape for long term storage of PACS data has become the preferred alternative. A PACS system consisting of a centrally managed system of RAID disk, software and at the heart of the system, tape, presents a solution that for the first time solves the problems of multi-modality high end PACS, non-DICOM image, electronic medical record and ADT data storage. This paper will examine the installation of the University of Utah, Department of Radiology PACS system and the integration of automated tape archive. The tape archive is also capable of storing data other than traditional PACS data. The implementation of an automated data archive to serve the many other needs of a large hospital will also be discussed. This will include the integration of a filmless cardiology department and the backup/archival needs of a traditional MIS department. The need for high bandwidth to tape with a large RAID cache will be examined and how with an interface to a RIS pre-fetch engine, tape can be a superior solution to optical platters or other archival solutions. The data management software will be discussed in detail. The performance and cost of RAID disk cache and automated tape compared to a solution that includes optical will be examined.

  6. Metadata requirements for results of diagnostic imaging procedures: a BIIF profile to support user applications

    NASA Astrophysics Data System (ADS)

    Brown, Nicholas J.; Lloyd, David S.; Reynolds, Melvin I.; Plummer, David L.

    2002-05-01

    A visible digital image is rendered from a set of digital image data. Medical digital image data can be stored as either: (a) pre-rendered format, corresponding to a photographic print, or (b) un-rendered format, corresponding to a photographic negative. The appropriate image data storage format and associated header data (metadata) required by a user of the results of a diagnostic procedure recorded electronically depends on the task(s) to be performed. The DICOM standard provides a rich set of metadata that supports the needs of complex applications. Many end user applications, such as simple report text viewing and display of a selected image, are not so demanding and generic image formats such as JPEG are sometimes used. However, these are lacking some basic identification requirements. In this paper we make specific proposals for minimal extensions to generic image metadata of value in various domains, which enable safe use in the case of two simple healthcare end user scenarios: (a) viewing of text and a selected JPEG image activated by a hyperlink and (b) viewing of one or more JPEG images together with superimposed text and graphics annotation using a file specified by a profile of the ISO/IEC Basic Image Interchange Format (BIIF).

  7. Status of the Landsat thematic mapper and multispectral scanner archive conversion system

    USGS Publications Warehouse

    Werner, Darla J.

    1993-01-01

    The U.S. Geological Survey's EROS Data Center (EDC) manages the National Satellite Land Remote Sensing Data Archive. This archive includes Landsat thematic mapper (TM) multispectral scanner (MSS) data acquired since 1972. The Landsat archive is an important resource to global change research. To ensure long-term availability of Landsat data from the archive, the EDC specified requirements for a Thematic Mapper and Multispectral Scanner Archive Conversion System (TMACS) that would preserve the data by transcribing it to a more durable medium. In addition to media conversion, hardware and software was installed at EDC in July 1992. In December 1992, the EDC began converting Landsat MSS data from high-density, open reel instrumentation tapes to digital cassette tapes. Almost 320,000 MSS images acquired since 1979 and more than 200,000 TM images acquired since 1982 will be converted to the new medium during the next 3 years. During the media conversion process, several high-density tapes have exhibited severe binder degradation. Even though these tapes have been stored in environmentally controlled conditions, hydrolysis has occurred, resulting in "sticky oxide shed". Using a thermostatically controlled oven built at EDC, tape "baking" has been 100 percent successful and actually improves the quality of some images.

  8. Implementation of a filmless mini picture archiving and communication system in ultrasonography: experience after one year of use.

    PubMed

    Henri, C J; Cox, R D; Bret, P M

    1997-08-01

    This article details our experience in developing and operating an ultrasound mini-picture archiving and communication system (PACS). Using software developed in-house, low-end Macintosh computers (Apple Computer Co. Cupertino, CA) equipped with framegrabbers coordinate the entry of patient demographic information, image acquisition, and viewing on each ultrasound scanner. After each exam, the data are transmitted to a central archive server where they can be accessed from anywhere on the network. The archive server also provides web-based access to the data and manages pre-fetch and other requests for data that may no longer be on-line. Archival is fully automatic and is performed on recordable compact disk (CD) without compression. The system has been filmless now for over 18 months. In the meantime, one film processor has been eliminated and the position of one film clerk has been reallocated. Previously, nine ultrasound machines produced approximately 150 sheets of laser film per day (at 14 images per sheet). The same quantity of data are now archived without compression onto a single CD. Start-up costs were recovered within six months, and the project has been extended to include computed tomography (CT) and magnetic resonance imaging (MRI).

  9. High speed imager test station

    DOEpatents

    Yates, George J.; Albright, Kevin L.; Turko, Bojan T.

    1995-01-01

    A test station enables the performance of a solid state imager (herein called a focal plane array or FPA) to be determined at high image frame rates. A programmable waveform generator is adapted to generate clock pulses at determinable rates for clock light-induced charges from a FPA. The FPA is mounted on an imager header board for placing the imager in operable proximity to level shifters for receiving the clock pulses and outputting pulses effective to clock charge from the pixels forming the FPA. Each of the clock level shifters is driven by leading and trailing edge portions of the clock pulses to reduce power dissipation in the FPA. Analog circuits receive output charge pulses clocked from the FPA pixels. The analog circuits condition the charge pulses to cancel noise in the pulses and to determine and hold a peak value of the charge for digitizing. A high speed digitizer receives the peak signal value and outputs a digital representation of each one of the charge pulses. A video system then displays an image associated with the digital representation of the output charge pulses clocked from the FPA. In one embodiment, the FPA image is formatted to a standard video format for display on conventional video equipment.

  10. High speed imager test station

    DOEpatents

    Yates, G.J.; Albright, K.L.; Turko, B.T.

    1995-11-14

    A test station enables the performance of a solid state imager (herein called a focal plane array or FPA) to be determined at high image frame rates. A programmable waveform generator is adapted to generate clock pulses at determinable rates for clock light-induced charges from a FPA. The FPA is mounted on an imager header board for placing the imager in operable proximity to level shifters for receiving the clock pulses and outputting pulses effective to clock charge from the pixels forming the FPA. Each of the clock level shifters is driven by leading and trailing edge portions of the clock pulses to reduce power dissipation in the FPA. Analog circuits receive output charge pulses clocked from the FPA pixels. The analog circuits condition the charge pulses to cancel noise in the pulses and to determine and hold a peak value of the charge for digitizing. A high speed digitizer receives the peak signal value and outputs a digital representation of each one of the charge pulses. A video system then displays an image associated with the digital representation of the output charge pulses clocked from the FPA. In one embodiment, the FPA image is formatted to a standard video format for display on conventional video equipment. 12 figs.

  11. The Convergence of Information Technology, Data, and Management in a Library Imaging Program

    ERIC Educational Resources Information Center

    France, Fenella G.; Emery, Doug; Toth, Michael B.

    2010-01-01

    Integrating advanced imaging and processing capabilities in libraries, archives, and museums requires effective systems and information management to ensure that the large amounts of digital data about cultural artifacts can be readily acquired, stored, archived, accessed, processed, and linked to other data. The Library of Congress is developing…

  12. Complementary concept for an image archive and communication system in a cardiological department based on CD-medical, an online archive, and networking facilities

    NASA Astrophysics Data System (ADS)

    Oswald, Helmut; Mueller-Jones, Kay; Builtjes, Jan; Fleck, Eckart

    1998-07-01

    The developments in information technologies -- computer hardware, networking and storage media -- has led to expectations that these advances make it possible to replace 35 mm film completely by digital techniques in the catheter laboratory. Besides the role of an archival medium, cine film is used as the major image review and exchange medium in cardiology. None of the today technologies can fulfill completely the requirements to replace cine film. One of the major drawbacks of cine film is the single access in time and location. For the four catheter laboratories in our institutions we have designed a complementary concept combining the CD-R, also called CD-medical, as a single patient storage and exchange medium, and a digital archive for on-line access and image review of selected frames or short sequences on adequate medical workstations. The image data from various modalities as well as all digital documents regarding to a patient are part of an electronic patient record. The access, the processing and the display of documents is supported by an integrated medical application.

  13. Hyperswitch Network For Hypercube Computer

    NASA Technical Reports Server (NTRS)

    Chow, Edward; Madan, Herbert; Peterson, John

    1989-01-01

    Data-driven dynamic switching enables high speed data transfer. Proposed hyperswitch network based on mixed static and dynamic topologies. Routing header modified in response to congestion or faults encountered as path established. Static topology meets requirement if nodes have switching elements that perform necessary routing header revisions dynamically. Hypercube topology now being implemented with switching element in each computer node aimed at designing very-richly-interconnected multicomputer system. Interconnection network connects great number of small computer nodes, using fixed hypercube topology, characterized by point-to-point links between nodes.

  14. A 2.2 sq m /24 sq ft/ self-controlled deployable heat pipe radiator - Design and test

    NASA Technical Reports Server (NTRS)

    Edelstein, F.

    1975-01-01

    An all heat pipe, deployable radiator has been developed which can effectively control pumped fluid loop temperatures under varying loads using variable conductance panel heat pipes. The 2.2 sq m (24 sq ft) aluminum panel can be coupled to either a fluid header or a flexible heat pipe header capable of transporting 850 watts in a 90-deg bent configuration. Test results support the feasibility of using this system to passively control Freon-21 loop temperatures.

  15. A Document-Based EHR System That Controls the Disclosure of Clinical Documents Using an Access Control List File Based on the HL7 CDA Header.

    PubMed

    Takeda, Toshihiro; Ueda, Kanayo; Nakagawa, Akito; Manabe, Shirou; Okada, Katsuki; Mihara, Naoki; Matsumura, Yasushi

    2017-01-01

    Electronic health record (EHR) systems are necessary for the sharing of medical information between care delivery organizations (CDOs). We developed a document-based EHR system in which all of the PDF documents that are stored in our electronic medical record system can be disclosed to selected target CDOs. An access control list (ACL) file was designed based on the HL7 CDA header to manage the information that is disclosed.

  16. Degassifying and mixing apparatus for liquids. [potable water for spacecraft

    NASA Technical Reports Server (NTRS)

    Yamauchi, S. T. (Inventor)

    1983-01-01

    An apparatus for degassing a liquid comprises a containment vessel a liquid pump and a header assembly (12) within the containment vessel in a volume above the reservoir of the liquid. The pump draws from this reservoir and outputs to the header assembly, the latter being constructed to return the liquid to the reservoir in the form of a number of stacked, vertically spaced, concentric, conical cascades via orifices. A vacuum source provides a partial vacuum in the containment vessel to enhance the degassing process.

  17. Picture archiving and communication system--Part one: Filmless radiology and distance radiology.

    PubMed

    De Backer, A I; Mortelé, K J; De Keulenaer, B L

    2004-01-01

    Picture archiving and communication system (PACS) is a collection of technologies used to carry out digital medical imaging. PACS is used to digitally acquire medical images from the various modalities, such as computed tomography (CT), magnetic resonance imaging (MRI), ultrasound, and digital projection radiography. The image data and pertinent information are transmitted to other and possibly remote locations over networks, where they may be displayed on computer workstations for soft copy viewing in multiple locations, thus permitting simultaneous consultations and almost instant reporting from radiologists at a distance. Data are secured and archived on digital media such as optical disks or tape, and may be automatically retrieved as necessary. Close integration with the hospital information system (HIS)--radiology information system (RIS) is critical for system functionality. Medical image management systems are maturing, providing access outside of the radiology department to images throughout the hospital via the Ethernet, at different hospitals, or from a home workstation if teleradiology has been implemented.

  18. VizieR Online Data Catalog: GTC spectra of z~2.3 quasars (Sulentic+, 2014)

    NASA Astrophysics Data System (ADS)

    Sulentic, J. W.; Marziani, P.; Del Olmo, A.; Dultzin, D.; Perea, J.; Negrete, C. A.

    2014-09-01

    Spectroscopic data for 22 intermediate redshift quasars are identified in Table 1. Actual data files are in FITS format in the spectra sub-directory. Each individual spectrum cover the spectral range 360-770 nm. Units are in wavelength in Angstrom, and specific flux in erg/s/cm2/Angstrom (pW/m3) in the observed frame (i.e., before redshift correction). Full object name (OBJECT), total exposure time (EXPTIME), number of coadded individual spectra (NUM_IMAG), and observation date (DATE-OBS) are reported as records in the FITS header of each spectrum (as in Table 2 of the paper). (2 data files).

  19. Design and implementation of GRID-based PACS in a hospital with multiple imaging departments

    NASA Astrophysics Data System (ADS)

    Yang, Yuanyuan; Jin, Jin; Sun, Jianyong; Zhang, Jianguo

    2008-03-01

    Usually, there were multiple clinical departments providing imaging-enabled healthcare services in enterprise healthcare environment, such as radiology, oncology, pathology, and cardiology, the picture archiving and communication system (PACS) is now required to support not only radiology-based image display, workflow and data flow management, but also to have more specific expertise imaging processing and management tools for other departments providing imaging-guided diagnosis and therapy, and there were urgent demand to integrate the multiple PACSs together to provide patient-oriented imaging services for enterprise collaborative healthcare. In this paper, we give the design method and implementation strategy of developing grid-based PACS (Grid-PACS) for a hospital with multiple imaging departments or centers. The Grid-PACS functions as a middleware between the traditional PACS archiving servers and workstations or image viewing clients and provide DICOM image communication and WADO services to the end users. The images can be stored in distributed multiple archiving servers, but can be managed with central mode. The grid-based PACS has auto image backup and disaster recovery services and can provide best image retrieval path to the image requesters based on the optimal algorithms. The designed grid-based PACS has been implemented in Shanghai Huadong Hospital and been running for two years smoothly.

  20. CMOS: Efficient Clustered Data Monitoring in Sensor Networks

    PubMed Central

    2013-01-01

    Tiny and smart sensors enable applications that access a network of hundreds or thousands of sensors. Thus, recently, many researchers have paid attention to wireless sensor networks (WSNs). The limitation of energy is critical since most sensors are battery-powered and it is very difficult to replace batteries in cases that sensor networks are utilized outdoors. Data transmission between sensor nodes needs more energy than computation in a sensor node. In order to reduce the energy consumption of sensors, we present an approximate data gathering technique, called CMOS, based on the Kalman filter. The goal of CMOS is to efficiently obtain the sensor readings within a certain error bound. In our approach, spatially close sensors are grouped as a cluster. Since a cluster header generates approximate readings of member nodes, a user query can be answered efficiently using the cluster headers. In addition, we suggest an energy efficient clustering method to distribute the energy consumption of cluster headers. Our simulation results with synthetic data demonstrate the efficiency and accuracy of our proposed technique. PMID:24459444

  1. CMOS: efficient clustered data monitoring in sensor networks.

    PubMed

    Min, Jun-Ki

    2013-01-01

    Tiny and smart sensors enable applications that access a network of hundreds or thousands of sensors. Thus, recently, many researchers have paid attention to wireless sensor networks (WSNs). The limitation of energy is critical since most sensors are battery-powered and it is very difficult to replace batteries in cases that sensor networks are utilized outdoors. Data transmission between sensor nodes needs more energy than computation in a sensor node. In order to reduce the energy consumption of sensors, we present an approximate data gathering technique, called CMOS, based on the Kalman filter. The goal of CMOS is to efficiently obtain the sensor readings within a certain error bound. In our approach, spatially close sensors are grouped as a cluster. Since a cluster header generates approximate readings of member nodes, a user query can be answered efficiently using the cluster headers. In addition, we suggest an energy efficient clustering method to distribute the energy consumption of cluster headers. Our simulation results with synthetic data demonstrate the efficiency and accuracy of our proposed technique.

  2. Pacemaker syndrome with sub-acute left ventricular systolic dysfunction in a patient with a dual-chamber pacemaker: consequence of lead switch at the header.

    PubMed

    Khurwolah, Mohammad Reeaze; Vezi, Brian Zwelethini

    In the daily practice of pacemaker insertion, the occurrence of atrial and ventricular lead switch at the pacemaker box header is a rare and unintentional phenomenon, with less than five cases reported in the literature. The lead switch may have dire consequences, depending on the indication for the pacemaker. One of these consequences is pacemaker syndrome, in which the normal sequence of atrial and ventricular activation is impaired, leading to sub-optimal ventricular filling and cardiac output. It is important for the attending physician to recognise any worsening of symptoms in a patient who has recently had a permanent pacemaker inserted. In the case of a dual-chamber pacemaker, switching of the atrial and ventricular leads at the pacemaker box header should be strongly suspected. We present an unusual case of pacemaker syndrome and right ventricular-only pacinginduced left ventricular systolic dysfunction in a patient with a dual-chamber pacemaker.

  3. Ampule tests to simulate glass corrosion in ambient temperature lithium batteries. Volume 2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Douglas, S.C.; Bunker, B.C.; Crafts, C.C.

    1984-06-01

    Glass corrosion in battery headers has been found to limit the shelf life of ambient temperature lithium batteries. Glass corrosion can lead to loss of battery electrolytes or to shorts across the conductive corrosion product. Tests have been conducted which simulate the corrosive environment in a battery by sealing headers attached to lithium metal into Pyrex ampules containing battery electrolyte. Using the ampule test, glass corrosion kinetics have been determined at 70/sup 0/C for the Li/SO/sub 2/, Li/SOCl/sub 2/, and Li/SOCl/sub 2/ + BrCl battery systems. Test results indicate that corrosion of commercial glass compositions is extensive in all electrolytesmore » tested, resulting in predicted battery failures after several months. Sandia's TA-23 glass corrodes at a much slower rate, indicating a projected battery lifetime of over five years in the Li/SO/sub 2/ system. Test results reveal that corrosion kinetics are sensitive to header polarization, stress, and configuration as well as glass composition.« less

  4. ISDEC-2 and ISDEC-3 controllers for HAWAII detectors

    NASA Astrophysics Data System (ADS)

    Burse, Mahesh; Ramaprakash, A. N.; Chordia, Pravinkumar; Punnadi, Sujit; Chillal, Kalpesh; Mestri, Vilas; Bharti, Rupali; Sinha, Sakya; Kohok, Abhay

    2016-07-01

    ISDEC-2 - IUCAA1 SIDECAR Drive Electronics Controller is an alternative for Teledyne make JADE2 based controller for HAWAII detectors. It is a ready to use complete package and has been developed keeping in mind general astronomical requirements and widely used observatory set-ups like preferred OS-Linux , multi-extension fits output with fully populated headers (with detector as well as telescope and observation specific information), etc. Actual exposure time is measured for each frame to a few tens of microsecond accuracy and put in the fits header. It also caters to several application specific requirements like fast resets, strip mode, multiple region readout with on board co-adding, etc. ISDEC-2 is designed to work at -40 deg. and is already in use at observatories worldwide. ISDEC-3 is an Artix-7 FPGA based SIDECAR Drive Electronics Controller currently being developed at IUCAA. It will retain all the functionality supported by ISDEC-2 and will also support the operation of H2RG in continuos, fast (32 output, 5 MSPS, 12 bit) mode. It will have a 5 Gbps USB 3.0 PC interface and 1 Gbps Ethernet interface for image data transfer from SIDECAR to host PC. Additionally, the board will have DDR-3 memory for on-board storage and processing. ISDEC-3 will be capable of handling two SIDECARs simultaneously (in sync) for H2RG slow modes.

  5. Possibilities of Processing Archival Photogrammetric Images Captured by Rollei 6006 Metric Camera Using Current Method

    NASA Astrophysics Data System (ADS)

    Dlesk, A.; Raeva, P.; Vach, K.

    2018-05-01

    Processing of analog photogrammetric negatives using current methods brings new challenges and possibilities, for example, creation of a 3D model from archival images which enables the comparison of historical state and current state of cultural heritage objects. The main purpose of this paper is to present possibilities of processing archival analog images captured by photogrammetric camera Rollei 6006 metric. In 1994, the Czech company EuroGV s.r.o. carried out photogrammetric measurements of former limestone quarry the Great America located in the Central Bohemian Region in the Czech Republic. All the negatives of photogrammetric images, complete documentation, coordinates of geodetically measured ground control points, calibration reports and external orientation of images calculated in the Combined Adjustment Program are preserved and were available for the current processing. Negatives of images were scanned and processed using structure from motion method (SfM). The result of the research is a statement of what accuracy is possible to expect from the proposed methodology using Rollei metric images originally obtained for terrestrial intersection photogrammetry while adhering to the proposed methodology.

  6. Diagnostic report acquisition unit for the Mayo/IBM PACS project

    NASA Astrophysics Data System (ADS)

    Brooks, Everett G.; Rothman, Melvyn L.

    1991-07-01

    The Mayo Clinic and IBM Rochester have jointly developed a picture archive and control system (PACS) for use with Mayo's MRI and Neuro-CT imaging modalities. One of the challenges of developing a useful PACS involves integrating the diagnostic reports with the electronic images so they can be displayed simultaneously. By the time a diagnostic report is generated for a particular case, its images have already been captured and archived by the PACS. To integrate the report with the images, the authors have developed an IBM Personal System/2 computer (PS/2) based diagnostic report acquisition unit (RAU). A typed copy of the report is transmitted via facsimile to the RAU where it is stacked electronically with other reports that have been sent previously but not yet processed. By processing these reports at the RAU, the information they contain is integrated with the image database and a copy of the report is archived electronically on an IBM Application System/400 computer (AS/400). When a user requests a set of images for viewing, the report is automatically integrated with the image data. By using a hot key, the user can toggle on/off the report on the display screen. This report describes process, hardware, and software employed to integrate the diagnostic report information into the PACS, including how the report images are captured, transmitted, and entered into the AS/400 database. Also described is how the archived reports and their associated medical images are located and merged for retrieval and display. The methods used to detect and process error conditions are also discussed.

  7. Visual analytics for semantic queries of TerraSAR-X image content

    NASA Astrophysics Data System (ADS)

    Espinoza-Molina, Daniela; Alonso, Kevin; Datcu, Mihai

    2015-10-01

    With the continuous image product acquisition of satellite missions, the size of the image archives is considerably increasing every day as well as the variety and complexity of their content, surpassing the end-user capacity to analyse and exploit them. Advances in the image retrieval field have contributed to the development of tools for interactive exploration and extraction of the images from huge archives using different parameters like metadata, key-words, and basic image descriptors. Even though we count on more powerful tools for automated image retrieval and data analysis, we still face the problem of understanding and analyzing the results. Thus, a systematic computational analysis of these results is required in order to provide to the end-user a summary of the archive content in comprehensible terms. In this context, visual analytics combines automated analysis with interactive visualizations analysis techniques for an effective understanding, reasoning and decision making on the basis of very large and complex datasets. Moreover, currently several researches are focused on associating the content of the images with semantic definitions for describing the data in a format to be easily understood by the end-user. In this paper, we present our approach for computing visual analytics and semantically querying the TerraSAR-X archive. Our approach is mainly composed of four steps: 1) the generation of a data model that explains the information contained in a TerraSAR-X product. The model is formed by primitive descriptors and metadata entries, 2) the storage of this model in a database system, 3) the semantic definition of the image content based on machine learning algorithms and relevance feedback, and 4) querying the image archive using semantic descriptors as query parameters and computing the statistical analysis of the query results. The experimental results shows that with the help of visual analytics and semantic definitions we are able to explain the image content using semantic terms and the relations between them answering questions such as what is the percentage of urban area in a region? or what is the distribution of water bodies in a city?

  8. Use of archive aerial photography for monitoring black mangrove populations

    USDA-ARS?s Scientific Manuscript database

    A study was conducted on the south Texas Gulf Coast to evaluate archive aerial color-infrared (CIR) photography combined with supervised image analysis techniques to quantify changes in black mangrove [Avicennia germinans (L.) L.] populations over a 26-year period. Archive CIR film from two study si...

  9. The Archival Photograph and Its Meaning: Formalisms for Modeling Images

    ERIC Educational Resources Information Center

    Benson, Allen C.

    2009-01-01

    This article explores ontological principles and their potential applications in the formal description of archival photographs. Current archival descriptive practices are reviewed and the larger question is addressed: do archivists who are engaged in describing photographs need a more formalized system of representation, or do existing encoding…

  10. Performance of the Mayo-IBM PAC system

    NASA Astrophysics Data System (ADS)

    Persons, Kenneth R.; Reardon, Frank J.; Gehring, Dale G.; Hangiandreou, Nicholas J.

    1994-05-01

    The Mayo Clinic and IBM (at Rochester, Minnesota) have jointly developed a picture archived system for use with Mayo's MRI and CT imaging modalities. This PACS is made up of over 50 computers that work cooperatively to provide archival, retrieval and image distribution services for Mayo's Department of Radiology. This paper will examine the performance characteristics of the system.

  11. Optimization of Single- and Dual-Color Immunofluorescence Protocols for Formalin-Fixed, Paraffin-Embedded Archival Tissues.

    PubMed

    Kajimura, Junko; Ito, Reiko; Manley, Nancy R; Hale, Laura P

    2016-02-01

    Performance of immunofluorescence staining on archival formalin-fixed paraffin-embedded human tissues is generally not considered to be feasible, primarily due to problems with tissue quality and autofluorescence. We report the development and application of procedures that allowed for the study of a unique archive of thymus tissues derived from autopsies of individuals exposed to atomic bomb radiation in Hiroshima, Japan in 1945. Multiple independent treatments were used to minimize autofluorescence and maximize fluorescent antibody signals. Treatments with NH3/EtOH and Sudan Black B were particularly useful in decreasing autofluorescent moieties present in the tissue. Deconvolution microscopy was used to further enhance the signal-to-noise ratios. Together, these techniques provide high-quality single- and dual-color fluorescent images with low background and high contrast from paraffin blocks of thymus tissue that were prepared up to 60 years ago. The resulting high-quality images allow the application of a variety of image analyses to thymus tissues that previously were not accessible. Whereas the procedures presented remain to be tested for other tissue types and archival conditions, the approach described may facilitate greater utilization of older paraffin block archives for modern immunofluorescence studies. © 2016 The Histochemical Society.

  12. ACR/NEMA Digital Image Interface Standard (An Illustrated Protocol Overview)

    NASA Astrophysics Data System (ADS)

    Lawrence, G. Robert

    1985-09-01

    The American College of Radiologists (ACR) and the National Electrical Manufacturers Association (NEMA) have sponsored a joint standards committee mandated to develop a universal interface standard for the transfer of radiology images among a variety of PACS imaging devicesl. The resulting standard interface conforms to the ISO/OSI standard reference model for network protocol layering. The standard interface specifies the lower layers of the reference model (Physical, Data Link, Transport and Session) and implies a requirement of the Network Layer should a requirement for a network exist. The message content has been considered and a flexible message and image format specified. The following Imaging Equipment modalities are supported by the standard interface... CT Computed Tomograpy DS Digital Subtraction NM Nuclear Medicine US Ultrasound MR Magnetic Resonance DR Digital Radiology The following data types are standardized over the transmission interface media.... IMAGE DATA DIGITIZED VOICE HEADER DATA RAW DATA TEXT REPORTS GRAPHICS OTHERS This paper consists of text supporting the illustrated protocol data flow. Each layer will be individually treated. Particular emphasis will be given to the Data Link layer (Frames) and the Transport layer (Packets). The discussion utilizes a finite state sequential machine model for the protocol layers.

  13. Determining the Completeness of the Nimbus Meteorological Data Archive

    NASA Technical Reports Server (NTRS)

    Johnson, James; Moses, John; Kempler, Steven; Zamkoff, Emily; Al-Jazrawi, Atheer; Gerasimov, Irina; Trivedi, Bhagirath

    2011-01-01

    NASA launched the Nimbus series of meteorological satellites in the 1960s and 70s. These satellites carried instruments for making observations of the Earth in the visible, infrared, ultraviolet, and microwave wavelengths. The original data archive consisted of a combination of digital data written to 7-track computer tapes and on various film media. Many of these data sets are now being migrated from the old media to the GES DISC modern online archive. The process involves recovering the digital data files from tape as well as scanning images of the data from film strips. Some of the challenges of archiving the Nimbus data include the lack of any metadata from these old data sets. Metadata standards and self-describing data files did not exist at that time, and files were written on now obsolete hardware systems and outdated file formats. This requires creating metadata by reading the contents of the old data files. Some digital data files were corrupted over time, or were possibly improperly copied at the time of creation. Thus there are data gaps in the collections. The film strips were stored in boxes and are now being scanned as JPEG-2000 images. The only information describing these images is what was written on them when they were originally created, and sometimes this information is incomplete or missing. We have the ability to cross-reference the scanned images against the digital data files to determine which of these best represents the data set from the various missions, or to see how complete the data sets are. In this presentation we compared data files and scanned images from the Nimbus-2 High-Resolution Infrared Radiometer (HRIR) for September 1966 to determine whether the data and images are properly archived with correct metadata.

  14. User Driven Image Stacking for ODI Data and Beyond via a Highly Customizable Web Interface

    NASA Astrophysics Data System (ADS)

    Hayashi, S.; Gopu, A.; Young, M. D.; Kotulla, R.

    2015-09-01

    While some astronomical archives have begun serving standard calibrated data products, the process of producing stacked images remains a challenge left to the end-user. The benefits of astronomical image stacking are well established, and dither patterns are recommended for almost all observing targets. Some archives automatically produce stacks of limited scientific usefulness without any fine-grained user or operator configurability. In this paper, we present PPA Stack, a web based stacking framework within the ODI - Portal, Pipeline, and Archive system. PPA Stack offers a web user interface with built-in heuristics (based on pointing, filter, and other metadata information) to pre-sort images into a set of likely stacks while still allowing the user or operator complete control over the images and parameters for each of the stacks they wish to produce. The user interface, designed using AngularJS, provides multiple views of the input dataset and parameters, all of which are synchronized in real time. A backend consisting of a Python application optimized for ODI data, wrapped around the SWarp software, handles the execution of stacking workflow jobs on Indiana University's Big Red II supercomputer, and the subsequent ingestion of the combined images back into the PPA archive. PPA Stack is designed to enable seamless integration of other stacking applications in the future, so users can select the most appropriate option for their science.

  15. Restoration and PDS Archive of Apollo Lunar Rock Sample Data

    NASA Technical Reports Server (NTRS)

    Garcia, P. A.; Todd, N. S.; Lofgren, G. E.; Stefanov, W. L.; Runco, S. K.; LaBasse, D.; Gaddis, L. R.

    2011-01-01

    In 2008, scientists at the Johnson Space Center (JSC) Lunar Sample Laboratory and Image Science & Analysis Laboratory (under the auspices of the Astromaterials Research and Exploration Science Directorate or ARES) began work on a 4-year project to digitize the original film negatives of Apollo Lunar Rock Sample photographs. These rock samples together with lunar regolith and core samples were collected as part of the lander missions for Apollos 11, 12, 14, 15, 16 and 17. The original film negatives are stored at JSC under cryogenic conditions. This effort is data restoration in the truest sense. The images represent the only record available to scientists which allows them to view the rock samples when making a sample request. As the negatives are being scanned, they are also being formatted and documented for permanent archive in the NASA Planetary Data System (PDS) archive. The ARES group is working collaboratively with the Imaging Node of the PDS on the archiving.

  16. ALICE Data Release: A Revaluation of HST-NICMOS Coronagraphic Images

    NASA Astrophysics Data System (ADS)

    Hagan, J. Brendan; Choquet, Élodie; Soummer, Rémi; Vigan, Arthur

    2018-04-01

    The Hubble Space Telescope NICMOS instrument was used from 1997 to 2008 to perform coronagraphic observations of about 400 targets. Most of them were part of surveys looking for substellar companions or resolved circumstellar disks to young nearby stars, making the NICMOS coronagraphic archive a valuable database for exoplanets and disks studies. As part of the Archival Legacy Investigations of Circumstellar Environments program, we have consistently reprocessed a large fraction of the NICMOS coronagrahic archive using advanced starlight subtraction methods. We present here the high-level science products of these re-analyzed data, which we delivered back to the community through the Mikulski Archive for Space Telescopes: doi:10.17909/T9W89V. We also present the second version of the HCI-FITS format (for High-Contrast Imaging FITS format), which we developed as a standard format for data exchange of imaging reduced science products. These re-analyzed products are openly available for population statistics studies, characterization of specific targets, or detected point-source identification.

  17. Development of ultrasound/endoscopy PACS (picture archiving and communication system) and investigation of compression method for cine images

    NASA Astrophysics Data System (ADS)

    Osada, Masakazu; Tsukui, Hideki

    2002-09-01

    ABSTRACT Picture Archiving and Communication System (PACS) is a system which connects imaging modalities, image archives, and image workstations to reduce film handling cost and improve hospital workflow. Handling diagnostic ultrasound and endoscopy images is challenging, because it produces large amount of data such as motion (cine) images of 30 frames per second, 640 x 480 in resolution, with 24-bit color. Also, it requires enough image quality for clinical review. We have developed PACS which is able to manage ultrasound and endoscopy cine images with above resolution and frame rate, and investigate suitable compression method and compression rate for clinical image review. Results show that clinicians require capability for frame-by-frame forward and backward review of cine images because they carefully look through motion images to find certain color patterns which may appear in one frame. In order to satisfy this quality, we have chosen motion JPEG, installed and confirmed that we could capture this specific pattern. As for acceptable image compression rate, we have performed subjective evaluation. No subjects could tell the difference between original non-compressed images and 1:10 lossy compressed JPEG images. One subject could tell the difference between original and 1:20 lossy compressed JPEG images although it is acceptable. Thus, ratios of 1:10 to 1:20 are acceptable to reduce data amount and cost while maintaining quality for clinical review.

  18. The ISO Data Archive and Interoperability with Other Archives

    NASA Astrophysics Data System (ADS)

    Salama, Alberto; Arviset, Christophe; Hernández, José; Dowson, John; Osuna, Pedro

    The ESA's Infrared Space Observatory (ISO), an unprecedented observatory for infrared astronomy launched in November 1995, successfully made nearly 30,000 scientific observations in its 2.5-year mission. The ISO data can be retrieved from the ISO Data Archive, available at ISO Data Archive , and comprised of about 150,000 observations, including parallel and serendipity mode observations. A user-friendly Java interface permits queries to the database and data retrieval. The interface currently offers a wide variety of links to other archives, such as name resolution with NED and SIMBAD, access to electronic articles from ADS and CDS/VizieR, and access to IRAS data. In the past year development has been focused on improving the IDA interoperability with other astronomical archives, either by accessing other relevant archives or by providing direct access to the ISO data for external services. A mechanism of information transfer has been developed, allowing direct query to the IDA via a Java Server Page, returning quick look ISO images and relevant, observation-specific information embedded in an HTML page. This method has been used to link from the CDS/Vizier Data Centre and ADS, and work with IPAC to allow access to the ISO Archive from IRSA, including display capabilities of the observed sky regions onto other mission images, is in progress. Prospects for further links to and from other archives and databases are also addressed.

  19. A Routing Mechanism for Cloud Outsourcing of Medical Imaging Repositories.

    PubMed

    Godinho, Tiago Marques; Viana-Ferreira, Carlos; Bastião Silva, Luís A; Costa, Carlos

    2016-01-01

    Web-based technologies have been increasingly used in picture archive and communication systems (PACS), in services related to storage, distribution, and visualization of medical images. Nowadays, many healthcare institutions are outsourcing their repositories to the cloud. However, managing communications between multiple geo-distributed locations is still challenging due to the complexity of dealing with huge volumes of data and bandwidth requirements. Moreover, standard methodologies still do not take full advantage of outsourced archives, namely because their integration with other in-house solutions is troublesome. In order to improve the performance of distributed medical imaging networks, a smart routing mechanism was developed. This includes an innovative cache system based on splitting and dynamic management of digital imaging and communications in medicine objects. The proposed solution was successfully deployed in a regional PACS archive. The results obtained proved that it is better than conventional approaches, as it reduces remote access latency and also the required cache storage space.

  20. NASA/IPAC Infrared Archive's General Image Cutouts Service

    NASA Astrophysics Data System (ADS)

    Alexov, A.; Good, J. C.

    2006-07-01

    The NASA/IPAC Infrared Archive (IRSA) ``Cutouts" Service (http://irsa.ipac.caltech.edu/applications/Cutouts) is a general tool for creating small ``cutout" FITS images and JPEGs from collections of data archived at IRSA. This service is a companion to IRSA's Atlas tool (http://irsa.ipac.caltech.edu/applications/Atlas/), which currently serves over 25 different data collections of various sizes and complexity and returns entire images for a user-defined region of the sky. The Cutouts Services sits on top of Atlas and extends the Atlas functionality by generating subimages at locations and sizes requested by the user from images already identified by Atlas. These results can be downloaded individually, in batch mode (using the program wget), or as a tar file. Cutouts re-uses IRSA's software architecture along with the publicly available Montage mosaicking tools. The advantages and disadvantages of this approach to generic cutout serving will be discussed.

  1. Development of public science archive system of Subaro Telescope. 2

    NASA Astrophysics Data System (ADS)

    Yamamoto, Naotaka; Noda, Sachiyo; Taga, Masatoshi; Ozawa, Tomohiko; Horaguchi, Toshihiro; Okumura, Shin-Ichiro; Furusho, Reiko; Baba, Hajime; Yagi, Masafumi; Yasuda, Naoki; Takata, Tadafumi; Ichikawa, Shin-Ichi

    2003-09-01

    We report various improvements in a public science archive system, SMOKA (Subaru-Mitaka-Okayama-Kiso Archive system). We have developed a new interface to search observational data of minor bodies in the solar system. In addition, the other improvements (1) to search frames by specifying wavelength directly, (2) to find out calibration data set automatically, (3) to browse data on weather, humidity, and temperature, which provide information of image quality, (4) to provide quick-look images of OHS/CISCO and IRCS, and (5) to include the data from OAO HIDES (HIgh Dispersion Echelle Spectrograph), are also summarized.

  2. Clinical experience with a high-performance ATM-connected DICOM archive for cardiology

    NASA Astrophysics Data System (ADS)

    Solomon, Harry P.

    1997-05-01

    A system to archive large image sets, such as cardiac cine runs, with near realtime response must address several functional and performance issues, including efficient use of a high performance network connection with standard protocols, an architecture which effectively integrates both short- and long-term mass storage devices, and a flexible data management policy which allows optimization of image distribution and retrieval strategies based on modality and site-specific operational use. Clinical experience with such as archive has allowed evaluation of these systems issues and refinement of a traffic model for cardiac angiography.

  3. The global Landsat archive: Status, consolidation, and direction

    USGS Publications Warehouse

    Wulder, Michael A.; White, Joanne C.; Loveland, Thomas; Woodcock, Curtis; Belward, Alan; Cohen, Warren B.; Fosnight, Eugene A.; Shaw, Jerad; Masek, Jeffery G.; Roy, David P.

    2016-01-01

    New and previously unimaginable Landsat applications have been fostered by a policy change in 2008 that made analysis-ready Landsat data free and open access. Since 1972, Landsat has been collecting images of the Earth, with the early years of the program constrained by onboard satellite and ground systems, as well as limitations across the range of required computing, networking, and storage capabilities. Rather than robust on-satellite storage for transmission via high bandwidth downlink to a centralized storage and distribution facility as with Landsat-8, a network of receiving stations, one operated by the U.S. government, the other operated by a community of International Cooperators (ICs), were utilized. ICs paid a fee for the right to receive and distribute Landsat data and over time, more Landsat data was held outside the archive of the United State Geological Survey (USGS) than was held inside, much of it unique. Recognizing the critical value of these data, the USGS began a Landsat Global Archive Consolidation (LGAC) initiative in 2010 to bring these data into a single, universally accessible, centralized global archive, housed at the Earth Resources Observation and Science (EROS) Center in Sioux Falls, South Dakota. The primary LGAC goals are to inventory the data held by ICs, acquire the data, and ingest and apply standard ground station processing to generate an L1T analysis-ready product. As of January 1, 2015 there were 5,532,454 images in the USGS archive. LGAC has contributed approximately 3.2 million of those images, more than doubling the original USGS archive holdings. Moreover, an additional 2.3 million images have been identified to date through the LGAC initiative and are in the process of being added to the archive. The impact of LGAC is significant and, in terms of images in the collection, analogous to that of having had twoadditional Landsat-5 missions. As a result of LGAC, there are regions of the globe that now have markedly improved Landsat data coverage, resulting in an enhanced capacity for mapping, monitoring change, and capturing historic conditions. Although future missions can be planned and implemented, the past cannot be revisited, underscoring the value and enhanced significance of historical Landsat data and the LGAC initiative. The aim of this paper is to report the current status of the global USGS Landsat archive, document the existing and anticipated contributions of LGAC to the archive, and characterize the current acquisitions of Landsat-7 and Landsat-8. Landsat-8 is adding data to the archive at an unprecedented rate as nearly all terrestrial images are now collected. We also offer key lessons learned so far from the LGAC initiative, plus insights regarding other critical elements of the Landsat program looking forward, such as acquisition, continuity, temporal revisit, and the importance of continuing to operationalize the Landsat program.

  4. The state of the art of medical imaging technology: from creation to archive and back.

    PubMed

    Gao, Xiaohong W; Qian, Yu; Hui, Rui

    2011-01-01

    Medical imaging has learnt itself well into modern medicine and revolutionized medical industry in the last 30 years. Stemming from the discovery of X-ray by Nobel laureate Wilhelm Roentgen, radiology was born, leading to the creation of large quantities of digital images as opposed to film-based medium. While this rich supply of images provides immeasurable information that would otherwise not be possible to obtain, medical images pose great challenges in archiving them safe from corrupted, lost and misuse, retrievable from databases of huge sizes with varying forms of metadata, and reusable when new tools for data mining and new media for data storing become available. This paper provides a summative account on the creation of medical imaging tomography, the development of image archiving systems and the innovation from the existing acquired image data pools. The focus of this paper is on content-based image retrieval (CBIR), in particular, for 3D images, which is exemplified by our developed online e-learning system, MIRAGE, home to a repository of medical images with variety of domains and different dimensions. In terms of novelties, the facilities of CBIR for 3D images coupled with image annotation in a fully automatic fashion have been developed and implemented in the system, resonating with future versatile, flexible and sustainable medical image databases that can reap new innovations.

  5. The State of the Art of Medical Imaging Technology: from Creation to Archive and Back

    PubMed Central

    Gao, Xiaohong W; Qian, Yu; Hui, Rui

    2011-01-01

    Medical imaging has learnt itself well into modern medicine and revolutionized medical industry in the last 30 years. Stemming from the discovery of X-ray by Nobel laureate Wilhelm Roentgen, radiology was born, leading to the creation of large quantities of digital images as opposed to film-based medium. While this rich supply of images provides immeasurable information that would otherwise not be possible to obtain, medical images pose great challenges in archiving them safe from corrupted, lost and misuse, retrievable from databases of huge sizes with varying forms of metadata, and reusable when new tools for data mining and new media for data storing become available. This paper provides a summative account on the creation of medical imaging tomography, the development of image archiving systems and the innovation from the existing acquired image data pools. The focus of this paper is on content-based image retrieval (CBIR), in particular, for 3D images, which is exemplified by our developed online e-learning system, MIRAGE, home to a repository of medical images with variety of domains and different dimensions. In terms of novelties, the facilities of CBIR for 3D images coupled with image annotation in a fully automatic fashion have been developed and implemented in the system, resonating with future versatile, flexible and sustainable medical image databases that can reap new innovations. PMID:21915232

  6. Visual information mining in remote sensing image archives

    NASA Astrophysics Data System (ADS)

    Pelizzari, Andrea; Descargues, Vincent; Datcu, Mihai P.

    2002-01-01

    The present article focuses on the development of interactive exploratory tools for visually mining the image content in large remote sensing archives. Two aspects are treated: the iconic visualization of the global information in the archive and the progressive visualization of the image details. The proposed methods are integrated in the Image Information Mining (I2M) system. The images and image structure in the I2M system are indexed based on a probabilistic approach. The resulting links are managed by a relational data base. Both the intrinsic complexity of the observed images and the diversity of user requests result in a great number of associations in the data base. Thus new tools have been designed to visualize, in iconic representation the relationships created during a query or information mining operation: the visualization of the query results positioned on the geographical map, quick-looks gallery, visualization of the measure of goodness of the query, visualization of the image space for statistical evaluation purposes. Additionally the I2M system is enhanced with progressive detail visualization in order to allow better access for operator inspection. I2M is a three-tier Java architecture and is optimized for the Internet.

  7. Error mitigation for CCSD compressed imager data

    NASA Astrophysics Data System (ADS)

    Gladkova, Irina; Grossberg, Michael; Gottipati, Srikanth; Shahriar, Fazlul; Bonev, George

    2009-08-01

    To efficiently use the limited bandwidth available on the downlink from satellite to ground station, imager data is usually compressed before transmission. Transmission introduces unavoidable errors, which are only partially removed by forward error correction and packetization. In the case of the commonly used CCSD Rice-based compression, it results in a contiguous sequence of dummy values along scan lines in a band of the imager data. We have developed a method capable of using the image statistics to provide a principled estimate of the missing data. Our method outperforms interpolation yet can be performed fast enough to provide uninterrupted data flow. The estimation of the lost data provides significant value to end users who may use only part of the data, may not have statistical tools, or lack the expertise to mitigate the impact of the lost data. Since the locations of the lost data will be clearly marked as meta-data in the HDF or NetCDF header, experts who prefer to handle error mitigation themselves will be free to use or ignore our estimates as they see fit.

  8. Semiconductor bridge (SCB) detonator

    DOEpatents

    Bickes, Jr., Robert W.; Grubelich, Mark C.

    1999-01-01

    The present invention is a low-energy detonator for high-density secondary-explosive materials initiated by a semiconductor bridge igniter that comprises a pair of electrically conductive lands connected by a semiconductor bridge. The semiconductor bridge is in operational or direct contact with the explosive material, whereby current flowing through the semiconductor bridge causes initiation of the explosive material. Header wires connected to the electrically-conductive lands and electrical feed-throughs of the header posts of explosive devices, are substantially coaxial to the direction of current flow through the SCB, i.e., substantially coaxial to the SCB length.

  9. Proven and Robust Ground Support Systems - GSFC Success and Lessons Learned

    NASA Technical Reports Server (NTRS)

    Pfarr, Barbara; Donohue, John; Lui, Ben; Greer, Greg; Green, Tom

    2008-01-01

    Over the past fifteen years, Goddard Space Flight Center has developed several successful science missions in-house: the Wilkinson Microwave Anisotropy Probe (WMAP), the Imager for Magnetopause-to-Aurora Global Exploration (IMAGE), the Earth Observing 1 (EO-1) [1], and the Space Technology 5 (ST-5)[2] missions, several Small Explorers, and several balloon missions. Currently in development are the Solar Dynamics Observatory (SDO) [3] and the Lunar Reconnaissance Orbiter (LRO)[4]. What is not well known is that these missions have been supported during spacecraft and/or instrument integration and test, flight software development, and mission operations by two in house satellite Telemetry and Command (T & C) Systems, the Integrated Test and Operations System (ITOS) and the Advanced Spacecraft Integration and System Test (ASIST). The advantages of an in-house satellite Telemetry and Command system are primarily in the flexibility of management and maintenance - the developers are considered a part of the mission team, get involved early in the development process of the spacecraft and mission operations-control center, and provide on-site, on-call support that goes beyond Help Desk and simple software fixes. On the other hand, care must be taken to ensure that the system remains generic enough for cost effective re-use from one mission to the next. The software is designed such that many features are user-configurable. Where user-configurable options were impractical, features were designed so as to be easy for the development team to modify. Adding support for a new ground message header, for example, is a one-day effort because of the software framework on which that code rests. This paper will discuss the many features of the Goddard satellite Telemetry and Command systems that have contributed to the success of the missions listed above. These features include flexible user interfaces, distributed parallel commanding and telemetry decommutation, a procedure language, the interfaces and tools needed for a high degree of automation, and instantly accessible archives of spacecraft telemetry. It will discuss some of the problems overcome during development, including secure commanding over networks or the Internet, constellation support for the three satellites that comprise the ST-5 mission, and geographically distributed telemetry end users.

  10. Reiterating "Asylum Archive": Documenting Direct Provision in Ireland

    ERIC Educational Resources Information Center

    Nedeljkovic, Vukasin

    2018-01-01

    Originally a coping mechanism for an artist housed in a Direct Provision Centres while seeking asylum in Ireland, "Asylum Archive" has become much more than that. In 2018, it is now a collaborative archive, interactive and intermedial online document, and a scholarly research project. This iteration includes five new images of Railway…

  11. Interpretation of ANA Indirect Immunofluorescence Test Outside the Darkroom Using NOVA View Compared to Manual Microscopy

    PubMed Central

    Copple, Susan S.; Jaskowski, Troy D.; Giles, Rashelle; Hill, Harry R.

    2014-01-01

    Objective. To evaluate NOVA View with focus on reading archived images versus microscope based manual interpretation of ANA HEp-2 slides by an experienced, certified medical technologist. Methods. 369 well defined sera from: 44 rheumatoid arthritis, 50 systemic lupus erythematosus, 35 scleroderma, 19 Sjögren's syndrome, and 10 polymyositis patients as well as 99 healthy controls were examined. In addition, 12 defined sera from the Centers for Disease Control and 100 random patient sera sent to ARUP Laboratories for ANA HEp-2 IIF testing were included. Samples were read using the archived images on NOVA View and compared to results obtained from manual reading. Results. At a 1 : 40/1 : 80 dilution the resulting comparison demonstrated 94.8%/92.9% positive, 97.4%/97.4% negative, and 96.5%/96.2% total agreements between manual IIF and NOVA View archived images. Agreement of identifiable patterns between methods was 97%, with PCNA and mixed patterns undetermined. Conclusion. Excellent agreements were obtained between reading archived images on NOVA View and manually on a fluorescent microscope. In addition, workflow benefits were observed which need to be analyzed in future studies. PMID:24741573

  12. Redundant array of independent disks: practical on-line archiving of nuclear medicine image data.

    PubMed

    Lear, J L; Pratt, J P; Trujillo, N

    1996-02-01

    While various methods for long-term archiving of nuclear medicine image data exist, none support rapid on-line search and retrieval of information. We assembled a 90-Gbyte redundant array of independent disks (RAID) system using 10-, 9-Gbyte disk drives. The system was connected to a personal computer and software was used to partition the array into 4-Gbyte sections. All studies (50,000) acquired over a 7-year period were archived in the system. Based on patient name/number and study date, information could be located within 20 seconds and retrieved for display and analysis in less than 5 seconds. RAID offers a practical, redundant method for long-term archiving of nuclear medicine studies that supports rapid on-line retrieval.

  13. infoRAD: computers for clinical practice and education in radiology. Teleradiology, information transfer, and PACS: implications for diagnostic imaging in the 1990s.

    PubMed

    Schilling, R B

    1993-05-01

    Picture archiving and communication systems (PACS) provide image viewing at diagnostic, reporting, consultation, and remote workstations; archival on magnetic or optical media by means of short- or long-term storage devices; communications by means of local or wide area networks or public communication services; and integrated systems with modality interfaces and gateways to health care facilities and departmental information systems. Research indicates three basic needs for image and report management: (a) improved communication and turnaround time between radiologists and other imaging specialists and referring physicians, (b) fast reliable access to both current and previously obtained images and reports, and (c) space-efficient archival support. Although PACS considerations are much more complex than those associated with single modalities, the same basic purchase criteria apply. These criteria include technical leadership, image quality, throughput, life cost (eg, initial cost, maintenance, upgrades, and depreciation), and total service. Because a PACS takes much longer to implement than a single modality, the customer and manufacturer must develop a closer working relationship than has been necessary in the past.

  14. Contrast in Terahertz Images of Archival Documents—Part II: Influence of Topographic Features

    NASA Astrophysics Data System (ADS)

    Bardon, Tiphaine; May, Robert K.; Taday, Philip F.; Strlič, Matija

    2017-04-01

    We investigate the potential of terahertz time-domain imaging in reflection mode to reveal archival information in documents in a non-invasive way. In particular, this study explores the parameters and signal processing tools that can be used to produce well-contrasted terahertz images of topographic features commonly found in archival documents, such as indentations left by a writing tool, as well as sieve lines. While the amplitude of the waveforms at a specific time delay can provide the most contrasted and legible images of topographic features on flat paper or parchment sheets, this parameter may not be suitable for documents that have a highly irregular surface, such as water- or fire-damaged documents. For analysis of such documents, cross-correlation of the time-domain signals can instead yield images with good contrast. Analysis of the frequency-domain representation of terahertz waveforms can also provide well-contrasted images of topographic features, with improved spatial resolution when utilising high-frequency content. Finally, we point out some of the limitations of these means of analysis for extracting information relating to topographic features of interest from documents.

  15. Implementation of an ASP model offsite backup archive for clinical images utilizing Internet 2

    NASA Astrophysics Data System (ADS)

    Liu, Brent J.; Chao, Sander S.; Documet, Jorge; Lee, Jasper; Lee, Michael; Topic, Ian; Williams, Lanita

    2005-04-01

    With the development of PACS technology and an increasing demand by medical facilities to become filmless, there is a need for a fast and efficient method of providing data backup for disaster recovery and downtime scenarios. At the Image Processing Informatics Lab (IPI), an ASP Backup Archive was developed using a fault-tolerant server with a T1 connection to serve the PACS at the St. John's Health Center (SJHC) Santa Monica, California. The ASP archive server has been in clinical operation for more than 18 months, and its performance was presented at this SPIE Conference last year. This paper extends the ASP Backup Archive to serve the PACS at the USC Healthcare Consultation Center II (HCC2) utilizing an Internet2 connection. HCC2 is a new outpatient facility that recently opened in April 2004. The Internet2 connectivity between USC's HCC2 and IPI has been established for over one year. There are two novelties of the current ASP model: 1) Use of Internet2 for daily clinical operation, and 2) Modifying the existing backup archive to handle two sites in the ASP model. This paper presents the evaluation of the ASP Backup Archive based on the following two criteria: 1) Reliability and performance of the Internet2 connection between HCC2 and IPI using DICOM image transfer in a clinical environment, and 2) Ability of the ASP Fault-Tolerant backup archive to support two separate clinical PACS sites simultaneously. The performances of using T1 and Internet2 at the two different sites are also compared.

  16. J-Plus Web Portal

    NASA Astrophysics Data System (ADS)

    Civera Lorenzo, Tamara

    2017-10-01

    Brief presentation about the J-PLUS EDR data access web portal (http://archive.cefca.es/catalogues/jplus-edr) where the different services available to retrieve images and catalogues data have been presented.J-PLUS Early Data Release (EDR) archive includes two types of data: images and dual and single catalogue data which include parameters measured from images. J-PLUS web portal offers catalogue data and images through several different online data access tools or services each suited to a particular need. The different services offered are: Coverage map Sky navigator Object visualization Image search Cone search Object list search Virtual observatory services: Simple Cone Search Simple Image Access Protocol Simple Spectral Access Protocol Table Access Protocol

  17. Informatics in Radiology (infoRAD): personal computer security: part 2. Software Configuration and file protection.

    PubMed

    Caruso, Ronald D

    2004-01-01

    Proper configuration of software security settings and proper file management are necessary and important elements of safe computer use. Unfortunately, the configuration of software security options is often not user friendly. Safe file management requires the use of several utilities, most of which are already installed on the computer or available as freeware. Among these file operations are setting passwords, defragmentation, deletion, wiping, removal of personal information, and encryption. For example, Digital Imaging and Communications in Medicine medical images need to be anonymized, or "scrubbed," to remove patient identifying information in the header section prior to their use in a public educational or research environment. The choices made with respect to computer security may affect the convenience of the computing process. Ultimately, the degree of inconvenience accepted will depend on the sensitivity of the files and communications to be protected and the tolerance of the user. Copyright RSNA, 2004

  18. The Starchive: An open access, open source archive of nearby and young stars and their planets

    NASA Astrophysics Data System (ADS)

    Tanner, Angelle; Gelino, Chris; Elfeki, Mario

    2015-12-01

    Historically, astronomers have utilized a piecemeal set of archives such as SIMBAD, the Washington Double Star Catalog, various exoplanet encyclopedias and electronic tables from the literature to cobble together stellar and exo-planetary parameters in the absence of corresponding images and spectra. As the search for planets around young stars through direct imaging, transits and infrared/optical radial velocity surveys blossoms, there is a void in the available set of to create comprehensive lists of the stellar parameters of nearby stars especially for important parameters such as metallicity and stellar activity indicators. For direct imaging surveys, we need better resources for downloading existing high contrast images to help confirm new discoveries and find ideal target stars. Once we have discovered new planets, we need a uniform database of stellar and planetary parameters from which to look for correlations to better understand the formation and evolution of these systems. As a solution to these issues, we are developing the Starchive - an open access stellar archive in the spirit of the open exoplanet catalog, the Kepler Community Follow-up Program and many others. The archive will allow users to download various datasets, upload new images, spectra and metadata and will contain multiple plotting tools to use in presentations and data interpretations. While we will highly regulate and constantly validate the data being placed into our archive the open nature of its design is intended to allow the database to be expanded efficiently and have a level of versatility which is necessary in today's fast moving, big data community. Finally, the front-end scripts will be placed on github and users will be encouraged to contribute new plotting tools. Here, I will introduce the community to the content and expected capabilities of the archive and query the audience for community feedback.

  19. Advanced digital image archival system using MPEG technologies

    NASA Astrophysics Data System (ADS)

    Chang, Wo

    2009-08-01

    Digital information and records are vital to the human race regardless of the nationalities and eras in which they were produced. Digital image contents are produced at a rapid pace from cultural heritages via digitalization, scientific and experimental data via high speed imaging sensors, national defense satellite images from governments, medical and healthcare imaging records from hospitals, personal collection of photos from digital cameras. With these mass amounts of precious and irreplaceable data and knowledge, what standards technologies can be applied to preserve and yet provide an interoperable framework for accessing the data across varieties of systems and devices? This paper presents an advanced digital image archival system by applying the international standard of MPEG technologies to preserve digital image content.

  20. Conversion of a traditional image archive into an image resource on compact disc.

    PubMed Central

    Andrew, S M; Benbow, E W

    1997-01-01

    The conversion of a traditional archive of pathology images was organised on 35 mm slides into a database of images stored on compact disc (CD-ROM), and textual descriptions were added to each image record. Students on a didactic pathology course found this resource useful as an aid to revision, despite relative computer illiteracy, and it is anticipated that students on a new problem based learning course, which incorporates experience with information technology, will benefit even more readily when they use the database as an educational resource. A text and image database on CD-ROM can be updated repeatedly, and the content manipulated to reflect the content and style of the courses it supports. Images PMID:9306931

  1. Influence of imaging resolution on color fidelity in digital archiving.

    PubMed

    Zhang, Pengchang; Toque, Jay Arre; Ide-Ektessabi, Ari

    2015-11-01

    Color fidelity is of paramount importance in digital archiving. In this paper, the relationship between color fidelity and imaging resolution was explored by calculating the color difference of an IT8.7/2 color chart with a CIELAB color difference formula for scanning and simulation images. Microscopic spatial sampling was used in selecting the image pixels for the calculations to highlight the loss of color information. A ratio, called the relative imaging definition (RID), was defined to express the correlation between image resolution and color fidelity. The results show that in order for color differences to remain unrecognizable, the imaging resolution should be at least 10 times higher than the physical dimension of the smallest feature in the object being studied.

  2. Fpack and Funpack Utilities for FITS Image Compression and Uncompression

    NASA Technical Reports Server (NTRS)

    Pence, W.

    2008-01-01

    Fpack is a utility program for optimally compressing images in the FITS (Flexible Image Transport System) data format (see http://fits.gsfc.nasa.gov). The associated funpack program restores the compressed image file back to its original state (as long as a lossless compression algorithm is used). These programs may be run from the host operating system command line and are analogous to the gzip and gunzip utility programs except that they are optimized for FITS format images and offer a wider choice of compression algorithms. Fpack stores the compressed image using the FITS tiled image compression convention (see http://fits.gsfc.nasa.gov/fits_registry.html). Under this convention, the image is first divided into a user-configurable grid of rectangular tiles, and then each tile is individually compressed and stored in a variable-length array column in a FITS binary table. By default, fpack usually adopts a row-by-row tiling pattern. The FITS image header keywords remain uncompressed for fast access by FITS reading and writing software. The tiled image compression convention can in principle support any number of different compression algorithms. The fpack and funpack utilities call on routines in the CFITSIO library (http://hesarc.gsfc.nasa.gov/fitsio) to perform the actual compression and uncompression of the FITS images, which currently supports the GZIP, Rice, H-compress, and PLIO IRAF pixel list compression algorithms.

  3. From Ephemeral to Legitimate: An Inquiry into Television's Material Traces in Archival Spaces, 1950s-1970s

    ERIC Educational Resources Information Center

    Bratslavsky, Lauren Michelle

    2013-01-01

    The dissertation offers a historical inquiry about how television's material traces entered archival spaces. Material traces refer to both the moving image products and the assortment of documentation about the processes of television as industrial and creative endeavors. By identifying the development of television-specific archives and…

  4. The Internet as a Medium of Training for Picture Archival and Communication Systems (PACS).

    ERIC Educational Resources Information Center

    Majid, Shaheen; Misra, Ramesh Kumar

    2002-01-01

    Explores the potential of Web-based training for PACS (Picture Archival and Communication Systems) used in radiology departments for the storage and archiving of patients' medical images. Reports results of studies in three hospitals in Malaysia, Singapore and the Philippines that showed that the Internet can be used effectively for training.…

  5. USNO Image and Catalog Archive Server - Naval Oceanography Portal

    Science.gov Websites

    are here: Home › USNO › Astrometry › Optical/IR Products › USNO Image and Catalog Archive Server USNO Logo USNO Navigation Optical/IR Products NOMAD UCAC URAT USNO-B1.0 Double Stars Solar System Link Disclaimer This is an official U.S. Navy web site. Security & Privacy Policy Veterans Crisis

  6. Enterprise utilization of "always on-line" diagnostic study archive.

    PubMed

    McEnery, Kevin W; Suitor, Charles T; Thompson, Stephen K; Shepard, Jeffrey S; Murphy, William A

    2002-01-01

    To meet demands for enterprise image distribution, an "always on-line" image storage archive architecture was implemented before soft copy interpretation. It was presumed that instant availability of historical diagnostic studies would elicit a substantial utilization. Beginning November 1, 2000 an enterprise distribution archive was activated (Stentor, SanFrancisco, CA). As of August 8, 2001, 83,052 studies were available for immediate access without the need for retrieval from long-term archive. Image storage and retrieval logs for the period from June 12, 2001 to August 8, 2001 were analyzed. A total of 41,337 retrieval requests were noted for the 83,052 studies available as August 8, 2001. Computed radiography represented 16.8% of retrieval requests; digital radiography, 16.9%; computed tomography (CT), 44.5%; magnetic resonance (MR), 19.2%; and ultrasonography, 2.6%. A total of 51.5% of study retrievals were for studies less than 72 hours old. Study requests for cases greater than 100 days old represented 9.9% of all accessions, 9.7% of CT accessions, and 15.4% of MR accessions. Utilization of the archive indicates a substantial proportion of study retrievals for studies less than 72 hours after study completion. However, significant interest in historical CT and MR examinations was shown.

  7. Autosophy: an alternative vision for satellite communication, compression, and archiving

    NASA Astrophysics Data System (ADS)

    Holtz, Klaus; Holtz, Eric; Kalienky, Diana

    2006-08-01

    Satellite communication and archiving systems are now designed according to an outdated Shannon information theory where all data is transmitted in meaningless bit streams. Video bit rates, for example, are determined by screen size, color resolution, and scanning rates. The video "content" is irrelevant so that totally random images require the same bit rates as blank images. An alternative system design, based on the newer Autosophy information theory, is now evolving, which transmits data "contend" or "meaning" in a universally compatible 64bit format. This would allow mixing all multimedia transmissions in the Internet's packet stream. The new systems design uses self-assembling data structures, which grow like data crystals or data trees in electronic memories, for both communication and archiving. The advantages for satellite communication and archiving may include: very high lossless image and video compression, unbreakable encryption, resistance to transmission errors, universally compatible data formats, self-organizing error-proof mass memories, immunity to the Internet's Quality of Service problems, and error-proof secure communication protocols. Legacy data transmission formats can be converted by simple software patches or integrated chipsets to be forwarded through any media - satellites, radio, Internet, cable - without needing to be reformatted. This may result in orders of magnitude improvements for all communication and archiving systems.

  8. CLIPS++: Embedding CLIPS into C++

    NASA Technical Reports Server (NTRS)

    Obermeyer, Lance; Miranker, Daniel P.

    1994-01-01

    This paper describes a set of C++ extensions to the CLIPS language and their embodiment in CLIPS++. These extensions and the implementation approach of CLIPS++ provide a new level of embeddability with C and C++. These extensions are a C++ include statement and a defcontainer construct; (include (c++-header-file.h)) and (defcontainer (c++-type)). The include construct allows C++ functions to be embedded in both the LHS and RHS of CLIPS rules. The header file in an include construct is the same header file the programmer uses for his/her own C++ code, independent of CLIPS. The defcontainer construct allows the inference engine to treat C++ class instances as CLIPS deftemplate facts. Consequently existing C++ class libraries may be transparently imported into CLIPS. These C++ types may use advanced features like inheritance, virtual functions, and templates. The implementation has been tested with several class libraries, including Rogue Wave Software's Tools.h++, GNU's libg++, and USL's C++ Standard Components. The execution speed of CLIPS++ has been determined to be 5 to 700 times the execution speed of CLIPS 6.0 (10 to 20X typical).

  9. Apparatus and methods for supplying auxiliary steam in a combined cycle system

    DOEpatents

    Gorman, William G.; Carberg, William George; Jones, Charles Michael

    2002-01-01

    To provide auxiliary steam, a low pressure valve is opened in a combined cycle system to divert low pressure steam from the heat recovery steam generator to a header for supplying steam to a second combined cycle's steam turbine seals, sparging devices and cooling steam for the steam turbine if the steam turbine and gas turbine lie on a common shaft with the generator. Cooling steam is supplied the gas turbine in the combined cycle system from the high pressure steam turbine. Spent gas turbine cooling steam may augment the low pressure steam supplied to the header by opening a high pressure valve whereby high and low pressure steam flows are combined. An attemperator is used to reduce the temperature of the combined steam in response to auxiliary steam flows above a predetermined flow and a steam header temperature above a predetermined temperature. The auxiliary steam may be used to start additional combined cycle units or to provide a host unit with steam turbine cooling and sealing steam during full-speed no-load operation after a load rejection.

  10. OASIS: A Data Fusion System Optimized for Access to Distributed Archives

    NASA Astrophysics Data System (ADS)

    Berriman, G. B.; Kong, M.; Good, J. C.

    2002-05-01

    The On-Line Archive Science Information Services (OASIS) is accessible as a java applet through the NASA/IPAC Infrared Science Archive home page. It uses Geographical Information System (GIS) technology to provide data fusion and interaction services for astronomers. These services include the ability to process and display arbitrarily large image files, and user-controlled contouring, overlay regeneration and multi-table/image interactions. OASIS has been optimized for access to distributed archives and data sets. Its second release (June 2002) provides a mechanism that enables access to OASIS from "third-party" services and data providers. That is, any data provider who creates a query form to an archive containing a collection of data (images, catalogs, spectra) can direct the result files from the query into OASIS. Similarly, data providers who serve links to datasets or remote services on a web page can access all of these data with one instance of OASIS. In this was any data or service provider is given access to the full suite of capabilites of OASIS. We illustrate the "third-party" access feature with two examples: queries to the high-energy image datasets accessible from GSFC SkyView, and links to data that are returned from a target-based query to the NASA Extragalactic Database (NED). The second release of OASIS also includes a file-transfer manager that reports the status of multiple data downloads from remote sources to the client machine. It is a prototype for a request management system that will ultimately control and manage compute-intensive jobs submitted through OASIS to computing grids, such as request for large scale image mosaics and bulk statistical analysis.

  11. Medical image archive node simulation and architecture

    NASA Astrophysics Data System (ADS)

    Chiang, Ted T.; Tang, Yau-Kuo

    1996-05-01

    It is a well known fact that managed care and new treatment technologies are revolutionizing the health care provider world. Community Health Information Network and Computer-based Patient Record projects are underway throughout the United States. More and more hospitals are installing digital, `filmless' radiology (and other imagery) systems. They generate a staggering amount of information around the clock. For example, a typical 500-bed hospital might accumulate more than 5 terabytes of image data in a period of 30 years for conventional x-ray images and digital images such as Magnetic Resonance Imaging and Computer Tomography images. With several hospitals contributing to the archive, the storage required will be in the hundreds of terabytes. Systems for reliable, secure, and inexpensive storage and retrieval of digital medical information do not exist today. In this paper, we present a Medical Image Archive and Distribution Service (MIADS) concept. MIADS is a system shared by individual and community hospitals, laboratories, and doctors' offices that need to store and retrieve medical images. Due to the large volume and complexity of the data, as well as the diversified user access requirement, implementation of the MIADS will be a complex procedure. One of the key challenges to implementing a MIADS is to select a cost-effective, scalable system architecture to meet the ingest/retrieval performance requirements. We have performed an in-depth system engineering study, and developed a sophisticated simulation model to address this key challenge. This paper describes the overall system architecture based on our system engineering study and simulation results. In particular, we will emphasize system scalability and upgradability issues. Furthermore, we will discuss our simulation results in detail. The simulations study the ingest/retrieval performance requirements based on different system configurations and architectures for variables such as workload, tape access time, number of drives, number of exams per patient, number of Central Processing Units, patient grouping, and priority impacts. The MIADS, which could be a key component of a broader data repository system, will be able to communicate with and obtain data from existing hospital information systems. We will discuss the external interfaces enabling MIADS to communicate with and obtain data from existing Radiology Information Systems such as the Picture Archiving and Communication System (PACS). Our system design encompasses the broader aspects of the archive node, which could include multimedia data such as image, audio, video, and free text data. This system is designed to be integrated with current hospital PACS through a Digital Imaging and Communications in Medicine interface. However, the system can also be accessed through the Internet using Hypertext Transport Protocol or Simple File Transport Protocol. Our design and simulation work will be key to implementing a successful, scalable medical image archive and distribution system.

  12. CCDST: A free Canadian climate data scraping tool

    NASA Astrophysics Data System (ADS)

    Bonifacio, Charmaine; Barchyn, Thomas E.; Hugenholtz, Chris H.; Kienzle, Stefan W.

    2015-02-01

    In this paper we present a new software tool that automatically fetches, downloads and consolidates climate data from a Web database where the data are contained on multiple Web pages. The tool is called the Canadian Climate Data Scraping Tool (CCDST) and was developed to enhance access and simplify analysis of climate data from Canada's National Climate Data and Information Archive (NCDIA). The CCDST deconstructs a URL for a particular climate station in the NCDIA and then iteratively modifies the date parameters to download large volumes of data, remove individual file headers, and merge data files into one output file. This automated sequence enhances access to climate data by substantially reducing the time needed to manually download data from multiple Web pages. To this end, we present a case study of the temporal dynamics of blowing snow events that resulted in ~3.1 weeks time savings. Without the CCDST, the time involved in manually downloading climate data limits access and restrains researchers and students from exploring climate trends. The tool is coded as a Microsoft Excel macro and is available to researchers and students for free. The main concept and structure of the tool can be modified for other Web databases hosting geophysical data.

  13. Medical information, communication, and archiving system (MICAS): Phase II integration and acceptance testing

    NASA Astrophysics Data System (ADS)

    Smith, Edward M.; Wandtke, John; Robinson, Arvin E.

    1999-07-01

    The Medical Information, Communication and Archive System (MICAS) is a multi-modality integrated image management system that is seamlessly integrated with the Radiology Information System (RIS). This project was initiated in the summer of 1995 with the first phase being installed during the first half of 1997 and the second phase installed during the summer of 1998. Phase II enhancements include a permanent archive, automated workflow including modality worklist, study caches, NT diagnostic workstations with all components adhering to Digital Imaging and Communications in Medicine (DICOM) standards. This multi-vendor phased approach to PACS implementation is designed as an enterprise-wide PACS to provide images and reports throughout our healthcare network. MICAS demonstrates that aa multi-vendor open system phased approach to PACS is feasible, cost-effective, and has significant advantages over a single vendor implementation.

  14. Migration of medical image data archived using mini-PACS to full-PACS.

    PubMed

    Jung, Haijo; Kim, Hee-Joung; Kang, Won-Suk; Lee, Sang-Ho; Kim, Sae-Rome; Ji, Chang Lyong; Kim, Jung-Han; Yoo, Sun Kook; Kim, Ki-Hwang

    2004-06-01

    This study evaluated the migration to full-PACS of medical image data archived using mini-PACS at two hospitals of the Yonsei University Medical Center, Seoul, Korea. A major concern in the migration of medical data is to match the image data from the mini-PACS with the hospital OCS (Ordered Communication System). Prior to carrying out the actual migration process, the principles, methods, and anticipated results for the migration with respect to both cost and effectiveness were evaluated. Migration gateway workstations were established and a migration software tool was developed. The actual migration process was performed based on the results of several migration simulations. Our conclusions were that a migration plan should be carefully prepared and tailored to the individual hospital environment because the server system, archive media, network, OCS, and policy for data management may be unique.

  15. West Flank Coso, CA FORGE 3D geologic model

    DOE Data Explorer

    Doug Blankenship

    2016-03-01

    This is an x,y,z file of the West Flank FORGE 3D geologic model. Model created in Earthvision by Dynamic Graphic Inc. The model was constructed with a grid spacing of 100 m. Geologic surfaces were extrapolated from the input data using a minimum tension gridding algorithm. The data file is tabular data in a text file, with lithology data associated with X,Y,Z grid points. All the relevant information is in the file header (the spatial reference, the projection etc.) In addition all the fields in the data file are identified in the header.

  16. Fallon FORGE 3D Geologic Model

    DOE Data Explorer

    Doug Blankenship

    2016-03-01

    An x,y,z scattered data file for the 3D geologic model of the Fallon FORGE site. Model created in Earthvision by Dynamic Graphic Inc. The model was constructed with a grid spacing of 100 m. Geologic surfaces were extrapolated from the input data using a minimum tension gridding algorithm. The data file is tabular data in a text file, with lithology data associated with X,Y,Z grid points. All the relevant information is in the file header (the spatial reference, the projection etc.) In addition all the fields in the data file are identified in the header.

  17. Semiconductor bridge (SCB) detonator

    DOEpatents

    Bickes, R.W. Jr.; Grubelich, M.C.

    1999-01-19

    The present invention is a low-energy detonator for high-density secondary-explosive materials initiated by a semiconductor bridge (SCB) igniter that comprises a pair of electrically conductive lands connected by a semiconductor bridge. The semiconductor bridge is in operational or direct contact with the explosive material, whereby current flowing through the semiconductor bridge causes initiation of the explosive material. Header wires connected to the electrically-conductive lands and electrical feed-throughs of the header posts of explosive devices, are substantially coaxial to the direction of current flow through the SCB, i.e., substantially coaxial to the SCB length. 3 figs.

  18. LANDSAT-D data format control book. Volume 6, appendix G: GSFC HDT-AM inventory tape (GHIT-AM)

    NASA Technical Reports Server (NTRS)

    1981-01-01

    The data format specifications of the Goddard HDT inventory tapes (GHITS), which accompany shipments of archival digital multispectral scanner image data (HDT-AM tapes), are defined. The GHIT is a nine-track, 1600-BPI tape which conforms to the ANSI standard and serves as an inventory and description of the image data included in the shipment. The archival MSS tapes (HDT-AMs) contain radiometrically corrected but geometrically uncorrected image data plus certain ancillary data necessary to perform the geometric corrections.

  19. Knowledge-driven information mining in remote-sensing image archives

    NASA Astrophysics Data System (ADS)

    Datcu, M.; Seidel, K.; D'Elia, S.; Marchetti, P. G.

    2002-05-01

    Users in all domains require information or information-related services that are focused, concise, reliable, low cost and timely and which are provided in forms and formats compatible with the user's own activities. In the current Earth Observation (EO) scenario, the archiving centres generally only offer data, images and other "low level" products. The user's needs are being only partially satisfied by a number of, usually small, value-adding companies applying time-consuming (mostly manual) and expensive processes relying on the knowledge of experts to extract information from those data or images.

  20. DICOM-compliant PACS with CD-based image archival

    NASA Astrophysics Data System (ADS)

    Cox, Robert D.; Henri, Christopher J.; Rubin, Richard K.; Bret, Patrice M.

    1998-07-01

    This paper describes the design and implementation of a low- cost PACS conforming to the DICOM 3.0 standard. The goal was to provide an efficient image archival and management solution on a heterogeneous hospital network as a basis for filmless radiology. The system follows a distributed, client/server model and was implemented at a fraction of the cost of a commercial PACS. It provides reliable archiving on recordable CD and allows access to digital images throughout the hospital and on the Internet. Dedicated servers have been designed for short-term storage, CD-based archival, data retrieval and remote data access or teleradiology. The short-term storage devices provide DICOM storage and query/retrieve services to scanners and workstations and approximately twelve weeks of 'on-line' image data. The CD-based archival and data retrieval processes are fully automated with the exception of CD loading and unloading. The system employs lossless compression on both short- and long-term storage devices. All servers communicate via the DICOM protocol in conjunction with both local and 'master' SQL-patient databases. Records are transferred from the local to the master database independently, ensuring that storage devices will still function if the master database server cannot be reached. The system features rules-based work-flow management and WWW servers to provide multi-platform remote data access. The WWW server system is distributed on the storage, retrieval and teleradiology servers allowing viewing of locally stored image data directly in a WWW browser without the need for data transfer to a central WWW server. An independent system monitors disk usage, processes, network and CPU load on each server and reports errors to the image management team via email. The PACS was implemented using a combination of off-the-shelf hardware, freely available software and applications developed in-house. The system has enabled filmless operation in CT, MR and ultrasound within the radiology department and throughout the hospital. The use of WWW technology has enabled the development of an intuitive we- based teleradiology and image management solution that provides complete access to image data.

  1. Measurements of 100 'Critical' Minor Planets from NEAT Archive

    NASA Astrophysics Data System (ADS)

    Deshmukh, Shishir

    2017-07-01

    Uncertainties associated with the orbits of minor planets can be reduced by analyzing archival imagery as attempted in the current investigation. Archival images from NEAT and NASA’s Skymorph database were analyzed using standard software to identify the minor planets listed in the critical list. Findings of each minor planet were submitted to Minor Planet Center (MPC) to offer better orbital solutions.

  2. Digital information management: a progress report on the National Digital Mammography Archive

    NASA Astrophysics Data System (ADS)

    Beckerman, Barbara G.; Schnall, Mitchell D.

    2002-05-01

    Digital mammography creates very large images, which require new approaches to storage, retrieval, management, and security. The National Digital Mammography Archive (NDMA) project, funded by the National Library of Medicine (NLM), is developing a limited testbed that demonstrates the feasibility of a national breast imaging archive, with access to prior exams; patient information; computer aids for image processing, teaching, and testing tools; and security components to ensure confidentiality of patient information. There will be significant benefits to patients and clinicians in terms of accessible data with which to make a diagnosis and to researchers performing studies on breast cancer. Mammography was chosen for the project, because standards were already available for digital images, report formats, and structures. New standards have been created for communications protocols between devices, front- end portal and archive. NDMA is a distributed computing concept that provides for sharing and access across corporate entities. Privacy, auditing, and patient consent are all integrated into the system. Five sites, Universities of Pennsylvania, Chicago, North Carolina and Toronto, and BWXT Y12, are connected through high-speed networks to demonstrate functionality. We will review progress, including technical challenges, innovative research and development activities, standards and protocols being implemented, and potential benefits to healthcare systems.

  3. Dynamic flat panel detector versus image intensifier in cardiac imaging: dose and image quality

    NASA Astrophysics Data System (ADS)

    Vano, E.; Geiger, B.; Schreiner, A.; Back, C.; Beissel, J.

    2005-12-01

    The practical aspects of the dosimetric and imaging performance of a digital x-ray system for cardiology procedures were evaluated. The system was configured with an image intensifier (II) and later upgraded to a dynamic flat panel detector (FD). Entrance surface air kerma (ESAK) to phantoms of 16, 20, 24 and 28 cm of polymethyl methacrylate (PMMA) and the image quality of a test object were measured. Images were evaluated directly on the monitor and with numerical methods (noise and signal-to-noise ratio). Information contained in the DICOM header for dosimetry audit purposes was also tested. ESAK values per frame (or kerma rate) for the most commonly used cine and fluoroscopy modes for different PMMA thicknesses and for field sizes of 17 and 23 cm for II, and 20 and 25 cm for FD, produced similar results in the evaluated system with both technologies, ranging between 19 and 589 µGy/frame (cine) and 5 and 95 mGy min-1 (fluoroscopy). Image quality for these dose settings was better for the FD version. The 'study dosimetric report' is comprehensive, and its numerical content is sufficiently accurate. There is potential in the future to set those systems with dynamic FD to lower doses than are possible in the current II versions, especially for digital cine runs, or to benefit from improved image quality.

  4. MIDG-Emerging grid technologies for multi-site preclinical molecular imaging research communities.

    PubMed

    Lee, Jasper; Documet, Jorge; Liu, Brent; Park, Ryan; Tank, Archana; Huang, H K

    2011-03-01

    Molecular imaging is the visualization and identification of specific molecules in anatomy for insight into metabolic pathways, tissue consistency, and tracing of solute transport mechanisms. This paper presents the Molecular Imaging Data Grid (MIDG) which utilizes emerging grid technologies in preclinical molecular imaging to facilitate data sharing and discovery between preclinical molecular imaging facilities and their collaborating investigator institutions to expedite translational sciences research. Grid-enabled archiving, management, and distribution of animal-model imaging datasets help preclinical investigators to monitor, access and share their imaging data remotely, and promote preclinical imaging facilities to share published imaging datasets as resources for new investigators. The system architecture of the Molecular Imaging Data Grid is described in a four layer diagram. A data model for preclinical molecular imaging datasets is also presented based on imaging modalities currently used in a molecular imaging center. The MIDG system components and connectivity are presented. And finally, the workflow steps for grid-based archiving, management, and retrieval of preclincial molecular imaging data are described. Initial performance tests of the Molecular Imaging Data Grid system have been conducted at the USC IPILab using dedicated VMware servers. System connectivity, evaluated datasets, and preliminary results are presented. The results show the system's feasibility, limitations, direction of future research. Translational and interdisciplinary research in medicine is increasingly interested in cellular and molecular biology activity at the preclinical levels, utilizing molecular imaging methods on animal models. The task of integrated archiving, management, and distribution of these preclinical molecular imaging datasets at preclinical molecular imaging facilities is challenging due to disparate imaging systems and multiple off-site investigators. A Molecular Imaging Data Grid design, implementation, and initial evaluation is presented to demonstrate the secure and novel data grid solution for sharing preclinical molecular imaging data across the wide-area-network (WAN).

  5. A Complete Public Archive for the Einstein Imaging Proportional Counter

    NASA Technical Reports Server (NTRS)

    Helfand, David J.

    1996-01-01

    Consistent with our proposal to the Astrophysics Data Program in 1992, we have completed the design, construction, documentation, and distribution of a flexible and complete archive of the data collected by the Einstein Imaging Proportional Counter. Along with software and data delivered to the High Energy Astrophysics Science Archive Research Center at Goddard Space Flight Center, we have compiled and, where appropriate, published catalogs of point sources, soft sources, hard sources, extended sources, and transient flares detected in the database along with extensive analyses of the instrument's backgrounds and other anomalies. We include in this document a brief summary of the archive's functionality, a description of the scientific catalogs and other results, a bibliography of publications supported in whole or in part under this contract, and a list of personnel whose pre- and post-doctoral education consisted in part in participation in this project.

  6. Fast high resolution reconstruction in multi-slice and multi-view cMRI

    NASA Astrophysics Data System (ADS)

    Velasco Toledo, Nelson; Romero Castro, Eduardo

    2015-01-01

    Cardiac magnetic resonance imaging (cMRI) is an useful tool in diagnosis, prognosis and research since it functionally tracks the heart structure. Although useful, this imaging technique is limited in spatial resolution because heart is a constant moving organ, also there are other non controled conditions such as patient movements and volumetric changes during apnea periods when data is acquired, those conditions limit the time to capture high quality information. This paper presents a very fast and simple strategy to reconstruct high resolution 3D images from a set of low resolution series of 2D images. The strategy is based on an information reallocation algorithm which uses the DICOM header to relocate voxel intensities in a regular grid. An interpolation method is applied to fill empty places with estimated data, the interpolation resamples the low resolution information to estimate the missing information. As a final step a gaussian filter that denoises the final result. A reconstructed image evaluation is performed using as a reference a super-resolution reconstructed image. The evaluation reveals that the method maintains the general heart structure with a small loss in detailed information (edge sharpening and blurring), some artifacts related with input information quality are detected. The proposed method requires low time and computational resources.

  7. Development and implementation of ultrasound picture archiving and communication system

    NASA Astrophysics Data System (ADS)

    Weinberg, Wolfram S.; Tessler, Franklin N.; Grant, Edward G.; Kangarloo, Hooshang; Huang, H. K.

    1990-08-01

    The Department of Radiological Sciences at the UCLA School of Medicine is developing an archiving and communication system (PACS) for digitized ultrasound images. In its final stage the system will involve the acquisition and archiving of ultrasound studies from four different locations including the Center for Health Sciences, the Department for Mental Health and the Outpatient Radiology and Endoscopy Departments with a total of 200-250 patient studies per week. The concept comprises two stages of image manipulation for each ultrasound work area. The first station is located close to the examination site and accomodates the acquisition of digital images from up to five ultrasound devices and provides for instantaneous display and primary viewing and image selection. Completed patient studies are transferred to a main workstation for secondary review, further analysis and comparison studies. The review station has an on-line storage capacity of 10,000 images with a resolution of 512x512 8 bit data to allow for immediate retrieval of active patient studies of up to two weeks. The main work stations are connected through the general network and use one central archive for long term storage and a film printer for hardcopy output. First phase development efforts concentrate on the implementation and testing of a system at one location consisting of a number of ultrasound units with video digitizer and network interfaces and a microcomputer workstation as host for the display station with two color monitors, each allowing simultaneous display of four 512x512 images. The discussion emphasizes functionality, performance and acceptance of the system in the clinical environment.

  8. Resolution analysis of archive films for the purpose of their optimal digitization and distribution

    NASA Astrophysics Data System (ADS)

    Fliegel, Karel; Vítek, Stanislav; Páta, Petr; Myslík, Jiří; Pecák, Josef; Jícha, Marek

    2017-09-01

    With recent high demand for ultra-high-definition (UHD) content to be screened in high-end digital movie theaters but also in the home environment, film archives full of movies in high-definition and above are in the scope of UHD content providers. Movies captured with the traditional film technology represent a virtually unlimited source of UHD content. The goal to maintain complete image information is also related to the choice of scanning resolution and spatial resolution for further distribution. It might seem that scanning the film material in the highest possible resolution using state-of-the-art film scanners and also its distribution in this resolution is the right choice. The information content of the digitized images is however limited, and various degradations moreover lead to its further reduction. Digital distribution of the content in the highest image resolution might be therefore unnecessary or uneconomical. In other cases, the highest possible resolution is inevitable if we want to preserve fine scene details or film grain structure for archiving purposes. This paper deals with the image detail content analysis of archive film records. The resolution limit in captured scene image and factors which lower the final resolution are discussed. Methods are proposed to determine the spatial details of the film picture based on the analysis of its digitized image data. These procedures allow determining recommendations for optimal distribution of digitized video content intended for various display devices with lower resolutions. Obtained results are illustrated on spatial downsampling use case scenario, and performance evaluation of the proposed techniques is presented.

  9. Using modern imaging techniques to old HST data: a summary of the ALICE program.

    NASA Astrophysics Data System (ADS)

    Choquet, Elodie; Soummer, Remi; Perrin, Marshall; Pueyo, Laurent; Hagan, James Brendan; Zimmerman, Neil; Debes, John Henry; Schneider, Glenn; Ren, Bin; Milli, Julien; Wolff, Schuyler; Stark, Chris; Mawet, Dimitri; Golimowski, David A.; Hines, Dean C.; Roberge, Aki; Serabyn, Eugene

    2018-01-01

    Direct imaging of extrasolar systems is a powerful technique to study the physical properties of exoplanetary systems and understand their formation and evolution mechanisms. The detection and characterization of these objects are challenged by their high contrast with their host star. Several observing strategies and post-processing algorithms have been developed for ground-based high-contrast imaging instruments, enabling the discovery of directly-imaged and spectrally-characterized exoplanets. The Hubble Space Telescope (HST), pioneer in directly imaging extrasolar systems, has yet been often limited to the detection of bright debris disks systems, with sensitivity limited by the difficulty to implement an optimal PSF subtraction stategy, which is readily offered on ground-based telescopes in pupil tracking mode.The Archival Legacy Investigations of Circumstellar Environments (ALICE) program is a consistent re-analysis of the 10 year old coronagraphic archive of HST's NICMOS infrared imager. Using post-processing methods developed for ground-based observations, we used the whole archive to calibrate PSF temporal variations and improve NICMOS's detection limits. We have now delivered ALICE-reprocessed science products for the whole NICMOS archival data back to the community. These science products, as well as the ALICE pipeline, were used to prototype the JWST coronagraphic data and reduction pipeline. The ALICE program has enabled the detection of 10 faint debris disk systems never imaged before in the near-infrared and several substellar companion candidates, which we are all in the process of characterizing through follow-up observations with both ground-based facilities and HST-STIS coronagraphy. In this publication, we provide a summary of the results of the ALICE program, advertise its science products and discuss the prospects of the program.

  10. Earth analog image digitization of field, aerial, and lab experiment studies for Planetary Data System archiving.

    NASA Astrophysics Data System (ADS)

    Williams, D. A.; Nelson, D. M.

    2017-12-01

    A portion of the earth analog image archive at the Ronald Greeley Center for Planetary Studies (RGCPS)-the NASA Regional Planetary Information Facility at Arizona State University-is being digitized and will be added to the Planetary Data System (PDS) for public use. This will be a first addition of terrestrial data to the PDS specifically for comparative planetology studies. Digitization is separated into four tasks. First is the scanning of aerial photographs of volcanic and aeolian structures and flows. The second task is to scan field site images taken from ground and low-altitude aircraft of volcanic structures, lava flows, lava tubes, dunes, and wind streaks. The third image set to be scanned includes photographs of lab experiments from the NASA Planetary Aeolian Laboratory wind tunnels, vortex generator, and of wax models. Finally, rare NASA documents are being scanned and formatted as PDF files. Thousands of images are to be scanned for this project. Archiving of the data will follow the PDS4 standard, where the entire project is classified as a single bundle, with individual subjects (i.e., the Amboy Crater volcanic structure in the Mojave Desert of California) as collections. Within the collections, each image is considered a product, with a unique ID and associated XML document. Documents describing the image data, including the subject and context, will be included with each collection. Once complete, the data will be hosted by a PDS data node and available for public search and download. As one of the first earth analog datasets to be archived by the PDS, this project could prompt the digitizing and making available of historic datasets from other facilities for the scientific community.

  11. Sleep patterns and match performance in elite Australian basketball athletes.

    PubMed

    Staunton, Craig; Gordon, Brett; Custovic, Edhem; Stanger, Jonathan; Kingsley, Michael

    2017-08-01

    To assess sleep patterns and associations between sleep and match performance in elite Australian female basketball players. Prospective cohort study. Seventeen elite female basketball players were monitored across two consecutive in-season competitions (30 weeks). Total sleep time and sleep efficiency were determined using triaxial accelerometers for Baseline, Pre-match, Match-day and Post-match timings. Match performance was determined using the basketball efficiency statistic (EFF). The effects of match schedule (Regular versus Double-Header; Home versus Away) and sleep on EFF were assessed. The Double-Header condition changed the pattern of sleep when compared with the Regular condition (F (3,48) =3.763, P=0.017), where total sleep time Post-match was 11% less for Double-Header (mean±SD; 7.2±1.4h) compared with Regular (8.0±1.3h; P=0.007). Total sleep time for Double-Header was greater Pre-match (8.2±1.7h) compared with Baseline (7.1±1.6h; P=0.022) and Match-day (7.3±1.5h; P=0.007). Small correlations existed between sleep metrics at Pre-match and EFF for pooled data (r=-0.39 to -0.22; P≥0.238). Relationships between total sleep time and EFF ranged from moderate negative to large positive correlations for individual players (r=-0.37 to 0.62) and reached significance for one player (r=0.60; P=0.025). Match schedule can affect the sleep patterns of elite female basketball players. A large degree of inter-individual variability existed in the relationship between sleep and match performance; nevertheless, sleep monitoring might assist in the optimisation of performance for some athletes. Copyright © 2017 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.

  12. Types and analysis of defects in welding junctions of the header to steam generator shells on power-generating units with VVER-1000

    NASA Astrophysics Data System (ADS)

    Ozhigov, L. S.; Voevodin, V. N.; Mitrofanov, A. S.; Vasilenko, R. L.

    2016-10-01

    Investigation objects were metal templates, which were cut during the repair of welding junction no. 111 (header to the steam generator shell) on a power-generating unit with VVER-1000 of the South-Ukraine NPP, and substances of mud depositions collected from walls of this junction. Investigations were carried out using metallography, optical microscopy, and scanning electron microscopy with energy dispersion microanalysis by an MMO-1600-AT metallurgical microscope and a JEOL JSM-7001F scanning electron microscope with the Shottky cathode. As a result of investigations in corrosion pits and mud depositions in the area of welding junction no. 111, iron and copper-enriched particles were revealed. It is shown that, when contacting with the steel header surface, these particles can form microgalvanic cells causing reactions of iron dissolution and the pit corrosion of metal. Nearby corrosion pits in metal are microcracks, which can be effect of the stress state of metal under corrosion pits along with revealed effects of twinning. The hypothesis is expressed that pitting corrosion of metal occurred during the first operation period of the power-generating unit in the ammonia water chemistry conditions (WCC). The formation of corrosion pits and nucleating cracks from them was stopped with the further operation under morpholine WCC. The absence of macrocracks in metal of templates verifies that, during operation, welding junction no. 111 operated under load conditions not exceeding the permissible ones by design requirements. The durability of the welding junction of the header to the steam generator shell significantly depends on the technological schedule of chemical cleaning and steam generator shut-down cooling.

  13. Rendering an archive in three dimensions

    NASA Astrophysics Data System (ADS)

    Leiman, David A.; Twose, Claire; Lee, Teresa Y. H.; Fletcher, Alex; Yoo, Terry S.

    2003-05-01

    We examine the requirements for a publicly accessible, online collection of three-dimensional biomedical image data, including those yielded by radiological processes such as MRI, ultrasound and others. Intended as a repository and distribution mechanism for such medical data, we created the National Online Volumetric Archive (NOVA) as a case study aimed at identifying the multiple issues involved in realizing a large-scale digital archive. In the paper we discuss such factors as the current legal and health information privacy policy affecting the collection of human medical images, retrieval and management of information and technical implementation. This project culminated in the launching of a website that includes downloadable datasets and a prototype data submission system.

  14. HEASARC Software Archive

    NASA Technical Reports Server (NTRS)

    White, Nicholas (Technical Monitor); Murray, Stephen S.

    2003-01-01

    (1) Chandra Archive: SAO has maintained the interfaces through which HEASARC gains access to the Chandra Data Archive. At HEASARC's request, we have implemented an anonymous ftp copy of a major part of the public archive and we keep that archive up-to- date. SAO has participated in the ADEC interoperability working group, establishing guidelines or interoperability standards and prototyping such interfaces. We have provided an NVO-based prototype interface, intending to serve the HEASARC-led NVO demo project. HEASARC's Astrobrowse interface was maintained and updated. In addition, we have participated in design discussions surrounding HEASARC's Caldb project. We have attended the HEASARC Users Group meeting and presented CDA status and developments. (2) Chandra CALDB: SA0 has maintained and expanded the Chandra CALDB by including four new data file types, defining the corresponding CALDB keyword/identification structures. We have provided CALDB upgrades for the public (CIAO) and for Standard Data Processing. Approximately 40 new files have been added to the CALDB in these version releases. There have been in the past year ten of these CALDB upgrades, each with unique index configurations. In addition, with the inputs from software, archive, and calibration scientists, as well as CIAO/SDP software developers, we have defined a generalized expansion of the existing CALDB interface and indexing structure. The purpose of this is to make the CALDB more generally applicable and useful in new and future missions that will be supported archivally by HEASARC. The generalized interface will identify additional configurational keywords and permit more extensive calibration parameter and boundary condition specifications for unique file selection. HEASARC scientists and developers from SAO and GSFC have become involved in this work, which is expected to produce a new interface for general use within the current year. (3) DS9: One of the decisions that came from last year's HEADCC meeting was to make the ds9 image display program the primary vehicle for displaying line graphics (as well as images). The first step required to make this possible was to enhance the line graphics capabilities of ds9. SAO therefore spent considerable effort upgrading ds9 to use Tcl 8.4 so that the BLT line graphics package could be built and imported into ds9 from source code, rather than from a pre-built (and generally outdated) shared library. This task, which is nearly complete, allows us to extend BLT as needed for the HEAD community. Following HEADCC discussion concerning archiving and the display of archived data, we extended ds9 to support full access to many astronomical Web-based archives sites, including HEASARC, MAST, CHANDRA, SKYVIEW, ADS, NED, SIMBAD, IRAS, NVRO, SAO TDC, and FIRST. Using ds9's new internal Web access capabilities, these archives can be accessed via their Web page. FITS images, plots, spectra, and journal abstracts can be referenced, down-loaded, and displayed directly and easily in ds9. For more information, see: http://hea-www.harvard.edu/saord/ds9. Also after the HEADCC discussion concerning region filtering, we extended the Funtools sample implementation of region filtering as described in: http://hea-www.harvard.edu/saord/funtools/regions.html. In particular, we added several new composite regions for event and image filtering, including elliptical and box annuli. We also extended the panda (Pie AND Annulus) region support to include box pandas and elliptical pandas. These new composite regions are especially useful in programs that need to count photons in each separate region using only a single pass through the data. Support for these new regions was added to ds9. In the same vein, we developed new region support for filtering images using simple FITS image masks, i.e. 8-bit or 16-bit FITS images where the value of a pixel is the region id number for that pixel. Other important enhancements to DS9 this year, include supporor multiple world coordinate systems, three dimensional event file binning, image smoothing, region groups and tags, the ability to save images in a number of image formats (such as JPEG, TIFF, PNG, FITS), improvements in support for integrating external analysis tools, and support for the virtual observatory. In particular, a full-featured web browser has been implemented within D S 9 . This provides support for full access to HEASARC archive sites such as SKYVIEW and W3BROWSE, in addition to other astronomical archives sites such as MAST, CHANDRA, ADS, NED, SIMBAD, IRAS, NVRO, SA0 TDC, and FIRST. From within DS9, the archives can be searched, and FITS images, plots, spectra, and journal abstracts can be referenced, downloaded and displayed The web browser provides the basis for the built-in help facility. All DS9 documentation, including the reference manual, FAQ, Know Features, and contact information is now available to the user without the need for external display applications. New versions of DS9 maybe downloaded and installed using this facility. Two important features used in the analysis of high energy astronomical data have been implemented in the past year. The first is support for binning photon event data in three dimensions. By binning the third dimension in time or energy, users are easily able to detect variable x-ray sources and identify other physical properties of their data. Second, a number of fast smoothing algorithms have been implemented in DS9, which allow users to smooth their data in real time. Algorithms for boxcar, tophat, and gaussian smoothing are supported.

  15. Benefits of cloud computing for PACS and archiving.

    PubMed

    Koch, Patrick

    2012-01-01

    The goal of cloud-based services is to provide easy, scalable access to computing resources and IT services. The healthcare industry requires a private cloud that adheres to government mandates designed to ensure privacy and security of patient data while enabling access by authorized users. Cloud-based computing in the imaging market has evolved from a service that provided cost effective disaster recovery for archived data to fully featured PACS and vendor neutral archiving services that can address the needs of healthcare providers of all sizes. Healthcare providers worldwide are now using the cloud to distribute images to remote radiologists while supporting advanced reading tools, deliver radiology reports and imaging studies to referring physicians, and provide redundant data storage. Vendor managed cloud services eliminate large capital investments in equipment and maintenance, as well as staffing for the data center--creating a reduction in total cost of ownership for the healthcare provider.

  16. Autocorrelation techniques for soft photogrammetry

    NASA Astrophysics Data System (ADS)

    Yao, Wu

    In this thesis research is carried out on image processing, image matching searching strategies, feature type and image matching, and optimal window size in image matching. To make comparisons, the soft photogrammetry package SoftPlotter is used. Two aerial photographs from the Iowa State University campus high flight 94 are scanned into digital format. In order to create a stereo model from them, interior orientation, single photograph rectification and stereo rectification are done. Two new image matching methods, multi-method image matching (MMIM) and unsquare window image matching are developed and compared. MMIM is used to determine the optimal window size in image matching. Twenty four check points from four different types of ground features are used for checking the results from image matching. Comparison between these four types of ground feature shows that the methods developed here improve the speed and the precision of image matching. A process called direct transformation is described and compared with the multiple steps in image processing. The results from image processing are consistent with those from SoftPlotter. A modified LAN image header is developed and used to store the information about the stereo model and image matching. A comparison is also made between cross correlation image matching (CCIM), least difference image matching (LDIM) and least square image matching (LSIM). The quality of image matching in relation to ground features are compared using two methods developed in this study, the coefficient surface for CCIM and the difference surface for LDIM. To reduce the amount of computation in image matching, the best-track searching algorithm, developed in this research, is used instead of the whole range searching algorithm.

  17. LBT Distributed Archive: Status and Features

    NASA Astrophysics Data System (ADS)

    Knapic, C.; Smareglia, R.; Thompson, D.; Grede, G.

    2011-07-01

    After the first release of the LBT Distributed Archive, this successful collaboration is continuing within the LBT corporation. The IA2 (Italian Center for Astronomical Archive) team had updated the LBT DA with new features in order to facilitate user data retrieval while abiding by VO standards. To facilitate the integration of data from any new instruments, we have migrated to a new database, developed new data distribution software, and enhanced features in the LBT User Interface. The DBMS engine has been changed to MySQL. Consequently, the data handling software now uses java thread technology to update and synchronize the main storage archives on Mt. Graham and in Tucson, as well as archives in Trieste and Heidelberg, with all metadata and proprietary data. The LBT UI has been updated with additional features allowing users to search by instrument and some of the more important characteristics of the images. Finally, instead of a simple cone search service over all LBT image data, new instrument specific SIAP and cone search services have been developed. They will be published in the IVOA framework later this fall.

  18. Open source software in a practical approach for post processing of radiologic images.

    PubMed

    Valeri, Gianluca; Mazza, Francesco Antonino; Maggi, Stefania; Aramini, Daniele; La Riccia, Luigi; Mazzoni, Giovanni; Giovagnoni, Andrea

    2015-03-01

    The purpose of this paper is to evaluate the use of open source software (OSS) to process DICOM images. We selected 23 programs for Windows and 20 programs for Mac from 150 possible OSS programs including DICOM viewers and various tools (converters, DICOM header editors, etc.). The programs selected all meet the basic requirements such as free availability, stand-alone application, presence of graphical user interface, ease of installation and advanced features beyond simple display monitor. Capabilities of data import, data export, metadata, 2D viewer, 3D viewer, support platform and usability of each selected program were evaluated on a scale ranging from 1 to 10 points. Twelve programs received a score higher than or equal to eight. Among them, five obtained a score of 9: 3D Slicer, MedINRIA, MITK 3M3, VolView, VR Render; while OsiriX received 10. OsiriX appears to be the only program able to perform all the operations taken into consideration, similar to a workstation equipped with proprietary software, allowing the analysis and interpretation of images in a simple and intuitive way. OsiriX is a DICOM PACS workstation for medical imaging and software for image processing for medical research, functional imaging, 3D imaging, confocal microscopy and molecular imaging. This application is also a good tool for teaching activities because it facilitates the attainment of learning objectives among students and other specialists.

  19. JPL, NASA and the Historical Record: Key Events/Documents in Lunar and Mars Exploration

    NASA Technical Reports Server (NTRS)

    Hooks, Michael Q.

    1999-01-01

    This document represents a presentation about the Jet Propulsion Laboratory (JPL) historical archives in the area of Lunar and Martian Exploration. The JPL archives documents the history of JPL's flight projects, research and development activities and administrative operations. The archives are in a variety of format. The presentation reviews the information available through the JPL archives web site, information available through the Regional Planetary Image Facility web site, and the information on past missions available through the web sites. The presentation also reviews the NASA historical resources at the NASA History Office and the National Archives and Records Administration.

  20. BATSE imaging survey of the Galactic plane

    NASA Technical Reports Server (NTRS)

    Grindlay, J. E.; Barret, D.; Bloser, P. F.; Zhang, S. N.; Robinson, C.; Harmon, B. A.

    1997-01-01

    The burst and transient source experiment (BATSE) onboard the Compton Gamma Ray Observatory (CGRO) provides all sky monitoring capability, occultation analysis and occultation imaging which enables new and fainter sources to be searched for in relatively crowded fields. The occultation imaging technique is used in combination with an automated BATSE image scanner, allowing an analysis of large data sets of occultation images for detections of candidate sources and for the construction of source catalogs and data bases. This automated image scanner system is being tested on archival data in order to optimize the search and detection thresholds. The image search system, its calibration results and preliminary survey results on archival data are reported on. The aim of the survey is to identify a complete sample of black hole candidates in the galaxy and constrain the number of black hole systems and neutron star systems.

  1. Archive of digital boomer subbottom profile data collected in the Atlantic Ocean offshore northeast Florida during USGS cruises 03FGS01 and 03FGS02 in September and October of 2003

    USGS Publications Warehouse

    Calderon, Karynna; Forde, Arnell S.; Dadisman, Shawn V.; Wiese, Dana S.; Phelps, Daniel C.

    2012-01-01

    In September and October of 2003, the U.S. Geological Survey (USGS), in cooperation with the Florida Geological Survey, conducted geophysical surveys of the Atlantic Ocean offshore northeast Florida from St. Augustine, Florida, to the Florida-Georgia border. This report serves as an archive of unprocessed digital boomer subbottom profile data, trackline maps, navigation files, Geographic Information System (GIS) files, Field Activity Collection System (FACS) logs, and formal Federal Geographic Data Committee (FGDC) metadata. Filtered and gained (a relative increase in signal amplitude) digital images of the seismic profiles are also provided. Refer to the Acronyms page for expansions of all acronyms and abbreviations used in this report. The USGS St. Petersburg Coastal and Marine Science Center (SPCMSC) assigns a unique identifier to each cruise or field activity. For example, 03FGS01 tells us the data were collected in 2003 as part of cooperative work with the Florida Geological Survey (FGS) and that the data were collected during the first field activity for that project in that calendar year. Refer to http://walrus.wr.usgs.gov/infobank/programs/html/definition/activity.html for a detailed description of the method used to assign the field activity identification (ID). The naming convention used for each seismic line is as follows: yye##a, where 'yy' are the last two digits of the year in which the data were collected, 'e' is a 1-letter abbreviation for the equipment type (for example, b for boomer), '##' is a 2-digit number representing a specific track, and 'a' is a letter representing the section of a line if recording was prematurely terminated or rerun for quality or acquisition problems. The boomer plate is an acoustic energy source that consists of capacitors charged to a high voltage and discharged through a transducer in the water. The transducer is towed on a sled floating on the water surface and when discharged emits a short acoustic pulse, or shot, which propagates through the water, sediment column, or rock beneath. The acoustic energy is reflected at density boundaries (such as the seafloor, sediment, or rock layers beneath the seafloor), detected by hydrophone receivers, and recorded by a PC-based seismic acquisition system. This process is repeated at timed intervals (for example, 0.5 seconds) and recorded for specific intervals of time (for example, 100 milliseconds). In this way, a two-dimensional (2-D) vertical profile of the shallow geologic structure beneath the ship track is produced. Refer to the handwritten FACS operation log (PDF, 442 KB) for diagrams and descriptions of acquisition geometry, which varied throughout the cruises. Table 1 displays a summary of acquisition parameters. See the digital FACS equipment logs (PDF, 9-13 KB each) for details about the acquisition equipment used. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG Y (Barry and others, 1975) format (rev. 0), except for the first 3,200 bytes of the card image header, which are stored in ASCII format instead of the standard EBCDIC format. The SEG Y files may be downloaded and processed with commercial or public domain software such as Seismic Unix (SU) (Cohen and Stockwell, 2005). See the How To Download SEG Y Data page for download instructions. The printable profiles provided here are Graphics Interchange Format (GIF) images that were filtered and gained using SU software. Refer to the Software page for details about the processing and links to example SU processing scripts and USGS software for viewing the SEG Y files (Zihlman, 1992).

  2. ISMRM Raw Data Format: A Proposed Standard for MRI Raw Datasets

    PubMed Central

    Inati, Souheil J.; Naegele, Joseph D.; Zwart, Nicholas R.; Roopchansingh, Vinai; Lizak, Martin J.; Hansen, David C.; Liu, Chia-Ying; Atkinson, David; Kellman, Peter; Kozerke, Sebastian; Xue, Hui; Campbell-Washburn, Adrienne E.; Sørensen, Thomas S.; Hansen, Michael S.

    2015-01-01

    Purpose This work proposes the ISMRM Raw Data (ISMRMRD) format as a common MR raw data format, which promotes algorithm and data sharing. Methods A file format consisting of a flexible header and tagged frames of k-space data was designed. Application Programming Interfaces were implemented in C/C++, MATLAB, and Python. Converters for Bruker, General Electric, Philips, and Siemens proprietary file formats were implemented in C++. Raw data were collected using MRI scanners from four vendors, converted to ISMRMRD format, and reconstructed using software implemented in three programming languages (C++, MATLAB, Python). Results Images were obtained by reconstructing the raw data from all vendors. The source code, raw data, and images comprising this work are shared online, serving as an example of an image reconstruction project following a paradigm of reproducible research. Conclusion The proposed raw data format solves a practical problem for the MRI community. It may serve as a foundation for reproducible research and collaborations. The ISMRMRD format is a completely open and community-driven format, and the scientific community is invited (including commercial vendors) to participate either as users or developers. PMID:26822475

  3. Precision Effects for Solar Image Coordinates Within the FITS World Coordinate System

    NASA Technical Reports Server (NTRS)

    Thompson, W. T.

    2010-01-01

    The FITS world coordinate system (WCS) provides a number of tools for precisely specifying the spatial coordinates of an image. Many of the finer details that the WCS addresses have not historically been taken into account in solar image processing. This paper examines various effects which can affect the expression of coordinates in FITS headers, to determine under what conditions such effects need to be taken into account in data analysis, and under what conditions they can be safely ignored. Effects which are examined include perspective, parallax, spherical projection, optical axis determination, speed-of-light effects, stellar aberration, gravitational deflection, and scattering and refraction at radio wavelengths. Purely instrumental effects, such as misalignment or untreated optical aberrations, are not considered. Since the value of the solar radius is an experimental quantity, the effect of adopting a specific radius value is also examined. These effects are examined in the context of a previous paper outlining a WCS standard for encoding solar coordinates in FITS files. Aspects of that previous paper are clarified and extended in the present work.

  4. MER Telemetry Processor

    NASA Technical Reports Server (NTRS)

    Lee, Hyun H.

    2012-01-01

    MERTELEMPROC processes telemetered data in data product format and generates Experiment Data Records (EDRs) for many instruments (HAZCAM, NAVCAM, PANCAM, microscopic imager, Moessbauer spectrometer, APXS, RAT, and EDLCAM) on the Mars Exploration Rover (MER). If the data is compressed, then MERTELEMPROC decompresses the data with an appropriate decompression algorithm. There are two compression algorithms (ICER and LOCO) used in MER. This program fulfills a MER specific need to generate Level 1 products within a 60-second time requirement. EDRs generated by this program are used by merinverter, marscahv, marsrad, and marsjplstereo to generate higher-level products for the mission operations. MERTELEPROC was the first GDS program to process the data product. Metadata of the data product is in XML format. The software allows user-configurable input parameters, per-product processing (not streambased processing), and fail-over is allowed if the leading image header is corrupted. It is used within the MER automated pipeline. MERTELEMPROC is part of the OPGS (Operational Product Generation Subsystem) automated pipeline, which analyzes images returned by in situ spacecraft and creates level 1 products to assist in operations, science, and outreach.

  5. The Kanzelhöhe Online Data Archive

    NASA Astrophysics Data System (ADS)

    Pötzi, W.; Hirtenfellner-Polanec, W.; Temmer, M.

    The Kanzelhöhe Observatory provides high-cadence full-disk observations of solar activity phenomena like sunspots, flares and prominence eruptions on a regular basis. The data are available for download from the KODA (Kanzelhöhe Observatory Data Archive) which is freely accessible. The archive offers sunspot drawings back to 1950 and high cadence H-α data back to 1973. Images from other instruments, like white-light and CaIIK, are available since 2007 and 2010, respectively. In the following we describe how to access the archive and the format of the data.

  6. New secure communication-layer standard for medical image management (ISCL)

    NASA Astrophysics Data System (ADS)

    Kita, Kouichi; Nohara, Takashi; Hosoba, Minoru; Yachida, Masuyoshi; Yamaguchi, Masahiro; Ohyama, Nagaaki

    1999-07-01

    This paper introduces a summary of the standard draft of ISCL 1.00 which will be published by MEDIS-DC officially. ISCL is abbreviation of Integrated Secure Communication Layer Protocols for Secure Medical Image Management Systems. ISCL is a security layer which manages security function between presentation layer and TCP/IP layer. ISCL mechanism depends on basic function of a smart IC card and symmetric secret key mechanism. A symmetry key for each session is made by internal authentication function of a smart IC card with a random number. ISCL has three functions which assure authentication, confidently and integrity. Entity authentication process is done through 3 path 4 way method using functions of internal authentication and external authentication of a smart iC card. Confidentially algorithm and MAC algorithm for integrity are able to be selected. ISCL protocols are communicating through Message Block which consists of Message Header and Message Data. ISCL protocols are evaluating by applying to regional collaboration system for image diagnosis, and On-line Secure Electronic Storage system for medical images. These projects are supported by Medical Information System Development Center. These project shows ISCL is useful to keep security.

  7. An open source toolkit for medical imaging de-identification.

    PubMed

    González, David Rodríguez; Carpenter, Trevor; van Hemert, Jano I; Wardlaw, Joanna

    2010-08-01

    Medical imaging acquired for clinical purposes can have several legitimate secondary uses in research projects and teaching libraries. No commonly accepted solution for anonymising these images exists because the amount of personal data that should be preserved varies case by case. Our objective is to provide a flexible mechanism for anonymising Digital Imaging and Communications in Medicine (DICOM) data that meets the requirements for deployment in multicentre trials. We reviewed our current de-identification practices and defined the relevant use cases to extract the requirements for the de-identification process. We then used these requirements in the design and implementation of the toolkit. Finally, we tested the toolkit taking as a reference those requirements, including a multicentre deployment. The toolkit successfully anonymised DICOM data from various sources. Furthermore, it was shown that it could forward anonymous data to remote destinations, remove burned-in annotations, and add tracking information to the header. The toolkit also implements the DICOM standard confidentiality mechanism. A DICOM de-identification toolkit that facilitates the enforcement of privacy policies was developed. It is highly extensible, provides the necessary flexibility to account for different de-identification requirements and has a low adoption barrier for new users.

  8. DICOM Standard Conformance in Veterinary Medicine in Germany: a Survey of Imaging Studies in Referral Cases.

    PubMed

    Brühschwein, Andreas; Klever, Julius; Wilkinson, Tom; Meyer-Lindenberg, Andrea

    2018-02-01

    In 2016, the recommendations of the DICOM Standards Committee for the use of veterinary identification DICOM tags had its 10th anniversary. The goal of our study was to survey veterinary DICOM standard conformance in Germany regarding the specific identification tags veterinarians should use in veterinary diagnostic imaging. We hypothesized that most veterinarians in Germany do not follow the guidelines of the DICOM Standards Committee. We analyzed the metadata of 488 imaging studies of referral cases from 115 different veterinary institutions in Germany by computer-aided DICOM header readout. We found that 25 (5.1%) of the imaging studies fully complied with the "veterinary DICOM standard" in this survey. The results confirmed our hypothesis that the recommendations of the DICOM Standards Committee for the consistent and advantageous use of veterinary identification tags have found minimal acceptance amongst German veterinarians. DICOM does not only enable connectivity between machines, DICOM also improves communication between veterinarians by sharing correct and valuable metadata for better patient care. Therefore, we recommend that lecturers, universities, societies, authorities, vendors, and other stakeholders should increase their effort to improve the spread of the veterinary DICOM standard in the veterinary world.

  9. You Can See Film through Digital: A Report from Where the Archiving of Motion Picture Film Stands

    NASA Astrophysics Data System (ADS)

    Tochigi, Akira

    In recent years, digital technology has brought drastic change to the archiving of motion picture film. By collecting digital media as well as film, many conventional film archives have transformed themselves into moving image archives or audiovisual archives. As well, digital technology has expanded the possibility of the restoration of motion picture film in comparison with conventional photochemical (analog) restoration. This paper first redefines some fundamental terms regarding the archiving of motion picture film and discusses the conditions which need consideration for film archiving in Japan. With a few examples of the recent restoration projects conducted by National Film Center of the National Museum of Modern Art, Tokyo, this paper then clarifies new challenges inherent in digital restoration and urges the importance of better appreciation of motion picture film.

  10. Direct cooled power electronics substrate

    DOEpatents

    Wiles, Randy H [Powell, TN; Wereszczak, Andrew A [Oak Ridge, TN; Ayers, Curtis W [Kingston, TN; Lowe, Kirk T [Knoxville, TN

    2010-09-14

    The disclosure describes directly cooling a three-dimensional, direct metallization (DM) layer in a power electronics device. To enable sufficient cooling, coolant flow channels are formed within the ceramic substrate. The direct metallization layer (typically copper) may be bonded to the ceramic substrate, and semiconductor chips (such as IGBT and diodes) may be soldered or sintered onto the direct metallization layer to form a power electronics module. Multiple modules may be attached to cooling headers that provide in-flow and out-flow of coolant through the channels in the ceramic substrate. The modules and cooling header assembly are preferably sized to fit inside the core of a toroidal shaped capacitor.

  11. DTS Raw Data Guelph, ON Canada

    DOE Data Explorer

    Thomas Coleman

    2013-07-31

    Unprocessed active distributed temperature sensing (DTS) data from 3 boreholes in the Guelph, ON Canada region. Data from borehole 1 was collected during a fluid injection while data from boreholes 2 and 3 were collected under natural gradient conditions in a lined borehole. The column labels/headers (in the first row) define the time since start of measurement in seconds and the row labels/headers (in the first column) are the object IDs that are defined in the metadata. Each object ID is a sampling location whose exact location is defined in the metadata file. Data in each cell are temperature in Celsius at time and sampling location as defined above.

  12. FUEL SUBASSEMBLY CONSTRUCTION FOR RADIAL FLOW IN A NUCLEAR REACTOR

    DOEpatents

    Treshow, M.

    1962-12-25

    An assembly of fuel elements for a boiling water reactor arranged for radial flow of the coolant is described. The ingress for the coolant is through a central header tube, perforated with parallel circumferertial rows of openings each having a lip to direct the coolant flow downward. Around the central tube there are a number of equally spaced concentric trays, closely fitiing the central header tube. Cylindrical fuel elements are placed in a regular pattern around the central tube, piercing the trays. A larger tube encloses the arrangement, with space provided for upward flow of coolart beyond the edge of the trays. (AEC)

  13. Stack configurations for tubular solid oxide fuel cells

    DOEpatents

    Armstrong, Timothy R.; Trammell, Michael P.; Marasco, Joseph A.

    2010-08-31

    A fuel cell unit includes an array of solid oxide fuel cell tubes having porous metallic exterior surfaces, interior fuel cell layers, and interior surfaces, each of the tubes having at least one open end; and, at least one header in operable communication with the array of solid oxide fuel cell tubes for directing a first reactive gas into contact with the porous metallic exterior surfaces and for directing a second reactive gas into contact with the interior surfaces, the header further including at least one busbar disposed in electrical contact with at least one surface selected from the group consisting of the porous metallic exterior surfaces and the interior surfaces.

  14. Clinical applications of an ATM/Ethernet network in departments of neuroradiology and radiotherapy.

    PubMed

    Cimino, C; Pizzi, R; Fusca, M; Bruzzone, M G; Casolino, D; Sicurello, F

    1997-01-01

    An integrated system for the multimedia management of images and clinical information has been developed at the Isituto Nazionale Neurologico C. Besta in Milan. The Institute physicians have the daily need of consulting images coming from various modalities. The high volume of archived material and the need of retrieving and displaying new and past images and clinical information has motivated the development of a Picture Archiving and Communication System (PACS) for the automatic management of images and clinical data, related not only to the Radiology Department, but also to the Radiotherapy Department for 3D virtual simulation, to remote teleconsulting, and in the following to all the wards, ambulatories and labs.

  15. The Cancer Digital Slide Archive - TCGA

    Cancer.gov

    Dr. David Gutman and Dr. Lee Cooper developed The Cancer Digital Slide Archive (CDSA), a web platform for accessing pathology slide images of TCGA samples. Find out how they did it and how to use the CDSA website in this Case Study.

  16. Current status of the joint Mayo Clinic-IBM PACS project

    NASA Astrophysics Data System (ADS)

    Hangiandreou, Nicholas J.; Williamson, Byrn, Jr.; Gehring, Dale G.; Persons, Kenneth R.; Reardon, Frank J.; Salutz, James R.; Felmlee, Joel P.; Loewen, M. D.; Forbes, Glenn S.

    1994-05-01

    A multi-phase collaboration between Mayo Clinic and IBM-Rochester was undertaken, with the goal of developing a picture archiving and communication system for routine clinical use in the Radiology Department. The initial phase of this project (phase 0) was started in 1988. The current system has been fully integrated into the clinical practice and, to date, over 6.5 million images from 16 imaging modalities have been archived. Phase 3 of this project has recently concluded.

  17. Cost-effective data storage/archival subsystem for functional PACS

    NASA Astrophysics Data System (ADS)

    Chen, Y. P.; Kim, Yongmin

    1993-09-01

    Not the least of the requirements of a workable PACS is the ability to store and archive vast amounts of information. A medium-size hospital will generate between 1 and 2 TBytes of data annually on a fully functional PACS. A high-speed image transmission network coupled with a comparably high-speed central data storage unit can make local memory and magnetic disks in the PACS workstations less critical and, in an extreme case, unnecessary. Under these circumstances, the capacity and performance of the central data storage subsystem and database is critical in determining the response time at the workstations, thus significantly affecting clinical acceptability. The central data storage subsystem not only needs to provide sufficient capacity to store about ten days worth of images (five days worth of new studies, and on the average, about one comparison study for each new study), but also supplies images to the requesting workstation in a timely fashion. The database must provide fast retrieval responses upon users' requests for images. This paper analyzes both advantages and disadvantages of multiple parallel transfer disks versus RAID disks for short-term central data storage subsystem, as well as optical disk jukebox versus digital recorder tape subsystem for long-term archive. Furthermore, an example high-performance cost-effective storage subsystem which integrates both the RAID disks and high-speed digital tape subsystem as a cost-effective PACS data storage/archival unit are presented.

  18. Hierarchical storage of large volume of multidector CT data using distributed servers

    NASA Astrophysics Data System (ADS)

    Ratib, Osman; Rosset, Antoine; Heuberger, Joris; Bandon, David

    2006-03-01

    Multidector scanners and hybrid multimodality scanners have the ability to generate large number of high-resolution images resulting in very large data sets. In most cases, these datasets are generated for the sole purpose of generating secondary processed images and 3D rendered images as well as oblique and curved multiplanar reformatted images. It is therefore not essential to archive the original images after they have been processed. We have developed an architecture of distributed archive servers for temporary storage of large image datasets for 3D rendering and image processing without the need for long term storage in PACS archive. With the relatively low cost of storage devices it is possible to configure these servers to hold several months or even years of data, long enough for allowing subsequent re-processing if required by specific clinical situations. We tested the latest generation of RAID servers provided by Apple computers with a capacity of 5 TBytes. We implemented a peer-to-peer data access software based on our Open-Source image management software called OsiriX, allowing remote workstations to directly access DICOM image files located on the server through a new technology called "bonjour". This architecture offers a seamless integration of multiple servers and workstations without the need for central database or complex workflow management tools. It allows efficient access to image data from multiple workstation for image analysis and visualization without the need for image data transfer. It provides a convenient alternative to centralized PACS architecture while avoiding complex and time-consuming data transfer and storage.

  19. Archive of post-Hurricane Isabel coastal oblique aerial photographs collected during U.S. Geological Survey Field Activity 03CCH01 from Ocean City, Maryland, to Fort Caswell, North Carolina and Inland from Waynesboro to Redwood, Virginia, September 21 - 23, 2003

    USGS Publications Warehouse

    Subino, Janice A.; Morgan, Karen L.M.; Krohn, M. Dennis; Dadisman, Shawn V.

    2013-01-01

    On September 21 - 23, 2003, the United States Geological Survey (USGS) conducted an oblique aerial photographic survey along the Atlantic coast from Ocean City, Md., to Fort Caswell, N.C., and inland oblique aerial photographic survey from Waynesboro to Redwood, Va., aboard a Navajo Piper twin-engine airplane. The coastal survey was conducted at an altitude of 500 feet (ft) and approximately 1,000 ft offshore. For the inland photos, the aircraft tried to stay approximately 500 ft above the terrain. These coastal photos were used to document coastal changes like beach erosion and overwash caused by Hurricane Isabel, while the inland photos looked for potential landslides caused by heavy rains. The photos may also be used as baseline data for future coastal change analysis. The USGS and the National Aeronautics and Space Administration (NASA) surveyed the impact zone of Hurricane Isabel to better understand the changes in vulnerability of the Nation’s coasts to extreme storms (Morgan, 2009). This report serves as an archive of photographs collected during the September 21 - 23, 2003, post-Hurricane Isabel coastal and inland oblique aerial survey along with associated survey maps, KML files, navigation files, digital Field Activity Collection System (FACS) logs, and Federal Geographic Data Committee (FGDC) metadata. Refer to the Acronyms page for expansions of all acronyms and abbreviations used in this report. The USGS St. Petersburg Coastal and Marine Science Center (SPCMSC) assigns a unique identifier to each cruise or field activity. For example, 03CCH01 tells us the data were collected in 2003 for the Coastal Change Hazards (CCH) study and the data were collected during the first field activity for that project in that calendar year. Refer to http://walrus.wr.usgs.gov/infobank/programs/html/definition/activity.html for a detailed description of the method used to assign the ID number. The photographs provided here are Joint Photographic Experts Group (JPEG) scanned images of the analog 35 millimeter (mm) color positive slides. The photograph locations are estimates of the location of the plane (see the Navigation page). The metadata values for photo creation time, GPS latitude, GPS longitude, GPS position (latitude and longitude), keywords, credit, artist, caption, copyright, and contact were added to each photograph's EXIF header using EXIFtool (Subino and others, 2012). Photographs can be opened directly with any JPEG-compatible image viewer by clicking on a thumbnail on the contact sheet, or, when viewing the Google Earth KML file, by clicking on the marker and then clicking on either the thumbnail or the link below the thumbnail. Nathaniel Plant (USGS - St. Petersburg, Fla.), and Ann Marie Ascough (formerly contracted at the USGS - St. Petersburg, Fla.) helped with the creation of KML files. To view the photos and survey maps, proceed to the Photos and Maps page.

  20. Managing an archive of weather satellite images

    NASA Technical Reports Server (NTRS)

    Seaman, R. L.

    1992-01-01

    The author's experiences of building and maintaining an archive of hourly weather satellite pictures at NOAO are described. This archive has proven very popular with visiting and staff astronomers - especially on windy days and cloudy nights. Given access to a source of such pictures, a suite of simple shell and IRAF CL scripts can provide a great deal of robust functionality with little effort. These pictures and associated data products such as surface analysis (radar) maps and National Weather Service forecasts are updated hourly at anonymous ftp sites on the Internet, although your local Atsmospheric Sciences Department may prove to be a more reliable source. The raw image formats are unfamiliar to most astronomers, but reading them into IRAF is straightforward. Techniques for performing this format conversion at the host computer level are described which may prove useful for other chores. Pointers are given to sources of data and of software, including a package of example tools. These tools include shell and Perl scripts for downloading pictures, maps, and forecasts, as well as IRAF scripts and host level programs for translating the images into IRAF and GIF formats and for slicing & dicing the resulting images. Hints for displaying the images and for making hardcopies are given.

  1. From Metric Image Archives to Point Cloud Reconstruction: Case Study of the Great Mosque of Aleppo in Syria

    NASA Astrophysics Data System (ADS)

    Grussenmeyer, P.; Khalil, O. Al

    2017-08-01

    The paper presents photogrammetric archives from Aleppo (Syria), collected between 1999 and 2002 by the Committee for maintenance and restoration of the Great Mosque in partnership with the Engineering Unit of the University of Aleppo. During that period, terrestrial photogrammetric data and geodetic surveys of the Great Omayyad mosque were recorded for documentation purposes and geotechnical studies. During the recent war in Syria, the Mosque has unfortunately been seriously damaged and its minaret has been completely destroyed. The paper presents a summary of the documentation available from the past projects as well as solutions of 3D reconstruction based on the processing of the photogrammetric archives with the latest 3D image-based techniques.

  2. HST archive primer, version 4.1

    NASA Technical Reports Server (NTRS)

    Fruchter, A. (Editor); Baum, S. (Editor)

    1994-01-01

    This version of the HST Archive Primer provides the basic information a user needs to know to access the HST archive via StarView the new user interface to the archive. Using StarView, users can search for observations interest, find calibration reference files, and retrieve data from the archive. Both the terminal version of StarView and the X-windows version feature a name resolver which simplifies searches of the HST archive based on target name. In addition, the X-windows version of StarView allows preview of all public HST data; compressed versions of public images are displayed via SAOIMAGE, while spectra are plotted using the public plotting package, XMGR. Finally, the version of StarView described here features screens designed for observers preparing Cycle 5 HST proposals.

  3. ISMRM Raw data format: A proposed standard for MRI raw datasets.

    PubMed

    Inati, Souheil J; Naegele, Joseph D; Zwart, Nicholas R; Roopchansingh, Vinai; Lizak, Martin J; Hansen, David C; Liu, Chia-Ying; Atkinson, David; Kellman, Peter; Kozerke, Sebastian; Xue, Hui; Campbell-Washburn, Adrienne E; Sørensen, Thomas S; Hansen, Michael S

    2017-01-01

    This work proposes the ISMRM Raw Data format as a common MR raw data format, which promotes algorithm and data sharing. A file format consisting of a flexible header and tagged frames of k-space data was designed. Application Programming Interfaces were implemented in C/C++, MATLAB, and Python. Converters for Bruker, General Electric, Philips, and Siemens proprietary file formats were implemented in C++. Raw data were collected using magnetic resonance imaging scanners from four vendors, converted to ISMRM Raw Data format, and reconstructed using software implemented in three programming languages (C++, MATLAB, Python). Images were obtained by reconstructing the raw data from all vendors. The source code, raw data, and images comprising this work are shared online, serving as an example of an image reconstruction project following a paradigm of reproducible research. The proposed raw data format solves a practical problem for the magnetic resonance imaging community. It may serve as a foundation for reproducible research and collaborations. The ISMRM Raw Data format is a completely open and community-driven format, and the scientific community is invited (including commercial vendors) to participate either as users or developers. Magn Reson Med 77:411-421, 2017. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  4. Ultrasound Picture Archiving And Communication Systems

    NASA Astrophysics Data System (ADS)

    Koestner, Ken; Hottinger, C. F.

    1982-01-01

    The ideal ultrasonic image communication and storage system must be flexible in order to optimize speed and minimize storage requirements. Various ultrasonic imaging modalities are quite different in data volume and speed requirements. Static imaging, for example B-Scanning, involves acquisition of a large amount of data that is averaged or accumulated in a desired manner. The image is then frozen in image memory before transfer and storage. Images are commonly a 512 x 512 point array, each point 6 bits deep. Transfer of such an image over a serial line at 9600 baud would require about three minutes. Faster transfer times are possible; for example, we have developed a parallel image transfer system using direct memory access (DMA) that reduces the time to 16 seconds. Data in this format requires 256K bytes for storage. Data compression can be utilized to reduce these requirements. Real-time imaging has much more stringent requirements for speed and storage. The amount of actual data per frame in real-time imaging is reduced due to physical limitations on ultrasound. For example, 100 scan lines (480 points long, 6 bits deep) can be acquired during a frame at a 30 per second rate. In order to transmit and save this data at a real-time rate requires a transfer rate of 8.6 Megabaud. A real-time archiving system would be complicated by the necessity of specialized hardware to interpolate between scan lines and perform desirable greyscale manipulation on recall. Image archiving for cardiology and radiology would require data transfer at this high rate to preserve temporal (cardiology) and spatial (radiology) information.

  5. One-Dimensional Signal Extraction Of Paper-Written ECG Image And Its Archiving

    NASA Astrophysics Data System (ADS)

    Zhang, Zhi-ni; Zhang, Hong; Zhuang, Tian-ge

    1987-10-01

    A method for converting paper-written electrocardiograms to one dimensional (1-D) signals for archival storage on floppy disk is presented here. Appropriate image processing techniques were employed to remove the back-ground noise inherent to ECG recorder charts and to reconstruct the ECG waveform. The entire process consists of (1) digitization of paper-written ECGs with an image processing system via a TV camera; (2) image preprocessing, including histogram filtering and binary image generation; (3) ECG feature extraction and ECG wave tracing, and (4) transmission of the processed ECG data to IBM-PC compatible floppy disks for storage and retrieval. The algorithms employed here may also be used in the recognition of paper-written EEG or EMG and may be useful in robotic vision.

  6. Improving Image Drizzling in the HST Archive: Advanced Camera for Surveys

    NASA Astrophysics Data System (ADS)

    Hoffmann, Samantha L.; Avila, Roberto J.

    2017-06-01

    The Mikulski Archive for Space Telescopes (MAST) pipeline performs geometric distortion corrections, associated image combinations, and cosmic ray rejections with AstroDrizzle on Hubble Space Telescope (HST) data. The MDRIZTAB reference table contains a list of relevant parameters that controls this program. This document details our photometric analysis of Advanced Camera for Surveys Wide Field Channel (ACS/WFC) data processed by AstroDrizzle. Based on this analysis, we update the MDRIZTAB table to improve the quality of the drizzled products delivered by MAST.

  7. Stewardship of very large digital data archives

    NASA Technical Reports Server (NTRS)

    Savage, Patric

    1991-01-01

    An archive is a permanent store. There are relatively few very large digital data archives in existence. Most business records are expired within five or ten years. Many kinds of business records that do have long lives are embedded in data bases that are continually updated and re-issued cyclically. Also, a great deal of permanent business records are actually archived as microfilm, fiche, or optical disk images - their digital version being an operational convenience rather than an archive. The problems forseen in stewarding the very large digital data archives that will accumulate during the mission of the Earth Observing System (EOS) are addressed. It focuses on the function of shepherding archived digital data into an endless future. Stewardship entails storing and protecting the archive and providing meaningful service to the community of users. The steward will (1) provide against loss due to physical phenomena; (2) assure that data is not lost due to storage technology obsolescence; and (3) maintain data in a current formatting methodology.

  8. A review of images of nurses and smoking on the World Wide Web.

    PubMed

    Sarna, Linda; Bialous, Stella Aguinaga

    2012-01-01

    With the advent of the World Wide Web, historic images previously having limited distributions are now widely available. As tobacco use has evolved, so have images of nurses related to smoking. Using a systematic search, the purpose of this article is to describe types of images of nurses and smoking available on the World Wide Web. Approximately 10,000 images of nurses and smoking published over the past century were identified through search engines and digital archives. Seven major themes were identified: nurses smoking, cigarette advertisements, helping patients smoke, "naughty" nurse, teaching women to smoke, smoking in and outside of health care facilities, and antitobacco images. The use of nursing images to market cigarettes was known but the extent of the use of these images has not been reported previously. Digital archives can be used to explore the past, provide a perspective for understanding the present, and suggest directions for the future in confronting negative images of nursing. Copyright © 2012 Elsevier Inc. All rights reserved.

  9. Mass-storage management for distributed image/video archives

    NASA Astrophysics Data System (ADS)

    Franchi, Santina; Guarda, Roberto; Prampolini, Franco

    1993-04-01

    The realization of image/video database requires a specific design for both database structures and mass storage management. This issue has addressed the project of the digital image/video database system that has been designed at IBM SEMEA Scientific & Technical Solution Center. Proper database structures have been defined to catalog image/video coding technique with the related parameters, and the description of image/video contents. User workstations and servers are distributed along a local area network. Image/video files are not managed directly by the DBMS server. Because of their wide size, they are stored outside the database on network devices. The database contains the pointers to the image/video files and the description of the storage devices. The system can use different kinds of storage media, organized in a hierarchical structure. Three levels of functions are available to manage the storage resources. The functions of the lower level provide media management. They allow it to catalog devices and to modify device status and device network location. The medium level manages image/video files on a physical basis. It manages file migration between high capacity media and low access time media. The functions of the upper level work on image/video file on a logical basis, as they archive, move and copy image/video data selected by user defined queries. These functions are used to support the implementation of a storage management strategy. The database information about characteristics of both storage devices and coding techniques are used by the third level functions to fit delivery/visualization requirements and to reduce archiving costs.

  10. Modular heat exchanger

    DOEpatents

    Culver, Donald W.

    1978-01-01

    A heat exchanger for use in nuclear reactors includes a heat exchange tube bundle formed from similar modules each having a hexagonal shroud containing a large number of thermally conductive tubes which are connected with inlet and outlet headers at opposite ends of each module, the respective headers being adapted for interconnection with suitable inlet and outlet manifold means. In order to adapt the heat exchanger for operation in a high temperature and high pressure environment and to provide access to all tube ports at opposite ends of the tube bundle, a spherical tube sheet is arranged in sealed relation across the chamber with an elongated duct extending outwardly therefrom to provide manifold means for interconnection with the opposite end of the tube bundle.

  11. Digital Image Support in the ROADNet Real-time Monitoring Platform

    NASA Astrophysics Data System (ADS)

    Lindquist, K. G.; Hansen, T. S.; Newman, R. L.; Vernon, F. L.; Nayak, A.; Foley, S.; Fricke, T.; Orcutt, J.; Rajasekar, A.

    2004-12-01

    The ROADNet real-time monitoring infrastructure has allowed researchers to integrate geophysical monitoring data from a wide variety of signal domains. Antelope-based data transport, relational-database buffering and archiving, backup/replication/archiving through the Storage Resource Broker, and a variety of web-based distribution tools create a powerful monitoring platform. In this work we discuss our use of the ROADNet system for the collection and processing of digital image data. Remote cameras have been deployed at approximately 32 locations as of September 2004, including the SDSU Santa Margarita Ecological Reserve, the Imperial Beach pier, and the Pinon Flats geophysical observatory. Fire monitoring imagery has been obtained through a connection to the HPWREN project. Near-real-time images obtained from the R/V Roger Revelle include records of seafloor operations by the JASON submersible, as part of a maintenance mission for the H2O underwater seismic observatory. We discuss acquisition mechanisms and the packet architecture for image transport via Antelope orbservers, including multi-packet support for arbitrarily large images. Relational database storage supports archiving of timestamped images, image-processing operations, grouping of related images and cameras, support for motion-detect triggers, thumbnail images, pre-computed video frames, support for time-lapse movie generation and storage of time-lapse movies. Available ROADNet monitoring tools include both orbserver-based display of incoming real-time images and web-accessible searching and distribution of images and movies driven by the relational database (http://mercali.ucsd.edu/rtapps/rtimbank.php). An extension to the Kepler Scientific Workflow System also allows real-time image display via the Ptolemy project. Custom time-lapse movies may be made from the ROADNet web pages.

  12. Use of film digitizers to assist radiology image management

    NASA Astrophysics Data System (ADS)

    Honeyman-Buck, Janice C.; Frost, Meryll M.; Staab, Edward V.

    1996-05-01

    The purpose of this development effort was to evaluate the possibility of using digital technologies to solve image management problems in the Department of Radiology at the University of Florida. The three problem areas investigated were local interpretation of images produced in remote locations, distribution of images to areas outside of radiology, and film handling. In all cases the use of a laser film digitizer interfaced to an existing Picture Archiving and Communication System (PACS) was investigated as a solution to the problem. In each case the volume of studies involved were evaluated to estimate the impact of the solution on the network, archive, and workstations. Communications were stressed in the analysis of the needs for all image transmission. The operational aspects of the solution were examined to determine the needs for training, service, and maintenance. The remote sites requiring local interpretation included were a rural hospital needing coverage for after hours studies, the University of Florida student infirmary, and the emergency room. Distribution of images to the intensive care units was studied to improve image access and patient care. Handling of films originating from remote sites and those requiring urgent reporting were evaluated to improve management functions. The results of our analysis and the decisions that were made based on the analysis are described below. In the cases where systems were installed, a description of the system and its integration into the PACS system is included. For all three problem areas, although we could move images via a digitizer to the archive and a workstation, there was no way to inform the radiologist that a study needed attention. In the case of outside films, the patient did not always have a medical record number that matched one in our Radiology Information Systems (RIS). In order to incorporate all studies for a patient, we needed common locations for orders, reports, and images. RIS orders were generated for each outside study to be interpreted and a medical record number assigned if none existed. All digitized outside films were archived in the PACS archive for later review or comparison use. The request generated by the RIS requesting a diagnostic interpretation was placed at the PACS workstation to alert the radiologists that unread images had arrived and a box was added to the workstation user interface that could be checked by the radiologist to indicate that a report had been dictated. The digitizer system solved several problems, unavailable films in the emergency room, teleradiology, and archiving of outside studies that had been read by University of Florida radiologists. In addition to saving time for outside film management, we now store the studies for comparison purposes, no longer lose emergency room films, generate diagnostic reports on emergency room films in a timely manner (important for billing and reimbursement), and can handle the distributed nature of our business. As changes in health care drive management changes, existing tools can be used in new ways to help make the transition easier. In this case, adding digitizers to an existing PACS network helped solve several image management problems.

  13. Automatic machine learning based prediction of cardiovascular events in lung cancer screening data

    NASA Astrophysics Data System (ADS)

    de Vos, Bob D.; de Jong, Pim A.; Wolterink, Jelmer M.; Vliegenthart, Rozemarijn; Wielingen, Geoffrey V. F.; Viergever, Max A.; Išgum, Ivana

    2015-03-01

    Calcium burden determined in CT images acquired in lung cancer screening is a strong predictor of cardiovascular events (CVEs). This study investigated whether subjects undergoing such screening who are at risk of a CVE can be identified using automatic image analysis and subject characteristics. Moreover, the study examined whether these individuals can be identified using solely image information, or if a combination of image and subject data is needed. A set of 3559 male subjects undergoing Dutch-Belgian lung cancer screening trial was included. Low-dose non-ECG synchronized chest CT images acquired at baseline were analyzed (1834 scanned in the University Medical Center Groningen, 1725 in the University Medical Center Utrecht). Aortic and coronary calcifications were identified using previously developed automatic algorithms. A set of features describing number, volume and size distribution of the detected calcifications was computed. Age of the participants was extracted from image headers. Features describing participants' smoking status, smoking history and past CVEs were obtained. CVEs that occurred within three years after the imaging were used as outcome. Support vector machine classification was performed employing different feature sets using sets of only image features, or a combination of image and subject related characteristics. Classification based solely on the image features resulted in the area under the ROC curve (Az) of 0.69. A combination of image and subject features resulted in an Az of 0.71. The results demonstrate that subjects undergoing lung cancer screening who are at risk of CVE can be identified using automatic image analysis. Adding subject information slightly improved the performance.

  14. Architecture for a PACS primary diagnosis workstation

    NASA Astrophysics Data System (ADS)

    Shastri, Kaushal; Moran, Byron

    1990-08-01

    A major factor in determining the overall utility of a medical Picture Archiving and Communications (PACS) system is the functionality of the diagnostic workstation. Meyer-Ebrecht and Wendler [1] have proposed a modular picture computer architecture with high throughput and Perry et.al [2] have defined performance requirements for radiology workstations. In order to be clinically useful, a primary diagnosis workstation must not only provide functions of current viewing systems (e.g. mechanical alternators [3,4]) such as acceptable image quality, simultaneous viewing of multiple images, and rapid switching of image banks; but must also provide a diagnostic advantage over the current systems. This includes window-level functions on any image, simultaneous display of multi-modality images, rapid image manipulation, image processing, dynamic image display (cine), electronic image archival, hardcopy generation, image acquisition, network support, and an easy user interface. Implementation of such a workstation requires an underlying hardware architecture which provides high speed image transfer channels, local storage facilities, and image processing functions. This paper describes the hardware architecture of the Siemens Diagnostic Reporting Console (DRC) which meets these requirements.

  15. The public cancer radiology imaging collections of The Cancer Imaging Archive.

    PubMed

    Prior, Fred; Smith, Kirk; Sharma, Ashish; Kirby, Justin; Tarbox, Lawrence; Clark, Ken; Bennett, William; Nolan, Tracy; Freymann, John

    2017-09-19

    The Cancer Imaging Archive (TCIA) is the U.S. National Cancer Institute's repository for cancer imaging and related information. TCIA contains 30.9 million radiology images representing data collected from approximately 37,568 subjects. This data is organized into collections by tumor-type with many collections also including analytic results or clinical data. TCIA staff carefully de-identify and curate all incoming collections prior to making the information available via web browser or programmatic interfaces. Each published collection within TCIA is assigned a Digital Object Identifier that references the collection. Additionally, researchers who use TCIA data may publish the subset of information used in their analysis by requesting a TCIA generated Digital Object Identifier. This data descriptor is a review of a selected subset of existing publicly available TCIA collections. It outlines the curation and publication methods employed by TCIA and makes available 15 collections of cancer imaging data.

  16. EIR: enterprise imaging repository, an alternative imaging archiving and communication system.

    PubMed

    Bian, Jiang; Topaloglu, Umit; Lane, Cheryl

    2009-01-01

    The enormous number of studies performed at the Nuclear Medicine Department of University of Arkansas for Medical Sciences (UAMS) generates a huge amount PET/CT images daily. A DICOM workstation had been used as "mini-PACS" to route all studies, which is historically proven to be slow due to various reasons. However, replacing the workstation with a commercial PACS server is not only cost inefficient; and more often, the PACS vendors are reluctant to take responsibility for the final integration of these components. Therefore, in this paper, we propose an alternative imaging archiving and communication system called Enterprise Imaging Repository (EIR). EIR consists of two distinguished components: an image processing daemon and a user friendly web interface. EIR not only reduces the overall waiting time of transferring a study from the modalities to radiologists' workstations, but also provides a more preferable presentation.

  17. TCIApathfinder: an R client for The Cancer Imaging Archive REST API.

    PubMed

    Russell, Pamela; Fountain, Kelly; Wolverton, Dulcy; Ghosh, Debashis

    2018-06-05

    The Cancer Imaging Archive (TCIA) hosts publicly available de-identified medical images of cancer from over 25 body sites and over 30,000 patients. Over 400 published studies have utilized freely available TCIA images. Images and metadata are available for download through a web interface or a REST API. Here we present TCIApathfinder, an R client for the TCIA REST API. TCIApathfinder wraps API access in user-friendly R functions that can be called interactively within an R session or easily incorporated into scripts. Functions are provided to explore the contents of the large database and to download image files. TCIApathfinder provides easy access to TCIA resources in the highly popular R programming environment. TCIApathfinder is freely available under the MIT license as a package on CRAN (https://cran.r-project.org/web/packages/TCIApathfinder/index.html) and at https://github.com/pamelarussell/TCIApathfinder. Copyright ©2018, American Association for Cancer Research.

  18. The SSABLE system - Automated archive, catalog, browse and distribution of satellite data in near-real time

    NASA Technical Reports Server (NTRS)

    Simpson, James J.; Harkins, Daniel N.

    1993-01-01

    Historically, locating and browsing satellite data has been a cumbersome and expensive process. This has impeded the efficient and effective use of satellite data in the geosciences. SSABLE is a new interactive tool for the archive, browse, order, and distribution of satellite date based upon X Window, high bandwidth networks, and digital image rendering techniques. SSABLE provides for automatically constructing relational database queries to archived image datasets based on time, data, geographical location, and other selection criteria. SSABLE also provides a visual representation of the selected archived data for viewing on the user's X terminal. SSABLE is a near real-time system; for example, data are added to SSABLE's database within 10 min after capture. SSABLE is network and machine independent; it will run identically on any machine which satisfies the following three requirements: 1) has a bitmapped display (monochrome or greater); 2) is running the X Window system; and 3) is on a network directly reachable by the SSABLE system. SSABLE has been evaluated at over 100 international sites. Network response time in the United States and Canada varies between 4 and 7 s for browse image updates; reported transmission times to Europe and Australia typically are 20-25 s.

  19. Integration Of An MR Image Network Into A Clinical PACS

    NASA Astrophysics Data System (ADS)

    Ratib, Osman M.; Mankovich, Nicholas J.; Taira, Ricky K.; Cho, Paul S.; Huang, H. K.

    1988-06-01

    A direct link between a clinical pediatric PACS module and a FONAR MRI image network was implemented. The original MR network combines together the MR scanner, a remote viewing station and a central archiving station. The pediatric PACS directly connects to the archiving unit through an Ethernet TCP-IP network adhering to FONAR's protocol. The PACS communication software developed supports the transfer of patient studies and the patient information directly from the MR archive database to the pediatric PACS. In the first phase of our project we developed a package to transfer data between a VAX-111750 and the IBM PC I AT-based MR archive database through the Ethernet network. This system served as a model for PACS-to-modality network communication. Once testing was complete on this research network, the software and network hardware was moved to the clinical pediatric VAX for full PACS integration. In parallel to the direct transmission of digital images to the Pediatric PACS, a broadband communication system in video format was developed for real-time broadcasting of images originating from the MR console to 8 remote viewing stations distributed in the radiology department. These analog viewing stations allow the radiologists to directly monitor patient positioning and to select the scan levels during a patient examination from remote locations in the radiology department. This paper reports (1) the technical details of this implementation, (2) the merits of this network development scheme, and (3) the performance statistics of the network-to-PACS interface.

  20. Document image archive transfer from DOS to UNIX

    NASA Technical Reports Server (NTRS)

    Hauser, Susan E.; Gill, Michael J.; Thoma, George R.

    1994-01-01

    An R&D division of the National Library of Medicine has developed a prototype system for automated document image delivery as an adjunct to the labor-intensive manual interlibrary loan service of the library. The document image archive is implemented by a PC controlled bank of optical disk drives which use 12 inch WORM platters containing bitmapped images of over 200,000 pages of medical journals. Following three years of routine operation which resulted in serving patrons with articles both by mail and fax, an effort is underway to relocate the storage environment from the DOS-based system to a UNIX-based jukebox whose magneto-optical erasable 5 1/4 inch platters hold the images. This paper describes the deficiencies of the current storage system, the design issues of modifying several modules in the system, the alternatives proposed and the tradeoffs involved.

  1. Fiber Optic Communication System For Medical Images

    NASA Astrophysics Data System (ADS)

    Arenson, Ronald L.; Morton, Dan E.; London, Jack W.

    1982-01-01

    This paper discusses a fiber optic communication system linking ultrasound devices, Computerized tomography scanners, Nuclear Medicine computer system, and a digital fluoro-graphic system to a central radiology research computer. These centrally archived images are available for near instantaneous recall at various display consoles. When a suitable laser optical disk is available for mass storage, more extensive image archiving will be added to the network including digitized images of standard radiographs for comparison purposes and for remote display in such areas as the intensive care units, the operating room, and selected outpatient departments. This fiber optic system allows for a transfer of high resolution images in less than a second over distances exceeding 2,000 feet. The advantages of using fiber optic cables instead of typical parallel or serial communication techniques will be described. The switching methodology and communication protocols will also be discussed.

  2. Use of multidimensional, multimodal imaging and PACS to support neurological diagnoses

    NASA Astrophysics Data System (ADS)

    Wong, Stephen T. C.; Knowlton, Robert C.; Hoo, Kent S.; Huang, H. K.

    1995-05-01

    Technological advances in brain imaging have revolutionized diagnosis in neurology and neurological surgery. Major imaging techniques include magnetic resonance imaging (MRI) to visualize structural anatomy, positron emission tomography (PET) to image metabolic function and cerebral blood flow, magnetoencephalography (MEG) to visualize the location of physiologic current sources, and magnetic resonance spectroscopy (MRS) to measure specific biochemicals. Each of these techniques studies different biomedical aspects of the brain, but there lacks an effective means to quantify and correlate the disparate imaging datasets in order to improve clinical decision making processes. This paper describes several techniques developed in a UNIX-based neurodiagnostic workstation to aid the noninvasive presurgical evaluation of epilepsy patients. These techniques include online access to the picture archiving and communication systems (PACS) multimedia archive, coregistration of multimodality image datasets, and correlation and quantitation of structural and functional information contained in the registered images. For illustration, we describe the use of these techniques in a patient case of nonlesional neocortical epilepsy. We also present out future work based on preliminary studies.

  3. Contrast in Terahertz Images of Archival Documents—Part I: Influence of the Optical Parameters from the Ink and Support

    NASA Astrophysics Data System (ADS)

    Bardon, Tiphaine; May, Robert K.; Jackson, J. Bianca; Beentjes, Gabriëlle; de Bruin, Gerrit; Taday, Philip F.; Strlič, Matija

    2017-04-01

    This study aims to objectively inform curators when terahertz time-domain (TD) imaging set in reflection mode is likely to give well-contrasted images of inscriptions in a complex archival document and is a useful non-invasive alternative to current digitisation processes. To this end, the dispersive refractive indices and absorption coefficients from various archival materials are assessed and their influence on contrast in terahertz images from historical documents is explored. Sepia ink and inks produced with bistre or verdigris mixed with a solution of Arabic gum or rabbit skin glue are unlikely to lead to well-contrasted images. However, dispersions of bone black, ivory black, iron gall ink, malachite, lapis lazuli, minium and vermilion are likely to lead to well-contrasted images. Inscriptions written with lamp black, carbon black and graphite give the best imaging results. The characteristic spectral signatures from iron gall ink, minium and vermilion pellets between 5 and 100 cm-1 relate to a ringing effect at late collection times in TD waveforms transmitted through these pellets. The same ringing effect can be probed in waveforms reflected from iron gall, minium and vermilion ink deposits at the surface of a document. Since TD waveforms collected for each scanning pixel can be Fourier-transformed into spectral information, terahertz TD imaging in reflection mode can serve as a hyperspectral imaging tool. However, chemical recognition and mapping of the ink is currently limited by the fact that the morphology of the document influences more the terahertz spectral response of the document than the resonant behaviour of the ink.

  4. Digital management and regulatory submission of medical images from clinical trials: role and benefits of the core laboratory

    NASA Astrophysics Data System (ADS)

    Robbins, William L.; Conklin, James J.

    1995-10-01

    Medical images (angiography, CT, MRI, nuclear medicine, ultrasound, x ray) play an increasingly important role in the clinical development and regulatory review process for pharmaceuticals and medical devices. Since medical images are increasingly acquired and archived digitally, or are readily digitized from film, they can be visualized, processed and analyzed in a variety of ways using digital image processing and display technology. Moreover, with image-based data management and data visualization tools, medical images can be electronically organized and submitted to the U.S. Food and Drug Administration (FDA) for review. The collection, processing, analysis, archival, and submission of medical images in a digital format versus an analog (film-based) format presents both challenges and opportunities for the clinical and regulatory information management specialist. The medical imaging 'core laboratory' is an important resource for clinical trials and regulatory submissions involving medical imaging data. Use of digital imaging technology within a core laboratory can increase efficiency and decrease overall costs in the image data management and regulatory review process.

  5. The SmartGeo Portal: A retrospective

    NASA Astrophysics Data System (ADS)

    Heilmann, Zeno; Satta, Guido; Bonomi, Ernesto

    2016-04-01

    The SmartGeo portal was created in a follow-up project that evolved from the geophysical data imaging services of a Grid computing portal for Geoscience, called GRIDA3. The scope of the project was to support commercial geotechnical service providers as well as academic researchers working in near-surface geoscience. Starting from the existing services, the SmartGeo portal was set up on new hardware, using the latest version of the grid portal environment EnginFrame. After a first working version was established, the services were reviewed, updated and accompanied by new services according to the feedback we received from our partners. One partner for instance experienced large difficulties in a project that aimed at delineating the aquifer for finding water pollutant substances in an industrial area of Basel. The seismic imaging service inherited from the previous portal was employing a data-driven algorithm optimized to provide, directly during data acquisition, nearly in real-time a first image of the subsurface structure. Different to this, our user needed for his data from a geologically very complex and noisy urban environment the maximum lateral resolution and noise reduction possible. For this purpose we added two cutting edge data imaging algorithms able to deliver such high precision results by simultaneously optimizing, for every single image point, all parameters of the mathematical model---a procedure which increased the computational effort by one or two magnitudes, respectively. Thus, parallel computing on grid infrastructure served for maximizing the image resolution instead for generating real-time results. This proved also very useful for the data of an academic partner, recorded for imaging the structure of a shallow sedimentary basin, where we could obtain strongly improved seismic velocity information using these new algorithms. A general user request was to implement interactive data visualization tools. To fulfill this demand we took advantage of the capability of the cloud computing framework to integrate a VNC server that exports the display of any chosen application to the user screen. Since in some cases incomplete data headers created problems for processing and visualization, we used GUI virtualization via VNC to include a powerful freeware application capable to review and set the data header in a graphical, user friendly way. Due to the limited size of the project and the short time frame of two years, some issues could only be identified but not completely resolved. Just to mention one of them: inherent to all shared infrastructure approaches is the fear of users to upload their data to external hardware and to work on a system that does not provide the same level of privacy as their office workstation. We believe that optional encryption of user data and work spaces with a key known only to the user itself could help to dispel these doubts. In this way, a gradual system of data transparency could be created, including public, shared, protected and encrypted data.

  6. Data Mining and Knowledge Discovery tools for exploiting big Earth-Observation data

    NASA Astrophysics Data System (ADS)

    Espinoza Molina, D.; Datcu, M.

    2015-04-01

    The continuous increase in the size of the archives and in the variety and complexity of Earth-Observation (EO) sensors require new methodologies and tools that allow the end-user to access a large image repository, to extract and to infer knowledge about the patterns hidden in the images, to retrieve dynamically a collection of relevant images, and to support the creation of emerging applications (e.g.: change detection, global monitoring, disaster and risk management, image time series, etc.). In this context, we are concerned with providing a platform for data mining and knowledge discovery content from EO archives. The platform's goal is to implement a communication channel between Payload Ground Segments and the end-user who receives the content of the data coded in an understandable format associated with semantics that is ready for immediate exploitation. It will provide the user with automated tools to explore and understand the content of highly complex images archives. The challenge lies in the extraction of meaningful information and understanding observations of large extended areas, over long periods of time, with a broad variety of EO imaging sensors in synergy with other related measurements and data. The platform is composed of several components such as 1.) ingestion of EO images and related data providing basic features for image analysis, 2.) query engine based on metadata, semantics and image content, 3.) data mining and knowledge discovery tools for supporting the interpretation and understanding of image content, 4.) semantic definition of the image content via machine learning methods. All these components are integrated and supported by a relational database management system, ensuring the integrity and consistency of Terabytes of Earth Observation data.

  7. Toward a National Computerized Database for Moving Image Materials.

    ERIC Educational Resources Information Center

    Gartenberg, Jon

    This report summarizes a project conducted by a group of catalogers from film archives devoted to nitrate preservation, which explored ways of developing a database to provide a complete film and television information service that would be available nationwide and could contain filmographic data, information on holdings in archives and…

  8. 76 FR 43960 - NARA Records Reproduction Fees

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-22

    ... transferred to NARA and maintain its fee schedule on NARA's Web site http://www.archives.gov . The proposed... document is faint or too dark, it requires additional time to obtain a readable image. In TABLE 1 below... our Web site ( http://www.archives.gov ) annually when announcing that records reproduction fees will...

  9. 36 CFR 1225.24 - When can an agency apply previously approved schedules to electronic records?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... Property NATIONAL ARCHIVES AND RECORDS ADMINISTRATION RECORDS MANAGEMENT SCHEDULING RECORDS § 1225.24 When... must notify the National Archives and Records Administration, Modern Records Programs (NWM), 8601... authority reference; and (v) Format of the records (e.g., database, scanned images, digital photographs, etc...

  10. 36 CFR 1225.24 - When can an agency apply previously approved schedules to electronic records?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Property NATIONAL ARCHIVES AND RECORDS ADMINISTRATION RECORDS MANAGEMENT SCHEDULING RECORDS § 1225.24 When... must notify the National Archives and Records Administration, Modern Records Programs (NWM), 8601... authority reference; and (v) Format of the records (e.g., database, scanned images, digital photographs, etc...

  11. 36 CFR 1225.24 - When can an agency apply previously approved schedules to electronic records?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Property NATIONAL ARCHIVES AND RECORDS ADMINISTRATION RECORDS MANAGEMENT SCHEDULING RECORDS § 1225.24 When... must notify the National Archives and Records Administration, Modern Records Programs (NWM), 8601... authority reference; and (v) Format of the records (e.g., database, scanned images, digital photographs, etc...

  12. 21 CFR 892.2050 - Picture archiving and communications system.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... processing of medical images. Its hardware components may include workstations, digitizers, communications... hardcopy devices. The software components may provide functions for performing operations related to image...

  13. 21 CFR 892.2050 - Picture archiving and communications system.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... processing of medical images. Its hardware components may include workstations, digitizers, communications... hardcopy devices. The software components may provide functions for performing operations related to image...

  14. 21 CFR 892.2050 - Picture archiving and communications system.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... processing of medical images. Its hardware components may include workstations, digitizers, communications... hardcopy devices. The software components may provide functions for performing operations related to image...

  15. 21 CFR 892.2050 - Picture archiving and communications system.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... processing of medical images. Its hardware components may include workstations, digitizers, communications... hardcopy devices. The software components may provide functions for performing operations related to image...

  16. Web tools for large-scale 3D biological images and atlases

    PubMed Central

    2012-01-01

    Background Large-scale volumetric biomedical image data of three or more dimensions are a significant challenge for distributed browsing and visualisation. Many images now exceed 10GB which for most users is too large to handle in terms of computer RAM and network bandwidth. This is aggravated when users need to access tens or hundreds of such images from an archive. Here we solve the problem for 2D section views through archive data delivering compressed tiled images enabling users to browse through very-large volume data in the context of a standard web-browser. The system provides an interactive visualisation for grey-level and colour 3D images including multiple image layers and spatial-data overlay. Results The standard Internet Imaging Protocol (IIP) has been extended to enable arbitrary 2D sectioning of 3D data as well a multi-layered images and indexed overlays. The extended protocol is termed IIP3D and we have implemented a matching server to deliver the protocol and a series of Ajax/Javascript client codes that will run in an Internet browser. We have tested the server software on a low-cost linux-based server for image volumes up to 135GB and 64 simultaneous users. The section views are delivered with response times independent of scale and orientation. The exemplar client provided multi-layer image views with user-controlled colour-filtering and overlays. Conclusions Interactive browsing of arbitrary sections through large biomedical-image volumes is made possible by use of an extended internet protocol and efficient server-based image tiling. The tools open the possibility of enabling fast access to large image archives without the requirement of whole image download and client computers with very large memory configurations. The system was demonstrated using a range of medical and biomedical image data extending up to 135GB for a single image volume. PMID:22676296

  17. Elements of a next generation time-series ASCII data file format for Earth Sciences

    NASA Astrophysics Data System (ADS)

    Webster, C. J.

    2015-12-01

    Data in ASCII comma separated value (CSV) format are recognized as the most simple, straightforward and readable type of data present in the geosciences. Many scientific workflows developed over the years rely on data using this simple format. However, there is a need for a lightweight ASCII header format standard that is easy to create and easy to work with. Current OGC grade XML standards are complex and difficult to implement for researchers with few resources. Ideally, such a format should provide the data in CSV for easy consumption by generic applications such as spreadsheets. The format should use an existing time standard. The header should be easily human readable as well as machine parsable. The metadata format should be extendable to allow vocabularies to be adopted as they are created by external standards bodies. The creation of such a format will increase the productivity of software engineers and scientists because fewer translators and checkers would be required. Data in ASCII comma separated value (CSV) format are recognized as the most simple, straightforward and readable type of data present in the geosciences. Many scientific workflows developed over the years rely on data using this simple format. However, there is a need for a lightweight ASCII header format standard that is easy to create and easy to work with. Current OGC grade XML standards are complex and difficult to implement for researchers with few resources. Ideally, such a format would provide the data in CSV for easy consumption by generic applications such as spreadsheets. The format would use existing time standard. The header would be easily human readable as well as machine parsable. The metadata format would be extendable to allow vocabularies to be adopted as they are created by external standards bodies. The creation of such a format would increase the productivity of software engineers and scientists because fewer translators would be required.

  18. Selection and implementation of a distributed phased archive for a multivendor incremental approach to PACS

    NASA Astrophysics Data System (ADS)

    Smith, Edward M.; Wandtke, John; Robinson, Arvin E.

    1999-07-01

    The selection criteria for the archive were based on the objectives of the Medical Information, Communication and Archive System (MICAS), a multi-vendor incremental approach to PACS. These objectives include interoperability between all components, seamless integration of the Radiology Information System (RIS) with MICAS and eventually other hospital databases, all components must demonstrate DICOM compliance prior to acceptance and automated workflow that can be programmed to meet changes in the healthcare environment. The long-term multi-modality archive is being implemented in 3 or more phases with the first phase designed to provide a 12 to 18 month storage solution. This decision was made because the cost per GB of storage is rapidly decreasing and the speed at which data can be retrieved is increasing with time. The open-solution selected allows incorporation of leading edge, 'best of breed' hardware and software and provides maximum jukeboxes, provides maximum flexibility of workflow both within and outside of radiology. The selected solution is media independent, supports multiple jukeboxes, provides expandable storage capacity and will provide redundancy and fault tolerance at minimal cost. Some of the required attributes of the archive include scalable archive strategy, virtual image database with global query and object-oriented database. The selection process took approximately 10 months with Cemax-Icon being the vendor selected. Prior to signing a purchase order, Cemax-Icon performed a site survey, agreed upon the acceptance test protocol and provided a written guarantee of connectivity between their archive and the imaging modalities and other MICAS components.

  19. Multiphoton fluorescence lifetime imaging of chemotherapy distribution in solid tumors

    NASA Astrophysics Data System (ADS)

    Carlson, Marjorie; Watson, Adrienne L.; Anderson, Leah; Largaespada, David A.; Provenzano, Paolo P.

    2017-11-01

    Doxorubicin is a commonly used chemotherapeutic employed to treat multiple human cancers, including numerous sarcomas and carcinomas. Furthermore, doxorubicin possesses strong fluorescent properties that make it an ideal reagent for modeling drug delivery by examining its distribution in cells and tissues. However, while doxorubicin fluorescence and lifetime have been imaged in live tissue, its behavior in archival samples that frequently result from drug and treatment studies in human and animal patients, and murine models of human cancer, has to date been largely unexplored. Here, we demonstrate imaging of doxorubicin intensity and lifetimes in archival formalin-fixed paraffin-embedded sections from mouse models of human cancer with multiphoton excitation and multiphoton fluorescence lifetime imaging microscopy (FLIM). Multiphoton excitation imaging reveals robust doxorubicin emission in tissue sections and captures spatial heterogeneity in cells and tissues. However, quantifying the amount of doxorubicin signal in distinct cell compartments, particularly the nucleus, often remains challenging due to strong signals in multiple compartments. The addition of FLIM analysis to display the spatial distribution of excited state lifetimes clearly distinguishes between signals in distinct compartments such as the cell nuclei versus cytoplasm and allows for quantification of doxorubicin signal in each compartment. Furthermore, we observed a shift in lifetime values in the nuclei of transformed cells versus nontransformed cells, suggesting a possible diagnostic role for doxorubicin lifetime imaging to distinguish normal versus transformed cells. Thus, data here demonstrate that multiphoton FLIM is a highly sensitive platform for imaging doxorubicin distribution in normal and diseased archival tissues.

  20. Peer-to-peer architecture for multi-departmental distributed PACS

    NASA Astrophysics Data System (ADS)

    Rosset, Antoine; Heuberger, Joris; Pysher, Lance; Ratib, Osman

    2006-03-01

    We have elected to explore peer-to-peer technology as an alternative to centralized PACS architecture for the increasing requirements for wide access to images inside and outside a radiology department. The goal being to allow users across the enterprise to access any study anytime without the need for prefetching or routing of images from central archive. Images can be accessed between different workstations and local storage nodes. We implemented "bonjour" a new remote file access technology developed by Apple allowing applications to share data and files remotely with optimized data access and data transfer. Our Open-source image display platform called OsiriX was adapted to allow sharing of local DICOM images through direct access of each local SQL database to be accessible from any other OsiriX workstation over the network. A server version of Osirix Core Data database also allows to access distributed archives servers in the same way. The infrastructure implemented allows fast and efficient access to any image anywhere anytime independently from the actual physical location of the data. It also allows benefiting from the performance of distributed low-cost and high capacity storage servers that can provide efficient caching of PACS data that was found to be 10 to 20 x faster that accessing the same date from the central PACS archive. It is particularly suitable for large hospitals and academic environments where clinical conferences, interdisciplinary discussions and successive sessions of image processing are often part of complex workflow or patient management and decision making.

  1. Archiving of interferometric radio and mm/submm data at the National Radio Astronomy Observatory

    NASA Astrophysics Data System (ADS)

    Lacy, Mark

    2018-06-01

    Modern radio interferometers such as ALMA and the VLA are capable of producing ~1TB/day of data for processing into image products of comparable size. Besides the shear volume of data, the products themselves can be complicated and are sometimes hard to map into standard astronomical archive metadata. We also face similar issues to those faced by archives at other wavelengths, namely the role of archives as the basis of reprocessing platforms and facilities, and the validation and ingestion of user-derived products. In this talk I shall discuss the plans of NRAO in these areas over the next decade.

  2. VizieR Online Data Catalog: B213 filament 150 and 260GHz emission maps (Bracco+, 2017)

    NASA Astrophysics Data System (ADS)

    Bracco, A.; Palmeirim, P.; Andre, P.; Adam, R.; Ade, P.; Bacmann, A.; Beelen, A.; Benoeet, A.; Bideaud, A.; Billot, N.; Bourrion, O.; Calvo, M.; Catalano, A.; Coiffard, G.; Comis, B.; D'Addabbo, A.; Desert, F.-X.; Didelon, P.; Doyle, S.; Goupy, J.; Konyves, V.; Kramer, C.; Lagache, G.; Leclercq, S.; Macias-Perez, J. F.; Maury, A.; Mauskopf, P.; Mayet, F.; Monfardini, A.; Motte, F.; Pajot, F.; Pascale, E.; Peretto, N.; Perotto, L.; Pisano, G.; Ponthieu, N.; Reveret, V.; Rigby, A.; Ritacco, A.; Rodriguez, L.; Romero, C.; Roy, A.; Ruppin, F.; Schuster, K.; Sievers, A.; Triqueneaux, S.; Tucker, C.; Zylka, R.

    2017-07-01

    We present the continuum emission maps at 150 and 260GHz of the B213 filament in the Taurus molecular complex obtained with the NIKA camera at IRAM 30m. Observations were performed during the first NIKA open pool, in February 2014, and are presented in the paper. The maps FWHM angular resolution is 24" (see Fig. 1). Due to the scanning strategy, the noise rms is relatively constant in the central part of maps but rapidly increase towards the edge. Scales larger than 2' are filtered during the data reduction. The image coordinates can be found in the FITS header. Three maps per frequency are provided: flux density, noise, and time-per-pixel. Units are MJy/sr and second, respectively. (2 data files).

  3. USGS Releases Landsat Orthorectified State Mosaics

    USGS Publications Warehouse

    ,

    2005-01-01

    The U.S. Geological Survey (USGS) National Remote Sensing Data Archive, located at the USGS Center for Earth Resources Observation and Science (EROS) in Sioux Falls, South Dakota, maintains the Landsat orthorectified data archive. Within the archive are Landsat Enhanced Thematic Mapper Plus (ETM+) data that have been pansharpened and orthorectified by the Earth Satellite Corporation. This imagery has acquisition dates ranging from 1999 to 2001 and was created to provide users with access to quality-screened, high-resolution satellite images with global coverage over the Earth's landmasses.

  4. A new system for digital image acquisition, storage and presentation in an accident and emergency department

    PubMed Central

    Clegg, G; Roebuck, S; Steedman, D

    2001-01-01

    Objectives—To develop a computer based storage system for clinical images—radiographs, photographs, ECGs, text—for use in teaching, training, reference and research within an accident and emergency (A&E) department. Exploration of methods to access and utilise the data stored in the archive. Methods—Implementation of a digital image archive using flatbed scanner and digital camera as capture devices. A sophisticated coding system based on ICD 10. Storage via an "intelligent" custom interface. Results—A practical solution to the problems of clinical image storage for teaching purposes. Conclusions—We have successfully developed a digital image capture and storage system, which provides an excellent teaching facility for a busy A&E department. We have revolutionised the practice of the "hand-over meeting". PMID:11435357

  5. What Is A Picture Archiving And Communication System (PACS)?

    NASA Astrophysics Data System (ADS)

    Marceau, Carla

    1982-01-01

    A PACS is a digital system for acquiring, storing, moving and displaying picture or image information. It is an alternative to film jackets that has been made possible by recent breakthroughs in computer technology: telecommunications, local area nets and optical disks. The fundamental concept of the digital representation of image information is introduced. It is shown that freeing images from a material representation on film or paper leads to a dramatic increase in flexibility in our use of the images. The ultimate goal of a medical PACS system is a radiology department without film jackets. The inherent nature of digital images and the power of the computer allow instant free "copies" of images to be made and thrown away. These copies can be transmitted to distant sites in seconds, without the "original" ever leaving the archives of the radiology department. The result is a radiology department with much freer access to patient images and greater protection against lost or misplaced image information. Finally, images in digital form can be treated as data for the computer in image processing, which includes enhancement, reconstruction and even computer-aided analysis.

  6. The Hubble Spectroscopic Legacy Archive

    NASA Astrophysics Data System (ADS)

    Peeples, M.; Tumlinson, J.; Fox, A.; Aloisi, A.; Fleming, S.; Jedrzejewski, R.; Oliveira, C.; Ayres, T.; Danforth, C.; Keeney, B.; Jenkins, E.

    2017-04-01

    With no future space ultraviolet instruments currently planned, the data from the UV spectrographs aboard the Hubble Space Telescope have a legacy value beyond their initial science goals. The goal of the Hubble Spectroscopic Legacy Archive(HSLA) is to provide to the community new science-grade combined spectra for all publicly available data obtained by the Cosmic Origins Spectrograph (COS)and the Space Telescope Imaging Spectrograph (STIS). These data are packaged into "smart archives" according to target type and scientific themes to facilitate the construction of archival samples for common science uses. A new "quick look" capability makes the data easy for users to quickly access, assess the quality of,and download for archival science. The first generation of these products for the far-ultraviolet (FUV) modes of COS was made available online via the Mikulski Archive for Space Telescopes (MAST) in early 2016 and updated in early 2017; future releases will include COS/NUV and STIS/UV data.

  7. Using and Distributing Spaceflight Data: The Johnson Space Center Life Sciences Data Archive

    NASA Technical Reports Server (NTRS)

    Cardenas, J. A.; Buckey, J. C.; Turner, J. N.; White, T. S.; Havelka,J. A.

    1995-01-01

    Life sciences data collected before, during and after spaceflight are valuable and often irreplaceable. The Johnson Space Center Life is hard to find, and much of the data (e.g. Sciences Data Archive has been designed to provide researchers, engineers, managers and educators interactive access to information about and data from human spaceflight experiments. The archive system consists of a Data Acquisition System, Database Management System, CD-ROM Mastering System and Catalog Information System (CIS). The catalog information system is the heart of the archive. The CIS provides detailed experiment descriptions (both written and as QuickTime movies), hardware descriptions, hardware images, documents, and data. An initial evaluation of the archive at a scientific meeting showed that 88% of those who evaluated the catalog want to use the system when completed. The majority of the evaluators found the archive flexible, satisfying and easy to use. We conclude that the data archive effectively provides key life sciences data to interested users.

  8. The Cancer Imaging Archive (TCIA) | Informatics Technology for Cancer Research (ITCR)

    Cancer.gov

    TCIA is NCI’s repository for publicly shared cancer imaging data. TCIA collections include radiology and pathology images, clinical and clinical trial data, image derived annotations and quantitative features and a growing collection of related ‘omics data both from clinical and pre-clinical studies.

  9. Photo CD and Other Digital Imaging Technologies: What's out There and What's It For?

    ERIC Educational Resources Information Center

    Chen, Ching-Chih

    1993-01-01

    Describes Kodak's Photo CD technology and its impact on digital imaging. Color desktop publishing, image processing and preservation, image archival storage, and interactive multimedia development, as well as the equipment, software, and services that make these applications possible, are described. Contact information for developers and…

  10. MineScan: non-image data monitoring and mining from imaging modalities

    NASA Astrophysics Data System (ADS)

    Zaidi, Shayan M.; Huff, Dov; Bhalodia, Pankit; Mongkolwat, Pattanasak; Channin, David S.

    2003-05-01

    This project is intended to capture and interactively display non-image information routinely generated by imaging modalities. This information relates to the device's performance of the individual procedures and is not necessarily available in other information streams such as DICOM headers. While originally intended for use in servicing the modalities, this information can also be presented to radiologists and administrators within the department for both micro- and macro-management purposes. This data can help hospital administrators and radiologists manage available resources and discover clues to indicate what modifications in hospital operations might significantly improve its ability to provide efficient patient care. Data is collected from a departmental CT scanner. The data consists of a running record of exams followed by a list of processing records logged over a 24-hour period. MineScan extracts information from these records and stores it into a database. A statistical program is run once a day to collect relevant metrics. MineScan can be accessed via a Web browser or through an advanced prototype PACS workstation. This information, if provided in real-time, can be used to manage operations in a busy department. Even when provided historically, the data can be used to assess current activity, analyze trends and plan future operations.

  11. Archive of digital and digitized analog boomer seismic reflection data collected during USGS cruise 96CCT02 in Copano, Corpus Christi, and Nueces Bays and Corpus Christi Bayou, Texas, July 1996

    USGS Publications Warehouse

    Harrison, Arnell S.; Dadisman, Shawn V.; Kindinger, Jack G.; Morton, Robert A.; Blum, Mike D.; Wiese, Dana S.; Subiño, Janice A.

    2007-01-01

    In June of 1996, the U.S. Geological Survey conducted geophysical surveys from Nueces to Copano Bays, Texas. This report serves as an archive of unprocessed digital boomer seismic reflection data, trackline maps, navigation files, GIS information, cruise log, and formal FGDC metadata. Filtered and gained digital images of the seismic profiles and high resolution scanned TIFF images of the original paper printouts are also provided. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG-Y format (Barry and others, 1975) and may be downloaded and processed with commercial or public domain software such as Seismic Unix (SU). Example SU processing scripts and USGS software for viewing the SEG-Y files (Zihlman, 1992) are also provided.

  12. The Hubble Spectroscopic Legacy Archive

    NASA Astrophysics Data System (ADS)

    Peeples, Molly S.; Tumlinson, Jason; Fox, Andrew; Aloisi, Alessandra; Ayres, Thomas R.; Danforth, Charles; Fleming, Scott W.; Jenkins, Edward B.; Jedrzejewski, Robert I.; Keeney, Brian A.; Oliveira, Cristina M.

    2016-01-01

    With no future space ultraviolet instruments currently planned, the data from the UV spectrographs aboard the Hubble Space Telescope have a legacy value beyond their initial science goals. The Hubble Spectroscopic Legacy Archive will provide to the community new science-grade combined spectra for all publicly available data obtained by the Cosmic Origins Spectrograph (COS) and the Space Telescope Imaging Spectrograph (STIS). These data will be packaged into "smart archives" according to target type and scientific themes to facilitate the construction of archival samples for common science uses. A new "quick look" capability will make the data easy for users to quickly access, assess the quality of, and download for archival science starting in Cycle 24, with the first generation of these products for the FUV modes of COS available online via MAST in early 2016.

  13. HEAVY WATER MODERATED NEUTRONIC REACTOR

    DOEpatents

    Szilard, L.

    1958-04-29

    A nuclear reactor of the type which utilizes uranium fuel elements and a liquid coolant is described. The fuel elements are in the form of elongated tubes and are disposed within outer tubes extending through a tank containing heavy water, which acts as a moderator. The ends of the fuel tubes are connected by inlet and discharge headers, and liquid bismuth is circulated between the headers and through the fuel tubes for cooling. Helium is circulated through the annular space between the outer tubes in the tank and the fuel tubes to cool the water moderator to prevent boiling. The fuel tubes are covered with a steel lining, and suitable control means, heat exchange means, and pumping means for the coolants are provided to complete the reactor assembly.

  14. VizieR Online Data Catalog: Proper motions and photometry of stars in NGC 3201 (Sariya+, 2017)

    NASA Astrophysics Data System (ADS)

    Sariya, D. P.; Jiang, I.-G.; Yadav, R. K. S.

    2017-07-01

    To determine the PMs of the stars in this work, we used archive images (http://archive.eso.org/eso/esoarchivemain.html) from observations made with the 2.2m ESO/MPI telescope at La Silla, Chile. This telescope contains a mosaic camera called the Wide-Field Imager (WFI), consisting of 4*2 (i.e., 8 CCD chips). Since each CCD has an array of 2048*4096 pixels, WFI ultimately produces images with a 34*33arcmin2 field of view. The observational run of the first epoch contains two images in B,V and I bands, each with 240s exposure time observed on 1999 December 05. In the second epoch, we have 35 images with 40s exposure time each in V filter observed during the period of 2014 April 02-05. Thus the epoch gap between the data is ~14.3 years. (2 data files).

  15. The development of a digitising service centre for natural history collections

    PubMed Central

    Tegelberg, Riitta; Haapala, Jaana; Mononen, Tero; Pajari, Mika; Saarenmaa, Hannu

    2012-01-01

    Abstract Digitarium is a joint initiative of the Finnish Museum of Natural History and the University of Eastern Finland. It was established in 2010 as a dedicated shop for the large-scale digitisation of natural history collections. Digitarium offers service packages based on the digitisation process, including tagging, imaging, data entry, georeferencing, filtering, and validation. During the process, all specimens are imaged, and distance workers take care of the data entry from the images. The customer receives the data in Darwin Core Archive format, as well as images of the specimens and their labels. Digitarium also offers the option of publishing images through Morphbank, sharing data through GBIF, and archiving data for long-term storage. Service packages can also be designed on demand to respond to the specific needs of the customer. The paper also discusses logistics, costs, and intellectual property rights (IPR) issues related to the work that Digitarium undertakes. PMID:22859879

  16. Architecture of distributed picture archiving and communication systems for storing and processing high resolution medical images

    NASA Astrophysics Data System (ADS)

    Tokareva, Victoria

    2018-04-01

    New generation medicine demands a better quality of analysis increasing the amount of data collected during checkups, and simultaneously decreasing the invasiveness of a procedure. Thus it becomes urgent not only to develop advanced modern hardware, but also to implement special software infrastructure for using it in everyday clinical practice, so-called Picture Archiving and Communication Systems (PACS). Developing distributed PACS is a challenging task for nowadays medical informatics. The paper discusses the architecture of distributed PACS server for processing large high-quality medical images, with respect to technical specifications of modern medical imaging hardware, as well as international standards in medical imaging software. The MapReduce paradigm is proposed for image reconstruction by server, and the details of utilizing the Hadoop framework for this task are being discussed in order to provide the design of distributed PACS as ergonomic and adapted to the needs of end users as possible.

  17. The Birmingham Burn Centre archive: A photographic history of post-war burn care in the United Kingdom.

    PubMed

    Hardwicke, Joseph; Kohlhardt, Angus; Moiemen, Naiem

    2015-06-01

    The Medical Research Council Burns and Industrial Injuries Unit at the Birmingham Accident Hospital pioneered civilian burn care and research in the United Kingdom during the post-war years. A photographic archive has been discovered that documents this period from 1945 to 1975. The aim of this project was to sort, digitize and archive the images in a secure format for future reference. The photographs detail the management of burns patients, from injury causation and surgical intervention, to nursing care, rehabilitation and long-term follow-up. A total of 2650 images files were collected from over 600 patients. Many novel surgical, nursing, dressing and rehabilitation strategies are documented and discussed. We have chosen to report part of the archive under the sections of (1) aseptic and antimicrobial burn care; (2) burn excision and wound closure; (3) rehabilitation, reconstruction and long-term outcomes; (4) accident prevention; and (5) response to a major burns incident. The Birmingham collection gives us a valuable insight into the approach to civilian burn care in the post-war years, and we present a case from the archive to the modern day, the longest clinical photographic follow-up to date. Copyright © 2015 Elsevier Ltd and ISBI. All rights reserved.

  18. VizieR Online Data Catalog: C/2012 F6 (Lemmon) and C/2012 S1 (ISON) maps (Cordiner+, 2014)

    NASA Astrophysics Data System (ADS)

    Cordiner, M. A.; Remijan, A. J.; Boissier, J.; Milam, S. N.; Mumma, M. J.; Charnley, S. B.; Paganini, L.; Villanueva, G.; Bockelee-Morvan, D.; Kuan, Y.-J.; Chuang, Y.-L.; Lis, D. C.; Biver, N.; Crovisier, J.; Minniti, D.; Coulson, I. M.

    2017-04-01

    WCS-calibrated fits image files of the molecular flux maps shown in Figure 1 for HCN, HNC and H2CO observed in comets C/2012 F6 (Lemmon) and C/2012 S1 (ISON) using ALMA. The files are labeled with the corresponding comet and molecule names. The files are standard two-dimensional fits images, which can be opened in fits image viewers such as SAOimage DS9, CASA viewer, or Starlink Gaia. GIMP and Adobe Photoshop can also be used, provided the appropriate plugins are present. The images contain flux values (in units of Jansky km/s per beam), as a function of celestial coordinate in the J2000 equatorial system. Due to the cometary motions, the absolute coordinate systems are accurate only at the start of the observations (dates and times are given in Table 1). These images are the result of integrating the (3D) ALMA data cubes over the full widths of the observed spectral lines (equivalent to collapsing the data cubes along their respective spectral/velocity axes). The beam dimensions (BMAJ and BMIN), corresponding to the angular resolution of the images, are given in the image headers in units of degrees. object.dat : -------------------------------------------------------------------------------- Code Name Elem q e i H1 d AU deg mag -------------------------------------------------------------------------------- C/2012 F6 Lemmon 2456375.5 0.7312461 0.9985125 82.607966 7.96 C/2012 S1 Ison 2456624.5 0.0124515 0.9998921 64.401571 6.11 (2 data files).

  19. ABISM: an interactive image quality assessment tool for adaptive optics instruments

    NASA Astrophysics Data System (ADS)

    Girard, Julien H.; Tourneboeuf, Martin

    2016-07-01

    ABISM (Automatic Background Interactive Strehl Meter) is a interactive tool to evaluate the image quality of astronomical images. It works on seeing-limited point spread functions (PSF) but was developed in particular for diffraction-limited PSF produced by adaptive optics (AO) systems. In the VLT service mode (SM) operations framework, ABISM is designed to help support astronomers or telescope and instruments operators (TIOs) to quickly measure the Strehl ratio (SR) during or right after an observing block (OB) to evaluate whether it meets the requirements/predictions or whether is has to be repeated and will remain in the SM queue. It's a Python-based tool with a graphical user interface (GUI) that can be used with little AO knowledge. The night astronomer (NA) or Telescope and Instrument Operator (TIO) can launch ABISM in one click and the program is able to read keywords from the FITS header to avoid mistakes. A significant effort was also put to make ABISM as robust (and forgiven) with a high rate of repeatability. As a matter of fact, ABISM is able to automatically correct for bad pixels, eliminate stellar neighbours and estimate/fit properly the background, etc.

  20. Home Economics Archive: Research, Tradition and History (HEARTH)

    Science.gov Websites

    , Tradition and History HEARTH is a core electronic collection of books and journals in Home Economics and Intimate History of American Girls. Additional information, images and readings on the history of Home Archive: Research, Tradition and History (HEARTH). Ithaca, NY: Albert R. Mann Library, Cornell University

  1. 36 CFR § 1225.24 - When can an agency apply previously approved schedules to electronic records?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... Property NATIONAL ARCHIVES AND RECORDS ADMINISTRATION RECORDS MANAGEMENT SCHEDULING RECORDS § 1225.24 When... must notify the National Archives and Records Administration, Modern Records Programs (NWM), 8601... authority reference; and (v) Format of the records (e.g., database, scanned images, digital photographs, etc...

  2. Clinical aspects of the Mayo/IBM PACS project

    NASA Astrophysics Data System (ADS)

    Forbes, Glenn S.; Morin, Richard L.; Pavlicek, William

    1991-07-01

    A joint project between Mayo Clinic and IBM to develop a picture archival and communications system has been under development for three years. This project began as a potential solution to a pressing archival problem in magnetic resonance imaging. The project has grown to encompass a much larger sphere of activity including workstations, image retrieval, and report archival. This report focuses on the clinical aspects involved in the design, development, and implementation of such a system. In particular, emphasis is placed on the clinical impact of the system both inside and outside of the radiology department. The primary concerns have centered on fidelity of archival data, ease of use, and diagnostic efficacy. The project to date has been limited to neuroradiology practice. This group consists of nine staff radiologists and fellows. Administrative policy decisions regarding the accessibility and available of digital data in the clinical environment have been much more difficult and complex than originally conceived. Based on the observations thus far, the authors believe the system will become a useful and valuable adjunct to clinical practice of radiology.

  3. Extraction of Dems and Orthoimages from Archive Aerial Imagery to Support Project Planning in Civil Engineering

    NASA Astrophysics Data System (ADS)

    Cogliati, M.; Tonelli, E.; Battaglia, D.; Scaioni, M.

    2017-12-01

    Archive aerial photos represent a valuable heritage to provide information about land content and topography in the past years. Today, the availability of low-cost and open-source solutions for photogrammetric processing of close-range and drone images offers the chance to provide outputs such as DEM's and orthoimages in easy way. This paper is aimed at demonstrating somehow and to which level of accuracy digitized archive aerial photos may be used within a such kind of low-cost software (Agisoft Photoscan Professional®) to generate photogrammetric outputs. Different steps of the photogrammetric processing workflow are presented and discussed. The main conclusion is that this procedure may come to provide some final products, which however do not feature the high accuracy and resolution that may be obtained using high-end photogrammetric software packages specifically designed for aerial survey projects. In the last part a case study is presented about the use of four-epoch archive of aerial images to analyze the area where a tunnel has to be excavated.

  4. Impact of digital radiography on clinical workflow.

    PubMed

    May, G A; Deer, D D; Dackiewicz, D

    2000-05-01

    It is commonly accepted that digital radiography (DR) improves workflow and patient throughput compared with traditional film radiography or computed radiography (CR). DR eliminates the film development step and the time to acquire the image from a CR reader. In addition, the wide dynamic range of DR is such that the technologist can perform the quality-control (QC) step directly at the modality in a few seconds, rather than having to transport the newly acquired image to a centralized QC station for review. Furthermore, additional workflow efficiencies can be achieved with DR by employing tight radiology information system (RIS) integration. In the DR imaging environment, this provides for patient demographic information to be automatically downloaded from the RIS to populate the DR Digital Imaging and Communications in Medicine (DICOM) image header. To learn more about this workflow efficiency improvement, we performed a comparative study of workflow steps under three different conditions: traditional film/screen x-ray, DR without RIS integration (ie, manual entry of patient demographics), and DR with RIS integration. This study was performed at the Cleveland Clinic Foundation (Cleveland, OH) using a newly acquired amorphous silicon flat-panel DR system from Canon Medical Systems (Irvine, CA). Our data show that DR without RIS results in substantial workflow savings over traditional film/screen practice. There is an additional 30% reduction in total examination time using DR with RIS integration.

  5. ONLINE satellite images and educational material: the Danish Galathea 3 world expedition under and after

    NASA Astrophysics Data System (ADS)

    Bay Hasager, Charlotte; Brøgger Sørensen, Peter; Baltazar Andersen, Ole; Badger, Merete; Højerslev, Niels Kristian; Høyer, Jacob L.; Løkkegaard, Bo; Lichtenegger, Jürg; Nyborg, Lotte; Saldo, Roberto

    2010-05-01

    Students and teachers may use ONLINE satellite image in the classroom. Images have been archived since August 2006 and the archive is updated every day since. This means that series of nearly four years of daily global images are available online. The parameters include ocean surface temperature, sea level anomaly, ocean wave height, ocean winds, global ozone in the atmosphere and clouds, and sea ice in the Arctic and Antarctica. During the Galathea 3 expedition that took place from August 2006 to April 2007 also many other high-resolution (local to regional) satellite images were acquired and stored in the archive. However after the end of the expedition only global satellite data are collected and stored. Use Google Earth at http://galathea.dtu.dk/GE_e.html to access the images. The expedition included 50 science projects and based on this educational material has been developed. There are around 20 educational projects in English at http://galathea3.emu.dk/satelliteeye/index_uk.html and 90 in Danish at http://vg3.dk/ freely available based on the science. All the educational projects in English deal with satellite image analysis and information. In addition, the short educational film (15min) for students and teachers at higher upper level on the use of satellite images during the expedition and in some science projects onboard is available in English. The film is called ‘Galathea's Eye' and is available at http://virtuelgalathea3.dk/om/videoer. All projects in English were developed in the ‘Satellite Eye for Galathea 3' projected supported by Egmontfonden and ESA Eduspace. The satellite images were mainly from ESA and Eduspace. The Danish projects are support also by Tips og Lottopuljen of Ministry of Education.

  6. WFIRST Science Operations at STScI

    NASA Astrophysics Data System (ADS)

    Gilbert, Karoline; STScI WFIRST Team

    2018-06-01

    With sensitivity and resolution comparable the Hubble Space Telescope, and a field of view 100 times larger, the Wide Field Instrument (WFI) on WFIRST will be a powerful survey instrument. STScI will be the Science Operations Center (SOC) for the WFIRST Mission, with additional science support provided by the Infrared Processing and Analysis Center (IPAC) and foreign partners. STScI will schedule and archive all WFIRST observations, calibrate and produce pipeline-reduced data products for imaging with the Wide Field Instrument, support the High Latitude Imaging and Supernova Survey Teams, and support the astronomical community in planning WFI imaging observations and analyzing the data. STScI has developed detailed concepts for WFIRST operations, including a data management system integrating data processing and the archive which will include a novel, cloud-based framework for high-level data processing, providing a common environment accessible to all users (STScI operations, Survey Teams, General Observers, and archival investigators). To aid the astronomical community in examining the capabilities of WFIRST, STScI has built several simulation tools. We describe the functionality of each tool and give examples of its use.

  7. Alluvial fan in China

    NASA Image and Video Library

    2008-09-05

    This image captures the beauty of a major alluvial fan in Tsinghai, a province located in Northwestern China. This archival image was taken from NASA Space Shuttle in 1997 as part of its ISS EarthKAM mission.

  8. On missing Data Treatment for degraded video and film archives: a survey and a new Bayesian approach.

    PubMed

    Kokaram, Anil C

    2004-03-01

    Image sequence restoration has been steadily gaining in importance with the increasing prevalence of visual digital media. The demand for content increases the pressure on archives to automate their restoration activities for preservation of the cultural heritage that they hold. There are many defects that affect archived visual material and one central issue is that of Dirt and Sparkle, or "Blotches." Research in archive restoration has been conducted for more than a decade and this paper places that material in context to highlight the advances made during that time. The paper also presents a new and simpler Bayesian framework that achieves joint processing of noise, missing data, and occlusion.

  9. Final Technical Report for DE-SC0002014- July 29, 2011

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ramirez, NC

    2011-07-29

    The project titled “National Biorepository for Children’s and Women’s Cancer”. The funding received by the Biopathology Center (BPC) at the Research Institute at Nationwide Children’s Hospital was utilized to procure equipment and add resources to establish a national digital archive of tissues of children and women’s cancers to advance treatment and research. As planned in the proposal, the project allowed the BPC to procure two high-speed imaging robots and hire imaging technicians to scan a large collection of Children’s and Women’s cancer tissues. The BPC team focused on completed clinical trials, with some dating back nearly 30 years, conducted bymore » the Children’s Oncology Group (and its precursor groups) as well as the Gynecologic Oncology Group. A total of 139 clinical trials were imaged as part of the archive project allowing the team to generate 29, 488 images that are currently stored at the Ohio Supercomputer Center located in Columbus Ohio. The images are now integrated with the Virtual Imaging for Pathology, Education and Research (VIPER) application. The VIPER application allows the BPC to make the digital archive available via the Internet to approved researchers remotely eliminating the use of glass slides for this collection. The elimination of glass slides reduces costs associated with shipping, reduces breakage of glass slides and allows for the review of these cases quickly by experts on a standard desktop computer.« less

  10. The Power of Imaging.

    ERIC Educational Resources Information Center

    Haapaniemi, Peter

    1990-01-01

    Describes imaging technology, which allows huge numbers of words and illustrations to be reduced to tiny fraction of space required by originals and discusses current applications. Highlights include image processing system at National Archives; use by banks for high-speed check processing; engineering document management systems (EDMS); folder…

  11. Programmed database system at the Chang Gung Craniofacial Center: part II--digitizing photographs.

    PubMed

    Chuang, Shiow-Shuh; Hung, Kai-Fong; de Villa, Glenda H; Chen, Philip K T; Lo, Lun-Jou; Chang, Sophia C N; Yu, Chung-Chih; Chen, Yu-Ray

    2003-07-01

    The archival tools used for digital images in advertising are not to fulfill the clinic requisition and are just beginning to develop. The storage of a large amount of conventional photographic slides needs a lot of space and special conditions. In spite of special precautions, degradation of the slides still occurs. The most common degradation is the appearance of fungus flecks. With the recent advances in digital technology, it is now possible to store voluminous numbers of photographs on a computer hard drive and keep them for a long time. A self-programmed interface has been developed to integrate database and image browser system that can build and locate needed files archive in a matter of seconds with the click of a button. This system requires hardware and software were market provided. There are 25,200 patients recorded in the database that involve 24,331 procedures. In the image files, there are 6,384 patients with 88,366 digital pictures files. From 1999 through 2002, NT400,000 dollars have been saved using the new system. Photographs can be managed with the integrating Database and Browse software for database archiving. This allows labeling of the individual photographs with demographic information and browsing. Digitized images are not only more efficient and economical than the conventional slide images, but they also facilitate clinical studies.

  12. From PACS to Web-based ePR system with image distribution for enterprise-level filmless healthcare delivery.

    PubMed

    Huang, H K

    2011-07-01

    The concept of PACS (picture archiving and communication system) was initiated in 1982 during the SPIE medical imaging conference in New Port Beach, CA. Since then PACS has been matured to become an everyday clinical tool for image archiving, communication, display, and review. This paper follows the continuous development of PACS technology including Web-based PACS, PACS and ePR (electronic patient record), enterprise PACS to ePR with image distribution (ID). The concept of large-scale Web-based enterprise PACS and ePR with image distribution is presented along with its implementation, clinical deployment, and operation. The Hong Kong Hospital Authority's (HKHA) integration of its home-grown clinical management system (CMS) with PACS and ePR with image distribution is used as a case study. The current concept and design criteria of the HKHA enterprise integration of the CMS, PACS, and ePR-ID for filmless healthcare delivery are discussed, followed by its work-in-progress and current status.

  13. Galileo SSI/Ida Radiometrically Calibrated Images V1.0

    NASA Astrophysics Data System (ADS)

    Domingue, D. L.

    2016-05-01

    This data set includes Galileo Orbiter SSI radiometrically calibrated images of the asteroid 243 Ida, created using ISIS software and assuming nadir pointing. This is an original delivery of radiometrically calibrated files, not an update to existing files. All images archived include the asteroid within the image frame. Calibration was performed in 2013-2014.

  14. Use of multidimensional, multimodal imaging and PACS to support neurological diagnoses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wong, S.T.C.; Knowlton, R.; Hoo, K.S.

    1995-12-31

    Technological advances in brain imaging have revolutionized diagnosis in neurology and neurological surgery. Major imaging techniques include magnetic resonance imaging (MRI) to visualize structural anatomy, positron emission tomography (PET) to image metabolic function and cerebral blood flow, magnetoencephalography (MEG) to visualize the location of physiologic current sources, and magnetic resonance spectroscopy (MRS) to measure specific biochemicals. Each of these techniques studies different biomedical aspects of the grain, but there lacks an effective means to quantify and correlate the disparate imaging datasets in order to improve clinical decision making processes. This paper describes several techniques developed in a UNIX-based neurodiagnostic workstationmore » to aid the non-invasive presurgical evaluation of epilepsy patients. These techniques include on-line access to the picture archiving and communication systems (PACS) multimedia archive, coregistration of multimodality image datasets, and correlation and quantitative of structural and functional information contained in the registered images. For illustration, the authors describe the use of these techniques in a patient case of non-lesional neocortical epilepsy. They also present the future work based on preliminary studies.« less

  15. The Hubble Legacy Archive: Data Processing in the Era of AstroDrizzle

    NASA Astrophysics Data System (ADS)

    Strolger, Louis-Gregory; Hubble Legacy Archive Team, The Hubble Source Catalog Team

    2015-01-01

    The Hubble Legacy Archive (HLA) expands the utility of Hubble Space Telescope wide-field imaging data by providing high-level composite images and source lists, perusable and immediately available online. The latest HLA data release (DR8.0) marks a fundamental change in how these image combinations are produced, using DrizzlePac tools and Astrodrizzle to reduce geometric distortion and provide improved source catalogs for all publicly available data. We detail the HLA data processing and source list schemas, what products are newly updated and available for WFC3 and ACS, and how these data products are further utilized in the production of the Hubble Source Catalog. We also discuss plans for future development, including updates to WFPC2 products and field mosaics.

  16. Development of the SOFIA Image Processing Tool

    NASA Technical Reports Server (NTRS)

    Adams, Alexander N.

    2011-01-01

    The Stratospheric Observatory for Infrared Astronomy (SOFIA) is a Boeing 747SP carrying a 2.5 meter infrared telescope capable of operating between at altitudes of between twelve and fourteen kilometers, which is above more than 99 percent of the water vapor in the atmosphere. The ability to make observations above most water vapor coupled with the ability to make observations from anywhere, anytime, make SOFIA one of the world s premiere infrared observatories. SOFIA uses three visible light CCD imagers to assist in pointing the telescope. The data from these imagers is stored in archive files as is housekeeping data, which contains information such as boresight and area of interest locations. A tool that could both extract and process data from the archive files was developed.

  17. Turbine vane structure

    DOEpatents

    Irwin, John A.

    1980-08-19

    A liquid cooled stator blade assembly for a gas turbine engine includes an outer shroud having a pair of liquid inlets and a pair of liquid outlets supplied through a header and wherein means including tubes support the header radially outwardly of the shroud and also couple the header with the pair of liquid inlets and outlets. A pair of turbine vanes extend radially between the shroud and a vane platform to define a gas turbine motive fluid passage therebetween; and each of the vanes is cooled by an internal body casting of super alloy material with a grooved layer of highly heat conductive material that includes spaced apart flat surface trailing edges in alignment with a flat trailing edge of the casting joined to wall segments of the liner which are juxtaposed with respect to the internal casting to form an array of parallel liquid inlet passages on one side of the vane and a second plurality of parallel liquid return passages on the opposite side of the vane; and a superalloy heat and wear resistant imperforate skin covers the outer surface of the composite blade including the internal casting and the heat conductive layer; a separate trailing edge section includes an internal casting and an outer skin butt connected to the end surfaces of the internal casting and the heat conductive layer to form an easily assembled liquid cooled trailing edge section in the turbine vane.

  18. Storage media for computers in radiology.

    PubMed

    Dandu, Ravi Varma

    2008-11-01

    The introduction and wide acceptance of digital technology in medical imaging has resulted in an exponential increase in the amount of data produced by the radiology department. There is an insatiable need for storage space to archive this ever-growing volume of image data. Healthcare facilities should plan the type and size of the storage media that they needed, based not just on the volume of data but also on considerations such as the speed and ease of access, redundancy, security, costs, as well as the longevity of the archival technology. This article reviews the various digital storage media and compares their merits and demerits.

  19. The Desert Laboratory Repeat Photography Collection - An Invaluable Archive Documenting Landscape Change

    USGS Publications Warehouse

    Webb, Robert H.; Boyer, Diane E.; Turner, Raymond M.; Bullock, Stephen H.

    2007-01-01

    The Desert Laboratory Repeat Photography Collection, the largest collection of its kind in the world, is housed at the U.S. Geological Survey (USGS) in Tucson, Arizona. The collection preserves thousands of photos taken precisely in the same places but at different times. This archive of 'repeat photographs' documents changes in the desert landscape and vegetation of the American Southwest, and also includes images from northwestern Mexico and Kenya. These images are an invaluable asset to help understand the effects of climate variation and land-use practices on arid and semiarid environments.

  20. Developing Generic Image Search Strategies for Large Astronomical Data Sets and Archives using Convolutional Neural Networks and Transfer Learning

    NASA Astrophysics Data System (ADS)

    Peek, Joshua E. G.; Hargis, Jonathan R.; Jones, Craig K.

    2018-01-01

    Astronomical instruments produce petabytes of images every year, vastly more than can be inspected by a member of the astronomical community in search of a specific population of structures. Fortunately, the sky is mostly black and source extraction algorithms have been developed to provide searchable catalogs of unconfused sources like stars and galaxies. These tools often fail for studies of more diffuse structures like the interstellar medium and unresolved stellar structures in nearby galaxies, leaving astronomers interested in observations of photodissociation regions, stellar clusters, diffuse interstellar clouds without the crucial ability to search. In this work we present a new path forward for finding structures in large data sets similar to an input structure using convolutional neural networks, transfer learning, and machine learning clustering techniques. We show applications to archival data in the Mikulski Archive for Space Telescopes (MAST).

  1. Outsourced central archiving: an information bridge in a multi-IMAC environment

    NASA Astrophysics Data System (ADS)

    Gustavsson, Staffan; Tylen, Ulf; Carlsson, Goeran; Angelhed, Jan-Erik; Wintell, Mikael; Helmersson, Roger; Norrby, Clas

    2001-08-01

    In 1998 three hospitals merged to form the Sahlgrenska University Hospital. The total radiology production became 325 000 examinations per year. Two different PACS and RIS with different and incompatible archiving solutions were used since 1996. One PACS had commercial origin and the other was developed inhouse. Together they managed 1/3 of the total production. Due to differences in standard compliance and system architecture the communication was unsatisfactory. In order to improve efficiency, communication and the service level to our customers the situation was evaluated. It was decided to build a transparent virtual radiology department based on a modular approach. A common RIS and a central DICOM image archive as the central nodes in a star configured system were chosen. Web technique was chosen as the solution for distribution of images and reports. The reasons for the decisions as well as the present status of the installation are described and discussed is this paper.

  2. Cine film replacement: digital archival requirements and remaining obstacles.

    PubMed

    Holmes, D R; Wondrow, M A; Bell, M R; Nissen, S E; Cusma, J T

    1998-07-01

    The acceptance of the Digital Imaging and Communication in Medicine (DICOM) standard and the Compact Disk-Recordable (CD-R) as the interchange medium have been critical developments for laboratories that need to move forward on the cine replacement front, while at the same time retain a means to communicate with other centers. One remaining essential component which has not been satisfactorily addressed is the issue of how digital image data should be archived within an institution. Every laboratory must consider the diverse issues which affect the choice of a digital archiving system. These factors include technical and economic issues, along with the clinical routines prevailing in their laboratory. A complete understanding of the issues will lead to the formulation of multiple options which may prove acceptable and will help to overcome the last obstacle which remains for the complete replacement of cine film in the cardiac catheterization laboratory.

  3. Visual Systems for Interactive Exploration and Mining of Large-Scale Neuroimaging Data Archives

    PubMed Central

    Bowman, Ian; Joshi, Shantanu H.; Van Horn, John D.

    2012-01-01

    While technological advancements in neuroimaging scanner engineering have improved the efficiency of data acquisition, electronic data capture methods will likewise significantly expedite the populating of large-scale neuroimaging databases. As they do and these archives grow in size, a particular challenge lies in examining and interacting with the information that these resources contain through the development of compelling, user-driven approaches for data exploration and mining. In this article, we introduce the informatics visualization for neuroimaging (INVIZIAN) framework for the graphical rendering of, and dynamic interaction with the contents of large-scale neuroimaging data sets. We describe the rationale behind INVIZIAN, detail its development, and demonstrate its usage in examining a collection of over 900 T1-anatomical magnetic resonance imaging (MRI) image volumes from across a diverse set of clinical neuroimaging studies drawn from a leading neuroimaging database. Using a collection of cortical surface metrics and means for examining brain similarity, INVIZIAN graphically displays brain surfaces as points in a coordinate space and enables classification of clusters of neuroanatomically similar MRI images and data mining. As an initial step toward addressing the need for such user-friendly tools, INVIZIAN provides a highly unique means to interact with large quantities of electronic brain imaging archives in ways suitable for hypothesis generation and data mining. PMID:22536181

  4. Aircraft scanner data availability via the version 0 Information Management System

    NASA Technical Reports Server (NTRS)

    Mah, G. R.

    1995-01-01

    As part of the Earth Observing System Data and Information System (EOSDIS) development, NASA and other government agencies have developed an operational prototype of the Information Management System (IMS). The IMS provides access to the data archived at the Distributed Active Archive Centers (DAAC's) that allows users to search through metadata describing the (image) data. Criteria based on sensor name or type, date and time, and geographic location are used to search the archive. Graphical representations of coverage and browse images are available to further refine a user's selection. previously, the EROS Data Center (EDC) DAAC had identified the Advanced SOlid-state Array Spectrometer (ASAS), Airborne Visible and infrared Imaging Spectrometer (AVIRIS), NS-001, and Thermal Infrared Multispectral Scanner (TIMS) as precursor data sets similar to those the DAAC will handle in the Earth Observing System era. Currently, the EDC DAAC staff, in cooperation with NASA, has transcribed TIMS, NS-001, and Thematic Mapper Simulation (TMS) data from Ames Research Center and also TIMS data from Stennis Space Center. During the transcription process, the IMS metadata and browse images were created to populate the inventory at the EDC DAAC. These data sets are now available in the IMS and may be requested from the any of the DAAC's via the IMS.

  5. Quickly Creating Interactive Astronomy Illustrations

    ERIC Educational Resources Information Center

    Slater, Timothy F.

    2015-01-01

    An innate advantage for astronomy teachers is having numerous breathtaking images of the cosmos available to capture students' curiosity, imagination, and wonder. Internet-based astronomy image libraries are numerous and easy to navigate. The Astronomy Picture of the Day, the Hubble Space Telescope image archive, and the NASA Planetary…

  6. 42 CFR 37.44 - Approval of radiographic facilities that use digital radiography systems.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... image acquisition, digitization, processing, compression, transmission, display, archiving, and... quality digital chest radiographs by submitting to NIOSH digital radiographic image files of a test object... digital radiographic image files from six or more sample chest radiographs that are of acceptable quality...

  7. 42 CFR 37.44 - Approval of radiographic facilities that use digital radiography systems.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... image acquisition, digitization, processing, compression, transmission, display, archiving, and... quality digital chest radiographs by submitting to NIOSH digital radiographic image files of a test object... digital radiographic image files from six or more sample chest radiographs that are of acceptable quality...

  8. yourSky: Custom Sky-Image Mosaics via the Internet

    NASA Technical Reports Server (NTRS)

    Jacob, Joseph

    2003-01-01

    yourSky (http://yourSky.jpl.nasa.gov) is a computer program that supplies custom astronomical image mosaics of sky regions specified by requesters using client computers connected to the Internet. [yourSky is an upgraded version of the software reported in Software for Generating Mosaics of Astronomical Images (NPO-21121), NASA Tech Briefs, Vol. 25, No. 4 (April 2001), page 16a.] A requester no longer has to engage in the tedious process of determining what subset of images is needed, nor even to know how the images are indexed in image archives. Instead, in response to a requester s specification of the size and location of the sky area, (and optionally of the desired set and type of data, resolution, coordinate system, projection, and image format), yourSky automatically retrieves the component image data from archives totaling tens of terabytes stored on computer tape and disk drives at multiple sites and assembles the component images into a mosaic image by use of a high-performance parallel code. yourSky runs on the server computer where the mosaics are assembled. Because yourSky includes a Web-interface component, no special client software is needed: ordinary Web browser software is sufficient.

  9. Toward public volume database management: a case study of NOVA, the National Online Volumetric Archive

    NASA Astrophysics Data System (ADS)

    Fletcher, Alex; Yoo, Terry S.

    2004-04-01

    Public databases today can be constructed with a wide variety of authoring and management structures. The widespread appeal of Internet search engines suggests that public information be made open and available to common search strategies, making accessible information that would otherwise be hidden by the infrastructure and software interfaces of a traditional database management system. We present the construction and organizational details for managing NOVA, the National Online Volumetric Archive. As an archival effort of the Visible Human Project for supporting medical visualization research, archiving 3D multimodal radiological teaching files, and enhancing medical education with volumetric data, our overall database structure is simplified; archives grow by accruing information, but seldom have to modify, delete, or overwrite stored records. NOVA is being constructed and populated so that it is transparent to the Internet; that is, much of its internal structure is mirrored in HTML allowing internet search engines to investigate, catalog, and link directly to the deep relational structure of the collection index. The key organizational concept for NOVA is the Image Content Group (ICG), an indexing strategy for cataloging incoming data as a set structure rather than by keyword management. These groups are managed through a series of XML files and authoring scripts. We cover the motivation for Image Content Groups, their overall construction, authorship, and management in XML, and the pilot results for creating public data repositories using this strategy.

  10. The National Institutes of Health Clinical Center Digital Imaging Network, Picture Archival and Communication System, and Radiology Information System.

    PubMed

    Goldszal, A F; Brown, G K; McDonald, H J; Vucich, J J; Staab, E V

    2001-06-01

    In this work, we describe the digital imaging network (DIN), picture archival and communication system (PACS), and radiology information system (RIS) currently being implemented at the Clinical Center, National Institutes of Health (NIH). These systems are presently in clinical operation. The DIN is a redundant meshed network designed to address gigabit density and expected high bandwidth requirements for image transfer and server aggregation. The PACS projected workload is 5.0 TB of new imaging data per year. Its architecture consists of a central, high-throughput Digital Imaging and Communications in Medicine (DICOM) data repository and distributed redundant array of inexpensive disks (RAID) servers employing fiber-channel technology for immediate delivery of imaging data. On demand distribution of images and reports to clinicians and researchers is accomplished via a clustered web server. The RIS follows a client-server model and provides tools to order exams, schedule resources, retrieve and review results, and generate management reports. The RIS-hospital information system (HIS) interfaces include admissions, discharges, and transfers (ATDs)/demographics, orders, appointment notifications, doctors update, and results.

  11. Sample EPA Biotech Form

    EPA Pesticide Factsheets

    This sample “EPA Biotech Form” is a header sheet that will accompany all biotechnology submission choices, including MCANs, TERAs, Tier I and Tier II exemption, and biotechnology Test Market Exemption Applications (TMEAs).

  12. An Archive of Digital Images.

    ERIC Educational Resources Information Center

    Fantini, M.; And Others

    1990-01-01

    Describes the architecture of the prototype of an image management system that has been used to develop an application concerning images of frescoes in the Sistina Chapel in the Vatican. Hardware and software design are described, the use of local area networks (LANs) is discussed, and data organization is explained. (15 references) (LRW)

  13. Bridging the integration gap between imaging and information systems: a uniform data concept for content-based image retrieval in computer-aided diagnosis.

    PubMed

    Welter, Petra; Riesmeier, Jörg; Fischer, Benedikt; Grouls, Christoph; Kuhl, Christiane; Deserno, Thomas M

    2011-01-01

    It is widely accepted that content-based image retrieval (CBIR) can be extremely useful for computer-aided diagnosis (CAD). However, CBIR has not been established in clinical practice yet. As a widely unattended gap of integration, a unified data concept for CBIR-based CAD results and reporting is lacking. Picture archiving and communication systems and the workflow of radiologists must be considered for successful data integration to be achieved. We suggest that CBIR systems applied to CAD should integrate their results in a picture archiving and communication systems environment such as Digital Imaging and Communications in Medicine (DICOM) structured reporting documents. A sample DICOM structured reporting template adaptable to CBIR and an appropriate integration scheme is presented. The proposed CBIR data concept may foster the promulgation of CBIR systems in clinical environments and, thereby, improve the diagnostic process.

  14. Bridging the integration gap between imaging and information systems: a uniform data concept for content-based image retrieval in computer-aided diagnosis

    PubMed Central

    Riesmeier, Jörg; Fischer, Benedikt; Grouls, Christoph; Kuhl, Christiane; Deserno (né Lehmann), Thomas M

    2011-01-01

    It is widely accepted that content-based image retrieval (CBIR) can be extremely useful for computer-aided diagnosis (CAD). However, CBIR has not been established in clinical practice yet. As a widely unattended gap of integration, a unified data concept for CBIR-based CAD results and reporting is lacking. Picture archiving and communication systems and the workflow of radiologists must be considered for successful data integration to be achieved. We suggest that CBIR systems applied to CAD should integrate their results in a picture archiving and communication systems environment such as Digital Imaging and Communications in Medicine (DICOM) structured reporting documents. A sample DICOM structured reporting template adaptable to CBIR and an appropriate integration scheme is presented. The proposed CBIR data concept may foster the promulgation of CBIR systems in clinical environments and, thereby, improve the diagnostic process. PMID:21672913

  15. Protein Crystal Growth

    NASA Technical Reports Server (NTRS)

    2003-01-01

    In order to rapidly and efficiently grow crystals, tools were needed to automatically identify and analyze the growing process of protein crystals. To meet this need, Diversified Scientific, Inc. (DSI), with the support of a Small Business Innovation Research (SBIR) contract from NASA s Marshall Space Flight Center, developed CrystalScore(trademark), the first automated image acquisition, analysis, and archiving system designed specifically for the macromolecular crystal growing community. It offers automated hardware control, image and data archiving, image processing, a searchable database, and surface plotting of experimental data. CrystalScore is currently being used by numerous pharmaceutical companies and academic and nonprofit research centers. DSI, located in Birmingham, Alabama, was awarded the patent Method for acquiring, storing, and analyzing crystal images on March 4, 2003. Another DSI product made possible by Marshall SBIR funding is VaporPro(trademark), a unique, comprehensive system that allows for the automated control of vapor diffusion for crystallization experiments.

  16. Building the Pipeline for Hubble Legacy Archive Grism data

    NASA Astrophysics Data System (ADS)

    Kümmel, M.; Albrecht, R.; Fosbury, R.; Freudling, W.; Haase, J.; Hook, R. N.; Kuntschner, H.; Lombardi, M.; Micol, A.; Rosa, M.; Stoehr, F.; Walsh, J. R.

    2008-10-01

    The Pipeline for Hubble Legacy Archive Grism data (PHLAG) is currently being developed as an end-to-end pipeline for the Hubble Legacy Archive (HLA). The inputs to PHLAG are slitless spectroscopic HST data with only the basic calibrations from standard HST pipelines applied; the outputs are fully calibrated, Virtuall Observatory-compatible spectra, which will be made available through a static HLA-archive. We give an overview of the various aspects of PHLAG. The pipeline consists of several subcomponents -- data preparation, data retrieval, image combination, object detection, spectral extraction using the aXe software, quality control -- which is discussed in detail. As a pilot project, PHLAG is currently being applied to NICMOS G141 grism data. Examples of G141 spectra reduced with PHLAG are shown.

  17. The Moving Image in Education Research: Reassembling the Body in Classroom Video Data

    ERIC Educational Resources Information Center

    de Freitas, Elizabeth

    2016-01-01

    While audio recordings and observation might have dominated past decades of classroom research, video data is now the dominant form of data in the field. Ubiquitous videography is standard practice today in archiving the body of both the teacher and the student, and vast amounts of classroom and experiment clips are stored in online archives. Yet…

  18. Earth imaging and scientific observations by SSTI ``Clark'' a NASA technology demonstration spacecraft

    NASA Astrophysics Data System (ADS)

    Hayduk, Robert J.; Scott, Walter S.; Walberg, Gerald D.; Butts, James J.; Starr, Richard D.

    1997-01-01

    The Small Satellite Technology Initiative (SSTI) is a National Aeronautics and Space Administration (NASA) program to demonstrate smaller, high technology satellites constructed rapidly and less expensively. Under SSTI, NASA funded the development of ``Clark,'' a high technology demonstration satellite to provide 3-m resolution panchromatic and 15-m resolution multispectral images, as well as collect atmospheric constituent and cosmic x-ray data. The 690-lb. satellite, to be launched in early 1997, will be in a 476 km, circular, sun-synchronous polar orbit. This paper describes the program objectives, the technical characteristics of the sensors and satellite, image processing, archiving and distribution. Data archiving and distribution will be performed by NASA Stennis Space Center and by the EROS Data Center, Sioux Falls, South Dakota, USA.

  19. Impact on dose and image quality of a software-based scatter correction in mammography.

    PubMed

    Monserrat, Teresa; Prieto, Elena; Barbés, Benigno; Pina, Luis; Elizalde, Arlette; Fernández, Belén

    2018-06-01

    Background In 2014, Siemens developed a new software-based scatter correction (Progressive Reconstruction Intelligently Minimizing Exposure [PRIME]), enabling grid-less digital mammography. Purpose To compare doses and image quality between PRIME (grid-less) and standard (with anti-scatter grid) modes. Material and Methods Contrast-to-noise ratio (CNR) was measured for various polymethylmethacrylate (PMMA) thicknesses and dose values provided by the mammograph were recorded. CDMAM phantom images were acquired for various PMMA thicknesses and inverse Image Quality Figure (IQF inv ) was calculated. Values of incident entrance surface air kerma (ESAK) and average glandular dose (AGD) were obtained from the DICOM header for a total of 1088 pairs of clinical cases. Two experienced radiologists compared subjectively the image quality of a total of 149 pairs of clinical cases. Results CNR values were higher and doses were lower in PRIME mode for all thicknesses. IQF inv values in PRIME mode were lower for all thicknesses except for 40 mm of PMMA equivalent, in which IQF inv was slightly greater in PRIME mode. A mean reduction of 10% in ESAK and 12% in AGD in PRIME mode with respect to standard mode was obtained. The clinical image quality in PRIME and standard acquisitions resulted to be similar in most of the cases (84% for the first radiologist and 67% for the second one). Conclusion The use of PRIME software reduces, in average, the dose of radiation to the breast without affecting image quality. This reduction is greater for thinner and denser breasts.

  20. The Digital Fish Library: Using MRI to Digitize, Database, and Document the Morphological Diversity of Fish

    PubMed Central

    Berquist, Rachel M.; Gledhill, Kristen M.; Peterson, Matthew W.; Doan, Allyson H.; Baxter, Gregory T.; Yopak, Kara E.; Kang, Ning; Walker, H. J.; Hastings, Philip A.; Frank, Lawrence R.

    2012-01-01

    Museum fish collections possess a wealth of anatomical and morphological data that are essential for documenting and understanding biodiversity. Obtaining access to specimens for research, however, is not always practical and frequently conflicts with the need to maintain the physical integrity of specimens and the collection as a whole. Non-invasive three-dimensional (3D) digital imaging therefore serves a critical role in facilitating the digitization of these specimens for anatomical and morphological analysis as well as facilitating an efficient method for online storage and sharing of this imaging data. Here we describe the development of the Digital Fish Library (DFL, http://www.digitalfishlibrary.org), an online digital archive of high-resolution, high-contrast, magnetic resonance imaging (MRI) scans of the soft tissue anatomy of an array of fishes preserved in the Marine Vertebrate Collection of Scripps Institution of Oceanography. We have imaged and uploaded MRI data for over 300 marine and freshwater species, developed a data archival and retrieval system with a web-based image analysis and visualization tool, and integrated these into the public DFL website to disseminate data and associated metadata freely over the web. We show that MRI is a rapid and powerful method for accurately depicting the in-situ soft-tissue anatomy of preserved fishes in sufficient detail for large-scale comparative digital morphology. However these 3D volumetric data require a sophisticated computational and archival infrastructure in order to be broadly accessible to researchers and educators. PMID:22493695

  1. An efficient architecture to support digital pathology in standard medical imaging repositories.

    PubMed

    Marques Godinho, Tiago; Lebre, Rui; Silva, Luís Bastião; Costa, Carlos

    2017-07-01

    In the past decade, digital pathology and whole-slide imaging (WSI) have been gaining momentum with the proliferation of digital scanners from different manufacturers. The literature reports significant advantages associated with the adoption of digital images in pathology, namely, improvements in diagnostic accuracy and better support for telepathology. Moreover, it also offers new clinical and research applications. However, numerous barriers have been slowing the adoption of WSI, among which the most important are performance issues associated with storage and distribution of huge volumes of data, and lack of interoperability with other hospital information systems, most notably Picture Archive and Communications Systems (PACS) based on the DICOM standard. This article proposes an architecture of a Web Pathology PACS fully compliant with DICOM standard communications and data formats. The solution includes a PACS Archive responsible for storing whole-slide imaging data in DICOM WSI format and offers a communication interface based on the most recent DICOM Web services. The second component is a zero-footprint viewer that runs in any web-browser. It consumes data using the PACS archive standard web services. Moreover, it features a tiling engine especially suited to deal with the WSI image pyramids. These components were designed with special focus on efficiency and usability. The performance of our system was assessed through a comparative analysis of the state-of-the-art solutions. The results demonstrate that it is possible to have a very competitive solution based on standard workflows. Copyright © 2017 Elsevier Inc. All rights reserved.

  2. The Digital Fish Library: using MRI to digitize, database, and document the morphological diversity of fish.

    PubMed

    Berquist, Rachel M; Gledhill, Kristen M; Peterson, Matthew W; Doan, Allyson H; Baxter, Gregory T; Yopak, Kara E; Kang, Ning; Walker, H J; Hastings, Philip A; Frank, Lawrence R

    2012-01-01

    Museum fish collections possess a wealth of anatomical and morphological data that are essential for documenting and understanding biodiversity. Obtaining access to specimens for research, however, is not always practical and frequently conflicts with the need to maintain the physical integrity of specimens and the collection as a whole. Non-invasive three-dimensional (3D) digital imaging therefore serves a critical role in facilitating the digitization of these specimens for anatomical and morphological analysis as well as facilitating an efficient method for online storage and sharing of this imaging data. Here we describe the development of the Digital Fish Library (DFL, http://www.digitalfishlibrary.org), an online digital archive of high-resolution, high-contrast, magnetic resonance imaging (MRI) scans of the soft tissue anatomy of an array of fishes preserved in the Marine Vertebrate Collection of Scripps Institution of Oceanography. We have imaged and uploaded MRI data for over 300 marine and freshwater species, developed a data archival and retrieval system with a web-based image analysis and visualization tool, and integrated these into the public DFL website to disseminate data and associated metadata freely over the web. We show that MRI is a rapid and powerful method for accurately depicting the in-situ soft-tissue anatomy of preserved fishes in sufficient detail for large-scale comparative digital morphology. However these 3D volumetric data require a sophisticated computational and archival infrastructure in order to be broadly accessible to researchers and educators.

  3. 47 CFR 11.33 - EAS Decoder.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... time periods expire. (4) Display and logging. A visual message shall be developed from any valid header... input. (8) Decoder Programming. Access to decoder programming shall be protected by a lock or other...

  4. XDS-I Gateway Development for HIE Connectivity with Legacy PACS at Gil Hospital.

    PubMed

    Simalango, Mikael Fernandus; Kim, Youngchul; Seo, Young Tae; Choi, Young Hwan; Cho, Yong Kyun

    2013-12-01

    The ability to support healthcare document sharing is imperative in a health information exchange (HIE). Sharing imaging documents or images, however, can be challenging, especially when they are stored in a picture archiving and communication system (PACS) archive that does not support document sharing via standard HIE protocols. This research proposes a standard-compliant imaging gateway that enables connectivity between a legacy PACS and the entire HIE. Investigation of the PACS solutions used at Gil Hospital was conducted. An imaging gateway application was then developed using a Java technology stack. Imaging document sharing capability enabled by the gateway was tested by integrating it into Gil Hospital's order communication system and its HIE infrastructure. The gateway can acquire radiology images from a PACS storage system, provide and register the images to Gil Hospital's HIE for document sharing purposes, and make the images retrievable by a cross-enterprise document sharing document viewer. Development of an imaging gateway that mediates communication between a PACS and an HIE can be considered a viable option when the PACS does not support the standard protocol for cross-enterprise document sharing for imaging. Furthermore, the availability of common HIE standards expedites the development and integration of the imaging gateway with an HIE.

  5. XDS-I Gateway Development for HIE Connectivity with Legacy PACS at Gil Hospital

    PubMed Central

    Simalango, Mikael Fernandus; Kim, Youngchul; Seo, Young Tae; Cho, Yong Kyun

    2013-01-01

    Objectives The ability to support healthcare document sharing is imperative in a health information exchange (HIE). Sharing imaging documents or images, however, can be challenging, especially when they are stored in a picture archiving and communication system (PACS) archive that does not support document sharing via standard HIE protocols. This research proposes a standard-compliant imaging gateway that enables connectivity between a legacy PACS and the entire HIE. Methods Investigation of the PACS solutions used at Gil Hospital was conducted. An imaging gateway application was then developed using a Java technology stack. Imaging document sharing capability enabled by the gateway was tested by integrating it into Gil Hospital's order communication system and its HIE infrastructure. Results The gateway can acquire radiology images from a PACS storage system, provide and register the images to Gil Hospital's HIE for document sharing purposes, and make the images retrievable by a cross-enterprise document sharing document viewer. Conclusions Development of an imaging gateway that mediates communication between a PACS and an HIE can be considered a viable option when the PACS does not support the standard protocol for cross-enterprise document sharing for imaging. Furthermore, the availability of common HIE standards expedites the development and integration of the imaging gateway with an HIE. PMID:24523994

  6. The impact of image storage organization on the effectiveness of PACS.

    PubMed

    Hindel, R

    1990-11-01

    Picture archiving communication system (PACS) requires efficient handling of large amounts of data. Mass storage systems are cost effective but slow, while very fast systems, like frame buffers and parallel transfer disks, are expensive. The image traffic can be divided into inbound traffic generated by diagnostic modalities and outbound traffic into workstations. At the contact points with medical professionals, the responses must be fast. Archiving, on the other hand, can employ slower but less expensive storage systems, provided that the primary activities are not impeded. This article illustrates a segmentation architecture meeting these requirements based on a clearly defined PACS concept.

  7. Storage media for computers in radiology

    PubMed Central

    Dandu, Ravi Varma

    2008-01-01

    The introduction and wide acceptance of digital technology in medical imaging has resulted in an exponential increase in the amount of data produced by the radiology department. There is an insatiable need for storage space to archive this ever-growing volume of image data. Healthcare facilities should plan the type and size of the storage media that they needed, based not just on the volume of data but also on considerations such as the speed and ease of access, redundancy, security, costs, as well as the longevity of the archival technology. This article reviews the various digital storage media and compares their merits and demerits. PMID:19774182

  8. VirGO: A Visual Browser for the ESO Science Archive Facility

    NASA Astrophysics Data System (ADS)

    Chéreau, F.

    2008-08-01

    VirGO is the next generation Visual Browser for the ESO Science Archive Facility developed by the Virtual Observatory (VO) Systems Department. It is a plug-in for the popular open source software Stellarium adding capabilities for browsing professional astronomical data. VirGO gives astronomers the possibility to easily discover and select data from millions of observations in a new visual and intuitive way. Its main feature is to perform real-time access and graphical display of a large number of observations by showing instrumental footprints and image previews, and to allow their selection and filtering for subsequent download from the ESO SAF web interface. It also allows the loading of external FITS files or VOTables, the superimposition of Digitized Sky Survey (DSS) background images, and the visualization of the sky in a `real life' mode as seen from the main ESO sites. All data interfaces are based on Virtual Observatory standards which allow access to images and spectra from external data centers, and interaction with the ESO SAF web interface or any other VO applications supporting the PLASTIC messaging system. The main website for VirGO is at http://archive.eso.org/cms/virgo.

  9. A PDS Archive for Observations of Mercury's Na Exosphere

    NASA Astrophysics Data System (ADS)

    Backes, C.; Cassidy, T.; Merkel, A. W.; Killen, R. M.; Potter, A. E.

    2016-12-01

    We present a data product consisting of ground-based observations of Mercury's sodium exosphere. We have amassed a sizeable dataset of several thousand spectral observations of Mercury's exosphere from the McMath-Pierce solar telescope. Over the last year, a data reduction pipeline has been developed and refined to process and reconstruct these spectral images into low resolution images of sodium D2 emission. This dataset, which extends over two decades, will provide an unprecedented opportunity to analyze the dynamics of Mercury's mid to high-latitude exospheric emissions, which have long been attributed to solar wind ion bombardment. This large archive of observations will be of great use to the Mercury science community in studying the effects of space weather on Mercury's tenuous exosphere. When completely processed, images in this dataset will show the observed spatial distribution of Na D2 in the Mercurian exosphere, have measurements of this sodium emission per pixel in units of kilorayleighs, and be available through NASA's Planetary Data System. The overall goal of the presentation will be to provide the Planetary Science community with a clear picture of what information and data this archival product will make available.

  10. PandASoft: Open Source Instructional Laboratory Administration Software

    NASA Astrophysics Data System (ADS)

    Gay, P. L.; Braasch, P.; Synkova, Y. N.

    2004-12-01

    PandASoft (Physics and Astronomy Software) is software for organizing and archiving a department's teaching resources and materials. An easy to use, secure interface allows faculty and staff to explore equipment inventories, see what laboratory experiments are available, find handouts, and track what has been used in different classes in the past. Divided into five sections: classes, equipment, laboratories, links, and media, its database cross links materials, allowing users to see what labs are used with which classes, what media and equipment are used with which labs, or simply what equipment is lurking in which room. Written in PHP and MySQL, this software can be installed on any UNIX / Linux platform, including Macintosh OS X. It is designed to allow users to easily customize the headers, footers and colors to blend with existing sites - no programming experience required. While initial data input is labor intensive, the system will save time later by allowing users to quickly answer questions related to what is in inventory, where it is located, how many are in stock, and where online they can learn more. It will also provide a central location for storing PDFs of handouts, and links to applets and cool sites at other universities. PandASoft comes with over 100 links to online resources pre-installed. We would like to thank Dr. Wolfgang Rueckner and the Harvard University Science Center for providing computers and resources for this project.

  11. Canine and feline fundus photography and videography using a nonpatented 3D printed lens adapter for a smartphone.

    PubMed

    Espinheira Gomes, Filipe; Ledbetter, Eric

    2018-05-11

    To describe an indirect funduscopy imaging technique for dogs and cats using low cost and widely available equipment: a smartphone, a three-dimensional (3D) printed indirect lens adapter, and a 40 diopters (D) indirect ophthalmoscopy lens. Fundus videography was performed in dogs and cats using a 40D indirect ophthalmoscopy lens and a smartphone fitted with a 3D printed indirect lens adapter. All animals were pharmacologically dilated with topical tropicamide 1% solution. Eyelid opening and video recording were performed using standard binocular indirect ophthalmoscopy technique. All videos were uploaded to a computer, and still images were selected and acquired for archiving purposes. Fundic images were manipulated to represent the true anatomy of the fundus. It was possible to promptly obtain good quality images from normal and diseased retinas using the nonpatented 3D printed, lens adapter for a smartphone. Fundic imaging using a smartphone can be performed with minimal investment. This simple imaging modality can be used by veterinary ophthalmologists and general practitioners to acquire, archive, and share images of the retina. The quality of images obtained will likely improve with developments in smartphone camera software and hardware. © 2018 American College of Veterinary Ophthalmologists.

  12. VizieR Online Data Catalog: Astrometric monitoring of ultracool dwarf binaries (Dupuy+, 2017)

    NASA Astrophysics Data System (ADS)

    Dupuy, T. J.; Liu, M. C.

    2017-09-01

    In Table 1 we list all 33 binaries in our Keck+CFHT astrometric monitoring sample, along with three other binaries that have published orbit and parallax measurements. We began obtaining resolved Keck AO astrometry in 2007-2008, and we combined our new astrometry with available data in the literature or public archives (e.g., HST and Gemini) to refine our orbital period estimates and thereby our prioritization for Keck observations. We present here new Keck/NIRC2 AO imaging and non-redundant aperture-masking observations, in addition to a re-analysis of our own previously published data and publicly available archival data for our sample binaries. Table 2 gives our measured astrometry and flux ratios for all Keck AO data used in our orbital analysis spanning 2003 Apr 15 to 2016 May 13. In total there are 339 distinct measurements (unique bandpass and epoch for a given target), where 302 of these are direct imaging and 37 are non-redundant aperture masking. Eight of the imaging measurements are from six unpublished archival data sets. See section 3.1.1 for further details. In addition to our Keck AO monitoring, we also obtained data for three T dwarf binaries over a three-year HST program using the Advanced Camera for Surveys (ACS) Wide Field Camera (WFC) in the F814W bandpass. See section 3.1.2 for further details. Many of our sample binaries have HST imaging data in the public archive. We have re-analyzed the available archival data coming from the WFPC2 Planetary Camera (WFPC2-PC1), ACS High Resolution Channel (ACS-HRC), and NICMOS Camera 1 (NICMOS-NIC1). See section 3.1.3 for further details. We present here an updated analysis of our data from the Hawaii Infrared Parallax Program that uses the CFHT facility infrared camera WIRCam. Our observing strategy and custom astrometry pipeline are described in detail in Dupuy & Liu (2012, J/ApJS/201/19). See section 3.2 for further explanations. (10 data files).

  13. United States European Command

    Science.gov Websites

    content on the U.S. European Command website may be translated by selecting a different language on the header. Except where otherwise noted, the language translation is performed by Google Translate, a third

  14. SU-E-J-237: Image Feature Based DRR and Portal Image Registration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, X; Chang, J

    Purpose: Two-dimensional (2D) matching of the kV X-ray and digitally reconstructed radiography (DRR) images is an important setup technique for image-guided radiotherapy (IGRT). In our clinics, mutual information based methods are used for this purpose on commercial linear accelerators, but with often needs for manual corrections. This work proved the feasibility that feature based image transform can be used to register kV and DRR images. Methods: The scale invariant feature transform (SIFT) method was implemented to detect the matching image details (or key points) between the kV and DRR images. These key points represent high image intensity gradients, and thusmore » the scale invariant features. Due to the poor image contrast from our kV image, direct application of the SIFT method yielded many detection errors. To assist the finding of key points, the center coordinates of the kV and DRR images were read from the DICOM header, and the two groups of key points with similar relative positions to their corresponding centers were paired up. Using these points, a rigid transform (with scaling, horizontal and vertical shifts) was estimated. We also artificially introduced vertical and horizontal shifts to test the accuracy of our registration method on anterior-posterior (AP) and lateral pelvic images. Results: The results provided a satisfactory overlay of the transformed kV onto the DRR image. The introduced vs. detected shifts were fit into a linear regression. In the AP image experiments, linear regression analysis showed a slope of 1.15 and 0.98 with an R2 of 0.89 and 0.99 for the horizontal and vertical shifts, respectively. The results are 1.2 and 1.3 with R2 of 0.72 and 0.82 for the lateral image shifts. Conclusion: This work provided an alternative technique for kV to DRR alignment. Further improvements in the estimation accuracy and image contrast tolerance are underway.« less

  15. The Washington University Central Neuroimaging Data Archive

    PubMed Central

    Gurney, Jenny; Olsen, Timothy; Flavin, John; Ramaratnam, Mohana; Archie, Kevin; Ransford, James; Herrick, Rick; Wallace, Lauren; Cline, Jeanette; Horton, Will; Marcus, Daniel S

    2016-01-01

    Since the early 2000’s, much of the neuroimaging work at Washington University (WU) has been facilitated by the Central Neuroimaging Data Archive (CNDA), an XNAT-based imaging informatics system. The CNDA is uniquely related to XNAT, as it served as the original codebase for the XNAT open source platform. The CNDA hosts data acquired in over 1000 research studies, encompassing 36,000 subjects and more than 60,000 imaging sessions. Most imaging modalities used in modern human research are represented in the CNDA, including magnetic resonance (MR), positron emission tomography (PET), computed tomography (CT), nuclear medicine (NM), computed radiography (CR), digital radiography (DX), and ultrasound (US). However, the majority of the imaging data in the CNDA are MR and PET of the human brain. Currently, about 20% of the total imaging data in the CNDA is available by request to external researchers. CNDA’s available data includes large sets of imaging sessions and in some cases clinical, psychometric, tissue, or genetic data acquired in the study of Alzheimer’s disease, brain metabolism, cancer, HIV, sickle cell anemia, and Tourette syndrome. PMID:26439514

  16. Shape optimized headers and methods of manufacture thereof

    DOEpatents

    Perrin, Ian James

    2013-11-05

    Disclosed herein is a shape optimized header comprising a shell that is operative for collecting a fluid; wherein an internal diameter and/or a wall thickness of the shell vary with a change in pressure and/or a change in a fluid flow rate in the shell; and tubes; wherein the tubes are in communication with the shell and are operative to transfer fluid into the shell. Disclosed herein is a method comprising fixedly attaching tubes to a shell; wherein the shell is operative for collecting a fluid; wherein an internal diameter and/or a wall thickness of the shell vary with a change in pressure and/or a change in a fluid flow rate in the shell; and wherein the tubes are in communication with the shell and are operative to transfer fluid into the shell.

  17. XAFS Data Interchange: A single spectrum XAFS data file format.

    PubMed

    Ravel, B; Newville, M

    We propose a standard data format for the interchange of XAFS data. The XAFS Data Interchange (XDI) standard is meant to encapsulate a single spectrum of XAFS along with relevant metadata. XDI is a text-based format with a simple syntax which clearly delineates metadata from the data table in a way that is easily interpreted both by a computer and by a human. The metadata header is inspired by the format of an electronic mail header, representing metadata names and values as an associative array. The data table is represented as columns of numbers. This format can be imported as is into most existing XAFS data analysis, spreadsheet, or data visualization programs. Along with a specification and a dictionary of metadata types, we provide an application-programming interface written in C and bindings for programming dynamic languages.

  18. XAFS Data Interchange: A single spectrum XAFS data file format

    NASA Astrophysics Data System (ADS)

    Ravel, B.; Newville, M.

    2016-05-01

    We propose a standard data format for the interchange of XAFS data. The XAFS Data Interchange (XDI) standard is meant to encapsulate a single spectrum of XAFS along with relevant metadata. XDI is a text-based format with a simple syntax which clearly delineates metadata from the data table in a way that is easily interpreted both by a computer and by a human. The metadata header is inspired by the format of an electronic mail header, representing metadata names and values as an associative array. The data table is represented as columns of numbers. This format can be imported as is into most existing XAFS data analysis, spreadsheet, or data visualization programs. Along with a specification and a dictionary of metadata types, we provide an application-programming interface written in C and bindings for programming dynamic languages.

  19. Monte Carlo Uncertainty Quantification for an Unattended Enrichment Monitor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jarman, Kenneth D.; Smith, Leon E.; Wittman, Richard S.

    As a case study for uncertainty analysis, we consider a model flow monitor for measuring enrichment in gas centrifuge enrichment plants (GCEPs) that could provide continuous monitoring of all declared gas flow and provide high-accuracy gas enrichment estimates as a function of time. The monitor system could include NaI(Tl) gamma-ray spectrometers, a pressure signal-sharing device to be installed on an operator\\rq{}s pressure gauge or a dedicated inspector pressure sensor, and temperature sensors attached to the outside of the header pipe, to provide pressure, temperature, and gamma-ray spectra measurements of UFmore » $$_6$$ gas flow through unit header pipes. Our study builds on previous modeling and analysis methods development for enrichment monitor concepts and a software tool that was developed at Oak Ridge National Laboratory to generate and analyze synthetic data.« less

  20. Network acceleration techniques

    NASA Technical Reports Server (NTRS)

    Crowley, Patricia (Inventor); Maccabe, Arthur Barney (Inventor); Awrach, James Michael (Inventor)

    2012-01-01

    Splintered offloading techniques with receive batch processing are described for network acceleration. Such techniques offload specific functionality to a NIC while maintaining the bulk of the protocol processing in the host operating system ("OS"). The resulting protocol implementation allows the application to bypass the protocol processing of the received data. Such can be accomplished this by moving data from the NIC directly to the application through direct memory access ("DMA") and batch processing the receive headers in the host OS when the host OS is interrupted to perform other work. Batch processing receive headers allows the data path to be separated from the control path. Unlike operating system bypass, however, the operating system still fully manages the network resource and has relevant feedback about traffic and flows. Embodiments of the present disclosure can therefore address the challenges of networks with extreme bandwidth delay products (BWDP).

  1. Zero-Copy Objects System

    NASA Technical Reports Server (NTRS)

    Burleigh, Scott C.

    2011-01-01

    Zero-Copy Objects System software enables application data to be encapsulated in layers of communication protocol without being copied. Indirect referencing enables application source data, either in memory or in a file, to be encapsulated in place within an unlimited number of protocol headers and/or trailers. Zero-copy objects (ZCOs) are abstract data access representations designed to minimize I/O (input/output) in the encapsulation of application source data within one or more layers of communication protocol structure. They are constructed within the heap space of a Simple Data Recorder (SDR) data store to which all participating layers of the stack must have access. Each ZCO contains general information enabling access to the core source data object (an item of application data), together with (a) a linked list of zero or more specific extents that reference portions of this source data object, and (b) linked lists of protocol header and trailer capsules. The concatenation of the headers (in ascending stack sequence), the source data object extents, and the trailers (in descending stack sequence) constitute the transmitted data object constructed from the ZCO. This scheme enables a source data object to be encapsulated in a succession of protocol layers without ever having to be copied from a buffer at one layer of the protocol stack to an encapsulating buffer at a lower layer of the stack. For large source data objects, the savings in copy time and reduction in memory consumption may be considerable.

  2. Distributed file management for remote clinical image-viewing stations

    NASA Astrophysics Data System (ADS)

    Ligier, Yves; Ratib, Osman M.; Girard, Christian; Logean, Marianne; Trayser, Gerhard

    1996-05-01

    The Geneva PACS is based on a distributed architecture, with different archive servers used to store all the image files produced by digital imaging modalities. Images can then be visualized on different display stations with the Osiris software. Image visualization require to have the image file physically present on the local station. Thus, images must be transferred from archive servers to local display stations in an acceptable way, which means fast and user friendly where the notion of file must be hidden to users. The transfer of image files is done according to different schemes including prefetching and direct image selection. Prefetching allows the retrieval of previous studies of a patient in advance. A direct image selection is also provided in order to retrieve images on request. When images are transferred locally on the display station, they are stored in Papyrus files, each file containing a set of images. File names are used by the Osiris viewing software to open image sequences. But file names alone are not explicit enough to properly describe the content of the file. A specific utility has been developed to present a list of patients, and for each patient a list of exams which can be selected and automatically displayed. The system has been successfully tested in different clinical environments. It will be soon extended on a hospital wide basis.

  3. The data facility of the Airborne Visible/Infrared Imaging Spectrometer (AVIRIS)

    NASA Technical Reports Server (NTRS)

    Nielsen, Pia J.; Green, Robert O.; Murray, Alex T.; Eng, Bjorn T.; Novack, H. Ian; Solis, Manuel; Olah, Martin

    1993-01-01

    AVIRIS operations at the Jet Propulsion Laboratory include a significant data task. The AVIRIS data facility is responsible for data archiving, data calibration, quality monitoring and distribution. Since 1987, the data facility has archived over one terabyte of AVIRIS data and distributed these data to science investigators as requested. In this paper we describe recent improvements in the AVIRIS data facility.

  4. ESA Science Archives, VO tools and remote Scientific Data reduction in Grid Architectures

    NASA Astrophysics Data System (ADS)

    Arviset, C.; Barbarisi, I.; de La Calle, I.; Fajersztejn, N.; Freschi, M.; Gabriel, C.; Gomez, P.; Guainazzi, M.; Ibarra, A.; Laruelo, A.; Leon, I.; Micol, A.; Parrilla, E.; Ortiz, I.; Osuna, P.; Salgado, J.; Stebe, A.; Tapiador, D.

    2008-08-01

    This paper presents the latest functionalities of the ESA Science Archives located at ESAC, Spain, in particular, the following archives : the ISO Data Archive (IDA {http://iso.esac.esa.int/ida}), the XMM-Newton Science Archive (XSA {http://xmm.esac.esa.int/xsa}), the Integral SOC Science Data Archive (ISDA {http://integral.esac.esa.int/isda}) and the Planetary Science Archive (PSA {http://www.rssd.esa.int/psa}), both the classical and the map-based Mars Express interfaces. Furthermore, the ESA VOSpec {http://esavo.esac.esa.int/vospecapp} spectra analysis tool is described, which allows to access and display spectral information from VO resources (both real observational and theoretical spectra), including access to Lines database and recent analysis functionalities. In addition, we detail the first implementation of RISA (Remote Interface for Science Analysis), a web service providing remote users the ability to create fully configurable XMM-Newton data analysis workflows, and to deploy and run them on the ESAC Grid. RISA makes fully use of the inter-operability provided by the SIAP (Simple Image Access Protocol) services as data input, and at the same time its VO-compatible output can directly be used by general VO-tools.

  5. Dealing with extreme data diversity: extraction and fusion from the growing types of document formats

    NASA Astrophysics Data System (ADS)

    David, Peter; Hansen, Nichole; Nolan, James J.; Alcocer, Pedro

    2015-05-01

    The growth in text data available online is accompanied by a growth in the diversity of available documents. Corpora with extreme heterogeneity in terms of file formats, document organization, page layout, text style, and content are common. The absence of meaningful metadata describing the structure of online and open-source data leads to text extraction results that contain no information about document structure and are cluttered with page headers and footers, web navigation controls, advertisements, and other items that are typically considered noise. We describe an approach to document structure and metadata recovery that uses visual analysis of documents to infer the communicative intent of the author. Our algorithm identifies the components of documents such as titles, headings, and body content, based on their appearance. Because it operates on an image of a document, our technique can be applied to any type of document, including scanned images. Our approach to document structure recovery considers a finer-grained set of component types than prior approaches. In this initial work, we show that a machine learning approach to document structure recovery using a feature set based on the geometry and appearance of images of documents achieves a 60% greater F1- score than a baseline random classifier.

  6. HST Archival Imaging of the Light Echoes of SN 1987A

    NASA Astrophysics Data System (ADS)

    Lawrence, S. S.; Hayon, M.; Sugerman, B. E. K.; Crotts, A. P. S.

    2002-12-01

    We have undertaken a search for light echo signals from Supernova 1987A that have been serendipitously recorded in images taken near the 30 Doradus region of the Large Magellanic Cloud by HST. We used the MAST interface to create a database of the 1282 WF/PC, WFPC2 and STIS images taken within 15 arcminutes of the supernova, between 1992 April and 2002 June. These 1282 images are grouped into 125 distinct epochs and pointings, with each epoch containing between 1 and 42 separate exposures. Sorting this database with various programs, aided by the STScI Visual Target Tuner, we have identified 63 pairs of WFPC2 imaging epochs that are not centered on the supernova but that have a significant amount of spatial overlap between their fields of view. These image data were downloaded from the public archive, cleaned of cosmic rays, and blinked to search for light echoes at radii larger than 2 arcminutes from the supernova. Our search to date has focused on those pairs of epochs with the largest degree of overlap. Of 16 pairs of epochs scanned to date, we have detected 3 strong light echoes and one faint, tentative echo signal. We will present direct and difference images of these and any further echoes, as well as the 3-D geometric, photometric and color properties of the echoing dust structures. In addition, a set of 20 epochs of WF/PC and WFPC2 imaging centered on SN 1987A remain to be searched for echoes within 2 arcminutes of the supernova. We will discuss our plans to integrate the high spatial-resolution HST snapshots of the echoes with our extensive, well-time-sampled, ground-based imaging data. We gratefully acknowledge the support of this undergraduate research project through an HST Archival Research Grant (HST-AR-09209.01-A).

  7. The Next Landsat Satellite: The Landsat Data Continuity Mission

    NASA Technical Reports Server (NTRS)

    Rons, James R.; Dwyer, John L.; Barsi, Julia A.

    2012-01-01

    The Landsat program is one of the longest running satellite programs for Earth observations from space. The program was initiated by the launch of Landsat 1 in 1972. Since then a series of six more Landsat satellites were launched and at least one of those satellites has been in operations at all times to continuously collect images of the global land surface. The Department of Interior (DOI) U.S. Geological Survey (USGS) preserves data collected by all of the Landsat satellites at their Earth Resources Observation and Science (EROS) Center in Sioux Falls, South Dakota. This 40-year data archive provides an unmatched record of the Earth's land surface that has undergone dramatic changes in recent decades due to the increasing pressure of a growing population and advancing technologies. EROS provides the ability for anyone to search the archive and order digital Landsat images over the internet for free. The Landsat data are a public resource for observing, characterizing, monitoring, trending, and predicting land use change over time providing an invaluable tool for those addressing the profound consequences of those changes to society. The most recent launch of a Landsat satellite occurred in 1999 when Landsat 7 was placed in orbit. While Landsat 7 remains in operation, the National Aeronautics and Space Administration (NASA) and the DOI/ USGS are building its successor satellite system currently called the Landsat Data Continuity Mission (LDCM). NASA has the lead for building and launching the satellite that will carry two Earth-viewing instruments, the Operational Land Imager (OLI) and the Thermal Infrared Sensor (TIRS). The OLI will take images that measure the amount of sunlight reflected by the land surface at nine wavelengths of light with three of those wavelengths beyond the range of human vision. T1RS will collect coincident images that measure light emitted by the land surface as a function of surface temperature at two longer wavelengths well beyond the range of human vision. The DOI/USGS is developing the ground system that will command and control the LDCM satellite in orbit and manage the OLI and TIRS data transmitted by the satellite. DOI/USGS will thus operate the satellite and collect, archive, and distribute the image data as part of the EROS archive. DOI/USGS has committed to renaming LDCM as Landsat 8 following launch. By either name the satellite and its sensors will extend the 40-year archive with images sufficiently consistent with data from earlier Landsat satellites to allow multi-decadal, broad-area studies of our dynamic landscapes. The next Landsat satellite and ground system are on schedule for a January, 2013 launch.

  8. VizieR Online Data Catalog: Star formation in active and normal galaxies (Tsai+, 2015)

    NASA Astrophysics Data System (ADS)

    Tsai, M.; Hwang, C.-Y.

    2015-11-01

    We selected 104 active galaxies from the lists of Melendez et al. (2010MNRAS.406..493M), Condon et al. 1991 (cat. J/ApJ/378/65), and Ho & Ulvestad 2001 (cat. J/ApJS/133/77). Most of the sources are identified as Active Galactic Nuclei (AGNs), and a few of them are classified as Luminous InfraRed Galaxies (LIRGs). We obtained 3.6 and 8μm infrared images of these galaxies from the Spitzer Archive (http://sha.ipac.caltech.edu/applications/Spitzer/SHA/) and 8GHz images from the VLA archive (http://archive.nrao.edu/archive/archiveimage.html). We also selected a nearby AGN sub-sample containing 21 radio-selected AGNs for further spatial analysis. We selected 25 nearby AGNs exhibiting no detected radio emission in order to compare with the results of the radio-selected sources. For comparison, we also selected normal galaxies with distances less than 15Mpc from the catalog of Tully 1994 (see cat. VII/145). We only selected the galaxies that have Spitzer archive data and are not identified as AGNs in either the Veron-Cetty & Veron 2006 (see cat. VII/258) AGN catalog or in the NED database (http://ned.ipac.caltech.edu/). Our results for the radio-selected and the non-radio-selected active galaxies are listed in Table1, and those for the normal galaxies are listed in Table2. (2 data files).

  9. Archive of Digitized Analog Boomer Seismic Reflection Data Collected from the Mississippi-Alabama-Florida Shelf During Cruises Onboard the R/V Kit Jones, June 1990 and July 1991

    USGS Publications Warehouse

    Sanford, Jordan M.; Harrison, Arnell S.; Wiese, Dana S.; Flocks, James G.

    2009-01-01

    In June of 1990 and July of 1991, the U.S. Geological Survey (USGS) conducted geophysical surveys to investigate the shallow geologic framework of the Mississippi-Alabama-Florida shelf in the northern Gulf of Mexico, from Mississippi Sound to the Florida Panhandle. Work was done onboard the Mississippi Mineral Resources Institute R/V Kit Jones as part of a project to study coastal erosion and offshore sand resources. This report is part of a series to digitally archive the legacy analog data collected from the Mississippi-Alabama SHelf (MASH). The MASH data rescue project is a cooperative effort by the USGS and the Minerals Management Service (MMS). This report serves as an archive of high-resolution scanned Tagged Image File Format (TIFF) and Graphics Interchange Format (GIF) images of the original boomer paper records, navigation files, trackline maps, Geographic Information System (GIS) files, cruise logs, and formal Federal Geographic Data Committee (FGDC) metadata.

  10. The new European Hubble archive

    NASA Astrophysics Data System (ADS)

    De Marchi, Guido; Arevalo, Maria; Merin, Bruno

    2016-01-01

    The European Hubble Archive (hereafter eHST), hosted at ESA's European Space Astronomy Centre, has been released for public use in October 2015. The eHST is now fully integrated with the other ESA science archives to ensure long-term preservation of the Hubble data, consisting of more than 1 million observations from 10 different scientific instruments. The public HST data, the Hubble Legacy Archive, and the high-level science data products are now all available to scientists through a single, carefully designed and user friendly web interface. In this talk, I will show how the the eHST can help boost archival research, including how to search on sources in the field of view thanks to precise footprints projected onto the sky, how to obtain enhanced previews of imaging data and interactive spectral plots, and how to directly link observations with already published papers. To maximise the scientific exploitation of Hubble's data, the eHST offers connectivity to virtual observatory tools, easily integrates with the recently released Hubble Source Catalog, and is fully accessible through ESA's archives multi-mission interface.

  11. Databases and archiving for cryoEM

    PubMed Central

    Patwardhan, Ardan; Lawson, Catherine L.

    2017-01-01

    Cryo-EM in structural biology is currently served by three public archives – EMDB for 3DEM reconstructions, PDB for models built from 3DEM reconstructions and EMPIAR for the raw 2D image data used to obtain the 3DEM reconstructions. These archives play a vital role for both the structural community and the wider biological community in making the data accessible so that results may be reused, reassessed and integrated with other structural and bioinformatics resources. The important role of the archives is underpinned by the fact that many journals mandate the deposition of data to PDB and EMDB on publication. The field is currently undergoing transformative changes where on the one hand high-resolution structures are becoming a routine occurrence while on the other hand electron tomography is enabling the study of macromolecules in the cellular context. Concomitantly the archives are evolving to best serve their stakeholder communities. In this chapter we describe the current state of the archives, resources available for depositing, accessing, searching, visualising and validating data, on-going community-wide initiatives and opportunities and challenges for the future. PMID:27572735

  12. VirGO: A Visual Browser for the ESO Science Archive Facility

    NASA Astrophysics Data System (ADS)

    Hatziminaoglou, Evanthia; Chéreau, Fabien

    2009-03-01

    VirGO is the next generation Visual Browser for the ESO Science Archive Facility (SAF) developed in the Virtual Observatory Project Office. VirGO enables astronomers to discover and select data easily from millions of observations in a visual and intuitive way. It allows real-time access and the graphical display of a large number of observations by showing instrumental footprints and image previews, as well as their selection and filtering for subsequent download from the ESO SAF web interface. It also permits the loading of external FITS files or VOTables, as well as the superposition of Digitized Sky Survey images to be used as background. All data interfaces are based on Virtual Observatory (VO) standards that allow access to images and spectra from external data centres, and interaction with the ESO SAF web interface or any other VO applications.

  13. 49 CFR 192.155 - Welded branch connections.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... not reduced, taking into account the stresses in the remaining pipe wall due to the opening in the pipe or header, the shear stresses produced by the pressure acting on the area of the branch opening...

  14. 19 CFR 10.223 - Articles eligible for preferential treatment.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... “sleeve header,” of woven or weft-inserted warp knit construction and of coarse animal hair or man-made... expenses incurred in the growth, production, manufacture, or other processing of the components, findings...

  15. 19 CFR 10.223 - Articles eligible for preferential treatment.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... “sleeve header,” of woven or weft-inserted warp knit construction and of coarse animal hair or man-made... expenses incurred in the growth, production, manufacture, or other processing of the components, findings...

  16. 19 CFR 10.223 - Articles eligible for preferential treatment.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... “sleeve header,” of woven or weft-inserted warp knit construction and of coarse animal hair or man-made... expenses incurred in the growth, production, manufacture, or other processing of the components, findings...

  17. 19 CFR 10.223 - Articles eligible for preferential treatment.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... “sleeve header,” of woven or weft-inserted warp knit construction and of coarse animal hair or man-made... expenses incurred in the growth, production, manufacture, or other processing of the components, findings...

  18. 19 CFR 10.223 - Articles eligible for preferential treatment.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... “sleeve header,” of woven or weft-inserted warp knit construction and of coarse animal hair or man-made... expenses incurred in the growth, production, manufacture, or other processing of the components, findings...

  19. 49 CFR 192.155 - Welded branch connections.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... not reduced, taking into account the stresses in the remaining pipe wall due to the opening in the pipe or header, the shear stresses produced by the pressure acting on the area of the branch opening...

  20. 49 CFR 192.155 - Welded branch connections.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... not reduced, taking into account the stresses in the remaining pipe wall due to the opening in the pipe or header, the shear stresses produced by the pressure acting on the area of the branch opening...

  1. 46 CFR 38.20-1 - Venting-T/ALL.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... pressure of 10 percent of the relief valve setting is insufficient to move the gases through any but an...) Vents and headers shall be so installed as to prevent excessive stresses on safety relief valve...

  2. 46 CFR 38.20-1 - Venting-T/ALL.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... pressure of 10 percent of the relief valve setting is insufficient to move the gases through any but an...) Vents and headers shall be so installed as to prevent excessive stresses on safety relief valve...

  3. 46 CFR 38.20-1 - Venting-T/ALL.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... pressure of 10 percent of the relief valve setting is insufficient to move the gases through any but an...) Vents and headers shall be so installed as to prevent excessive stresses on safety relief valve...

  4. 46 CFR 38.20-1 - Venting-T/ALL.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... pressure of 10 percent of the relief valve setting is insufficient to move the gases through any but an...) Vents and headers shall be so installed as to prevent excessive stresses on safety relief valve...

  5. 46 CFR 38.20-1 - Venting-T/ALL.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... pressure of 10 percent of the relief valve setting is insufficient to move the gases through any but an...) Vents and headers shall be so installed as to prevent excessive stresses on safety relief valve...

  6. 49 CFR 192.155 - Welded branch connections.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... not reduced, taking into account the stresses in the remaining pipe wall due to the opening in the pipe or header, the shear stresses produced by the pressure acting on the area of the branch opening...

  7. Application of newly developed Fluoro-QC software for image quality evaluation in cardiac X-ray systems.

    PubMed

    Oliveira, M; Lopez, G; Geambastiani, P; Ubeda, C

    2018-05-01

    A quality assurance (QA) program is a valuable tool for the continuous production of optimal quality images. The aim of this paper is to assess a newly developed automatic computer software for image quality (IR) evaluation in fluoroscopy X-ray systems. Test object images were acquired using one fluoroscopy system, Siemens Axiom Artis model (Siemens AG, Medical Solutions Erlangen, Germany). The software was developed as an ImageJ plugin. Two image quality parameters were assessed: high-contrast spatial resolution (HCSR) and signal-to-noise ratio (SNR). The time between manual and automatic image quality assessment procedures were compared. The paired t-test was used to assess the data. p Values of less than 0.05 were considered significant. The Fluoro-QC software generated faster IQ evaluation results (mean = 0.31 ± 0.08 min) than manual procedure (mean = 4.68 ± 0.09 min). The mean difference between techniques was 4.36 min. Discrepancies were identified in the region of interest (ROI) areas drawn manually with evidence of user dependence. The new software presented the results of two tests (HCSR = 3.06, SNR = 5.17) and also collected information from the DICOM header. Significant differences were not identified between manual and automatic measures of SNR (p value = 0.22) and HCRS (p value = 0.46). The Fluoro-QC software is a feasible, fast and free to use method for evaluating imaging quality parameters on fluoroscopy systems. Copyright © 2017 The College of Radiographers. Published by Elsevier Ltd. All rights reserved.

  8. Subband coding for image data archiving

    NASA Technical Reports Server (NTRS)

    Glover, Daniel; Kwatra, S. C.

    1993-01-01

    The use of subband coding on image data is discussed. An overview of subband coding is given. Advantages of subbanding for browsing and progressive resolution are presented. Implementations for lossless and lossy coding are discussed. Algorithm considerations and simple implementations of subband systems are given.

  9. Subband coding for image data archiving

    NASA Technical Reports Server (NTRS)

    Glover, D.; Kwatra, S. C.

    1992-01-01

    The use of subband coding on image data is discussed. An overview of subband coding is given. Advantages of subbanding for browsing and progressive resolution are presented. Implementations for lossless and lossy coding are discussed. Algorithm considerations and simple implementations of subband are given.

  10. Distributing medical images with internet technologies: a DICOM web server and a DICOM java viewer.

    PubMed

    Fernàndez-Bayó, J; Barbero, O; Rubies, C; Sentís, M; Donoso, L

    2000-01-01

    With the advent of filmless radiology, it becomes important to be able to distribute radiologic images digitally throughout an entire hospital. A new approach based on World Wide Web technologies was developed to accomplish this objective. This approach involves a Web server that allows the query and retrieval of images stored in a Digital Imaging and Communications in Medicine (DICOM) archive. The images can be viewed inside a Web browser with use of a small Java program known as the DICOM Java Viewer, which is executed inside the browser. The system offers several advantages over more traditional picture archiving and communication systems (PACS): It is easy to install and maintain, is platform independent, allows images to be manipulated and displayed efficiently, and is easy to integrate with existing systems that are already making use of Web technologies. The system is user-friendly and can easily be used from outside the hospital if a security policy is in place. The simplicity and flexibility of Internet technologies makes them highly preferable to the more complex PACS workstations. The system works well, especially with magnetic resonance and computed tomographic images, and can help improve and simplify interdepartmental relationships in a filmless hospital environment.

  11. Sandia Simple Particle Tracking (Sandia SPT) v. 1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anthony, Stephen M.

    2015-06-15

    Sandia SPT is designed as software to accompany a book chapter being published a methods chapter which provides an introduction on how to label and track individual proteins. The Sandia Simple Particle Tracking code uses techniques common to the image processing community, where its value is that it facilitates implementing the methods described in the book chapter by providing the necessary open-source code. The code performs single particle spot detection (or segmentation and localization) followed by tracking (or connecting the detected particles into trajectories). The book chapter, which along with the headers in each file, constitutes the documentation for themore » code is: Anthony, S.M.; Carroll-Portillo, A.; Timlon, J.A., Dynamics and Interactions of Individual Proteins in the Membrane of Living Cells. In Anup K. Singh (Ed.) Single Cell Protein Analysis Methods in Molecular Biology. Springer« less

  12. Selection and quality assessment of Landsat data for the North American forest dynamics forest history maps of the US

    USGS Publications Warehouse

    Schleeweis, Karen; Goward, Samuel N.; Huang, Chengquan; Dwyer, John L.; Dungan, Jennifer L.; Lindsey, Mary A.; Michaelis, Andrew; Rishmawi, Khaldoun; Masek, Jeffery G.

    2016-01-01

    Using the NASA Earth Exchange platform, the North American Forest Dynamics (NAFD) project mapped forest history wall-to-wall, annually for the contiguous US (1986–2010) using the Vegetation Change Tracker algorithm. As with any effort to identify real changes in remotely sensed time-series, data gaps, shifts in seasonality, misregistration, inconsistent radiometry and cloud contamination can be sources of error. We discuss the NAFD image selection and processing stream (NISPS) that was designed to minimize these sources of error. The NISPS image quality assessments highlighted issues with the Landsat archive and metadata including inadequate georegistration, unreliability of the pre-2009 L5 cloud cover assessments algorithm, missing growing-season imagery and paucity of clear views. Assessment maps of Landsat 5–7 image quantities and qualities are presented that offer novel perspectives on the growing-season archive considered for this study. Over 150,000+ Landsat images were considered for the NAFD project. Optimally, one high quality cloud-free image in each year or a total of 12,152 images would be used. However, to accommodate data gaps and cloud/shadow contamination 23,338 images were needed. In 220 specific path-row image years no acceptable images were found resulting in data gaps in the annual national map products.

  13. [Central online quality assurance in radiology: an IT solution exemplified by the German Breast Cancer Screening Program].

    PubMed

    Czwoydzinski, J; Girnus, R; Sommer, A; Heindel, W; Lenzen, H

    2011-09-01

    Physical-technical quality assurance is one of the essential tasks of the National Reference Centers in the German Breast Cancer Screening Program. For this purpose the mammography units are required to transfer the measured values of the constancy tests on a daily basis and all phantom images created for this purpose on a weekly basis to the reference centers. This is a serious logistical challenge. To meet these requirements, we developed an innovative software tool. By the end of 2005, we had already developed web-based software (MammoControl) allowing the transmission of constancy test results via entry forms. For automatic analysis and transmission of the phantom images, we then introduced an extension (MammoControl DIANA). This was based on Java, Java Web Start, the NetBeans Rich Client Platform, the Pixelmed Java DICOM Toolkit and the ImageJ library. MammoControl DIANA was designed to run locally in the mammography units. This allows automated on-site image analysis. Both results and compressed images can then be transmitted to the reference center. We developed analysis modules for the daily and monthly consistency tests and additionally for a homogeneity test. The software we developed facilitates the immediate availability of measurement results, phantom images, and DICOM header data in all reference centers. This allows both targeted guidance and short response time in the case of errors. We achieved a consistent IT-based evaluation with standardized tools for the entire screening program in Germany. © Georg Thieme Verlag KG Stuttgart · New York.

  14. Enterprise-class Digital Imaging and Communications in Medicine (DICOM) image infrastructure.

    PubMed

    York, G; Wortmann, J; Atanasiu, R

    2001-06-01

    Most current picture archiving and communication systems (PACS) are designed for a single department or a single modality. Few PACS installations have been deployed that support the needs of the hospital or the entire Integrated Delivery Network (IDN). The authors propose a new image management architecture that can support a large, distributed enterprise.

  15. Development of a Multi-Centre Clinical Trial Data Archiving and Analysis Platform for Functional Imaging

    NASA Astrophysics Data System (ADS)

    Driscoll, Brandon; Jaffray, David; Coolens, Catherine

    2014-03-01

    Purpose: To provide clinicians & researchers participating in multi-centre clinical trials with a central repository for large volume dynamic imaging data as well as a set of tools for providing end-to-end testing and image analysis standards of practice. Methods: There are three main pieces to the data archiving and analysis system; the PACS server, the data analysis computer(s) and the high-speed networks that connect them. Each clinical trial is anonymized using a customizable anonymizer and is stored on a PACS only accessible by AE title access control. The remote analysis station consists of a single virtual machine per trial running on a powerful PC supporting multiple simultaneous instances. Imaging data management and analysis is performed within ClearCanvas Workstation® using custom designed plug-ins for kinetic modelling (The DCE-Tool®), quality assurance (The DCE-QA Tool) and RECIST. Results: A framework has been set up currently serving seven clinical trials spanning five hospitals with three more trials to be added over the next six months. After initial rapid image transfer (+ 2 MB/s), all data analysis is done server side making it robust and rapid. This has provided the ability to perform computationally expensive operations such as voxel-wise kinetic modelling on very large data archives (+20 GB/50k images/patient) remotely with minimal end-user hardware. Conclusions: This system is currently in its proof of concept stage but has been used successfully to send and analyze data from remote hospitals. Next steps will involve scaling up the system with a more powerful PACS and multiple high powered analysis machines as well as adding real-time review capabilities.

  16. The Planetary Data System Web Catalog Interface--Another Use of the Planetary Data System Data Model

    NASA Technical Reports Server (NTRS)

    Hughes, S.; Bernath, A.

    1995-01-01

    The Planetary Data System Data Model consists of a set of standardized descriptions of entities within the Planetary Science Community. These can be real entities in the space exploration domain such as spacecraft, instruments, and targets; conceptual entities such as data sets, archive volumes, and data dictionaries; or the archive data products such as individual images, spectrum, series, and qubes.

  17. Enhancement of spectral quality of archival aerial photographs using satellite imagery for detection of land cover

    NASA Astrophysics Data System (ADS)

    Siok, Katarzyna; Jenerowicz, Agnieszka; Woroszkiewicz, Małgorzata

    2017-07-01

    Archival aerial photographs are often the only reliable source of information about the area. However, these data are single-band data that do not allow unambiguous detection of particular forms of land cover. Thus, the authors of this article seek to develop a method of coloring panchromatic aerial photographs, which enable increasing the spectral information of such images. The study used data integration algorithms based on pansharpening, implemented in commonly used remote sensing programs: ERDAS, ENVI, and PCI. Aerial photos and Landsat multispectral data recorded in 1987 and 2016 were chosen. This study proposes the use of modified intensity-hue-saturation and Brovey methods. The use of these methods enabled the addition of red-green-blue (RGB) components to monochrome images, thus enhancing their interpretability and spectral quality. The limitations of the proposed method relate to the availability of RGB satellite imagery, the accuracy of mutual orientation of the aerial and the satellite data, and the imperfection of archival aerial photographs. Therefore, it should be expected that the results of coloring will not be perfect compared to the results of the fusion of recent data with a similar ground sampling resolution, but still, they will allow a more accurate and efficient classification of land cover registered on archival aerial photographs.

  18. Video Analytics for Indexing, Summarization and Searching of Video Archives

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Trease, Harold E.; Trease, Lynn L.

    This paper will be submitted to the proceedings The Eleventh IASTED International Conference on. Signal and Image Processing. Given a video or video archive how does one effectively and quickly summarize, classify, and search the information contained within the data? This paper addresses these issues by describing a process for the automated generation of a table-of-contents and keyword, topic-based index tables that can be used to catalogue, summarize, and search large amounts of video data. Having the ability to index and search the information contained within the videos, beyond just metadata tags, provides a mechanism to extract and identify "useful"more » content from image and video data.« less

  19. Analysis of full disc Ca II K spectroheliograms. I. Photometric calibration and centre-to-limb variation compensation

    NASA Astrophysics Data System (ADS)

    Chatzistergos, Theodosios; Ermolli, Ilaria; Solanki, Sami K.; Krivova, Natalie A.

    2018-01-01

    Context. Historical Ca II K spectroheliograms (SHG) are unique in representing long-term variations of the solar chromospheric magnetic field. They usually suffer from numerous problems and lack photometric calibration. Thus accurate processing of these data is required to get meaningful results from their analysis. Aims: In this paper we aim at developing an automatic processing and photometric calibration method that provides precise and consistent results when applied to historical SHG. Methods: The proposed method is based on the assumption that the centre-to-limb variation of the intensity in quiet Sun regions does not vary with time. We tested the accuracy of the proposed method on various sets of synthetic images that mimic problems encountered in historical observations. We also tested our approach on a large sample of images randomly extracted from seven different SHG archives. Results: The tests carried out on the synthetic data show that the maximum relative errors of the method are generally <6.5%, while the average error is <1%, even if rather poor quality observations are considered. In the absence of strong artefacts the method returns images that differ from the ideal ones by <2% in any pixel. The method gives consistent values for both plage and network areas. We also show that our method returns consistent results for images from different SHG archives. Conclusions: Our tests show that the proposed method is more accurate than other methods presented in the literature. Our method can also be applied to process images from photographic archives of solar observations at other wavelengths than Ca II K.

  20. First results of MAO NASU SS bodies photographic archive digitizing

    NASA Astrophysics Data System (ADS)

    Pakuliak, L.; Andruk, V.; Shatokhina, S.; Golovnya, V.; Yizhakevych, O.; Kulyk, I.

    2013-05-01

    MAO NASU glass archive encloses about 1800 photographic plates with planets and their satellites (including near 80 images of Uranus, Pluto and Neptune), about 1700 plates with minor planets and about 900 plates with comets. Plates were made during 1949-1999 using 11 telescopes of different focus, mostly the Double Wide-angle Astrograph (F/D=2000/400) and the Double Long-focus Astrograph (F/D=5500/400) of MAO NASU. Observational sites are Kyiv, Lviv (Ukraine), Biurakan (Armenia), Abastumani (Georgia), Mt. Maidanak (Uzbekistan), Quito (Equador). Tables contain data about the most significant numbers of plates sub-divided by years and objects. The database with metadata of plates (DBGPA) is available on the computer cluster of MAO (http://gua.db.ukr-vo.org) via open access. The database accumulates archives of four Ukrainian observatories, involving the UkrVO national project. Together with the archive managing system, the database serves as a test area for JDA - Joint Digital Archive - the core of the UkrVO.

Top