Sample records for technology archive system

  1. Technical Concept Document. Central Archive for Reusable Defense Software (CARDS)

    DTIC Science & Technology

    1994-02-28

    FeNbry 1994 INFORMAL TECHNICAL REPORT For The SOFTWARE TECHNOLOGY FOR ADAPTABLE, RELIABLE SYSTEMS (STARS) Technical Concept Document Central Archive for...February 1994 INFORMAL TECHNICAL REPORT For The SOFTWARE TECHNOLOGY FOR ADAPTABLE, RELIABLE SYSTEMS (STARS) Technical Concept Document Central Archive...accordance with the DFARS Special Works Clause Developed by: This document, developed under the Software Technology for Adaptable, Reliable Systems

  2. Expert Systems Technology and Its Implication for Archives. National Archives Technical Information Paper No. 9.

    ERIC Educational Resources Information Center

    Michelson, Avra

    This report introduces archivists to the potential of expert systems for improving archives administration and alerts them to ways in which they can expect intelligent technologies to impact federal record-keeping systems and scholarly research methods. The report introduces the topic by describing expert systems used in three Fortune 500…

  3. Long-Term Preservation and Advanced Access Services to Archived Data: The Approach of a System Integrator

    NASA Astrophysics Data System (ADS)

    Petitjean, Gilles; de Hauteclocque, Bertrand

    2004-06-01

    EADS Defence and Security Systems (EADS DS SA) have developed an expertise as integrator of archive management systems for both their commercial and defence customers (ESA, CNES, EC, EUMETSAT, French MOD, US DOD, etc.), especially in Earth Observation and in Meteorology fields.The concern of valuable data owners is both their long-term preservation but also the integration of the archive in their information system with in particular an efficient access to archived data for their user community. The system integrator answers to this requirement by a methodology combining understanding of user needs, exhaustive knowledge of the existing solutions both for hardware and software elements and development and integration ability. The system integrator completes the facility development by support activities.The long-term preservation of archived data obviously involves a pertinent selection of storage media and archive library. This selection relies on storage technology survey but the selection criteria depend on the analysis of the user needs. The system integrator will recommend the best compromise for implementing an archive management facility, thanks to its knowledge and its independence of storage market and through the analysis of the user requirements. He will provide a solution, which is able to evolve to take advantage of the storage technology progress.But preserving the data for long-term is not only a question of storage technology. Some functions are required to secure the archive management system against contingency situation: multiple data set copies using operational procedures, active quality control of the archived data, migration policy optimising the cost of ownership.

  4. Preservation Environments

    NASA Technical Reports Server (NTRS)

    Moore, Reagan W.

    2004-01-01

    The long-term preservation of digital entities requires mechanisms to manage the authenticity of massive data collections that are written to archival storage systems. Preservation environments impose authenticity constraints and manage the evolution of the storage system technology by building infrastructure independent solutions. This seeming paradox, the need for large archives, while avoiding dependence upon vendor specific solutions, is resolved through use of data grid technology. Data grids provide the storage repository abstractions that make it possible to migrate collections between vendor specific products, while ensuring the authenticity of the archived data. Data grids provide the software infrastructure that interfaces vendor-specific storage archives to preservation environments.

  5. Medical image digital archive: a comparison of storage technologies

    NASA Astrophysics Data System (ADS)

    Chunn, Timothy; Hutchings, Matt

    1998-07-01

    A cost effective, high capacity digital archive system is one of the remaining key factors that will enable a radiology department to eliminate film as an archive medium. The ever increasing amount of digital image data is creating the need for huge archive systems that can reliably store and retrieve millions of images and hold from a few terabytes of data to possibly hundreds of terabytes. Selecting the right archive solution depends on a number of factors: capacity requirements, write and retrieval performance requirements, scaleability in capacity and performance, conformance to open standards, archive availability and reliability, security, cost, achievable benefits and cost savings, investment protection, and more. This paper addresses many of these issues. It compares and positions optical disk and magnetic tape technologies, which are the predominant archive mediums today. New technologies will be discussed, such as DVD and high performance tape. Price and performance comparisons will be made at different archive capacities, plus the effect of file size on random and pre-fetch retrieval time will be analyzed. The concept of automated migration of images from high performance, RAID disk storage devices to high capacity, NearlineR storage devices will be introduced as a viable way to minimize overall storage costs for an archive.

  6. Strategy of Planetary Data Archives in Japanese Missions for Planetary Data System

    NASA Astrophysics Data System (ADS)

    Yamamoto, Y.; Ishihara, Y.; Murakami, S. Y.

    2017-12-01

    To preserve data acquired by Japanese planetary explorations for a long time, we need a data archiving strategy in a form suitable for resources. Planetary Data System(PDS) developed by NASA is an excellent system for saving data over a long period. Especially for the current version 4 (PDS4), it is possible to create a data archive with high completeness using information technology. Historically, the Japanese planetary missions have archived data by scientists in their ways, but in the past decade, JAXA has been aiming to conform data to PDS considering long term preservation. Hayabusa, Akatsuki are archived in PDS3. Kaguya(SELENE) data have been newly converted from the original format to PDS3. Hayabusa2 and BepiColombo, and future planetary explorations will release data in PDS4. The cooperation of engineers who are familiar with information technology is indispensable to create data archives for scientists. In addition, it is essential to have experience, information sharing, and a system to support it. There is a challenge in Japan about the system.

  7. An overview on integrated data system for archiving and sharing marine geology and geophysical data in Korea Institute of Ocean Science & Technology (KIOST)

    NASA Astrophysics Data System (ADS)

    Choi, Sang-Hwa; Kim, Sung Dae; Park, Hyuk Min; Lee, SeungHa

    2016-04-01

    We established and have operated an integrated data system for managing, archiving and sharing marine geology and geophysical data around Korea produced from various research projects and programs in Korea Institute of Ocean Science & Technology (KIOST). First of all, to keep the consistency of data system with continuous data updates, we set up standard operating procedures (SOPs) for data archiving, data processing and converting, data quality controls, and data uploading, DB maintenance, etc. Database of this system comprises two databases, ARCHIVE DB and GIS DB for the purpose of this data system. ARCHIVE DB stores archived data as an original forms and formats from data providers for data archive and GIS DB manages all other compilation, processed and reproduction data and information for data services and GIS application services. Relational data management system, Oracle 11g, adopted for DBMS and open source GIS techniques applied for GIS services such as OpenLayers for user interface, GeoServer for application server, PostGIS and PostgreSQL for GIS database. For the sake of convenient use of geophysical data in a SEG Y format, a viewer program was developed and embedded in this system. Users can search data through GIS user interface and save the results as a report.

  8. EOSDIS: Archive and Distribution Systems in the Year 2000

    NASA Technical Reports Server (NTRS)

    Behnke, Jeanne; Lake, Alla

    2000-01-01

    Earth Science Enterprise (ESE) is a long-term NASA research mission to study the processes leading to global climate change. The Earth Observing System (EOS) is a NASA campaign of satellite observatories that are a major component of ESE. The EOS Data and Information System (EOSDIS) is another component of ESE that will provide the Earth science community with easy, affordable, and reliable access to Earth science data. EOSDIS is a distributed system, with major facilities at seven Distributed Active Archive Centers (DAACs) located throughout the United States. The EOSDIS software architecture is being designed to receive, process, and archive several terabytes of science data on a daily basis. Thousands of science users and perhaps several hundred thousands of non-science users are expected to access the system. The first major set of data to be archived in the EOSDIS is from Landsat-7. Another EOS satellite, Terra, was launched on December 18, 1999. With the Terra launch, the EOSDIS will be required to support approximately one terabyte of data into and out of the archives per day. Since EOS is a multi-mission program, including the launch of more satellites and many other missions, the role of the archive systems becomes larger and more critical. In 1995, at the fourth convening of NASA Mass Storage Systems and Technologies Conference, the development plans for the EOSDIS information system and archive were described. Five years later, many changes have occurred in the effort to field an operational system. It is interesting to reflect on some of the changes driving the archive technology and system development for EOSDIS. This paper principally describes the Data Server subsystem including how the other subsystems access the archive, the nature of the data repository, and the mass-storage I/O management. The paper reviews the system architecture (both hardware and software) of the basic components of the archive. It discusses the operations concept, code development, and testing phase of the system. Finally, it describes the future plans for the archive.

  9. Technologically Enhanced Archival Collections: Using the Buddy System

    ERIC Educational Resources Information Center

    Holz, Dayna

    2006-01-01

    Based in the context of challenges faced by archives when managing digital projects, this article explores options of looking outside the existing expertise of archives staff to find collaborative partners. In teaming up with other departments and organizations, the potential scope of traditional archival digitization projects is expanded beyond…

  10. The Role of Computers in Archives.

    ERIC Educational Resources Information Center

    Cook, Michael

    1989-01-01

    Discusses developments in information technologies, their present state of application, and their general significance for the future of archives and records management systems. The likely impact of future technological developments is considered and the need for infrastructural standards, professional cooperation, and training is emphasized.…

  11. Applying the Technology Acceptance Model in a Study of the Factors Affecting Usage of the Taiwan Digital Archives System

    ERIC Educational Resources Information Center

    Hong, Jon-Chao; Hwang, Ming-Yueh; Hsu, Hsuan-Fang; Wong, Wan-Tzu; Chen, Mei-Yung

    2011-01-01

    The rapid development of information and communication technology and the popularization of the Internet have given a boost to digitization technologies. Since 2001, The National Science Council (NSC) of Taiwan has invested a large amount of funding in the National Digital Archives Program (NDAP) to develop digital content. Some studies have…

  12. Long-term archiving and data access: modelling and standardization

    NASA Technical Reports Server (NTRS)

    Hoc, Claude; Levoir, Thierry; Nonon-Latapie, Michel

    1996-01-01

    This paper reports on the multiple difficulties inherent in the long-term archiving of digital data, and in particular on the different possible causes of definitive data loss. It defines the basic principles which must be respected when creating long-term archives. Such principles concern both the archival systems and the data. The archival systems should have two primary qualities: independence of architecture with respect to technological evolution, and generic-ness, i.e., the capability of ensuring identical service for heterogeneous data. These characteristics are implicit in the Reference Model for Archival Services, currently being designed within an ISO-CCSDS framework. A system prototype has been developed at the French Space Agency (CNES) in conformance with these principles, and its main characteristics will be discussed in this paper. Moreover, the data archived should be capable of abstract representation regardless of the technology used, and should, to the extent that it is possible, be organized, structured and described with the help of existing standards. The immediate advantage of standardization is illustrated by several concrete examples. Both the positive facets and the limitations of this approach are analyzed. The advantages of developing an object-oriented data model within this contxt are then examined.

  13. Archiving Software Systems: Approaches to Preserve Computational Capabilities

    NASA Astrophysics Data System (ADS)

    King, T. A.

    2014-12-01

    A great deal of effort is made to preserve scientific data. Not only because data is knowledge, but it is often costly to acquire and is sometimes collected under unique circumstances. Another part of the science enterprise is the development of software to process and analyze the data. Developed software is also a large investment and worthy of preservation. However, the long term preservation of software presents some challenges. Software often requires a specific technology stack to operate. This can include software, operating systems and hardware dependencies. One past approach to preserve computational capabilities is to maintain ancient hardware long past its typical viability. On an archive horizon of 100 years, this is not feasible. Another approach to preserve computational capabilities is to archive source code. While this can preserve details of the implementation and algorithms, it may not be possible to reproduce the technology stack needed to compile and run the resulting applications. This future forward dilemma has a solution. Technology used to create clouds and process big data can also be used to archive and preserve computational capabilities. We explore how basic hardware, virtual machines, containers and appropriate metadata can be used to preserve computational capabilities and to archive functional software systems. In conjunction with data archives, this provides scientist with both the data and capability to reproduce the processing and analysis used to generate past scientific results.

  14. Improving the security of international ISO container traffic by centralizing the archival of inspection results

    NASA Astrophysics Data System (ADS)

    Chalmers, Alex

    2004-09-01

    To increase the security and throughput of ISO traffic through international terminals more technology must be applied to the problem. A transnational central archive of inspection records is discussed that can be accessed by national agencies as ISO containers approach their borders. The intent is to improve the throughput and security of the cargo inspection process. A review of currently available digital media archiving technologies is presented and their possible application to the tracking of international ISO container shipments. Specific image formats employed by current x-ray inspection systems are discussed. Sample x-ray data from systems in use today are shown that could be entered into such a system. Data from other inspection technologies are shown to be easily integrated, as well as the creation of database records suitable for interfacing with other computer systems. Overall system performance requirements are discussed in terms of security, response time and capacity. Suggestions for pilot projects based on existing border inspection processes are made also.

  15. Buckets: Smart Objects for Digital Libraries

    NASA Technical Reports Server (NTRS)

    Nelson, Michael L.

    2001-01-01

    Current discussion of digital libraries (DLs) is often dominated by the merits of the respective storage, search and retrieval functionality of archives, repositories, search engines, search interfaces and database systems. While these technologies are necessary for information management, the information content is more important than the systems used for its storage and retrieval. Digital information should have the same long-term survivability prospects as traditional hardcopy information and should be protected to the extent possible from evolving search engine technologies and vendor vagaries in database management systems. Information content and information retrieval systems should progress on independent paths and make limited assumptions about the status or capabilities of the other. Digital information can achieve independence from archives and DL systems through the use of buckets. Buckets are an aggregative, intelligent construct for publishing in DLs. Buckets allow the decoupling of information content from information storage and retrieval. Buckets exist within the Smart Objects and Dumb Archives model for DLs in that many of the functionalities and responsibilities traditionally associated with archives are pushed down (making the archives dumber) into the buckets (making them smarter). Some of the responsibilities imbued to buckets are the enforcement of their terms and conditions, and maintenance and display of their contents.

  16. The American Archival Profession and Information Technology Standards.

    ERIC Educational Resources Information Center

    Cox, Richard J.

    1992-01-01

    Discussion of the use of standards by archivists highlights the U.S. MARC AMC (Archives-Manuscript Control) format for reporting archival records and manuscripts; their interest in specific standards being developed for the OSI (Open Systems Interconnection) reference model; and the management of records in electronic formats. (16 references) (LAE)

  17. Intelligent Systems Technologies and Utilization of Earth Observation Data

    NASA Technical Reports Server (NTRS)

    Ramapriyan, H. K.; McConaughy, G. R.; Morse, H. S.

    2004-01-01

    The addition of raw data and derived geophysical parameters from several Earth observing satellites over the last decade to the data held by NASA data centers has created a data rich environment for the Earth science research and applications communities. The data products are being distributed to a large and diverse community of users. Due to advances in computational hardware, networks and communications, information management and software technologies, significant progress has been made in the last decade in archiving and providing data to users. However, to realize the full potential of the growing data archives, further progress is necessary in the transformation of data into information, and information into knowledge that can be used in particular applications. Sponsored by NASA s Intelligent Systems Project within the Computing, Information and Communication Technology (CICT) Program, a conceptual architecture study has been conducted to examine ideas to improve data utilization through the addition of intelligence into the archives in the context of an overall knowledge building system (KBS). Potential Intelligent Archive concepts include: 1) Mining archived data holdings to improve metadata to facilitate data access and usability; 2) Building intelligence about transformations on data, information, knowledge, and accompanying services; 3) Recognizing the value of results, indexing and formatting them for easy access; 4) Interacting as a cooperative node in a web of distributed systems to perform knowledge building; and 5) Being aware of other nodes in the KBS, participating in open systems interfaces and protocols for virtualization, and achieving collaborative interoperability.

  18. Using Object Storage Technology vs Vendor Neutral Archives for an Image Data Repository Infrastructure.

    PubMed

    Bialecki, Brian; Park, James; Tilkin, Mike

    2016-08-01

    The intent of this project was to use object storage and its database, which has the ability to add custom extensible metadata to an imaging object being stored within the system, to harness the power of its search capabilities, and to close the technology gap that healthcare faces. This creates a non-disruptive tool that can be used natively by both legacy systems and the healthcare systems of today which leverage more advanced storage technologies. The base infrastructure can be populated alongside current workflows without any interruption to the delivery of services. In certain use cases, this technology can be seen as a true alternative to the VNA (Vendor Neutral Archive) systems implemented by healthcare today. The scalability, security, and ability to process complex objects makes this more than just storage for image data and a commodity to be consumed by PACS (Picture Archiving and Communication System) and workstations. Object storage is a smart technology that can be leveraged to create vendor independence, standards compliance, and a data repository that can be mined for truly relevant content by adding additional context to search capabilities. This functionality can lead to efficiencies in workflow and a wealth of minable data to improve outcomes into the future.

  19. A system approach to archival storage

    NASA Technical Reports Server (NTRS)

    Corcoran, John W.

    1991-01-01

    The introduction and viewgraphs of a discussion on a system approach to archival storage presented at the National Space Science Data Center (NSSDC) Mass Storage Workshop is included. The use of D-2 iron particles for archival storage is discussed along with how acceleration factors relating short-term tests to archival life times can be justified. Ampex Recording Systems is transferring D-2 video technology to data storage applications, and encountering concerns about corrosion. To protect the D-2 standard, Battelle tests were done on all four tapes in the Class 2 environment. Error rates were measured before and after the test on both exposed and control groups.

  20. Science information systems: Archive, access, and retrieval

    NASA Technical Reports Server (NTRS)

    Campbell, William J.

    1991-01-01

    The objective of this research is to develop technology for the automated characterization and interactive retrieval and visualization of very large, complex scientific data sets. Technologies will be developed for the following specific areas: (1) rapidly archiving data sets; (2) automatically characterizing and labeling data in near real-time; (3) providing users with the ability to browse contents of databases efficiently and effectively; (4) providing users with the ability to access and retrieve system independent data sets electronically; and (5) automatically alerting scientists to anomalies detected in data.

  1. The design and implementation of the HY-1B Product Archive System

    NASA Astrophysics Data System (ADS)

    Liu, Shibin; Liu, Wei; Peng, Hailong

    2010-11-01

    Product Archive System (PAS), as a background system, is the core part of the Product Archive and Distribution System (PADS) which is the center for data management of the Ground Application System of HY-1B satellite hosted by the National Satellite Ocean Application Service of China. PAS integrates a series of updating methods and technologies, such as a suitable data transmittal mode, flexible configuration files and log information in order to make the system with several desirable characteristics, such as ease of maintenance, stability, minimal complexity. This paper describes seven major components of the PAS (Network Communicator module, File Collector module, File Copy module, Task Collector module, Metadata Extractor module, Product data Archive module, Metadata catalogue import module) and some of the unique features of the system, as well as the technical problems encountered and resolved.

  2. Trade-off study of data storage technologies

    NASA Technical Reports Server (NTRS)

    Kadyszewski, R. V.

    1977-01-01

    The need to store and retrieve large quantities of data at modest cost has generated the need for an economical, compact, archival mass storage system. Very significant improvements in the state-of-the-art of mass storage systems have been accomplished through the development of a number of magnetic, electro-optical, and other related devices. This study was conducted in order to do a trade-off between these data storage devices and the related technologies in order to determine an optimum approach for an archival mass data storage system based upon a comparison of the projected capabilities and characteristics of these devices to yield operational systems in the early 1980's.

  3. JNDMS Task Authorization 2 Report

    DTIC Science & Technology

    2013-10-01

    uses Barnyard to store alarms from all DREnet Snort sensors in a MySQL database. Barnyard is an open source tool designed to work with Snort to take...Technology ITI Information Technology Infrastructure J2EE Java 2 Enterprise Edition JAR Java Archive. This is an archive file format defined by Java ...standards. JDBC Java Database Connectivity JDW JNDMS Data Warehouse JNDMS Joint Network and Defence Management System JNDMS Joint Network Defence and

  4. Advanced digital image archival system using MPEG technologies

    NASA Astrophysics Data System (ADS)

    Chang, Wo

    2009-08-01

    Digital information and records are vital to the human race regardless of the nationalities and eras in which they were produced. Digital image contents are produced at a rapid pace from cultural heritages via digitalization, scientific and experimental data via high speed imaging sensors, national defense satellite images from governments, medical and healthcare imaging records from hospitals, personal collection of photos from digital cameras. With these mass amounts of precious and irreplaceable data and knowledge, what standards technologies can be applied to preserve and yet provide an interoperable framework for accessing the data across varieties of systems and devices? This paper presents an advanced digital image archival system by applying the international standard of MPEG technologies to preserve digital image content.

  5. Archive and records management-Fiscal year 2010 offline archive media trade study

    USGS Publications Warehouse

    Bodoh, Tom; Boettcher, Ken; Gacke, Ken; Greenhagen, Cheryl; Engelbrecht, Al

    2010-01-01

    This document is a trade study comparing offline digital archive storage technologies. The document compares and assesses several technologies and recommends which technologies could be deployed as the next generation standard for the U.S. Geological Survey (USGS). Archives must regularly migrate to the next generation of digital archive technology, and the technology selected must maintain data integrity until the next migration. This document is the fiscal year 2010 (FY10) revision of a study completed in FY01 and revised in FY03, FY04, FY06, and FY08.

  6. Content Platforms Meet Data Storage, Retrieval Needs

    NASA Technical Reports Server (NTRS)

    2012-01-01

    Earth is under a constant barrage of information from space. Whether from satellites orbiting our planet, spacecraft circling Mars, or probes streaking toward the far reaches of the Solar System, NASA collects massive amounts of data from its spacefaring missions each day. NASA s Earth Observing System (EOS) satellites, for example, provide daily imagery and measurements of Earth s atmosphere, oceans, vegetation, and more. The Earth Observing System Data and Information System (EOSDIS) collects all of that science data and processes, archives, and distributes it to researchers around the globe; EOSDIS recently reached a total archive volume of 4.5 petabytes. Try to store that amount of information in your standard, four-drawer file cabinet, and you would need 90 million to get the job done. To manage the flood of information, NASA has explored technologies to efficiently collect, archive, and provide access to EOS data for scientists today and for years to come. One such technology is now providing similar capabilities to businesses and organizations worldwide.

  7. Integration experiences and performance studies of A COTS parallel archive systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Hsing-bung; Scott, Cody; Grider, Bary

    2010-01-01

    Current and future Archive Storage Systems have been asked to (a) scale to very high bandwidths, (b) scale in metadata performance, (c) support policy-based hierarchical storage management capability, (d) scale in supporting changing needs of very large data sets, (e) support standard interface, and (f) utilize commercial-off-the-shelf(COTS) hardware. Parallel file systems have been asked to do the same thing but at one or more orders of magnitude faster in performance. Archive systems continue to move closer to file systems in their design due to the need for speed and bandwidth, especially metadata searching speeds such as more caching and lessmore » robust semantics. Currently the number of extreme highly scalable parallel archive solutions is very small especially those that will move a single large striped parallel disk file onto many tapes in parallel. We believe that a hybrid storage approach of using COTS components and innovative software technology can bring new capabilities into a production environment for the HPC community much faster than the approach of creating and maintaining a complete end-to-end unique parallel archive software solution. In this paper, we relay our experience of integrating a global parallel file system and a standard backup/archive product with a very small amount of additional code to provide a scalable, parallel archive. Our solution has a high degree of overlap with current parallel archive products including (a) doing parallel movement to/from tape for a single large parallel file, (b) hierarchical storage management, (c) ILM features, (d) high volume (non-single parallel file) archives for backup/archive/content management, and (e) leveraging all free file movement tools in Linux such as copy, move, ls, tar, etc. We have successfully applied our working COTS Parallel Archive System to the current world's first petaflop/s computing system, LANL's Roadrunner, and demonstrated its capability to address requirements of future archival storage systems.« less

  8. Integration experiments and performance studies of a COTS parallel archive system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Hsing-bung; Scott, Cody; Grider, Gary

    2010-06-16

    Current and future Archive Storage Systems have been asked to (a) scale to very high bandwidths, (b) scale in metadata performance, (c) support policy-based hierarchical storage management capability, (d) scale in supporting changing needs of very large data sets, (e) support standard interface, and (f) utilize commercial-off-the-shelf (COTS) hardware. Parallel file systems have been asked to do the same thing but at one or more orders of magnitude faster in performance. Archive systems continue to move closer to file systems in their design due to the need for speed and bandwidth, especially metadata searching speeds such as more caching andmore » less robust semantics. Currently the number of extreme highly scalable parallel archive solutions is very small especially those that will move a single large striped parallel disk file onto many tapes in parallel. We believe that a hybrid storage approach of using COTS components and innovative software technology can bring new capabilities into a production environment for the HPC community much faster than the approach of creating and maintaining a complete end-to-end unique parallel archive software solution. In this paper, we relay our experience of integrating a global parallel file system and a standard backup/archive product with a very small amount of additional code to provide a scalable, parallel archive. Our solution has a high degree of overlap with current parallel archive products including (a) doing parallel movement to/from tape for a single large parallel file, (b) hierarchical storage management, (c) ILM features, (d) high volume (non-single parallel file) archives for backup/archive/content management, and (e) leveraging all free file movement tools in Linux such as copy, move, Is, tar, etc. We have successfully applied our working COTS Parallel Archive System to the current world's first petafiop/s computing system, LANL's Roadrunner machine, and demonstrated its capability to address requirements of future archival storage systems.« less

  9. Intelligent Systems Technologies to Assist in Utilization of Earth Observation Data

    NASA Technical Reports Server (NTRS)

    Ramapriyan, Hampapuram K.; McConaughy, Gail; Lynnes, Christopher; McDonald, Kenneth; Kempler, Steven

    2003-01-01

    With the launch of several Earth observing satellites over the last decade, we are now in a data rich environment. From NASA's Earth Observing System (EOS) satellites alone, we are accumulating more than 3 TB per day of raw data and derived geophysical parameters. The data products are being distributed to a large user community comprising scientific researchers, educators and operational government agencies. Notable progress has been made in the last decade in facilitating access to data. However, to realize the full potential of the growing archives of valuable scientific data, further progress is necessary in the transformation of data into information, and information into knowledge that can be used in particular applications. Sponsored by NASA s Intelligent Systems Project within the Computing, Information and Communication Technology (CICT) Program, a conceptual architecture study has been conducted to examine ideas to improve data utilization through the addition of intelligence into the archives in the context of an overall knowledge building system. Potential Intelligent Archive concepts include: 1) Mining archived data holdings using Intelligent Data Understanding algorithms to improve metadata to facilitate data access and usability; 2) Building intelligence about transformations on data, information, knowledge, and accompanying services involved in a scientific enterprise; 3) Recognizing the value of results, indexing and formatting them for easy access, and delivering them to concerned individuals; 4) Interacting as a cooperative node in a web of distributed systems to perform knowledge building (i.e., the transformations from data to information to knowledge) instead of just data pipelining; and 5) Being aware of other nodes in the knowledge building system, participating in open systems interfaces and protocols for virtualization, and collaborative interoperability. This paper presents some of these concepts and identifies issues to be addressed by research in future intelligent systems technology.

  10. Data archiving and serving system implementation in CLEP's GRAS Core System

    NASA Astrophysics Data System (ADS)

    Zuo, Wei; Zeng, Xingguo; Zhang, Zhoubin; Geng, Liang; Li, Chunlai

    2017-04-01

    The Ground Research & Applications System(GRAS) is one of the five systems of China's Lunar Exploration Project(CLEP), it is responsible for data acquisition, processing, management and application, and it is also the operation control center during satellite in-orbit and payload operation management. Chang'E-1, Chang'E-2 and Chang'E-3 have collected abundant lunar exploration data. The aim of this work is to present the implementation of data archiving and Serving in CLEP's GRAS Core System software. This first approach provides a client side API and server side software allowing the creation of a simplified version of CLEPDB data archiving software, and implements all required elements to complete data archiving flow from data acquisition until its persistent storage technology. The client side includes all necessary components that run on devices that acquire or produce data, distributing and streaming to configure remote archiving servers. The server side comprises an archiving service that stores into PDS files all received data. The archiving solution aims at storing data coming for the Data Acquisition Subsystem, the Operation Management Subsystem, the Data Preprocessing Subsystem and the Scientific Application & Research Subsystem. The serving solution aims at serving data for the various business systems, scientific researchers and public users. The data-driven and component clustering methods was adopted in this system, the former is used to solve real-time data archiving and data persistence services; the latter is used to keep the continuous supporting ability of archive and service to new data from Chang'E Mission. Meanwhile, it can save software development cost as well.

  11. Constraint based scheduling for the Goddard Space Flight Center distributed Active Archive Center's data archive and distribution system

    NASA Technical Reports Server (NTRS)

    Short, Nick, Jr.; Bedet, Jean-Jacques; Bodden, Lee; Boddy, Mark; White, Jim; Beane, John

    1994-01-01

    The Goddard Space Flight Center (GSFC) Distributed Active Archive Center (DAAC) has been operational since October 1, 1993. Its mission is to support the Earth Observing System (EOS) by providing rapid access to EOS data and analysis products, and to test Earth Observing System Data and Information System (EOSDIS) design concepts. One of the challenges is to ensure quick and easy retrieval of any data archived within the DAAC's Data Archive and Distributed System (DADS). Over the 15-year life of EOS project, an estimated several Petabytes (10(exp 15)) of data will be permanently stored. Accessing that amount of information is a formidable task that will require innovative approaches. As a precursor of the full EOS system, the GSFC DAAC with a few Terabits of storage, has implemented a prototype of a constraint-based task and resource scheduler to improve the performance of the DADS. This Honeywell Task and Resource Scheduler (HTRS), developed by Honeywell Technology Center in cooperation the Information Science and Technology Branch/935, the Code X Operations Technology Program, and the GSFC DAAC, makes better use of limited resources, prevents backlog of data, provides information about resources bottlenecks and performance characteristics. The prototype which is developed concurrently with the GSFC Version 0 (V0) DADS, models DADS activities such as ingestion and distribution with priority, precedence, resource requirements (disk and network bandwidth) and temporal constraints. HTRS supports schedule updates, insertions, and retrieval of task information via an Application Program Interface (API). The prototype has demonstrated with a few examples, the substantial advantages of using HTRS over scheduling algorithms such as a First In First Out (FIFO) queue. The kernel scheduling engine for HTRS, called Kronos, has been successfully applied to several other domains such as space shuttle mission scheduling, demand flow manufacturing, and avionics communications scheduling.

  12. The NASA Planetary Data System Roadmap Study for 2017 - 2026

    NASA Astrophysics Data System (ADS)

    McNutt, R. L., Jr.; Gaddis, L. R.; Law, E.; Beyer, R. A.; Crombie, M. K.; Ebel, D. S. S.; Ghosh, A.; Grayzeck, E.; Morgan, T. H.; Paganelli, F.; Raugh, A.; Stein, T.; Tiscareno, M. S.; Weber, R. C.; Banks, M.; Powell, K.

    2017-12-01

    NASA's Planetary Data System (PDS) is the formal archive of >1.2 petabytes of data from planetary exploration, science, and research. Initiated in 1989 to address an overall lack of attention to mission data documentation, access, and archiving, the PDS has evolved into an online collection of digital data managed and served by a federation of six science discipline nodes and two technical support nodes. Several ad hoc mission-oriented data nodes also provide complex data interfaces and access for the duration of their missions. The recent Planetary Data System Roadmap Study for 2017 to 2026 involved 15 planetary science community members who collectively prepared a report summarizing the results of an intensive examination of the current state of the PDS and its organization, management, practices, and data holdings (https://pds.jpl.nasa.gov/roadmap/PlanetaryDataSystemRMS17-26_20jun17.pdf). The report summarizes the history of the PDS, its functions and characteristics, and how it has evolved to its present form; also included are extensive references and documentary appendices. The report recognizes that as a complex, evolving, archive system, the PDS must constantly respond to new pressures and opportunities. The report provides details on the challenges now facing the PDS, 19 detailed findings, suggested remediations, and a summary of what the future may hold for planetary data archiving. The findings cover topics such as user needs and expectations, data usability and discoverability (i.e., metadata, data access, documentation, and training), tools and file formats, use of current information technologies, and responses to increases in data volume, variety, complexity, and number of data providers. In addition, the study addresses the possibility of archiving software, laboratory data, and measurements of physical samples. Finally, the report discusses the current structure and governance of the PDS and its impact on how archive growth, technology, and new developments are enabled and managed within the PDS. The report, with its findings, acknowledges the ongoing and expected challenges to be faced in the future, the need for maintaining an edge in the use of emerging technologies, and represents a guide for evolution of the PDS for the next decade.

  13. The Convergence of Information Technology, Data, and Management in a Library Imaging Program

    ERIC Educational Resources Information Center

    France, Fenella G.; Emery, Doug; Toth, Michael B.

    2010-01-01

    Integrating advanced imaging and processing capabilities in libraries, archives, and museums requires effective systems and information management to ensure that the large amounts of digital data about cultural artifacts can be readily acquired, stored, archived, accessed, processed, and linked to other data. The Library of Congress is developing…

  14. Picture archiving and communication in radiology.

    PubMed

    Napoli, Marzia; Nanni, Marinella; Cimarra, Stefania; Crisafulli, Letizia; Campioni, Paolo; Marano, Pasquale

    2003-01-01

    After over 80 years of exclusive archiving of radiologic films, at present, in Radiology, digital archiving is increasingly gaining ground. Digital archiving allows a considerable reduction in costs and space saving, but most importantly, immediate or remote consultation of all examinations and reports in the hospital clinical wards, is feasible. The RIS system, in this case, is the starting point of the process of electronic archiving which however is the task of PACS. The latter can be used as radiologic archive in accordance with the law provided that it is in conformance with some specifications as the use of optical long-term storage media or with electronic track of change. PACS archives, in a hierarchical system, all digital images produced by each diagnostic imaging modality. Images and patient data can be retrieved and used for consultation or remote consultation by the reporting radiologist who requires images and reports of previous radiologic examinations or by the referring physician of the ward. Modern PACS owing to the WEB server allow remote access to extremely simplified images and data however ensuring the due regulations and access protections. Since the PACS enables a simpler data communication within the hospital, security and patient privacy should be protected. A secure and reliable PACS should be able to minimize the risk of accidental data destruction, and should prevent non authorized access to the archive with adequate security measures in relation to the acquired knowledge and based on the technological advances. Archiving of data produced by modern digital imaging is a problem now present also in small Radiology services. The technology is able to readily solve problems which were extremely complex up to some years ago as the connection between equipment and archiving system owing also to the universalization of the DICOM 3.0 standard. The evolution of communication networks and the use of standard protocols as TCP/IP can minimize problems of data and image remote transmission within the healthcare enterprise as well as over the territory. However, new problems are appearing as that of digital data security profiles and of the different systems which should ensure it. Among these, algorithms of electronic signature should be mentioned. In Italy they are validated by law and therefore can be used in digital archives in accordance with the law.

  15. Kodak Optical Disk and Microfilm Technologies Carve Niches in Specific Applications.

    ERIC Educational Resources Information Center

    Gallenberger, John; Batterton, John

    1989-01-01

    Describes the Eastman Kodak Company's microfilm and optical disk technologies and their applications. Topics discussed include WORM technology; retrieval needs and cost effective archival storage needs; engineering applications; jukeboxes; optical storage options; systems for use with mainframes and microcomputers; and possible future…

  16. Strategic Plan for Information Systems and Technology, Fiscal Years 1994-1998.

    ERIC Educational Resources Information Center

    National Archives and Records Administration, Washington, DC.

    The information systems and technology management program of the National Archives and Records Administration (NARA) establishes broad policy guidance and technical standards for information management to ensure that appropriate resource sharing can occur, while providing cost-effective support for mission requirements of program offices. The NARA…

  17. Optical Disk Technology and Information.

    ERIC Educational Resources Information Center

    Goldstein, Charles M.

    1982-01-01

    Provides basic information on videodisks and potential applications, including inexpensive online storage, random access graphics to complement online information systems, hybrid network architectures, office automation systems, and archival storage. (JN)

  18. Using dCache in Archiving Systems oriented to Earth Observation

    NASA Astrophysics Data System (ADS)

    Garcia Gil, I.; Perez Moreno, R.; Perez Navarro, O.; Platania, V.; Ozerov, D.; Leone, R.

    2012-04-01

    The object of LAST activity (Long term data Archive Study on new Technologies) is to perform an independent study on best practices and assessment of different archiving technologies mature for operation in the short and mid-term time frame, or available in the long-term with emphasis on technologies better suited to satisfy the requirements of ESA, LTDP and other European and Canadian EO partners in terms of digital information preservation and data accessibility and exploitation. During the last phase of the project, a testing of several archiving solutions has been performed in order to evaluate their suitability. In particular, dCache, aimed to provide a file system tree view of the data repository exchanging this data with backend (tertiary) Storage Systems as well as space management, pool attraction, dataset replication, hot spot determination and recovery from disk or node failures. Connected to a tertiary storage system, dCache simulates unlimited direct access storage space. Data exchanges to and from the underlying HSM are performed automatically and invisibly to the user Dcache was created to solve the requirements of big computer centers and universities with big amounts of data, putting their efforts together and founding EMI (European Middleware Initiative). At the moment being, Dcache is mature enough to be implemented, being used by several research centers of relevance (e.g. LHC storing up to 50TB/day). This solution has been not used so far in Earth Observation and the results of the study are summarized in this article, focusing on the capacities over a simulated environment to get in line with the ESA requirements for a geographically distributed storage. The challenge of a geographically distributed storage system can be summarized as the way to provide a maximum quality for storage and dissemination services with the minimum cost.

  19. AIRSAR Automated Web-based Data Processing and Distribution System

    NASA Technical Reports Server (NTRS)

    Chu, Anhua; vanZyl, Jakob; Kim, Yunjin; Lou, Yunling; Imel, David; Tung, Wayne; Chapman, Bruce; Durden, Stephen

    2005-01-01

    In this paper, we present an integrated, end-to-end synthetic aperture radar (SAR) processing system that accepts data processing requests, submits processing jobs, performs quality analysis, delivers and archives processed data. This fully automated SAR processing system utilizes database and internet/intranet web technologies to allow external users to browse and submit data processing requests and receive processed data. It is a cost-effective way to manage a robust SAR processing and archival system. The integration of these functions has reduced operator errors and increased processor throughput dramatically.

  20. CARDS: A blueprint and environment for domain-specific software reuse

    NASA Technical Reports Server (NTRS)

    Wallnau, Kurt C.; Solderitsch, Anne Costa; Smotherman, Catherine

    1992-01-01

    CARDS (Central Archive for Reusable Defense Software) exploits advances in domain analysis and domain modeling to identify, specify, develop, archive, retrieve, understand, and reuse domain-specific software components. An important element of CARDS is to provide visibility into the domain model artifacts produced by, and services provided by, commercial computer-aided software engineering (CASE) technology. The use of commercial CASE technology is important to provide rich, robust support for the varied roles involved in a reuse process. We refer to this kind of use of knowledge representation systems as supporting 'knowledge-based integration.'

  1. Evolution of Archival Storage (from Tape to Memory)

    NASA Technical Reports Server (NTRS)

    Ramapriyan, Hampapuram K.

    2015-01-01

    Over the last three decades, there has been a significant evolution in storage technologies supporting archival of remote sensing data. This section provides a brief survey of how these technologies have evolved. Three main technologies are considered - tape, hard disk and solid state disk. Their historical evolution is traced, summarizing how reductions in cost have helped being able to store larger volumes of data on faster media. The cost per GB of media is only one of the considerations in determining the best approach to archival storage. Active archives generally require faster response to user requests for data than permanent archives. The archive costs have to consider facilities and other capital costs, operations costs, software licenses, utilities costs, etc. For meeting requirements in any organization, typically a mix of technologies is needed.

  2. Archival storage solutions for PACS

    NASA Astrophysics Data System (ADS)

    Chunn, Timothy

    1997-05-01

    While they are many, one of the inhibitors to the wide spread diffusion of PACS systems has been robust, cost effective digital archive storage solutions. Moreover, an automated Nearline solution is key to a central, sharable data repository, enabling many applications such as PACS, telemedicine and teleradiology, and information warehousing and data mining for research such as patient outcome analysis. Selecting the right solution depends on a number of factors: capacity requirements, write and retrieval performance requirements, scaleability in capacity and performance, configuration architecture and flexibility, subsystem availability and reliability, security requirements, system cost, achievable benefits and cost savings, investment protection, strategic fit and more.This paper addresses many of these issues. It compares and positions optical disk and magnetic tape technologies, which are the predominant archive mediums today. Price and performance comparisons will be made at different archive capacities, plus the effect of file size on storage system throughput will be analyzed. The concept of automated migration of images from high performance, high cost storage devices to high capacity, low cost storage devices will be introduced as a viable way to minimize overall storage costs for an archive. The concept of access density will also be introduced and applied to the selection of the most cost effective archive solution.

  3. Lecture archiving on a larger scale at the University of Michigan and CERN

    NASA Astrophysics Data System (ADS)

    Herr, Jeremy; Lougheed, Robert; Neal, Homer A.

    2010-04-01

    The ATLAS Collaboratory Project at the University of Michigan has been a leader in the area of collaborative tools since 1999. Its activities include the development of standards, software and hardware tools for lecture archiving, and making recommendations for videoconferencing and remote teaching facilities. Starting in 2006 our group became involved in classroom recordings, and in early 2008 we spawned CARMA, a University-wide recording service. This service uses a new portable recording system that we developed. Capture, archiving and dissemination of rich multimedia content from lectures, tutorials and classes are increasingly widespread activities among universities and research institutes. A growing array of related commercial and open source technologies is becoming available, with several new products introduced in the last couple years. As the result of a new close partnership between U-M and CERN IT, a market survey of these products was conducted and a summary of the results are presented here. It is informing an ambitious effort in 2009 to equip many CERN rooms with automated lecture archiving systems, on a much larger scale than before. This new technology is being integrated with CERN's existing webcast, CDS, and Indico applications.

  4. Geodetic Seamless Archive Centers Modernization - Information Technology for Exploiting the Data Explosion

    NASA Astrophysics Data System (ADS)

    Boler, F. M.; Blewitt, G.; Kreemer, C. W.; Bock, Y.; Noll, C. E.; McWhirter, J.; Jamason, P.; Squibb, M. B.

    2010-12-01

    Space geodetic science and other disciplines using geodetic products have benefited immensely from open sharing of data and metadata from global and regional archives Ten years ago Scripps Orbit and Permanent Array Center (SOPAC), the NASA Crustal Dynamics Data Information System (CDDIS), UNAVCO and other archives collaborated to create the GPS Seamless Archive Centers (GSAC) in an effort to further enable research with the expanding collections of GPS data then becoming available. The GSAC partners share metadata to facilitate data discovery and mining across participating archives and distribution of data to users. This effort was pioneering, but was built on technology that has now been rendered obsolete. As the number of geodetic observing technologies has expanded, the variety of data and data products has grown dramatically, exposing limitations in data product sharing. Through a NASA ROSES project, the three archives (CDDIS, SOPAC and UNAVCO) have been funded to expand the original GSAC capability for multiple geodetic observation types and to simultaneously modernize the underlying technology by implementing web services. The University of Nevada, Reno (UNR) will test the web services implementation by incorporating them into their daily GNSS data processing scheme. The effort will include new methods for quality control of current and legacy data that will be a product of the analysis/testing phase performed by UNR. The quality analysis by UNR will include a report of the stability of the stations coordinates over time that will enable data users to select sites suitable for their application, for example identifying stations with large seasonal effects. This effort will contribute to enhanced ability for very large networks to obtain complete data sets for processing.

  5. Supporting users through integrated retrieval, processing, and distribution systems at the Land Processes Distributed Active Archive Center

    USGS Publications Warehouse

    Kalvelage, Thomas A.; Willems, Jennifer

    2005-01-01

    The US Geological Survey's EROS Data Center (EDC) hosts the Land Processes Distributed Active Archive Center (LP DAAC). The LP DAAC supports NASA's Earth Observing System (EOS), which is a series of polar-orbiting and low inclination satellites for long-term global observations of the land surface, biosphere, solid Earth, atmosphere, and oceans. The EOS Data and Information Systems (EOSDIS) was designed to acquire, archive, manage and distribute Earth observation data to the broadest possible user community.The LP DAAC is one of four DAACs that utilize the EOSDIS Core System (ECS) to manage and archive their data. Since the ECS was originally designed, significant changes have taken place in technology, user expectations, and user requirements. Therefore the LP DAAC has implemented additional systems to meet the evolving needs of scientific users, tailored to an integrated working environment. These systems provide a wide variety of services to improve data access and to enhance data usability through subsampling, reformatting, and reprojection. These systems also support the wide breadth of products that are handled by the LP DAAC.The LP DAAC is the primary archive for the Landsat 7 Enhanced Thematic Mapper Plus (ETM+) data; it is the only facility in the United States that archives, processes, and distributes data from the Advanced Spaceborne Thermal Emission/Reflection Radiometer (ASTER) on NASA's Terra spacecraft; and it is responsible for the archive and distribution of “land products” generated from data acquired by the Moderate Resolution Imaging Spectroradiometer (MODIS) on NASA's Terra and Aqua satellites.

  6. Towards the Interoperability of Web, Database, and Mass Storage Technologies for Petabyte Archives

    NASA Technical Reports Server (NTRS)

    Moore, Reagan; Marciano, Richard; Wan, Michael; Sherwin, Tom; Frost, Richard

    1996-01-01

    At the San Diego Supercomputer Center, a massive data analysis system (MDAS) is being developed to support data-intensive applications that manipulate terabyte sized data sets. The objective is to support scientific application access to data whether it is located at a Web site, stored as an object in a database, and/or storage in an archival storage system. We are developing a suite of demonstration programs which illustrate how Web, database (DBMS), and archival storage (mass storage) technologies can be integrated. An application presentation interface is being designed that integrates data access to all of these sources. We have developed a data movement interface between the Illustra object-relational database and the NSL UniTree archival storage system running in a production mode at the San Diego Supercomputer Center. With this interface, an Illustra client can transparently access data on UniTree under the control of the Illustr DBMS server. The current implementation is based on the creation of a new DBMS storage manager class, and a set of library functions that allow the manipulation and migration of data stored as Illustra 'large objects'. We have extended this interface to allow a Web client application to control data movement between its local disk, the Web server, the DBMS Illustra server, and the UniTree mass storage environment. This paper describes some of the current approaches successfully integrating these technologies. This framework is measured against a representative sample of environmental data extracted from the San Diego Ba Environmental Data Repository. Practical lessons are drawn and critical research areas are highlighted.

  7. Stewardship of very large digital data archives

    NASA Technical Reports Server (NTRS)

    Savage, Patric

    1991-01-01

    An archive is a permanent store. There are relatively few very large digital data archives in existence. Most business records are expired within five or ten years. Many kinds of business records that do have long lives are embedded in data bases that are continually updated and re-issued cyclically. Also, a great deal of permanent business records are actually archived as microfilm, fiche, or optical disk images - their digital version being an operational convenience rather than an archive. The problems forseen in stewarding the very large digital data archives that will accumulate during the mission of the Earth Observing System (EOS) are addressed. It focuses on the function of shepherding archived digital data into an endless future. Stewardship entails storing and protecting the archive and providing meaningful service to the community of users. The steward will (1) provide against loss due to physical phenomena; (2) assure that data is not lost due to storage technology obsolescence; and (3) maintain data in a current formatting methodology.

  8. Distributing medical images with internet technologies: a DICOM web server and a DICOM java viewer.

    PubMed

    Fernàndez-Bayó, J; Barbero, O; Rubies, C; Sentís, M; Donoso, L

    2000-01-01

    With the advent of filmless radiology, it becomes important to be able to distribute radiologic images digitally throughout an entire hospital. A new approach based on World Wide Web technologies was developed to accomplish this objective. This approach involves a Web server that allows the query and retrieval of images stored in a Digital Imaging and Communications in Medicine (DICOM) archive. The images can be viewed inside a Web browser with use of a small Java program known as the DICOM Java Viewer, which is executed inside the browser. The system offers several advantages over more traditional picture archiving and communication systems (PACS): It is easy to install and maintain, is platform independent, allows images to be manipulated and displayed efficiently, and is easy to integrate with existing systems that are already making use of Web technologies. The system is user-friendly and can easily be used from outside the hospital if a security policy is in place. The simplicity and flexibility of Internet technologies makes them highly preferable to the more complex PACS workstations. The system works well, especially with magnetic resonance and computed tomographic images, and can help improve and simplify interdepartmental relationships in a filmless hospital environment.

  9. Cardiology-oriented PACS

    NASA Astrophysics Data System (ADS)

    Silva, Augusto F. d.; Costa, Carlos; Abrantes, Pedro; Gama, Vasco; Den Boer, Ad

    1998-07-01

    This paper describes an integrated system designed to provide efficient means for DICOM compliant cardiac imaging archival, transmission and visualization based on a communications backbone matching recent enabling telematic technologies like Asynchronous Transfer Mode (ATM) and switched Local Area Networks (LANs). Within a distributed client-server framework, the system was conceived on a modality based bottom-up approach, aiming ultrafast access to short term archives and seamless retrieval of cardiac video sequences throughout review stations located at the outpatient referral rooms, intensive and intermediate care units and operating theaters.

  10. Digital microscopy. Bringing new technology into focus.

    PubMed

    2010-06-01

    Digital microscopy enables the scanning of microscope slides so that they can be viewed, analyzed, and archived on a computer. While the technology is not yet widely accepted by pathologists, a switch to digital microscopy systems seems to be inevitable in the near future.

  11. Harmonize Pipeline and Archiving Aystem: PESSTO@IA2 Use Case

    NASA Astrophysics Data System (ADS)

    Smareglia, R.; Knapic, C.; Molinaro, M.; Young, D.; Valenti, S.

    2013-10-01

    Italian Astronomical Archives Center (IA2) is a research infrastructure project that aims at coordinating different national and international initiatives to improve the quality of astrophysical data services. IA2 is now also involved in the PESSTO (Public ESO Spectroscopic Survey of Transient Objects) collaboration, developing a complete archiving system to store calibrated post processed data (including sensitive intermediate products), a user interface to access private data and Virtual Observatory (VO) compliant web services to access public fast reduction data via VO tools. The archive system shall rely on the PESSTO Marshall to provide file data and its associated metadata output by the PESSTO data-reduction pipeline. To harmonize the object repository, data handling and archiving system, new tools are under development. These systems must have a strong cross-interaction without increasing the complexities of any single task, in order to improve the performances of the whole system and must have a sturdy logic in order to perform all operations in coordination with the other PESSTO tools. MySQL Replication technology and triggers are used for the synchronization of new data in an efficient, fault tolerant manner. A general purpose library is under development to manage data starting from raw observations to final calibrated ones, open to the overriding of different sources, formats, management fields, storage and publication policies. Configurations for all the systems are stored in a dedicated schema (no configuration files), but can be easily updated by a planned Archiving System Configuration Interface (ASCI).

  12. NASDA's earth observation satellite data archive policy for the earth observation data and information system (EOIS)

    NASA Technical Reports Server (NTRS)

    Sobue, Shin-ichi; Yoshida, Fumiyoshi; Ochiai, Osamu

    1996-01-01

    NASDA's new Advanced Earth Observing Satellite (ADEOS) is scheduled for launch in August, 1996. ADEOS carries 8 sensors to observe earth environmental phenomena and sends their data to NASDA, NASA, and other foreign ground stations around the world. The downlink data bit rate for ADEOS is 126 MB/s and the total volume of data is about 100 GB per day. To archive and manage such a large quantity of data with high reliability and easy accessibility it was necessary to develop a new mass storage system with a catalogue information database using advanced database management technology. The data will be archived and maintained in the Master Data Storage Subsystem (MDSS) which is one subsystem in NASDA's new Earth Observation data and Information System (EOIS). The MDSS is based on a SONY ID1 digital tape robotics system. This paper provides an overview of the EOIS system, with a focus on the Master Data Storage Subsystem and the NASDA Earth Observation Center (EOC) archive policy for earth observation satellite data.

  13. State involvement in and use of LANDSAT technology

    NASA Technical Reports Server (NTRS)

    Tessar, P. A.

    1981-01-01

    The background of state involvement in LANDSAT systems planning and the status of state LANDSAT use are reviewed. Major recommendations on data continuity; frequency and pattern of observation; state representation in program management; pointable sensors for a fully operational system; data processing systems; data pricing; data copyright; data archival; and technology transfer are highlighted. Plans of the government regarding the LANDSAT system are reflected in the FY-1982 budget process are examined.

  14. Goddard Conference on Mass Storage Systems and Technologies, volume 2

    NASA Technical Reports Server (NTRS)

    Kobler, Ben (Editor); Hariharan, P. C. (Editor)

    1993-01-01

    Papers and viewgraphs from the conference are presented. Discussion topics include the IEEE Mass Storage System Reference Model, data archiving standards, high-performance storage devices, magnetic and magneto-optic storage systems, magnetic and optical recording technologies, high-performance helical scan recording systems, and low end helical scan tape drives. Additional discussion topics addressed the evolution of the identifiable unit for processing (file, granule, data set, or some similar object) as data ingestion rates increase dramatically, and the present state of the art in mass storage technology.

  15. Lessons learned in setting up and running the European copy of HST archive

    NASA Astrophysics Data System (ADS)

    Pirenne, Benoit; Benvenuti, P.; Albrecht, Rudolf; Rasmussen, B. F.

    1993-11-01

    The endeavour of Hubble Space Telescope (HST) proved once more that arguments such as high costs, extremely long preparation time, inherent total failure risks, limited life time and high over-subscription rates make each scientific space mission almost always a unique event. The above arguments immediately point to the need for storing all the data produced by spacecraft in a short time for the scientific community to re-use in the long term. This calls for the organization of science archives. Together with the Space Telescope Science Institute, the European Coordinating Facility developed an archive system for the HST data. This paper is about the experience gained in setting up and running the European HST Science Data Archive system. Organization, cost versus scientific return and acceptance by the scientists are among the aspects that will be covered. In particular, we will insist on the 'four-pillar' structure principle that all archive centers should have. Namely: a user interface, a catalogue accurately describing the content of the archive, the human scientific expertise and of course the data. Long term prospects and problems due to technology changes will be evaluated and solutions will be proposed. The adaptability of the system described to other scientific space missions our ground-based observatories will be discussed.

  16. GENESIS: GPS Environmental and Earth Science Information System

    NASA Technical Reports Server (NTRS)

    Hajj, George

    1999-01-01

    This presentation reviews the GPS ENvironmental and Earth Science Information System (GENESIS). The objectives of GENESIS are outlined (1) Data Archiving, searching and distribution for science data products derived from Space borne TurboRogue Space Receivers for GPS science and other ground based GPS receivers, (2) Data browsing using integrated visualization tools, (3) Interactive web/java-based data search and retrieval, (4) Data subscription service, (5) Data migration from existing GPS archived data, (6) On-line help and documentation, and (7) participation in the WP-ESIP federation. The presentation reviews the products and services of Genesis, and the technology behind the system.

  17. A basis for a visual language for describing, archiving and analyzing functional models of complex biological systems

    PubMed Central

    Cook, Daniel L; Farley, Joel F; Tapscott, Stephen J

    2001-01-01

    Background: We propose that a computerized, internet-based graphical description language for systems biology will be essential for describing, archiving and analyzing complex problems of biological function in health and disease. Results: We outline here a conceptual basis for designing such a language and describe BioD, a prototype language that we have used to explore the utility and feasibility of this approach to functional biology. Using example models, we demonstrate that a rather limited lexicon of icons and arrows suffices to describe complex cell-biological systems as discrete models that can be posted and linked on the internet. Conclusions: Given available computer and internet technology, BioD may be implemented as an extensible, multidisciplinary language that can be used to archive functional systems knowledge and be extended to support both qualitative and quantitative functional analysis. PMID:11305940

  18. An Overview of the Planetary Data System Roadmap Study for 2017 - 2026

    NASA Astrophysics Data System (ADS)

    Morgan, Thomas H.; McNutt, Ralph L.; Gaddis, Lisa; Law, Emily; Beyer, Ross A.; Crombie, Kate; Ebel, Denton; Ghosh, Amitahba; Grayzeck, Edwin J.; Paganelli, Flora; Raugh, Anne C.; Stein, Thomas; Tiscareno, Matthew S.; Weber, Renee; E Banks, Maria; Powell, Kathryn

    2017-10-01

    NASA’s Planetary Data System (PDS) is the formal archive of >1.2 petabytes of data from planetary exploration, science, and research. Initiated in 1989 to address an overall lack of attention to mission data documentation, access, and archiving, the PDS has since evolved into an online collection of digital data managed and served by a federation of 6 science discipline nodes and 2 technical support nodes. Several ad-hoc mission-oriented data nodes also provide complex data interfaces and access for the duration of their missions.The new PDS Roadmap Study for 2017-2026 involved 15 planetary science community members who collectively prepared a report summarizing the results of an intensive examination of the current state of the PDS and its organization, management, practices, and data holdings (https://pds.jpl.nasa.gov/roadmap/PlanetaryDataSystemRMS17-26_20jun17.pdf). The report summarizes PDS history, its functions and characteristics, and its present form; also included are extensive references and documentary appendices. The report recognizes that as a complex evolving system, the PDS must respond to new pressures and opportunities. The report provides details on challenges now facing the PDS, 19 detailed findings and suggested remediations that could be used to respond to these findings, and a summary of the potential future of planetary data archiving. These findings cover topics such as user needs and expectations, data usability and discoverability (i.e., metadata, data access, documentation, and training), tools and file formats, use of current information technologies, and responses to increases in data volume, variety, complexity, and number of data providers. In addition, the study addresses the possibility of archiving software, laboratory data, and physical samples. Finally, the report discusses the current structure and governance of PDS and the impact of this on how archive growth, technology, and new developments are enabled and managed within the PDS. The report, with its findings, acknowledges the ongoing and expected challenges to be faced in the future, the need for maintaining an edge on the use of emerging technologies, and represents a guide for evolution of the PDS for the next decade.

  19. Internet-Based Cervical Cytology Screening System

    DTIC Science & Technology

    2007-04-01

    approaches to cervical cancer screening possible. In addition, advances in information technology have facilitated the Internet transmission and archival...processes in the clinical laboratory. Recent technological advances in specimen preparation and computerized primary screening make automated...AD_________________ Award Number: W81XWH-04-C-0083 TITLE: Internet -Based Cervical Cytology

  20. The amino acid's backup bone - storage solutions for proteomics facilities.

    PubMed

    Meckel, Hagen; Stephan, Christian; Bunse, Christian; Krafzik, Michael; Reher, Christopher; Kohl, Michael; Meyer, Helmut Erich; Eisenacher, Martin

    2014-01-01

    Proteomics methods, especially high-throughput mass spectrometry analysis have been continually developed and improved over the years. The analysis of complex biological samples produces large volumes of raw data. Data storage and recovery management pose substantial challenges to biomedical or proteomic facilities regarding backup and archiving concepts as well as hardware requirements. In this article we describe differences between the terms backup and archive with regard to manual and automatic approaches. We also introduce different storage concepts and technologies from transportable media to professional solutions such as redundant array of independent disks (RAID) systems, network attached storages (NAS) and storage area network (SAN). Moreover, we present a software solution, which we developed for the purpose of long-term preservation of large mass spectrometry raw data files on an object storage device (OSD) archiving system. Finally, advantages, disadvantages, and experiences from routine operations of the presented concepts and technologies are evaluated and discussed. This article is part of a Special Issue entitled: Computational Proteomics in the Post-Identification Era. Guest Editors: Martin Eisenacher and Christian Stephan. Copyright © 2013. Published by Elsevier B.V.

  1. Configurable technology development for reusable control and monitor ground systems

    NASA Technical Reports Server (NTRS)

    Uhrlaub, David R.

    1994-01-01

    The control monitor unit (CMU) uses configurable software technology for real-time mission command and control, telemetry processing, simulation, data acquisition, data archiving, and ground operations automation. The base technology is currently planned for the following control and monitor systems: portable Space Station checkout systems; ecological life support systems; Space Station logistics carrier system; and the ground system of the Delta Clipper (SX-2) in the Single-Stage Rocket Technology program. The CMU makes extensive use of commercial technology to increase capability and reduce development and life-cycle costs. The concepts and technology are being developed by McDonnell Douglas Space and Defense Systems for the Real-Time Systems Laboratory at NASA's Kennedy Space Center under the Payload Ground Operations Contract. A second function of the Real-Time Systems Laboratory is development and utilization of advanced software development practices.

  2. Picture archiving and communication system--Part one: Filmless radiology and distance radiology.

    PubMed

    De Backer, A I; Mortelé, K J; De Keulenaer, B L

    2004-01-01

    Picture archiving and communication system (PACS) is a collection of technologies used to carry out digital medical imaging. PACS is used to digitally acquire medical images from the various modalities, such as computed tomography (CT), magnetic resonance imaging (MRI), ultrasound, and digital projection radiography. The image data and pertinent information are transmitted to other and possibly remote locations over networks, where they may be displayed on computer workstations for soft copy viewing in multiple locations, thus permitting simultaneous consultations and almost instant reporting from radiologists at a distance. Data are secured and archived on digital media such as optical disks or tape, and may be automatically retrieved as necessary. Close integration with the hospital information system (HIS)--radiology information system (RIS) is critical for system functionality. Medical image management systems are maturing, providing access outside of the radiology department to images throughout the hospital via the Ethernet, at different hospitals, or from a home workstation if teleradiology has been implemented.

  3. Standby Power

    Science.gov Websites

    being unplugged. These products draw power 24 hours a day, often without the knowledge of the consumer Define & Measure Data Reports & Links Archives Contact © Building Technology and Urban Systems

  4. Technology and the Transformation of Archival Description

    ERIC Educational Resources Information Center

    Pitti, Daniel V.

    2005-01-01

    The emergence of computer and network technologies has presented the archival profession with daunting challenges as well as inspiring opportunities. Archivists have been actively imagining and realizing the application of advanced technologies to their professional functions and activities. Using advanced technologies, archivists have been…

  5. Complementary concept for an image archive and communication system in a cardiological department based on CD-medical, an online archive, and networking facilities

    NASA Astrophysics Data System (ADS)

    Oswald, Helmut; Mueller-Jones, Kay; Builtjes, Jan; Fleck, Eckart

    1998-07-01

    The developments in information technologies -- computer hardware, networking and storage media -- has led to expectations that these advances make it possible to replace 35 mm film completely by digital techniques in the catheter laboratory. Besides the role of an archival medium, cine film is used as the major image review and exchange medium in cardiology. None of the today technologies can fulfill completely the requirements to replace cine film. One of the major drawbacks of cine film is the single access in time and location. For the four catheter laboratories in our institutions we have designed a complementary concept combining the CD-R, also called CD-medical, as a single patient storage and exchange medium, and a digital archive for on-line access and image review of selected frames or short sequences on adequate medical workstations. The image data from various modalities as well as all digital documents regarding to a patient are part of an electronic patient record. The access, the processing and the display of documents is supported by an integrated medical application.

  6. Data management and digital delivery of analog data

    USGS Publications Warehouse

    Miller, W.A.; Longhenry, Ryan; Smith, T.

    2008-01-01

    The U.S. Geological Survey's (USGS) data archive at the Earth Resources Observation and Science (EROS) Center is a comprehensive and impartial record of the Earth's changing land surface. USGS/EROS has been archiving and preserving land remote sensing data for over 35 years. This remote sensing archive continues to grow as aircraft and satellites acquire more imagery. As a world leader in preserving data, USGS/EROS has a reputation as a technological innovator in solving challenges and ensuring that access to these collections is available. Other agencies also call on the USGS to consider their collections for long-term archive support. To improve access to the USGS film archive, each frame on every roll of film is being digitized by automated high performance digital camera systems. The system robotically captures a digital image from each film frame for the creation of browse and medium resolution image files. Single frame metadata records are also created to improve access that otherwise involves interpreting flight indexes. USGS/EROS is responsible for over 8.6 million frames of aerial photographs and 27.7 million satellite images.

  7. ICI optical data storage tape: An archival mass storage media

    NASA Technical Reports Server (NTRS)

    Ruddick, Andrew J.

    1993-01-01

    At the 1991 Conference on Mass Storage Systems and Technologies, ICI Imagedata presented a paper which introduced ICI Optical Data Storage Tape. This paper placed specific emphasis on the media characteristics and initial data was presented which illustrated the archival stability of the media. More exhaustive analysis that was carried out on the chemical stability of the media is covered. Equally important, it also addresses archive management issues associated with, for example, the benefits of reduced rewind requirements to accommodate tape relaxation effects that result from careful tribology control in ICI Optical Tape media. ICI Optical Tape media was designed to meet the most demanding requirements of archival mass storage. It is envisaged that the volumetric data capacity, long term stability and low maintenance characteristics demonstrated will have major benefits in increasing reliability and reducing the costs associated with archival storage of large data volumes.

  8. Incorporating Oracle on-line space management with long-term archival technology

    NASA Technical Reports Server (NTRS)

    Moran, Steven M.; Zak, Victor J.

    1996-01-01

    The storage requirements of today's organizations are exploding. As computers continue to escalate in processing power, applications grow in complexity and data files grow in size and in number. As a result, organizations are forced to procure more and more megabytes of storage space. This paper focuses on how to expand the storage capacity of a Very Large Database (VLDB) cost-effectively within a Oracle7 data warehouse system by integrating long term archival storage sub-systems with traditional magnetic media. The Oracle architecture described in this paper was based on an actual proof of concept for a customer looking to store archived data on optical disks yet still have access to this data without user intervention. The customer had a requirement to maintain 10 years worth of data on-line. Data less than a year old still had the potential to be updated thus will reside on conventional magnetic disks. Data older than a year will be considered archived and will be placed on optical disks. The ability to archive data to optical disk and still have access to that data provides the system a means to retain large amounts of data that is readily accessible yet significantly reduces the cost of total system storage. Therefore, the cost benefits of archival storage devices can be incorporated into the Oracle storage medium and I/O subsystem without loosing any of the functionality of transaction processing, yet at the same time providing an organization access to all their data.

  9. Acceptability of picture archiving and communication system (PACS) among hospital healthcare personnel based on a unified theory of acceptance and use of technology.

    PubMed

    Ahmadi, Maryam; Mehrabi, Nahid; Sheikhtaheri, Abbas; Sadeghi, Mojtaba

    2017-09-01

    The picture archiving and communication system (PACS) is a healthcare system technology which manages medical images and integrates equipment through a network. There are some theories about the use and acceptance of technology by people to describe the behavior and attitudes of end users towards information technologies. We investigated the influential factors on users' acceptance of PACS in the military hospitals of Tehran. In this applied analytical and cross-sectional study, 151 healthcare employees of military hospitals who had experience in using the PACS system were investigated. Participants were selected by census. The following variables were considered: performance expectancy, efforts expectancy, social influence, facilitating conditions and behavioral intention. Data were gathered using a questionnaire. Its validity and reliability were approved by a panel of experts and was piloted with 30 hospital healthcare staff (Cronbach's alpha =0.91). Spearman correlation coefficient and multiple linear regression analysis were used in analyzing the data. Expected performance, efforts expectancy, social impact and facilitating conditions had a significant relationship with behavioral intention. The multiple regression analysis indicated that only performance expectancy can predict the user's behavioral intentions to use PACS technology. Performance and effort expectancies are quite influential in accepting the use of PACS in hospitals. All healthcare personnel should become aware that using such technology is necessary in a hospital. Knowing the influencing factors that affect the acceptance of using new technology can help in improving its use, especially in a healthcare system. This can improve the offered healthcare services' quality.

  10. Experimental OAI-Based Digital Library Systems

    NASA Technical Reports Server (NTRS)

    Nelson, Michael L. (Editor); Maly, Kurt (Editor); Zubair, Mohammad (Editor); Rusch-Feja, Diann (Editor)

    2002-01-01

    The objective of Open Archives Initiative (OAI) is to develop a simple, lightweight framework to facilitate the discovery of content in distributed archives (http://www.openarchives.org). The focus of the workshop held at the 5th European Conference on Research and Advanced Technology for Digital Libraries (ECDL 2001) was to bring researchers in the area of digital libraries who are building OAI based systems so as to share their experiences, problems they are facing, and approaches they are taking to address them. The workshop consisted of invited talks from well-established researchers working in building OAI based digital library system along with short paper presentations.

  11. Automated search and retrieval of information from imaged documents using optical correlation techniques

    NASA Astrophysics Data System (ADS)

    Stalcup, Bruce W.; Dennis, Phillip W.; Dydyk, Robert B.

    1999-10-01

    Litton PRC and Litton Data Systems Division are developing a system, the Imaged Document Optical Correlation and Conversion System (IDOCCS), to provide a total solution to the problem of managing and retrieving textual and graphic information from imaged document archives. At the heart of IDOCCS, optical correlation technology provides the search and retrieval of information from imaged documents. IDOCCS can be used to rapidly search for key words or phrases within the imaged document archives. In addition, IDOCCS can automatically compare an input document with the archived database to determine if it is a duplicate, thereby reducing the overall resources required to maintain and access the document database. Embedded graphics on imaged pages can also be exploited; e.g., imaged documents containing an agency's seal or logo can be singled out. In this paper, we present a description of IDOCCS as well as preliminary performance results and theoretical projections.

  12. SpaceOps 1992: Proceedings of the Second International Symposium on Ground Data Systems for Space Mission Operations

    NASA Technical Reports Server (NTRS)

    1993-01-01

    The Second International Symposium featured 135 oral presentations in these 12 categories: Future Missions and Operations; System-Level Architectures; Mission-Specific Systems; Mission and Science Planning and Sequencing; Mission Control; Operations Automation and Emerging Technologies; Data Acquisition; Navigation; Operations Support Services; Engineering Data Analysis of Space Vehicle and Ground Systems; Telemetry Processing, Mission Data Management, and Data Archiving; and Operations Management. Topics focused on improvements in the productivity, effectiveness, efficiency, and quality of mission operations, ground systems, and data acquisition. Also emphasized were accomplishments in management of human factors; use of information systems to improve data retrieval, reporting, and archiving; design and implementation of logistics support for mission operations; and the use of telescience and teleoperations.

  13. CD-based image archival and management on a hybrid radiology intranet.

    PubMed

    Cox, R D; Henri, C J; Bret, P M

    1997-08-01

    This article describes the design and implementation of a low-cost image archival and management solution on a radiology network consisting of UNIX, IBM personal computer-compatible (IBM, Purchase, NY) and Macintosh (Apple Computer, Cupertino, CA) workstations. The picture archiving and communications system (PACS) is modular, scaleable and conforms to the Digital Imaging and Communications in Medicine (DICOM) 3.0 standard for image transfer, storage and retrieval. Image data is made available on soft-copy reporting workstations by a work-flow management scheme and on desktop computers through a World Wide Web (WWW) interface. Data archival is based on recordable compact disc (CD) technology and is automated. The project has allowed the radiology department to eliminate the use of film in magnetic resonance (MR) imaging, computed tomography (CT) and ultrasonography.

  14. The Power of Imaging.

    ERIC Educational Resources Information Center

    Haapaniemi, Peter

    1990-01-01

    Describes imaging technology, which allows huge numbers of words and illustrations to be reduced to tiny fraction of space required by originals and discusses current applications. Highlights include image processing system at National Archives; use by banks for high-speed check processing; engineering document management systems (EDMS); folder…

  15. An open, interoperable, and scalable prehospital information technology network architecture.

    PubMed

    Landman, Adam B; Rokos, Ivan C; Burns, Kevin; Van Gelder, Carin M; Fisher, Roger M; Dunford, James V; Cone, David C; Bogucki, Sandy

    2011-01-01

    Some of the most intractable challenges in prehospital medicine include response time optimization, inefficiencies at the emergency medical services (EMS)-emergency department (ED) interface, and the ability to correlate field interventions with patient outcomes. Information technology (IT) can address these and other concerns by ensuring that system and patient information is received when and where it is needed, is fully integrated with prior and subsequent patient information, and is securely archived. Some EMS agencies have begun adopting information technologies, such as wireless transmission of 12-lead electrocardiograms, but few agencies have developed a comprehensive plan for management of their prehospital information and integration with other electronic medical records. This perspective article highlights the challenges and limitations of integrating IT elements without a strategic plan, and proposes an open, interoperable, and scalable prehospital information technology (PHIT) architecture. The two core components of this PHIT architecture are 1) routers with broadband network connectivity to share data between ambulance devices and EMS system information services and 2) an electronic patient care report to organize and archive all electronic prehospital data. To successfully implement this comprehensive PHIT architecture, data and technology requirements must be based on best available evidence, and the system must adhere to health data standards as well as privacy and security regulations. Recent federal legislation prioritizing health information technology may position federal agencies to help design and fund PHIT architectures.

  16. Picture Archiving And Communication Systems (PACS): Introductory Systems Analysis Considerations

    NASA Astrophysics Data System (ADS)

    Hughes, Simon H. C.

    1983-05-01

    Two fundamental problems face any hospital or radiology department that is thinking about installing a Picture Archiving and Communications System (PACS). First, though the need for PACS already exists, much of the relevant technology is just beginning to be developed. Second, the requirements of each hospital are different, so that any attempts to market a single PACS design for use in large numbers of hospitals are likely to meet with the same problems as were experienced with general-purpose Hospital Information Systems. This paper outlines some of the decision processes involved in arriving at specifications for each module of a PACS and indicates design principles which should be followed in order to meet individual hospital requirements, while avoiding the danger of short-term systems obsolescence.

  17. Hubble Space Telescope: the new telemetry archiving system

    NASA Astrophysics Data System (ADS)

    Miebach, Manfred P.

    2000-07-01

    The Hubble Space Telescope (HST), the first of NASA's Great Observatories, was launched on April 24, 1990. The HST was designed for a minimum fifteen-year mission with on-orbit servicing by the Space Shuttle System planned at approximately three-year intervals. Major changes to the HST ground system have been implemented for the third servicing mission in December 1999. The primary objectives of the ground system re- engineering effort, a project called 'Vision 2000 Control Center System (CCS),' are to reduce both development and operating costs significantly for the remaining years of HST's lifetime. Development costs are reduced by providing a more modern hardware and software architecture and utilizing commercial off the shelf (COTS) products wherever possible. Part of CCS is a Space Telescope Engineering Data Store, the design of which is based on current Data Warehouse technology. The Data Warehouse (Red Brick), as implemented in the CCS Ground System that operates and monitors the Hubble Space Telescope, represents the first use of a commercial Data Warehouse to manage engineering data. The purpose of this data store is to provide a common data source of telemetry data for all HST subsystems. This data store will become the engineering data archive and will provide a queryable database for the user to analyze HST telemetry. The access to the engineering data in the Data Warehouse is platform-independent from an office environment using commercial standards (Unix, Windows98/NT). The latest Internet technology is used to reach the HST engineering community. A WEB-based user interface allows easy access to the data archives. This paper will provide a CCS system overview and will illustrate some of the CCS telemetry capabilities: in particular the use of the new Telemetry Archiving System. Vision 20001 is an ambitious project, but one that is well under way. It will allow the HST program to realize reduced operations costs for the Third Servicing Mission and beyond.

  18. The Telecommunications and Data Acquisition Report

    NASA Technical Reports Server (NTRS)

    Posner, E. C. (Editor)

    1990-01-01

    Archival reports on developments in programs managed by the JPL Office of Telecommunications and Data Acquisition (TDA) are provided. Topics covered include: DSN advanced systems (tracking and ground-based navigation; communications, spacecraft-ground; and station control and system technology) and DSN systems implementation (capabilities for existing projects; capabilities for new projects; TDA program management and analysis; and Goldstone solar system radar).

  19. Operating tool for a distributed data and information management system

    NASA Astrophysics Data System (ADS)

    Reck, C.; Mikusch, E.; Kiemle, S.; Wolfmüller, M.; Böttcher, M.

    2002-07-01

    The German Remote Sensing Data Center has developed the Data Information and Management System DIMS which provides multi-mission ground system services for earth observation product processing, archiving, ordering and delivery. DIMS successfully uses newest technologies within its services. This paper presents the solution taken to simplify operation tasks for this large and distributed system.

  20. Faculty Recommendations for Web Tools: Implications for Course Management Systems

    ERIC Educational Resources Information Center

    Oliver, Kevin; Moore, John

    2008-01-01

    A gap analysis of web tools in Engineering was undertaken as one part of the Digital Library Network for Engineering and Technology (DLNET) grant funded by NSF (DUE-0085849). DLNET represents a Web portal and an online review process to archive quality knowledge objects in Engineering and Technology disciplines. The gap analysis coincided with the…

  1. Data systems and computer science programs: Overview

    NASA Technical Reports Server (NTRS)

    Smith, Paul H.; Hunter, Paul

    1991-01-01

    An external review of the Integrated Technology Plan for the Civil Space Program is presented. The topics are presented in viewgraph form and include the following: onboard memory and storage technology; advanced flight computers; special purpose flight processors; onboard networking and testbeds; information archive, access, and retrieval; visualization; neural networks; software engineering; and flight control and operations.

  2. Government Information Quarterly. Volume 7, no. 2: National Aeronautics and Space Administration Scientific and Technical Information Programs. Special issue

    NASA Technical Reports Server (NTRS)

    Hernon, Peter (Editor); Mcclure, Charles R. (Editor); Pinelli, Thomas E. (Editor)

    1990-01-01

    NASA scientific and technical information (STI) programs are discussed. Topics include management of information in a research and development agency, the new space and Earth science information systems at NASA's archive, scientific and technical information management, and technology transfer of NASA aerospace technology to other industries.

  3. Evaluation of DVD-R for Archival Applications

    NASA Technical Reports Server (NTRS)

    Martin, Michael D.; Hyon, Jason J.

    2000-01-01

    For more than a decade, CD-ROM and CD-R have provided an unprecedented level of reliability, low cost and cross-platform compatibility to support federal data archiving and distribution efforts. However, it should be remembered that years of effort were required to achieve the standardization that has supported the growth of the CD industry. Incompatibilities in the interpretation of the ISO-9660 standard on different operating systems had to be dealt with, and the imprecise specifications in the Orange Book Part n and Part Hi led to incompatibilities between CD-R media and CD-R recorders. Some of these issues were presented by the authors at Optical Data Storage '95. The major current problem with the use of CD technology is the growing volume of digital data that needs to be stored. CD-ROM collections of hundreds of volumes and CD-R collections of several thousand volumes are becoming almost too cumbersome to be useful. The emergence of Digital Video Disks Recorder (DVD-R) technology promises to reduce the number of discs required for archive applications by a factor of seven while providing improved reliability. It is important to identify problem areas for DVD-R media and provide guidelines to manufacturers, file system developers and users in order to provide reliable data storage and interchange. The Data Distribution Laboratory (DDL) at NASA's Jet Propulsion Laboratory began its evaluation of DVD-R technology in early 1998. The initial plan was to obtain a DVD-Recorder for preliminary testing, deploy reader hardware to user sites for compatibility testing, evaluate the quality and longevity of DVD-R media and develop proof-of-concept archive collections to test the reliability and usability of DVD-R media and jukebox hardware.

  4. Life Sciences Data Archive (LSDA) in the Post-Shuttle Era

    NASA Technical Reports Server (NTRS)

    Fitts, Mary A.; Johnson-Throop, Kathy; Havelka, Jacque; Thomas, Diedre

    2009-01-01

    Now, more than ever before, NASA is realizing the value and importance of their intellectual assets. Principles of knowledge management, the systematic use and reuse of information/experience/expertise to achieve a specific goal, are being applied throughout the agency. LSDA is also applying these solutions, which rely on a combination of content and collaboration technologies, to enable research teams to create, capture, share, and harness knowledge to do the things they do well, even better. In the early days of spaceflight, space life sciences data were been collected and stored in numerous databases, formats, media-types and geographical locations. These data were largely unknown/unavailable to the research community. The Biomedical Informatics and Health Care Systems Branch of the Space Life Sciences Directorate at JSC and the Data Archive Project at ARC, with funding from the Human Research Program through the Exploration Medical Capability Element, are fulfilling these requirements through the systematic population of the Life Sciences Data Archive. This project constitutes a formal system for the acquisition, archival and distribution of data for HRP-related experiments and investigations. The general goal of the archive is to acquire, preserve, and distribute these data and be responsive to inquiries from the science communities.

  5. New Technology Changing The Face of Mobile Seismic Networks

    NASA Astrophysics Data System (ADS)

    Brisbourne, A.; Denton, P.; Seis-Uk

    SEIS-UK, a seismic equipment pool and data management facility run by a consortium of four UK universities (Leicester, Leeds, Cambridge and Royal Holloway, London) completed its second phase in 2001. To compliment the existing broadband equipment pool, which has been deployed to full capacity to date, the consortium undertook a tender evaluation process for low-power, lightweight sensors and recorders, for use on both controlled source and passive seismic experiments. The preferred option, selected by the consortium, was the Guralp CMG-6TD system, with 150 systems ordered. The CMG-6TD system is a new concept in temporary seismic equipment. A 30s- 100Hz force-feedback sensor, integral 24bit digitiser and 3-4Gbyte of solid-state memory are all housed in a single unit. Use of the most recent technologies has kept the power consumption to below 1W and the weight to 3.5Kg per unit. The concept of the disk-swap procedure for obtaining data from the field has been usurped by a fast data download technique using firewire technology. This allows for rapid station servicing, essential when 150 stations are in use, and also ensures the environmental integrity of the system by removing the requirement for a disk access port and envi- ronmentally exposed data disk. The system therefore meets the criteria for controlled source and passive seismic experiments: (1) the single unit concept and low-weight is designed for rapid deployment on short-term projects; (2) the low power consumption reduces the power-supply requirements facilitating deployment; (3) the low self-noise and bandwidth of the sensor make it applicable to passive experiments involving nat- ural sources. Further to this acquisition process, in collaboration with external groups, the SEIS- UK data management procedures have been streamlined with the integration of the Guralp GCF format data into the PASSCAL PDB software. This allows for rapid dissemination of field data and the production of archive-ready datasets, reducing the time between field recording and data archive. The archiving procedure for SEIS- UK datasets has been established, with data from experiments carried out with the broadband equipment already on the permanent continuous data archive at IRIS DMC.

  6. Acceptability of picture archiving and communication system (PACS) among hospital healthcare personnel based on a unified theory of acceptance and use of technology

    PubMed Central

    Ahmadi, Maryam; Mehrabi, Nahid; Sheikhtaheri, Abbas; Sadeghi, Mojtaba

    2017-01-01

    Background and aim The picture archiving and communication system (PACS) is a healthcare system technology which manages medical images and integrates equipment through a network. There are some theories about the use and acceptance of technology by people to describe the behavior and attitudes of end users towards information technologies. We investigated the influential factors on users’ acceptance of PACS in the military hospitals of Tehran. Methods In this applied analytical and cross-sectional study, 151 healthcare employees of military hospitals who had experience in using the PACS system were investigated. Participants were selected by census. The following variables were considered: performance expectancy, efforts expectancy, social influence, facilitating conditions and behavioral intention. Data were gathered using a questionnaire. Its validity and reliability were approved by a panel of experts and was piloted with 30 hospital healthcare staff (Cronbach’s alpha =0.91). Spearman correlation coefficient and multiple linear regression analysis were used in analyzing the data. Results Expected performance, efforts expectancy, social impact and facilitating conditions had a significant relationship with behavioral intention. The multiple regression analysis indicated that only performance expectancy can predict the user’s behavioral intentions to use PACS technology. Conclusion Performance and effort expectancies are quite influential in accepting the use of PACS in hospitals. All healthcare personnel should become aware that using such technology is necessary in a hospital. Knowing the influencing factors that affect the acceptance of using new technology can help in improving its use, especially in a healthcare system. This can improve the offered healthcare services’ quality. PMID:29038717

  7. Applied Information Systems Research Program (AISRP) Workshop 3 meeting proceedings

    NASA Technical Reports Server (NTRS)

    1993-01-01

    The third Workshop of the Applied Laboratory Systems Research Program (AISRP) met at the Univeristy of Colorado's Laboratory for Atmospheric and Space Physics in August of 1993. The presentations were organized into four sessions: Artificial Intelligence Techniques; Scientific Visualization; Data Management and Archiving; and Research and Technology.

  8. Physician Perceptions of Singleview: A Picture Archiving and Communications System (PACS) Federation Solution

    ERIC Educational Resources Information Center

    Kolowitz, Brian J.

    2012-01-01

    Information Technology is changing the face of medicine. Prior research has shown many physicians believe access to the complete Personal Health Record (PHR) would be beneficial to patient care. Many times these medical records are distributed across system and organizational boundaries. International standards committees, healthcare…

  9. You Can See Film through Digital: A Report from Where the Archiving of Motion Picture Film Stands

    NASA Astrophysics Data System (ADS)

    Tochigi, Akira

    In recent years, digital technology has brought drastic change to the archiving of motion picture film. By collecting digital media as well as film, many conventional film archives have transformed themselves into moving image archives or audiovisual archives. As well, digital technology has expanded the possibility of the restoration of motion picture film in comparison with conventional photochemical (analog) restoration. This paper first redefines some fundamental terms regarding the archiving of motion picture film and discusses the conditions which need consideration for film archiving in Japan. With a few examples of the recent restoration projects conducted by National Film Center of the National Museum of Modern Art, Tokyo, this paper then clarifies new challenges inherent in digital restoration and urges the importance of better appreciation of motion picture film.

  10. Use of a thin-section archive and enterprise 3D software for long-term storage of thin-slice CT data sets.

    PubMed

    Meenan, Christopher; Daly, Barry; Toland, Christopher; Nagy, Paul

    2006-01-01

    Rapid advances are changing the technology and applications of multidetector computed tomography (CT) scanners. The major increase in data associated with this new technology, however, breaks most commercial picture archiving and communication system (PACS) architectures by preventing them from delivering data in real time to radiologists and outside clinicians. We proposed a phased model for 3D workflow, installed a thin-slice archive and measured thin-slice data storage over a period of 5 months. A mean of 1,869 CT studies were stored per month, with an average of 643 images per study and a mean total volume of 588 GB/month. We also surveyed 48 radiologists to determine diagnostic use, impressions of thin-slice value, and requirements for retention times. The majority of radiologists thought thin slice was helpful for diagnosis and regularly used the application. Permanent storage of thin slice CT is likely to become best practice and a mission-critical pursuit for the health care enterprise.

  11. Firefly: embracing future web technologies

    NASA Astrophysics Data System (ADS)

    Roby, W.; Wu, X.; Goldina, T.; Joliet, E.; Ly, L.; Mi, W.; Wang, C.; Zhang, Lijun; Ciardi, D.; Dubois-Felsmann, G.

    2016-07-01

    At IPAC/Caltech, we have developed the Firefly web archive and visualization system. Used in production for the last eight years in many missions, Firefly gives the scientist significant capabilities to study data. Firefly provided the first completely web based FITS viewer as well as a growing set of tabular and plotting visualizers. Further, it will be used for the science user interface of the LSST telescope which goes online in 2021. Firefly must meet the needs of archive access and visualization for the 2021 LSST telescope and must serve astronomers beyond the year 2030. Recently, our team has faced the fact that the technology behind Firefly software was becoming obsolete. We were searching for ways to utilize the current breakthroughs in maintaining stability, testability, speed, and reliability of large web applications, which Firefly exemplifies. In the last year, we have ported the Firefly to cutting edge web technologies. Embarking on this massive overhaul is no small feat to say the least. Choosing the technologies that will maintain a forward trajectory in a future development project is always hard and often overwhelming. When a team must port 150,000 lines of code for a production-level product there is little room to make poor choices. This paper will give an overview of the most modern web technologies and lessons learned in our conversion from GWT based system to React/Redux based system.

  12. Digital Archive Issues from the Perspective of an Earth Science Data Producer

    NASA Technical Reports Server (NTRS)

    Barkstrom, Bruce R.

    2004-01-01

    Contents include the following: Introduction. A Producer Perspective on Earth Science Data. Data Producers as Members of a Scientific Community. Some Unique Characteristics of Scientific Data. Spatial and Temporal Sampling for Earth (or Space) Science Data. The Influence of the Data Production System Architecture. The Spatial and Temporal Structures Underlying Earth Science Data. Earth Science Data File (or Relation) Schemas. Data Producer Configuration Management Complexities. The Topology of Earth Science Data Inventories. Some Thoughts on the User Perspective. Science Data User Communities. Spatial and Temporal Structure Needs of Different Users. User Spatial Objects. Data Search Services. Inventory Search. Parameter (Keyword) Search. Metadata Searches. Documentation Search. Secondary Index Search. Print Technology and Hypertext. Inter-Data Collection Configuration Management Issues. An Archive View. Producer Data Ingest and Production. User Data Searching and Distribution. Subsetting and Supersetting. Semantic Requirements for Data Interchange. Tentative Conclusions. An Object Oriented View of Archive Information Evolution. Scientific Data Archival Issues. A Perspective on the Future of Digital Archives for Scientific Data. References Index for this paper.

  13. Electronic patient record and archive of records in Cardio.net system for telecardiology.

    PubMed

    Sierdziński, Janusz; Karpiński, Grzegorz

    2003-01-01

    In modern medicine the well structured patient data set, fast access to it and reporting capability become an important question. With the dynamic development of information technology (IT) such question is solved via building electronic patient record (EPR) archives. We then obtain fast access to patient data, diagnostic and treatment protocols etc. It results in more efficient, better and cheaper treatment. The aim of the work was to design a uniform Electronic Patient Record, implemented in cardio.net system for telecardiology allowing the co-operation among regional hospitals and reference centers. It includes questionnaires for demographic data and questionnaires supporting doctor's work (initial diagnosis, final diagnosis, history and physical, ECG at the discharge, applied treatment, additional tests, drugs, daily and periodical reports). The browser is implemented in EPR archive to facilitate data retrieval. Several tools for creating EPR and EPR archive were used such as: XML, PHP, Java Script and MySQL. The separate question is the security of data on WWW server. The security is ensured via Security Socket Layer (SSL) protocols and other tools. EPR in Cardio.net system is a module enabling the co-work of many physicians and the communication among different medical centers.

  14. IFLA General Conference, 1984. Management and Technology Division. Section on Information Technology and Joint Meeting of the Round Table Audiovisual Media, the International Association for Sound Archives, and the International Association for Music Libraries. Papers.

    ERIC Educational Resources Information Center

    International Federation of Library Associations, The Hague (Netherlands).

    Six papers on information technology, the development of information systems for Third World countries, handling of sound recordings, and library automation were presented at the 1984 IFLA conference. They include: (1) "Handling, Storage and Preservation of Sound Recordings under Tropical and Subtropical Climatic Conditions" (Dietrich…

  15. The NEEDS Data Base Management and Archival Mass Memory System

    NASA Technical Reports Server (NTRS)

    Bailey, G. A.; Bryant, S. B.; Thomas, D. T.; Wagnon, F. W.

    1980-01-01

    A Data Base Management System and an Archival Mass Memory System are being developed that will have a 10 to the 12th bit on-line and a 10 to the 13th off-line storage capacity. The integrated system will accept packetized data from the data staging area at 50 Mbps, create a comprehensive directory, provide for file management, record the data, perform error detection and correction, accept user requests, retrieve the requested data files and provide the data to multiple users at a combined rate of 50 Mbps. Stored and replicated data files will have a bit error rate of less than 10 to the -9th even after ten years of storage. The integrated system will be demonstrated to prove the technology late in 1981.

  16. A Routing Mechanism for Cloud Outsourcing of Medical Imaging Repositories.

    PubMed

    Godinho, Tiago Marques; Viana-Ferreira, Carlos; Bastião Silva, Luís A; Costa, Carlos

    2016-01-01

    Web-based technologies have been increasingly used in picture archive and communication systems (PACS), in services related to storage, distribution, and visualization of medical images. Nowadays, many healthcare institutions are outsourcing their repositories to the cloud. However, managing communications between multiple geo-distributed locations is still challenging due to the complexity of dealing with huge volumes of data and bandwidth requirements. Moreover, standard methodologies still do not take full advantage of outsourced archives, namely because their integration with other in-house solutions is troublesome. In order to improve the performance of distributed medical imaging networks, a smart routing mechanism was developed. This includes an innovative cache system based on splitting and dynamic management of digital imaging and communications in medicine objects. The proposed solution was successfully deployed in a regional PACS archive. The results obtained proved that it is better than conventional approaches, as it reduces remote access latency and also the required cache storage space.

  17. The System for Quick Search of the Astronomical Objects and Events in the Digital Plate Archives.

    NASA Astrophysics Data System (ADS)

    Sergeev, A. V.; Sergeeva, T. P.

    From the middle of the XIX century observatories all over the world have accumulated about three millions astronomical plates contained the unique information about the Universe which can not be obtained or restored with the help of any newest facilities and technologies but may be useful for many modern astronomical investigations. The threat of astronomical plate archives loss caused by economical, technical or some other causes have put before world astronomical community a problem: the preservation of the unique information kept on those plates. The problem can be solved by transformation of the information from plates to digital form and keeping it on electronic data medium. We began a creation of a system for quick search and analysing of astronomical events and objects in digital plate archive of the Ukrainian Main astronomical observatory of NAS. Connection of the system to Internet will allow a remote user (astronomer or observer) to have access to digital plate archive and to work with it. For providing of the high efficiency of this work the plate database (list of the plates with all information about them and access software) are preparing. Modular structure of the system basic software and standard format of the plate image files allow future development of problem-oriented software for special astronomical researches.

  18. Cost-effectiveness prospects of picture archiving and communication systems.

    PubMed

    Hindel, R; Preger, W

    1988-01-01

    PAC (picture archiving and communication) systems are widely discussed and promoted as the organizational solution to digital image management in a radiology department. For approximately two decades digital imaging has increasingly been used for such diagnostic modalities as CT, DSA, MRI, DR (Digital Radiography) and others. PACS are seen as a step toward high technology integration and more efficient management. Although the acquisition of such technology is investment intensive, there are well-founded projections that prolonged operation will prove cost justified. Such justification can only partly be derived from cost reduction through PAC with respect to present department management--the major justification is preparation for future economic pressures which could make survival of a department without modern technology difficult. Especially in the United States the political climate favors 'competitive medicine' and reduced government support. Seen in this context PACS promises to speed the transition of Health Care Services into a business with tight resource management, cost accounting and marketing. The following paper analyzes cost and revenue in a typical larger Radiology Department, projects various scenarios of cost reduction by means of digital technology and concludes with cautious optimism that the investment expenses for a PACS will be justified in the near future by prudent utilization of high technology.

  19. Using technology assessment as the picture archiving and communication system spreads outside radiology to the enterprise.

    PubMed

    Maliff, R P; Launders, J

    2000-05-01

    Picture archiving and communication systems (PACS) are being implemented within radiology departments, and many facilities are entering the next stage of PACS use by deploying PACS to departments outside of radiology and to other facilities located at a distance. Many PACS vendors and department administrators have based cost-justification analyses on the anticipated savings from expanding PACS to these areas. However, many of these cost-savings analyses can be highly suspect in their assumptions and findings. Technology assessment (TA) at the hospital/health system level is an organized, systematic approach to examining the efficacy of a technology in relation to the health system's mission and clinical needs. It can be an organized and unifying approach to aid in the distribution of limited capital resources. As extra-radiology PACS deployment is a costly endeavor, TA may be used to plan for PACS implementation throughout the enterprise. In many organizations, PACS is thought of as a radiology domain as its first uses were centered on this image-producing service. Now, as PACS technology spreads to other service areas, such as cardiology, dermatology, pathology, orthopedics, obstetrics, etc, the need to incorporate other viewpoints in a system-based PACS is necessary to avoid having independent PACS that may duplicate archives and may not communicate with each other. How to meet the diverse PACS needs of clinical services can be a challenging task; a TA program has been demonstrated to effectively handle the clinical needs, demands, and timeframes of PACS planning and support throughout hospitals and health systems. A hospital-based TA program can assist health care organizations to present PACS as a system-wide need and program rather than a radiology-based program gobbling up the capital budget. Submitting PACS to the TA review process can identify essential elements in planning and help avoid many of the pitfalls of PACS implementation and operations. Thorough cost and/or return on investment analyses, phasing decisions, workflow re-engineering, and outcomes assessment programs are a few of the issues that a TA program can address to help in the transition to a complete electronic image environment. The TA process includes clinician selection, evaluation criteria and their selection for technologies under review, a policy for review/authorization/denial, and measurement of expected outcomes.

  20. A Concept for the One Degree Imager (ODI) Data Reduction Pipeline and Archiving System

    NASA Astrophysics Data System (ADS)

    Knezek, Patricia; Stobie, B.; Michael, S.; Valdes, F.; Marru, S.; Henschel, R.; Pierce, M.

    2010-05-01

    The One Degree Imager (ODI), currently being built by the WIYN Observatory, will provide tremendous possibilities for conducting diverse scientific programs. ODI will be a complex instrument, using non-conventional Orthogonal Transfer Array (OTA) detectors. Due to its large field of view, small pixel size, use of OTA technology, and expected frequent use, ODI will produce vast amounts of astronomical data. If ODI is to achieve its full potential, a data reduction pipeline must be developed. Long-term archiving must also be incorporated into the pipeline system to ensure the continued value of ODI data. This paper presents a concept for an ODI data reduction pipeline and archiving system. To limit costs and development time, our plan leverages existing software and hardware, including existing pipeline software, Science Gateways, Computational Grid & Cloud Technology, Indiana University's Data Capacitor and Massive Data Storage System, and TeraGrid compute resources. Existing pipeline software will be augmented to add functionality required to meet challenges specific to ODI, enhance end-user control, and enable the execution of the pipeline on grid resources including national grid resources such as the TeraGrid and Open Science Grid. The planned system offers consistent standard reductions and end-user flexibility when working with images beyond the initial instrument signature removal. It also gives end-users access to computational and storage resources far beyond what are typically available at most institutions. Overall, the proposed system provides a wide array of software tools and the necessary hardware resources to use them effectively.

  1. GEOSPATICAL INFORMATION TECHNOLOGY AND INFORMATION MANAGEMENT QUALITY ASSURANCE

    EPA Science Inventory

    Most of the geospatial data in use are originated electronically. As a result, these data are acquired, stored, transformed, processed, presented, and archived electronically. The organized system of computer hardware and software used in these processes is called an Informatio...

  2. Mapping the Socio-Technical Complexity of Australian Science: From Archival Authorities to Networks of Contextual Information

    ERIC Educational Resources Information Center

    McCarthy, Gavan; Evans, Joanne

    2007-01-01

    This article examines the evolution of a national register of the archives of science and technology in Australia and the related development of an archival informatics focused initially on people and their relationships to archival materials. The register was created in 1985 as an in-house tool for the Australian Science Archives Project of the…

  3. Scholarly Communication and Information Technology: Exploring the Impact of Changes in the Research Process on Archives. Rand Reprints.

    ERIC Educational Resources Information Center

    Michelson, Avra; Rothenberg, Jeff

    1993-01-01

    The report considers the interaction of trends in information technology and trends in research practices and the policy implications for archives. The information is divided into 4 sections. The first section, an "Overview of Information Technology Trends," discusses end-user computing, which includes ubiquitous computing, end-user…

  4. Retrospective Analysis of Technological Literacy of K-12 Students in the USA

    ERIC Educational Resources Information Center

    Eisenkraft, Arthur

    2010-01-01

    Assessing technological literacy in the USA will require a large expenditure of resources. While these important initiatives are taking place, it is useful to analyze existing archival data to get a sense of students' understanding of technology. Such archival data exists from the entries submitted to the Toshiba/NSTA ExploraVisions competition…

  5. Superfund Public Information System (SPIS), January 1999

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1999-01-01

    The Superfund Public Information System (SPIS) on CD-ROM contains Superfund data for the United States Environmental Protection Agency. The Superfund data is a collection of three databases: Records of Decision (RODS); Comprehensive Environmental, Response, Compensation, and Liability Information System (CERCLIS); and Archive (NFRAP). Descriptions of these databases and CD contents are listed below. Data content: The CD contains the complete text of the official ROD documents signed and issued by EPA from fiscal years 1982--1996; 147 RODs for fiscal year 1997; and seven RODs for fiscal year 1998. The CD also contains 89 Explanation of Significant Difference (ESD) documents, asmore » well as 48 ROD Amendments. CERCLIS and Archive (NFRAP) data is through January 19, 1999. RODS is the Records Of Decision System. RODS is used to track site clean-ups under the Superfund program to justify the type of treatment chosen at each site. RODS contains information on technology justification, site history, community participation, enforcement activities, site characteristics, scope and role of response action, and remedy. Explanation of Significant Differences (ESDs) are also available on the CD. CERCLIS is the Comprehensive Environmental Response, Compensation, and Liability Information System. It is the official repository for all Superfund site and incident data. It contains comprehensive information on hazardous waste sites, site inspections, preliminary assessments, and remedial status. The system is sponsored by the EPA`s Office of Emergency and Remedial Response, Information Management Center. Archive (NFRAP) consists of hazardous waste sites that have no further remedial action planned; only basic identifying information is provided for archive sites. The sites found in the Archive database were originally in the CERCLIS database, but were removed beginning in the fall of 1995.« less

  6. Lessons Learned while Exploring Cloud-Native Architectures for NASA EOSDIS Applications and Systems

    NASA Astrophysics Data System (ADS)

    Pilone, D.

    2016-12-01

    As new, high data rate missions begin collecting data, the NASA's Earth Observing System Data and Information System (EOSDIS) archive is projected to grow roughly 20x to over 300PBs by 2025. To prepare for the dramatic increase in data and enable broad scientific inquiry into larger time series and datasets, NASA has been exploring the impact of applying cloud technologies throughout EOSDIS. In this talk we will provide an overview of NASA's prototyping and lessons learned in applying cloud architectures to: Highly scalable and extensible ingest and archive of EOSDIS data Going "all-in" on cloud based application architectures including "serverless" data processing pipelines and evaluating approaches to vendor-lock in Rethinking data distribution and approaches to analysis in a cloud environment Incorporating and enforcing security controls while minimizing the barrier for research efforts to deploy to NASA compliant, operational environments. NASA's Earth Observing System (EOS) is a coordinated series of satellites for long term global observations. NASA's Earth Observing System Data and Information System (EOSDIS) is a multi-petabyte-scale archive of environmental data that supports global climate change research by providing end-to-end services from EOS instrument data collection to science data processing to full access to EOS and other earth science data. On a daily basis, the EOSDIS ingests, processes, archives and distributes over 3 terabytes of data from NASA's Earth Science missions representing over 6000 data products ranging from various types of science disciplines. EOSDIS has continually evolved to improve the discoverability, accessibility, and usability of high-impact NASA data spanning the multi-petabyte-scale archive of Earth science data products.

  7. A Three-fold Outlook of the Ultra-Efficient Engine Technology Program Office (UEET)

    NASA Technical Reports Server (NTRS)

    Graham, La Quilia E.

    2004-01-01

    The Ultra-Efficient Engine Technology (UEET) Office at NASA Glenn Research Center is a part of the Aeronautics Directorate. Its vision is to develop and hand off revolutionary turbine engine propulsion technologies that will enable future generation vehicles over a wide range of flight speeds. There are seven different technology area projects of UEET. During my tenure at NASA Glenn Research Center, my assignment was to assist three different areas of UEET, simultaneously. I worked with Kathy Zona in Education Outreach, Lynn Boukalik in Knowledge Management, and Denise Busch with Financial Management. All of my tasks were related to the business side of UEET. As an intern with Education Outreach I created a word search to partner with an exhibit of a Turbine Engine developed out of the UEET office. This exhibit is a portable model that is presented to students of varying ages. The word search complies with National Standards for Education which are part of every science, engineering, and technology teachers curriculum. I also updated a Conference Planning/Workshop Excel Spreadsheet for the UEET Office. I collected and inputted facility overviews from various venues, both on and off site to determine where to hold upcoming conferences. I then documented which facilities were compliant with the Federal Emergency Management Agency's (FEMA) Hotel and Motel Fire Safety Act of 1990. The second area in which I worked was Knowledge Management. a large knowledge management system online which has extensive documentation that continually needs reviewing, updating, and archiving. Knowledge management is the ability to bring individual or team knowledge to an organizational level so that the information can be stored, shared, reviewed, archived. Livelink and a secure server are the Knowledge Management systems that UEET utilizes, Through these systems, I was able to obtain the documents needed for archiving. My assignment was to obtain intellectual property including reports, presentations, or any other documents related to the project. My next task was to document the author, date of creation, and all other properties of each document. To archive these documents I worked extensively with Microsoft Excel. different financial systems of accounting such as the SAP business accounting system. I also learned the best ways to present financial data and shadowed my mentor as she presented financial data to both UEET's project management and the Resources Analysis and Management Office (RAMO). I analyzed the June 2004 financial data of UEET and used Microsoft Excel to input the results of the data. This process made it easier to present the full cost of the project in the month of June. In addition I assisted in the End of the Year 2003 Reconciliation of Purchases of UEET.

  8. Goddard Conference on Mass Storage Systems and Technologies, Volume 1

    NASA Technical Reports Server (NTRS)

    Kobler, Ben (Editor); Hariharan, P. C. (Editor)

    1993-01-01

    Copies of nearly all of the technical papers and viewgraphs presented at the Goddard Conference on Mass Storage Systems and Technologies held in Sep. 1992 are included. The conference served as an informational exchange forum for topics primarily relating to the ingestion and management of massive amounts of data and the attendant problems (data ingestion rates now approach the order of terabytes per day). Discussion topics include the IEEE Mass Storage System Reference Model, data archiving standards, high-performance storage devices, magnetic and magneto-optic storage systems, magnetic and optical recording technologies, high-performance helical scan recording systems, and low end helical scan tape drives. Additional topics addressed the evolution of the identifiable unit for processing purposes as data ingestion rates increase dramatically, and the present state of the art in mass storage technology.

  9. Reusing Information Management Services for Recommended Decadal Study Missions That Facilitate Aerosol and Cloud Studies

    NASA Astrophysics Data System (ADS)

    Alcott, G.; Kempler, S.; Lynnes, C.; Leptoukh, G.; Vollmer, B.; Berrick, S.

    2008-12-01

    NASA Earth Sciences Division (ESD), and its preceding Earth science organizations, has made great investments in the development and maintenance of data management systems, as well as information technologies, for the purpose of maximizing the use and usefulness of NASA generated Earth science data. Earth science information systems, evolving with the maturation and implementation of advancing technologies, reside at NASA data centers, known as Distributed Active Archive Centers (DAACs). With information management system infrastructure in place, and system data and user services already developed and operational, only very small delta costs are required to fully support data archival, processing, and data support services required by the recommended Decadal Study missions. This presentation describes the services and capabilities of the Goddard Space Flight Center (GSFC) Earth Sciences Data and Information Services Center (GES DISC) (one of NASAs DAACs) and their potential reuse for these future missions. After 14 years working with instrument teams and the broader science community, GES DISC personnel expertise in atmospheric, water cycle, and atmospheric modeling data and information services, as well as Earth science missions, information system engineering, operations, and user services have developed a series of modular, reusable data management components currently is use in several projects. The knowledge and experience gained at the GES DISC lend themselves to providing science driven information systems in the areas of aerosols, clouds, and atmospheric chemicals to be measured by recommended Decadal Survey missions. Available reusable capabilities include data archive and distribution (Simple, Scalable, Script-based, Science [S4] Product Archive aka S4PA), data processing (S4 Processor for Measurements aka S4PM), data search (Mirador), data browse, visualization, and analysis (Giovanni), and data mining services. In addition, recent enhancements, such as Open Geospatial Consortium (OGC), Inc. interoperability implementations and data fusion prototypes, will be described. As a result of the information management systems developed by NASAs GES DISC, not only are large cost savings realized through system reuse, but maintenance costs are also minimized due to the simplicity of their implementations.

  10. Archival Services and Technologies for Scientific Data

    NASA Astrophysics Data System (ADS)

    Meyer, Jörg; Hardt, Marcus; Streit, Achim; van Wezel, Jos

    2014-06-01

    After analysis and publication, there is no need to keep experimental data online on spinning disks. For reliability and costs inactive data is moved to tape and put into a data archive. The data archive must provide reliable access for at least ten years following a recommendation of the German Science Foundation (DFG), but many scientific communities wish to keep data available much longer. Data archival is on the one hand purely a bit preservation activity in order to ensure the bits read are the same as those written years before. On the other hand enough information must be archived to be able to use and interpret the content of the data. The latter is depending on many also community specific factors and remains an areas of much debate among archival specialists. The paper describes the current practice of archival and bit preservation in use for different science communities at KIT for which a combination of organizational services and technical tools are required. The special monitoring to detect tape related errors, the software infrastructure in use as well as the service certification are discussed. Plans and developments at KIT also in the context of the Large Scale Data Management and Analysis (LSDMA) project are presented. The technical advantages of the T10 SCSI Stream Commands (SSC-4) and the Linear Tape File System (LTFS) will have a profound impact on future long term archival of large data sets.

  11. Imaged Document Optical Correlation and Conversion System (IDOCCS)

    NASA Astrophysics Data System (ADS)

    Stalcup, Bruce W.; Dennis, Phillip W.; Dydyk, Robert B.

    1999-03-01

    Today, the paper document is fast becoming a thing of the past. With the rapid development of fast, inexpensive computing and storage devices, many government and private organizations are archiving their documents in electronic form (e.g., personnel records, medical records, patents, etc.). In addition, many organizations are converting their paper archives to electronic images, which are stored in a computer database. Because of this, there is a need to efficiently organize this data into comprehensive and accessible information resources. The Imaged Document Optical Correlation and Conversion System (IDOCCS) provides a total solution to the problem of managing and retrieving textual and graphic information from imaged document archives. At the heart of IDOCCS, optical correlation technology provides the search and retrieval capability of document images. The IDOCCS can be used to rapidly search for key words or phrases within the imaged document archives and can even determine the types of languages contained within a document. In addition, IDOCCS can automatically compare an input document with the archived database to determine if it is a duplicate, thereby reducing the overall resources required to maintain and access the document database. Embedded graphics on imaged pages can also be exploited, e.g., imaged documents containing an agency's seal or logo, or documents with a particular individual's signature block, can be singled out. With this dual capability, IDOCCS outperforms systems that rely on optical character recognition as a basis for indexing and storing only the textual content of documents for later retrieval.

  12. [Electronic poison information management system].

    PubMed

    Kabata, Piotr; Waldman, Wojciech; Kaletha, Krystian; Sein Anand, Jacek

    2013-01-01

    We describe deployment of electronic toxicological information database in poison control center of Pomeranian Center of Toxicology. System was based on Google Apps technology, by Google Inc., using electronic, web-based forms and data tables. During first 6 months from system deployment, we used it to archive 1471 poisoning cases, prepare monthly poisoning reports and facilitate statistical analysis of data. Electronic database usage made Poison Center work much easier.

  13. Overview of PACS

    NASA Astrophysics Data System (ADS)

    Vanden Brink, John A.

    1995-08-01

    Development of the DICOM standard and incremental developments in workstation, network, compression, archiving, and digital x-ray technology have produced cost effective image communication possibilities for selected medical applications. The emerging markets include modality PACS, mini PACS, and teleradiology. Military and VA programs lead the way in the move to adopt PACS technology. Commercial markets for PACS components and PAC systems are at LR400 million growing to LR500 million in 1996.

  14. International Federation of Library Associations Annual Conference. Papers of the Management and Technology Division: Information Technology Section (47th, Leipzig, East Germany, August 17-22, 1981).

    ERIC Educational Resources Information Center

    Bradler, Reinhard; And Others

    These seven papers on library management and networks focus on: (1) computerized access to archival and library materials, describing the methodological problems associated with a pilot project in the German Democratic Republic, as well as the efficiency of data bank systems; (2) present and future development of libraries and information centers…

  15. Framework for the quality assurance of 'omics technologies considering GLP requirements.

    PubMed

    Kauffmann, Hans-Martin; Kamp, Hennicke; Fuchs, Regine; Chorley, Brian N; Deferme, Lize; Ebbels, Timothy; Hackermüller, Jörg; Perdichizzi, Stefania; Poole, Alan; Sauer, Ursula G; Tollefsen, Knut E; Tralau, Tewes; Yauk, Carole; van Ravenzwaay, Ben

    2017-12-01

    'Omics technologies are gaining importance to support regulatory toxicity studies. Prerequisites for performing 'omics studies considering GLP principles were discussed at the European Centre for Ecotoxicology and Toxicology of Chemicals (ECETOC) Workshop Applying 'omics technologies in Chemical Risk Assessment. A GLP environment comprises a standard operating procedure system, proper pre-planning and documentation, and inspections of independent quality assurance staff. To prevent uncontrolled data changes, the raw data obtained in the respective 'omics data recording systems have to be specifically defined. Further requirements include transparent and reproducible data processing steps, and safe data storage and archiving procedures. The software for data recording and processing should be validated, and data changes should be traceable or disabled. GLP-compliant quality assurance of 'omics technologies appears feasible for many GLP requirements. However, challenges include (i) defining, storing, and archiving the raw data; (ii) transparent descriptions of data processing steps; (iii) software validation; and (iv) ensuring complete reproducibility of final results with respect to raw data. Nevertheless, 'omics studies can be supported by quality measures (e.g., GLP principles) to ensure quality control, reproducibility and traceability of experiments. This enables regulators to use 'omics data in a fit-for-purpose context, which enhances their applicability for risk assessment. Copyright © 2017 Elsevier Inc. All rights reserved.

  16. The Archive Solution for Distributed Workflow Management Agents of the CMS Experiment at LHC

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kuznetsov, Valentin; Fischer, Nils Leif; Guo, Yuyi

    The CMS experiment at the CERN LHC developed the Workflow Management Archive system to persistently store unstructured framework job report documents produced by distributed workflow management agents. In this paper we present its architecture, implementation, deployment, and integration with the CMS and CERN computing infrastructures, such as central HDFS and Hadoop Spark cluster. The system leverages modern technologies such as a document oriented database and the Hadoop eco-system to provide the necessary flexibility to reliably process, store, and aggregatemore » $$\\mathcal{O}$$(1M) documents on a daily basis. We describe the data transformation, the short and long term storage layers, the query language, along with the aggregation pipeline developed to visualize various performance metrics to assist CMS data operators in assessing the performance of the CMS computing system.« less

  17. The Archive Solution for Distributed Workflow Management Agents of the CMS Experiment at LHC

    DOE PAGES

    Kuznetsov, Valentin; Fischer, Nils Leif; Guo, Yuyi

    2018-03-19

    The CMS experiment at the CERN LHC developed the Workflow Management Archive system to persistently store unstructured framework job report documents produced by distributed workflow management agents. In this paper we present its architecture, implementation, deployment, and integration with the CMS and CERN computing infrastructures, such as central HDFS and Hadoop Spark cluster. The system leverages modern technologies such as a document oriented database and the Hadoop eco-system to provide the necessary flexibility to reliably process, store, and aggregatemore » $$\\mathcal{O}$$(1M) documents on a daily basis. We describe the data transformation, the short and long term storage layers, the query language, along with the aggregation pipeline developed to visualize various performance metrics to assist CMS data operators in assessing the performance of the CMS computing system.« less

  18. MIDG-Emerging grid technologies for multi-site preclinical molecular imaging research communities.

    PubMed

    Lee, Jasper; Documet, Jorge; Liu, Brent; Park, Ryan; Tank, Archana; Huang, H K

    2011-03-01

    Molecular imaging is the visualization and identification of specific molecules in anatomy for insight into metabolic pathways, tissue consistency, and tracing of solute transport mechanisms. This paper presents the Molecular Imaging Data Grid (MIDG) which utilizes emerging grid technologies in preclinical molecular imaging to facilitate data sharing and discovery between preclinical molecular imaging facilities and their collaborating investigator institutions to expedite translational sciences research. Grid-enabled archiving, management, and distribution of animal-model imaging datasets help preclinical investigators to monitor, access and share their imaging data remotely, and promote preclinical imaging facilities to share published imaging datasets as resources for new investigators. The system architecture of the Molecular Imaging Data Grid is described in a four layer diagram. A data model for preclinical molecular imaging datasets is also presented based on imaging modalities currently used in a molecular imaging center. The MIDG system components and connectivity are presented. And finally, the workflow steps for grid-based archiving, management, and retrieval of preclincial molecular imaging data are described. Initial performance tests of the Molecular Imaging Data Grid system have been conducted at the USC IPILab using dedicated VMware servers. System connectivity, evaluated datasets, and preliminary results are presented. The results show the system's feasibility, limitations, direction of future research. Translational and interdisciplinary research in medicine is increasingly interested in cellular and molecular biology activity at the preclinical levels, utilizing molecular imaging methods on animal models. The task of integrated archiving, management, and distribution of these preclinical molecular imaging datasets at preclinical molecular imaging facilities is challenging due to disparate imaging systems and multiple off-site investigators. A Molecular Imaging Data Grid design, implementation, and initial evaluation is presented to demonstrate the secure and novel data grid solution for sharing preclinical molecular imaging data across the wide-area-network (WAN).

  19. Reusing Information Management Services for Recommended Decadal Study Missions to Facilitate Aerosol and Cloud Studies

    NASA Technical Reports Server (NTRS)

    Kempler, Steve; Alcott, Gary; Lynnes, Chris; Leptoukh, Greg; Vollmer, Bruce; Berrick, Steve

    2008-01-01

    NASA Earth Sciences Division (ESD) has made great investments in the development and maintenance of data management systems and information technologies, to maximize the use of NASA generated Earth science data. With information management system infrastructure in place, mature and operational, very small delta costs are required to fully support data archival, processing, and data support services required by the recommended Decadal Study missions. This presentation describes the services and capabilities of the Goddard Space Flight Center (GSFC) Earth Sciences Data and Information Services Center (GES DISC) and the reusability for these future missions. The GES DISC has developed a series of modular, reusable data management components currently in use. They include data archive and distribution (Simple, Scalable, Script-based, Science [S4] Product Archive aka S4PA), data processing (S4 Processor for Measurements aka S4PM), data search (Mirador), data browse, visualization, and analysis (Giovanni), and data mining services. Information management system components are based on atmospheric scientist inputs. Large development and maintenance cost savings can be realized through their reuse in future missions.

  20. Reconstructing a School's Past Using Oral Histories and GIS Mapping.

    ERIC Educational Resources Information Center

    Alibrandi, Marsha; Beal, Candy; Thompson, Ann; Wilson, Anna

    2000-01-01

    Describes an interdisciplinary project that incorporated language arts, social studies, instructional technology, and science where middle school students were involved in oral history, Geographic Information System (GIS) mapping, architectural research, the science of dendrochronology, and the creation of an archival school Web site. (CMK)

  1. Using project life-cycles as guide for timing the archival of scientific data and supporting documentation

    NASA Astrophysics Data System (ADS)

    Martinez, E.; Glassy, J. M.; Fowler, D. K.; Khayat, M.; Olding, S. W.

    2014-12-01

    The NASA Earth Science Data Systems Working Groups (ESDSWG) focuses on improving technologies and processes related to science discovery and preservation. One particular group, the Data Preservation Practices, is defining a set of guidelines to aid data providers in planning both what to submit for archival, and when to submit artifacts, so that the archival process can begin early in the project's life cycle. This has the benefit of leveraging knowledge within the project before staff roll off to other work. In this poster we describe various project archival use cases and identify possible archival life cycles that map closely to the pace and flow of work. To understand "archival life cycles", i.e., distinct project phases that produce archival artifacts such as instrument capabilities, calibration reports, and science data products, the workig group initially mapped the archival requirements defined in the Preservation Content Specification to the typical NASA project life cycle. As described in the poster, this work resulted in a well-defined archival life cycle, but only for some types of projects; it did not fit well for condensed project life cycles experienced within airborne and balloon campaigns. To understand the archival process for projects with compressed cycles, the working group gathered use cases from various communities. This poster will describe selected uses cases that provided insight into the unique flow of these projects, as well as proposing archival life cycles that map artifacts to projects with compressed timelines. Finally, the poster will conclude with some early recommendations for data providers, which will be captured in a formal Guidelines document - to be published in 2015.

  2. Browsing the PDS Image Archive with the Imaging Atlas and Apache Solr

    NASA Astrophysics Data System (ADS)

    Grimes, K. M.; Padams, J. H.; Stanboli, A.; Wagstaff, K. L.

    2018-04-01

    The PDS Image Archive is home to tens of millions of images, nearly 30 million of which are associated with rich metadata. By leveraging the Solr indexing technology and the Imaging Atlas interactive frontend, we enable intuitive archive browsing.

  3. Innovation and stasis: technology and race in Mark Twain's Pudd'nhead Wilson.

    PubMed

    Current, Cynthia A

    2009-01-01

    Mark Twain's Pudd'nhead Wilson demonstrates how technologies of identification attempt to counter how bodies evolve beyond previous constraints—in particular, the constraints of racial classification. Twain develops accounts of subjectivity and racial classification that cover an extraordinary breadth of genealogy, biology, and law, while still invoking elements of randomness and chance. The key to such combinations of fixity and emergence in human identity is the technology of fingerprinting. Twain's speculative engagement with fingerprinting creates a system and medium to classify and secure particular forms of identity, leading to the reassertion of racial values already inherent in science and technology, law, and commerce. Fingerprinting represents the direction that technologies of identity would seek to employ: a movement away from direct visual observation of bodies, whose emergence and change over time make them difficult to categorize, to reliance on archives of information that are increasingly removed from the contexts of meaning and emergence those bodies inhabit; this reflects "one drop" politics, as race becomes increasingly difficult to define visually. The archive itself, then, becomes infected with the spectacular vitality of, and the speculation and risk within, nineteenth-century biological and cultural determinism.

  4. The Third NASA Goddard Conference on Mass Storage Systems and Technologies

    NASA Technical Reports Server (NTRS)

    Kobler, Benjamin (Editor); Hariharan, P. C. (Editor)

    1993-01-01

    This report contains copies of nearly all of the technical papers and viewgraphs presented at the Goddard Conference on Mass Storage Systems and Technologies held in October 1993. The conference served as an informational exchange forum for topics primarily relating to the ingestion and management of massive amounts of data and the attendant problems involved. Discussion topics include the necessary use of computers in the solution of today's infinitely complex problems, the need for greatly increased storage densities in both optical and magnetic recording media, currently popular storage media and magnetic media storage risk factors, data archiving standards including a talk on the current status of the IEEE Storage Systems Reference Model (RM). Additional topics addressed System performance, data storage system concepts, communications technologies, data distribution systems, data compression, and error detection and correction.

  5. Commercial applications for optical data storage

    NASA Astrophysics Data System (ADS)

    Tas, Jeroen

    1991-03-01

    Optical data storage has spurred the market for document imaging systems. These systems are increasingly being used to electronically manage the processing, storage and retrieval of documents. Applications range from straightforward archives to sophisticated workflow management systems. The technology is developing rapidly and within a few years optical imaging facilities will be incorporated in most of the office information systems. This paper gives an overview of the status of the market, the applications and the trends of optical imaging systems.

  6. The Phases Differential Astrometry Data Archive. 5. Candidate Substellar Companions to Binary Systems

    DTIC Science & Technology

    2010-12-01

    Mathematics and Astronomy , 105-24, California Institute of Technology, Pasadena, CA 91125, USA 5 Nicolaus Copernicus Astronomical Center, Polish Academy...Blind Test with support from NASA contract NAS7-03001 (JPL 1336910). PHASES is funded in part by the California Institute of Technol- ogy Astronomy

  7. NATIONAL HEALTH & ENVIRONMENTAL EFFECTS RESEARCH LABORATORY BEGINS IMPLEMENTATION OF AN ELECTRONIC SCIENTIFIC DATA MANAGEMENT SYSTEM

    EPA Science Inventory

    Data and records management have changed greatly as a result of progress in computer technology, but many organizations, including the US EPA's National Records Management Program (NRMP) and the U.S. National Archives and Records Administration (NARA), still struggle to escape th...

  8. Supporting Student Research with Semantic Technologies and Digital Archives

    ERIC Educational Resources Information Center

    Martinez-Garcia, Agustina; Corti, Louise

    2012-01-01

    This article discusses how the idea of higher education students as producers of knowledge rather than consumers can be operationalised by means of student research projects, in which processes of research archiving and analysis are enabled through the use of semantic technologies. It discusses how existing digital repository frameworks can be…

  9. Seizing the strategic opportunities of emerging technologies by building up innovation system: monoclonal antibody development in China.

    PubMed

    Zhang, Mao-Yu; Li, Jian; Hu, Hao; Wang, Yi-Tao

    2015-11-04

    Monoclonal antibodies (mAbs), as an emerging technology, have become increasingly important in the development of human therapeutic agents. How developing countries such as China could seize this emerging technological opportunity remains a poorly studied issue in prior literature. Thus, this paper aims to investigate the research and development of mAbs in China based on an innovation system functions approach and probes into the question of how China has been taking advantage of emerging technologies to overcome its challenges of building up a complete innovation system in developing mAbs. Mixed research methods were applied by combining archival data and field interviews. Archival data from the China Food and Drug Administration, Web of Science, the United States Patent and Trademark Office, the Chinese Clinical Trial Registry, and the National Science and Technology Report Service were used to examine the status quo of the technology and research and development (R&D) activities in China, while the opinions of researchers and managers in this field were synthesized from the interviews. From the perspective of innovation system functions, technological development of mAb in China is being driven by incentives such as the subsidies from the State and corporate R&D funding. Knowledge diffusion has been well served over the last 10 years through exchanging information on networks and technology transfer with developed countries. The State has provided clear guidance on search of emerging mAb technologies. Legitimacy of mAb in China has gained momentum owing to the implementation of government policies stipulated in the "The Eleventh Five-year Plan" in 2007, as well as national projects such as the "973 Program" and "863 Program", among others. The potential of market formation stays high because of the rising local demand and government support. Entrepreneurial activities for mAb continue to prosper. In addition, the situation of resource supply has been improved with the support of the State. This study finds that a complete innovation system for mAb has begun to take shape in China. MAb innovators in China are capitalizing on this emerging technological opportunity to participate in the global drive of developing the value chain for the innovative drug. In the long run, the build-up of the research system for mAb in China could bring about more driving forces to the mAb innovation system.

  10. High-fidelity video and still-image communication based on spectral information: natural vision system and its applications

    NASA Astrophysics Data System (ADS)

    Yamaguchi, Masahiro; Haneishi, Hideaki; Fukuda, Hiroyuki; Kishimoto, Junko; Kanazawa, Hiroshi; Tsuchida, Masaru; Iwama, Ryo; Ohyama, Nagaaki

    2006-01-01

    In addition to the great advancement of high-resolution and large-screen imaging technology, the issue of color is now receiving considerable attention as another aspect than the image resolution. It is difficult to reproduce the original color of subject in conventional imaging systems, and that obstructs the applications of visual communication systems in telemedicine, electronic commerce, and digital museum. To breakthrough the limitation of conventional RGB 3-primary systems, "Natural Vision" project aims at an innovative video and still-image communication technology with high-fidelity color reproduction capability, based on spectral information. This paper summarizes the results of NV project including the development of multispectral and multiprimary imaging technologies and the experimental investigations on the applications to medicine, digital archives, electronic commerce, and computer graphics.

  11. Imaged document information location and extraction using an optical correlator

    NASA Astrophysics Data System (ADS)

    Stalcup, Bruce W.; Dennis, Phillip W.; Dydyk, Robert B.

    1999-12-01

    Today, the paper document is fast becoming a thing of the past. With the rapid development of fast, inexpensive computing and storage devices, many government and private organizations are archiving their documents in electronic form (e.g., personnel records, medical records, patents, etc.). Many of these organizations are converting their paper archives to electronic images, which are then stored in a computer database. Because of this, there is a need to efficiently organize this data into comprehensive and accessible information resources and provide for rapid access to the information contained within these imaged documents. To meet this need, Litton PRC and Litton Data Systems Division are developing a system, the Imaged Document Optical Correlation and Conversion System (IDOCCS), to provide a total solution to the problem of managing and retrieving textual and graphic information from imaged document archives. At the heart of IDOCCS, optical correlation technology provide a means for the search and retrieval of information from imaged documents. IDOCCS can be used to rapidly search for key words or phrases within the imaged document archives and has the potential to determine the types of languages contained within a document. In addition, IDOCCS can automatically compare an input document with the archived database to determine if it is a duplicate, thereby reducing the overall resources required to maintain and access the document database. Embedded graphics on imaged pages can also be exploited, e.g., imaged documents containing an agency's seal or logo can be singled out. In this paper, we present a description of IDOCCS as well as preliminary performance results and theoretical projections.

  12. New Developments in NOAA's Comprehensive Large Array-Data Stewardship System

    NASA Astrophysics Data System (ADS)

    Ritchey, N. A.; Morris, J. S.; Carter, D. J.

    2012-12-01

    The Comprehensive Large Array-data Stewardship System (CLASS) is part of the NOAA strategic goal of Climate Adaptation and Mitigation that gives focus to the building and sustaining of key observational assets and data archives critical to maintaining the global climate record. Since 2002, CLASS has been NOAA's enterprise solution for ingesting, storing and providing access to a host of near real-time remote sensing streams such as the Polar and Geostationary Operational Environmental Satellites (POES and GOES) and the Defense Meteorological Satellite Program (DMSP). Since October, 2011 CLASS has also been the dedicated Archive Data Segment (ADS) of the Suomi National Polar-orbiting Partnership (S-NPP). As the ADS, CLASS receives raw and processed S-NPP records for archival and distribution to the broad user community. Moving beyond just remote sensing and model data, NOAA has endorsed a plan to migrate all archive holdings from NOAA's National Data Centers into CLASS while retiring various disparate legacy data storage systems residing at the National Climatic Data Center (NCDC), National Geophysical Data Center (NGDC) and the National Oceanographic Data Center (NODC). In parallel to this data migration, CLASS is evolving to a service-oriented architecture utilizing cloud technologies for dissemination in addition to clearly defined interfaces that allow better collaboration with partners. This evolution will require implementation of standard access protocols and metadata which will lead to cost effective data and information preservation.

  13. A distributed component framework for science data product interoperability

    NASA Technical Reports Server (NTRS)

    Crichton, D.; Hughes, S.; Kelly, S.; Hardman, S.

    2000-01-01

    Correlation of science results from multi-disciplinary communities is a difficult task. Traditionally data from science missions is archived in proprietary data systems that are not interoperable. The Object Oriented Data Technology (OODT) task at the Jet Propulsion Laboratory is working on building a distributed product server as part of a distributed component framework to allow heterogeneous data systems to communicate and share scientific results.

  14. Utilizing data grid architecture for the backup and recovery of clinical image data.

    PubMed

    Liu, Brent J; Zhou, M Z; Documet, J

    2005-01-01

    Grid Computing represents the latest and most exciting technology to evolve from the familiar realm of parallel, peer-to-peer and client-server models. However, there has been limited investigation into the impact of this emerging technology in medical imaging and informatics. In particular, PACS technology, an established clinical image repository system, while having matured significantly during the past ten years, still remains weak in the area of clinical image data backup. Current solutions are expensive or time consuming and the technology is far from foolproof. Many large-scale PACS archive systems still encounter downtime for hours or days, which has the critical effect of crippling daily clinical operations. In this paper, a review of current backup solutions will be presented along with a brief introduction to grid technology. Finally, research and development utilizing the grid architecture for the recovery of clinical image data, in particular, PACS image data, will be presented. The focus of this paper is centered on applying a grid computing architecture to a DICOM environment since DICOM has become the standard for clinical image data and PACS utilizes this standard. A federation of PACS can be created allowing a failed PACS archive to recover its image data from others in the federation in a seamless fashion. The design reflects the five-layer architecture of grid computing: Fabric, Resource, Connectivity, Collective, and Application Layers. The testbed Data Grid is composed of one research laboratory and two clinical sites. The Globus 3.0 Toolkit (Co-developed by the Argonne National Laboratory and Information Sciences Institute, USC) for developing the core and user level middleware is utilized to achieve grid connectivity. The successful implementation and evaluation of utilizing data grid architecture for clinical PACS data backup and recovery will provide an understanding of the methodology for using Data Grid in clinical image data backup for PACS, as well as establishment of benchmarks for performance from future grid technology improvements. In addition, the testbed can serve as a road map for expanded research into large enterprise and federation level data grids to guarantee CA (Continuous Availability, 99.999% up time) in a variety of medical data archiving, retrieval, and distribution scenarios.

  15. An OAIS-Based Hospital Information System on the Cloud: Analysis of a NoSQL Column-Oriented Approach.

    PubMed

    Celesti, Antonio; Fazio, Maria; Romano, Agata; Bramanti, Alessia; Bramanti, Placido; Villari, Massimo

    2018-05-01

    The Open Archive Information System (OAIS) is a reference model for organizing people and resources in a system, and it is already adopted in care centers and medical systems to efficiently manage clinical data, medical personnel, and patients. Archival storage systems are typically implemented using traditional relational database systems, but the relation-oriented technology strongly limits the efficiency in the management of huge amount of patients' clinical data, especially in emerging cloud-based, that are distributed. In this paper, we present an OAIS healthcare architecture useful to manage a huge amount of HL7 clinical documents in a scalable way. Specifically, it is based on a NoSQL column-oriented Data Base Management System deployed in the cloud, thus to benefit from a big tables and wide rows available over a virtual distributed infrastructure. We developed a prototype of the proposed architecture at the IRCCS, and we evaluated its efficiency in a real case of study.

  16. The new space and earth science information systems at NASA's archive

    NASA Technical Reports Server (NTRS)

    Green, James L.

    1990-01-01

    The on-line interactive systems of the National Space Science Data Center (NSSDC) are examined. The worldwide computer network connections that allow access to NSSDC users are outlined. The services offered by the NSSDC new technology on-line systems are presented, including the IUE request system, ozone TOMS data, and data sets on astrophysics, atmospheric science, land sciences, and space plasma physics. Plans for future increases in the NSSDC data holdings are considered.

  17. The new space and Earth science information systems at NASA's archive

    NASA Technical Reports Server (NTRS)

    Green, James L.

    1990-01-01

    The on-line interactive systems of the National Space Science Data Center (NSSDC) are examined. The worldwide computer network connections that allow access to NSSDC users are outlined. The services offered by the NSSDC new technology on-line systems are presented, including the IUE request system, Total Ozone Mapping Spectrometer (TOMS) data, and data sets on astrophysics, atmospheric science, land sciences, and space plasma physics. Plans for future increases in the NSSDC data holdings are considered.

  18. Hubble Space Telescope: cost reduction by re-engineering telemetry processing and archiving

    NASA Astrophysics Data System (ADS)

    Miebach, Manfred P.

    1998-05-01

    The Hubble Space Telescope (HST), the first of NASA's Great Observatories, was launched on April 24, 1990. The HST was designed for a minimum fifteen-year mission with on-orbit servicing by the Space Shuttle System planned at approximately three-year intervals. Major changes to the HST ground system are planned to be in place for the third servicing mission in December 1999. The primary objectives of the ground system reengineering effort, a project called 'vision December 1999. The primary objectives of the ground system re-engineering effort, a project called 'vision 2000 control center systems (CCS)', are to reduce both development and operating costs significantly for the remaining years of HST's lifetime. Development costs will be reduced by providing a modern hardware and software architecture and utilizing commercial of f the shelf (COTS) products wherever possible. Operating costs will be reduced by eliminating redundant legacy systems and processes and by providing an integrated ground system geared toward autonomous operation. Part of CCS is a Space Telescope Engineering Data Store, the design of which is based on current Data Warehouse technology. The purpose of this data store is to provide a common data source of telemetry data for all HST subsystems. This data store will become the engineering data archive and will include a queryable database for the user to analyze HST telemetry. The access to the engineering data in the Data Warehouse is platform- independent from an office environment using commercial standards. Latest internet technology is used to reach the HST engineering community. A WEB-based user interface allows easy access to the data archives. This paper will provide a high level overview of the CCS system and will illustrate some of the CCS telemetry capabilities. Samples of CCS user interface pages will be given. Vision 2000 is an ambitious project, but one that is well under way. It will allow the HST program to realize reduced operations costs for the Third Servicing Mission and beyond.

  19. ESDORA: A Data Archive Infrastructure Using Digital Object Model and Open Source Frameworks

    NASA Astrophysics Data System (ADS)

    Shrestha, Biva; Pan, Jerry; Green, Jim; Palanisamy, Giriprakash; Wei, Yaxing; Lenhardt, W.; Cook, R. Bob; Wilson, B. E.; Leggott, M.

    2011-12-01

    There are an array of challenges associated with preserving, managing, and using contemporary scientific data. Large volume, multiple formats and data services, and the lack of a coherent mechanism for metadata/data management are some of the common issues across data centers. It is often difficult to preserve the data history and lineage information, along with other descriptive metadata, hindering the true science value for the archived data products. In this project, we use digital object abstraction architecture as the information/knowledge framework to address these challenges. We have used the following open-source frameworks: Fedora-Commons Repository, Drupal Content Management System, Islandora (Drupal Module) and Apache Solr Search Engine. The system is an active archive infrastructure for Earth Science data resources, which include ingestion, archiving, distribution, and discovery functionalities. We use an ingestion workflow to ingest the data and metadata, where many different aspects of data descriptions (including structured and non-structured metadata) are reviewed. The data and metadata are published after reviewing multiple times. They are staged during the reviewing phase. Each digital object is encoded in XML for long-term preservation of the content and relations among the digital items. The software architecture provides a flexible, modularized framework for adding pluggable user-oriented functionality. Solr is used to enable word search as well as faceted search. A home grown spatial search module is plugged in to allow user to make a spatial selection in a map view. A RDF semantic store within the Fedora-Commons Repository is used for storing information on data lineage, dissemination services, and text-based metadata. We use the semantic notion "isViewerFor" to register internally or externally referenced URLs, which are rendered within the same web browser when possible. With appropriate mapping of content into digital objects, many different data descriptions, including structured metadata, data history, auditing trails, are captured and coupled with the data content. The semantic store provides a foundation for possible further utilizations, including provide full-fledged Earth Science ontology for data interpretation or lineage tracking. Datasets from the NASA-sponsored Oak Ridge National Laboratory Distributed Active Archive Center (ORNL DAAC) as well as from the Synthesis Thematic Data Center (MAST-DC) are used in a testing deployment with the system. The testing deployment allows us to validate the features and values described here for the integrated system, which will be presented here. Overall, we believe that the integrated system is valid, reusable data archive software that provides digital stewardship for Earth Sciences data content, now and in the future. References: [1] Devarakonda, Ranjeet, and Harold Shanafield. "Drupal: Collaborative framework for science research." Collaboration Technologies and Systems (CTS), 2011 International Conference on. IEEE, 2011. [2] Devarakonda, Ranjeet, et al. "Semantic search integration to climate data." Collaboration Technologies and Systems (CTS), 2014 International Conference on. IEEE, 2014.

  20. The design of a petabyte archive and distribution system for the NASA ECS project

    NASA Technical Reports Server (NTRS)

    Caulk, Parris M.

    1994-01-01

    The NASA EOS Data and Information System (EOSDIS) Core System (ECS) will contain one of the largest data management systems ever built - the ECS Science and Data Processing System (SDPS). SDPS is designed to support long term Global Change Research by acquiring, producing, and storing earth science data, and by providing efficient means for accessing and manipulating that data. The first two releases of SDPS, Release A and Release B, will be operational in 1997 and 1998, respectively. Release B will be deployed at eight Distributed Active Archiving Centers (DAAC's). Individual DAAC's will archive different collections of earth science data, and will vary in archive capacity. The storage and management of these data collections is the responsibility of the SDPS Data Server subsystem. It is anticipated that by the year 2001, the Data Server subsystem at the Goddard DAAC must support a near-line data storage capacity of one petabyte. The development of SDPS is a system integration effort in which COTS products will be used in favor of custom components in very possible way. Some software and hardware capabilities required to meet ECS data volume and storage management requirements beyond 1999 are not yet supported by available COTS products. The ECS project will not undertake major custom development efforts to provide these capabilities. Instead, SDPS and its Data Server subsystem are designed to support initial implementations with current products, and provide an evolutionary framework that facilitates the introduction of advanced COTS products as they become available. This paper provides a high-level description of the Data Server subsystem design from a COTS integration standpoint, and discussed some of the major issues driving the design. The paper focuses on features of the design that will make the system scalable and adaptable to changing technologies.

  1. A case Study of Applying Object-Relational Persistence in Astronomy Data Archiving

    NASA Astrophysics Data System (ADS)

    Yao, S. S.; Hiriart, R.; Barg, I.; Warner, P.; Gasson, D.

    2005-12-01

    The NOAO Science Archive (NSA) team is developing a comprehensive domain model to capture the science data in the archive. Java and an object model derived from the domain model weil address the application layer of the archive system. However, since RDBMS is the best proven technology for data management, the challenge is the paradigm mismatch between the object and the relational models. Transparent object-relational mapping (ORM) persistence is a successful solution to this challenge. In the data modeling and persistence implementation of NSA, we are using Hibernate, a well-accepted ORM tool, to bridge the object model in the business tier and the relational model in the database tier. Thus, the database is isolated from the Java application. The application queries directly on objects using a DBMS-independent object-oriented query API, which frees the application developers from the low level JDBC and SQL so that they can focus on the domain logic. We present the detailed design of the NSA R3 (Release 3) data model and object-relational persistence, including mapping, retrieving and caching. Persistence layer optimization and performance tuning will be analyzed. The system is being built on J2EE, so the integration of Hibernate into the EJB container and the transaction management are also explored.

  2. Archival Theory and the Shaping of Educational History: Utilizing New Sources and Reinterpreting Traditional Ones

    ERIC Educational Resources Information Center

    Glotzer, Richard

    2013-01-01

    Information technology has spawned new evidentiary sources, better retrieval systems for existing ones, and new tools for interpreting traditional source materials. These advances have contributed to a broadening of public participation in civil society (Blouin and Rosenberg 2006). In these culturally unsettled and economically fragile times…

  3. Cardio-PACs: a new opportunity

    NASA Astrophysics Data System (ADS)

    Heupler, Frederick A., Jr.; Thomas, James D.; Blume, Hartwig R.; Cecil, Robert A.; Heisler, Mary

    2000-05-01

    It is now possible to replace film-based image management in the cardiac catheterization laboratory with a Cardiology Picture Archiving and Communication System (Cardio-PACS) based on digital imaging technology. The first step in the conversion process is installation of a digital image acquisition system that is capable of generating high-quality DICOM-compatible images. The next three steps, which are the subject of this presentation, involve image display, distribution, and storage. Clinical requirements and associated cost considerations for these three steps are listed below: Image display: (1) Image quality equal to film, with DICOM format, lossless compression, image processing, desktop PC-based with color monitor, and physician-friendly imaging software; (2) Performance specifications include: acquire 30 frames/sec; replay 15 frames/sec; access to file server 5 seconds, and to archive 5 minutes; (3) Compatibility of image file, transmission, and processing formats; (4) Image manipulation: brightness, contrast, gray scale, zoom, biplane display, and quantification; (5) User-friendly control of image review. Image distribution: (1) Standard IP-based network between cardiac catheterization laboratories, file server, long-term archive, review stations, and remote sites; (2) Non-proprietary formats; (3) Bidirectional distribution. Image storage: (1) CD-ROM vs disk vs tape; (2) Verification of data integrity; (3) User-designated storage capacity for catheterization laboratory, file server, long-term archive. Costs: (1) Image acquisition equipment, file server, long-term archive; (2) Network infrastructure; (3) Review stations and software; (4) Maintenance and administration; (5) Future upgrades and expansion; (6) Personnel.

  4. Scientific and Technological Information in Transactional Files in Government Records and Archives: A RAMP Study.

    ERIC Educational Resources Information Center

    Wimalaratne, K. D. G.

    This long-term Records and Archives Administration Programme (RAMP) study is designed to assist archivists, records managers, and information specialists in identifying for current use and possible archival selection those transactional or case files that contain scientific and technical information (STI), particularly in those instances where…

  5. Fifth NASA Goddard Conference on Mass Storage Systems and Technologies.. Volume 1

    NASA Technical Reports Server (NTRS)

    Kobler, Benjamin (Editor); Hariharan, P. C. (Editor)

    1996-01-01

    This document contains copies of those technical papers received in time for publication prior to the Fifth Goddard Conference on Mass Storage Systems and Technologies. As one of an ongoing series, this conference continues to serve as a unique medium for the exchange of information on topics relating to the ingestion and management of substantial amounts of data and the attendant problems involved. This year's discussion topics include storage architecture, database management, data distribution, file system performance and modeling, and optical recording technology. There will also be a paper on Application Programming Interfaces (API) for a Physical Volume Repository (PVR) defined in Version 5 of the Institute of Electrical and Electronics Engineers (IEEE) Reference Model (RM). In addition, there are papers on specific archives and storage products.

  6. From Ship-To-Shore In Real Time: Data Transmission, Distribution, Management, Processing, And Archiving Using Telepresence Technologies And The Inner Space Center

    NASA Astrophysics Data System (ADS)

    Coleman, D. F.

    2012-12-01

    Most research vessels are equipped with satellite Internet services with bandwidths capable of being upgraded to support telepresence technologies and live shore-based participation. This capability can be used for real-time data transmission to shore, where it can be distributed, managed, processed, and archived. The University of Rhode Island Inner Space Center utilizes telepresence technologies and a growing network of command centers on Internet2 to participate live with a variety of research vessels and their ocean observing and sampling systems. High-bandwidth video streaming, voice-over-IP telecommunications, and real-time data feeds and file transfers enable users on shore to take part in the oceanographic expeditions as if they were present on the ship, working in the lab. Telepresence-enabled systematic ocean exploration and similar programs represent a significant and growing paradigm shift that can change the future of seagoing ocean observations using research vessels. The required platform is the ship itself, and users of the technology rely on the ship-based technical teams, but remote and distributed shore-based science users, students, educators, and the general public can now take part by being aboard virtually.

  7. Legacy system integration using web technology

    NASA Astrophysics Data System (ADS)

    Kennedy, Richard L.; Seibert, James A.; Hughes, Chris J.

    2000-05-01

    As healthcare moves towards a completely digital, multimedia environment there is an opportunity to provide for cost- effective, highly distributed physician access to clinical information including radiology-based imaging. In order to address this opportunity a Universal Clinical Desktop (UCD) system was developed. A UCD provides a single point of entry into an integrated view of all types of clinical data available within a network of disparate healthcare information systems. In order to explore the application of a UCD in a hospital environment, a pilot study was established with the University of California Davis Medical Center using technology from Trilix Information Systems. Within this pilot environment the information systems integrated under the UCD include a radiology information system (RIS), a picture archive and communication system (PACS) and a laboratory information system (LIS).

  8. CCSDS Overview

    NASA Technical Reports Server (NTRS)

    Kearney, Mike

    2013-01-01

    The primary goal of Consultative Committee for Space Data Systems (CCSDS) is interoperability between communications and data systems of space agencies' vehicles, facilities, missions and programs. Of all of the technologies used in spaceflight, standardization of communications and data systems brings the most benefit to multi-agency interoperability. CCSDS Started in 1982 developing standards at the lower layers of the protocol stack. The CCSDS scope has grown to cover standards throughout the entire ISO communications stack, plus other Data Systems areas (architecture, archive, security, XML exchange formats, etc.

  9. High-speed optical 3D sensing and its applications

    NASA Astrophysics Data System (ADS)

    Watanabe, Yoshihiro

    2016-12-01

    This paper reviews high-speed optical 3D sensing technologies for obtaining the 3D shape of a target using a camera. The focusing speed is from 100 to 1000 fps, exceeding normal camera frame rates, which are typically 30 fps. In particular, contactless, active, and real-time systems are introduced. Also, three example applications of this type of sensing technology are introduced, including surface reconstruction from time-sequential depth images, high-speed 3D user interaction, and high-speed digital archiving.

  10. The HARPS-N archive through a Cassandra, NoSQL database suite?

    NASA Astrophysics Data System (ADS)

    Molinari, Emilio; Guerra, Jose; Harutyunyan, Avet; Lodi, Marcello; Martin, Adrian

    2016-07-01

    The TNG-INAF is developing the science archive for the WEAVE instrument. The underlying architecture of the archive is based on a non relational database, more precisely, on Apache Cassandra cluster, which uses a NoSQL technology. In order to test and validate the use of this architecture, we created a local archive which we populated with all the HARPSN spectra collected at the TNG since the instrument's start of operations in mid-2012, as well as developed tools for the analysis of this data set. The HARPS-N data set is two orders of magnitude smaller than WEAVE, but we want to demonstrate the ability to walk through a complete data set and produce scientific output, as valuable as that produced by an ordinary pipeline, though without accessing directly the FITS files. The analytics is done by Apache Solr and Spark and on a relational PostgreSQL database. As an example, we produce observables like metallicity indexes for the targets in the archive and compare the results with the ones coming from the HARPS-N regular data reduction software. The aim of this experiment is to explore the viability of a high availability cluster and distributed NoSQL database as a platform for complex scientific analytics on a large data set, which will then be ported to the WEAVE Archive System (WAS) which we are developing for the WEAVE multi object, fiber spectrograph.

  11. OASIS: A Data Fusion System Optimized for Access to Distributed Archives

    NASA Astrophysics Data System (ADS)

    Berriman, G. B.; Kong, M.; Good, J. C.

    2002-05-01

    The On-Line Archive Science Information Services (OASIS) is accessible as a java applet through the NASA/IPAC Infrared Science Archive home page. It uses Geographical Information System (GIS) technology to provide data fusion and interaction services for astronomers. These services include the ability to process and display arbitrarily large image files, and user-controlled contouring, overlay regeneration and multi-table/image interactions. OASIS has been optimized for access to distributed archives and data sets. Its second release (June 2002) provides a mechanism that enables access to OASIS from "third-party" services and data providers. That is, any data provider who creates a query form to an archive containing a collection of data (images, catalogs, spectra) can direct the result files from the query into OASIS. Similarly, data providers who serve links to datasets or remote services on a web page can access all of these data with one instance of OASIS. In this was any data or service provider is given access to the full suite of capabilites of OASIS. We illustrate the "third-party" access feature with two examples: queries to the high-energy image datasets accessible from GSFC SkyView, and links to data that are returned from a target-based query to the NASA Extragalactic Database (NED). The second release of OASIS also includes a file-transfer manager that reports the status of multiple data downloads from remote sources to the client machine. It is a prototype for a request management system that will ultimately control and manage compute-intensive jobs submitted through OASIS to computing grids, such as request for large scale image mosaics and bulk statistical analysis.

  12. Sharing and re-use of phylogenetic trees (and associated data) to facilitate synthesis.

    PubMed

    Stoltzfus, Arlin; O'Meara, Brian; Whitacre, Jamie; Mounce, Ross; Gillespie, Emily L; Kumar, Sudhir; Rosauer, Dan F; Vos, Rutger A

    2012-10-22

    Recently, various evolution-related journals adopted policies to encourage or require archiving of phylogenetic trees and associated data. Such attention to practices that promote sharing of data reflects rapidly improving information technology, and rapidly expanding potential to use this technology to aggregate and link data from previously published research. Nevertheless, little is known about current practices, or best practices, for publishing trees and associated data so as to promote re-use. Here we summarize results of an ongoing analysis of current practices for archiving phylogenetic trees and associated data, current practices of re-use, and current barriers to re-use. We find that the technical infrastructure is available to support rudimentary archiving, but the frequency of archiving is low. Currently, most phylogenetic knowledge is not easily re-used due to a lack of archiving, lack of awareness of best practices, and lack of community-wide standards for formatting data, naming entities, and annotating data. Most attempts at data re-use seem to end in disappointment. Nevertheless, we find many positive examples of data re-use, particularly those that involve customized species trees generated by grafting to, and pruning from, a much larger tree. The technologies and practices that facilitate data re-use can catalyze synthetic and integrative research. However, success will require engagement from various stakeholders including individual scientists who produce or consume shareable data, publishers, policy-makers, technology developers and resource-providers. The critical challenges for facilitating re-use of phylogenetic trees and associated data, we suggest, include: a broader commitment to public archiving; more extensive use of globally meaningful identifiers; development of user-friendly technology for annotating, submitting, searching, and retrieving data and their metadata; and development of a minimum reporting standard (MIAPA) indicating which kinds of data and metadata are most important for a re-useable phylogenetic record.

  13. The Planetary Data System— Archiving Planetary Data for the use of the Planetary Science Community

    NASA Astrophysics Data System (ADS)

    Morgan, Thomas H.; McLaughlin, Stephanie A.; Grayzeck, Edwin J.; Vilas, Faith; Knopf, William P.; Crichton, Daniel J.

    2014-11-01

    NASA’s Planetary Data System (PDS) archives, curates, and distributes digital data from NASA’s planetary missions. PDS provides the planetary science community convenient online access to data from NASA’s missions so that they can continue to mine these rich data sets for new discoveries. The PDS is a federated system consisting of nodes for specific discipline areas ranging from planetary geology to space physics. Our federation includes an engineering node that provides systems engineering support to the entire PDS.In order to adequately capture complete mission data sets containing not only raw and reduced instrument data, but also calibration and documentation and geometry data required to interpret and use these data sets both singly and together (data from multiple instruments, or from multiple missions), PDS personnel work with NASA missions from the initial AO through the end of mission to define, organize, and document the data. This process includes peer-review of data sets by members of the science community to ensure that the data sets are scientifically useful, effectively organized, and well documented. PDS makes the data in PDS easily searchable so that members of the planetary community can both query the archive to find data relevant to specific scientific investigations and easily retrieve the data for analysis. To ensure long-term preservation of data and to make data sets more easily searchable with the new capabilities in Information Technology now available (and as existing technologies become obsolete), the PDS (together with the COSPAR sponsored IPDA) developed and deployed a new data archiving system known as PDS4, released in 2013. The LADEE, MAVEN, OSIRIS REx, InSight, and Mars2020 missions are using PDS4. ESA has adopted PDS4 for the upcoming BepiColumbo mission. The PDS is actively migrating existing data records into PDS4 and developing tools to aid data providers and users. The PDS is also incorporating challenge-based competitions to rapidly and economically develop new tools for both users and data providers.Please visit our User Support Area at the meeting (Booth #114) if you have questions accessing our data sets or providing data to the PDS.

  14. "Different Strokes for Different Folks": Presenting EAD in Three UK Online Catalogues

    ERIC Educational Resources Information Center

    Hill, Amanda; Stockting, Bill; Higgins, Sarah

    2005-01-01

    This article discusses three different online services providing federated access to finding aids relating to archives found in a number of repositories: the Archives Hub, Access to Archives (A2A) and Navigational Aids for the History of Science, Technology and the Environment (NAHSTE). While the scale of the services is very different, a…

  15. Expanding the Role of an Earth Science Data System: The GHRC Innovations Lab

    NASA Astrophysics Data System (ADS)

    Conover, H.; Ramachandran, R.; Smith, T.; Kulkarni, A.; Maskey, M.; He, M.; Keiser, K.; Graves, S. J.

    2013-12-01

    The Global Hydrology Resource Center is a NASA Earth Science Distributed Active Archive Center (DAAC), managed in partnership by the Earth Science Department at NASA's Marshall Space Flight Center and the University of Alabama in Huntsville's Information Technology and Systems Center. Established in 1991, the GHRC processes, archives and distributes global lightning data from space, airborne and ground based observations from hurricane science field campaigns and Global Precipitation Mission (GPM) ground validation experiments, and satellite passive microwave products. GHRC's close association with the University provides a path for technology infusion from the research center into the data center. The ITSC has a long history of designing and operating science data and information systems. In addition to the GHRC and related data management projects, the ITSC also conducts multidisciplinary research in many facets of information technology. The coupling of ITSC research with the operational GHRC Data Center has enabled the development of new technologies that directly impact the ability of researchers worldwide to apply Earth science data to their specific domains of interest. The GHRC Innovations Lab will provide a showcase for emerging geoinformatics technologies resulting from NASA-sponsored research at the ITSC. Research products to be deployed in the Innovations Lab include: * Data Albums - curated collections of information related to a specific science topic or event with links to relevant data files from different sources. * Data Prospecting - combines automated data mining techniques with user interaction to provide for quick exploration of large volumes of data. * Provenance Browser - provides for graphical exploration of data lineage and related contextual information. In the Innovations Lab, these technologies can be targeted to GHRC data sets, and tuned to address GHRC user interests. As technologies are tested and matured in the Innovations Lab, the most promising will be selected for incorporation into the GHRC's online tool suite.

  16. Scalable Data Mining and Archiving for the Square Kilometre Array

    NASA Astrophysics Data System (ADS)

    Jones, D. L.; Mattmann, C. A.; Hart, A. F.; Lazio, J.; Bennett, T.; Wagstaff, K. L.; Thompson, D. R.; Preston, R.

    2011-12-01

    As the technologies for remote observation improve, the rapid increase in the frequency and fidelity of those observations translates into an avalanche of data that is already beginning to eclipse the resources, both human and technical, of the institutions and facilities charged with managing the information. Common data management tasks like cataloging both data itself and contextual meta-data, creating and maintaining scalable permanent archive, and making data available on-demand for research present significant software engineering challenges when considered at the scales of modern multi-national scientific enterprises such as the upcoming Square Kilometre Array project. The NASA Jet Propulsion Laboratory (JPL), leveraging internal research and technology development funding, has begun to explore ways to address the data archiving and distribution challenges with a number of parallel activities involving collaborations with the EVLA and ALMA teams at the National Radio Astronomy Observatory (NRAO), and members of the Square Kilometre Array South Africa team. To date, we have leveraged the Apache OODT Process Control System framework and its catalog and archive service components that provide file management, workflow management, resource management as core web services. A client crawler framework ingests upstream data (e.g., EVLA raw directory output), identifies its MIME type and automatically extracts relevant metadata including temporal bounds, and job-relevant/processing information. A remote content acquisition (pushpull) service is responsible for staging remote content and handing it off to the crawler framework. A science algorithm wrapper (called CAS-PGE) wraps underlying code including CASApy programs for the EVLA, such as Continuum Imaging and Spectral Line Cube generation, executes the algorithm, and ingests its output (along with relevant extracted metadata). In addition to processing, the Process Control System has been leveraged to provide data curation and automatic ingestion for the MeerKAT/KAT-7 precursor instrument in South Africa, helping to catalog and archive correlator and sensor output from KAT-7, and to make the information available for downstream science analysis. These efforts, supported by the increasing availability of high-quality open source software, represent a concerted effort to seek a cost-conscious methodology for maintaining the integrity of observational data from the upstream instrument to the archive, and at the same time ensuring that the data, with its richly annotated catalog of meta-data, remains a viable resource for research into the future.

  17. Internet based ECG medical information system.

    PubMed

    James, D A; Rowlands, D; Mahnovetski, R; Channells, J; Cutmore, T

    2003-03-01

    Physiological monitoring of humans for medical applications is well established and ready to be adapted to the Internet. This paper describes the implementation of a Medical Information System (MIS-ECG system) incorporating an Internet based ECG acquisition device. Traditionally clinical monitoring of ECG is largely a labour intensive process with data being typically stored on paper. Until recently, ECG monitoring applications have also been constrained somewhat by the size of the equipment required. Today's technology enables large and fixed hospital monitoring systems to be replaced by small portable devices. With an increasing emphasis on health management a truly integrated information system for the acquisition, analysis, patient particulars and archiving is now a realistic possibility. This paper describes recent Internet and technological advances and presents the design and testing of the MIS-ECG system that utilises those advances.

  18. The Telecommunications and Data Acquisition Report

    NASA Technical Reports Server (NTRS)

    Yuen, Joseph H. (Editor)

    1994-01-01

    This quarterly publication provides archival reports on developments in programs in space communications, radio navigation, radio science, and ground-based radio and radar astronomy. It reports on activities of the Deep Space Network (DSN) in planning, supporting research and technology, implementation, and operations. Also included are standardization activities at the Jet Propulsion Laboratory for space data and information systems.

  19. The Heinz Electronic Library Interactive Online System (HELIOS): Building a Digital Archive Using Imaging, OCR, and Natural Language Processing Technologies.

    ERIC Educational Resources Information Center

    Galloway, Edward A.; Michalek, Gabrielle V.

    1995-01-01

    Discusses the conversion project of the congressional papers of Senator John Heinz into digital format and the provision of electronic access to these papers by Carnegie Mellon University. Topics include collection background, project team structure, document processing, scanning, use of optical character recognition software, verification…

  20. A Sharing Mind Map-Oriented Approach to Enhance Collaborative Mobile Learning with Digital Archiving Systems

    ERIC Educational Resources Information Center

    Chang, Jui-Hung; Chiu, Po-Sheng; Huang, Yueh-Min

    2018-01-01

    With the advances in mobile network technology, the use of portable devices and mobile networks for learning is not limited by time and space. Such use, in combination with appropriate learning strategies, can achieve a better effect. Despite the effectiveness of mobile learning, students' learning direction, progress, and achievement may differ.…

  1. Yahoo Works with Academic Libraries on a New Project to Digitize Books

    ERIC Educational Resources Information Center

    Carlson, Scott; Young, Jeffrey R.

    2005-01-01

    This article reports on the most recent search-engine company to join with academic libraries in digitizing large collections of books to make them easily searchable online. Yahoo Inc. has teamed up with the University of California system, the University of Toronto, and several archives and technology companies on a project that could potentially…

  2. Equipment issues regarding the collection of video data for research

    NASA Astrophysics Data System (ADS)

    Kung, Rebecca Lippmann; Kung, Peter; Linder, Cedric

    2005-12-01

    Physics education research increasingly makes use of video data for analysis of student learning and teaching practice. Collection of these data is conceptually simple but execution is often fraught with costly and time-consuming complications. This pragmatic paper discusses the development of systems to record and permanently archive audio and video data in real-time. We focus on a system based upon consumer video DVD recorders, but also give an overview of other technologies and detail issues common to all systems. We detail common yet unexpected complications, particularly with regard to sound quality and compatibility with transcription software. Information specific to fixed and transportable systems, other technology options, and generic and specific equipment recommendations are given in supplemental appendices

  3. Taking digital imaging to the next level: challenges and opportunities.

    PubMed

    Hobbs, W Cecyl

    2004-01-01

    New medical imaging technology, such as multi-detector computed tomography (CT) scanners and positron emission tomography (PET) scanners, are creating new possibilities for non-invasive diagnosis that are leading providers to invest heavily in these new technologies. The volume of data produced by such technology is so large that it cannot be "read" using traditional film-based methods, and once in digital form, it creates a massive data integration and archiving challenge. Despite the benefits of digital imaging and archiving, there are several key challenges that healthcare organizations should consider in planning, selecting, and implementing the information technology (IT) infrastructure to support digital imaging. Decisions about storage and image distribution are essentially questions of "where" and "how fast." When planning the digital archiving infrastructure, organizations should think about where they want to store and distribute their images. This is similar to decisions that organizations have to make in regard to physical film storage and distribution, except the portability of images is even greater in a digital environment. The principle of "network effects" seems like a simple concept, yet the effect is not always considered when implementing a technology plan. To fully realize the benefits of digital imaging, the radiology department must integrate the archiving solutions throughout the department and, ultimately, with applications across other departments and enterprises. Medical institutions can derive a number of benefits from implementing digital imaging and archiving solutions like PACS. Hospitals and imaging centers can use the transition from film-based imaging as a foundational opportunity to reduce costs, increase competitive advantage, attract talent, and improve service to patients. The key factors in achieving these goals include attention to the means of data storage, distribution and protection.

  4. Rocket-Based Combined-Cycle (RBCC) Propulsion Technology Workshop. Tutorial session

    NASA Technical Reports Server (NTRS)

    1992-01-01

    The goal of this workshop was to illuminate the nation's space transportation and propulsion engineering community on the potential of hypersonic combined cycle (airbreathing/rocket) propulsion systems for future space transportation applications. Four general topics were examined: (1) selections from the expansive advanced propulsion archival resource; (2) related propulsion systems technical backgrounds; (3) RBCC engine multimode operations related subsystem background; and (4) focused review of propulsion aspects of current related programs.

  5. Lessons Learned while Exploring Cloud-Native Architectures for NASA EOSDIS Applications and Systems

    NASA Technical Reports Server (NTRS)

    Pilone, Dan

    2016-01-01

    As new, high data rate missions begin collecting data, the NASAs Earth Observing System Data and Information System (EOSDIS) archive is projected to grow roughly 20x to over 300PBs by 2025. To prepare for the dramatic increase in data and enable broad scientific inquiry into larger time series and datasets, NASA has been exploring the impact of applying cloud technologies throughout EOSDIS. In this talk we will provide an overview of NASAs prototyping and lessons learned in applying cloud architectures.

  6. The long hold: Storing data at the National Archives

    NASA Technical Reports Server (NTRS)

    Thibodeau, Kenneth

    1992-01-01

    The National Archives is, in many respects, in a unique position. For example, I find people from other organizations describing an archival medium as one which will last for three to five years. At the National Archives, we deal with the centuries, not years. From our perspective, there is no archival medium for data storage, and we do not expect there will ever be one. Predicting the long-term future of information technology beyond a mere five or ten years approaches the occult arts. But one prediction is probably safe. It is that the technology will continue to change, at least until analysts start talking about the post-information age. If we did have a medium which lasted a hundred years or longer, we probably would not have a device capable of reading it. The issue of obsolescence, as opposed to media stability, is more complex and more costly. It is especially complex at the National Archives because of two other aspects of our peculiar position. The first aspect is that we deal with incoherent data. The second is that we are charged with satisfying unknown and unknowable requirements. A brief overview of these aspects is presented.

  7. Medical image archive node simulation and architecture

    NASA Astrophysics Data System (ADS)

    Chiang, Ted T.; Tang, Yau-Kuo

    1996-05-01

    It is a well known fact that managed care and new treatment technologies are revolutionizing the health care provider world. Community Health Information Network and Computer-based Patient Record projects are underway throughout the United States. More and more hospitals are installing digital, `filmless' radiology (and other imagery) systems. They generate a staggering amount of information around the clock. For example, a typical 500-bed hospital might accumulate more than 5 terabytes of image data in a period of 30 years for conventional x-ray images and digital images such as Magnetic Resonance Imaging and Computer Tomography images. With several hospitals contributing to the archive, the storage required will be in the hundreds of terabytes. Systems for reliable, secure, and inexpensive storage and retrieval of digital medical information do not exist today. In this paper, we present a Medical Image Archive and Distribution Service (MIADS) concept. MIADS is a system shared by individual and community hospitals, laboratories, and doctors' offices that need to store and retrieve medical images. Due to the large volume and complexity of the data, as well as the diversified user access requirement, implementation of the MIADS will be a complex procedure. One of the key challenges to implementing a MIADS is to select a cost-effective, scalable system architecture to meet the ingest/retrieval performance requirements. We have performed an in-depth system engineering study, and developed a sophisticated simulation model to address this key challenge. This paper describes the overall system architecture based on our system engineering study and simulation results. In particular, we will emphasize system scalability and upgradability issues. Furthermore, we will discuss our simulation results in detail. The simulations study the ingest/retrieval performance requirements based on different system configurations and architectures for variables such as workload, tape access time, number of drives, number of exams per patient, number of Central Processing Units, patient grouping, and priority impacts. The MIADS, which could be a key component of a broader data repository system, will be able to communicate with and obtain data from existing hospital information systems. We will discuss the external interfaces enabling MIADS to communicate with and obtain data from existing Radiology Information Systems such as the Picture Archiving and Communication System (PACS). Our system design encompasses the broader aspects of the archive node, which could include multimedia data such as image, audio, video, and free text data. This system is designed to be integrated with current hospital PACS through a Digital Imaging and Communications in Medicine interface. However, the system can also be accessed through the Internet using Hypertext Transport Protocol or Simple File Transport Protocol. Our design and simulation work will be key to implementing a successful, scalable medical image archive and distribution system.

  8. Technology assessment and requirements analysis: a process to facilitate decision making in picture archiving and communications system implementation.

    PubMed

    Radvany, M G; Chacko, A K; Richardson, R R; Grazdan, G W

    1999-05-01

    In a time of decreasing resources, managers need a tool to manage their resources effectively, support clinical requirements, and replace aging equipment in order to ensure adequate clinical care. To do this successfully, one must be able to perform technology assessment and capital equipment asset management. The lack of a commercial system that adequately performed technology needs assessment and addressed the unique needs of the military led to the development of an in-house Technology Assessment and Requirements Analysis (TARA) program. The TARA is a tool that provides an unbiased review of clinical operations and the resulting capital equipment requirements for military hospitals. The TARA report allows for the development of acquisition strategies for new equipment, enhances personnel management, and improves and streamlines clinical operations and processes.

  9. Deep Space Network information system architecture study

    NASA Technical Reports Server (NTRS)

    Beswick, C. A.; Markley, R. W. (Editor); Atkinson, D. J.; Cooper, L. P.; Tausworthe, R. C.; Masline, R. C.; Jenkins, J. S.; Crowe, R. A.; Thomas, J. L.; Stoloff, M. J.

    1992-01-01

    The purpose of this article is to describe an architecture for the DSN information system in the years 2000-2010 and to provide guidelines for its evolution during the 1990's. The study scope is defined to be from the front-end areas at the antennas to the end users (spacecraft teams, principal investigators, archival storage systems, and non-NASA partners). The architectural vision provides guidance for major DSN implementation efforts during the next decade. A strong motivation for the study is an expected dramatic improvement in information-systems technologies--i.e., computer processing, automation technology (including knowledge-based systems), networking and data transport, software and hardware engineering, and human-interface technology. The proposed Ground Information System has the following major features: unified architecture from the front-end area to the end user; open-systems standards to achieve interoperability; DSN production of level 0 data; delivery of level 0 data from the Deep Space Communications Complex, if desired; dedicated telemetry processors for each receiver; security against unauthorized access and errors; and highly automated monitor and control.

  10. The challenge of a data storage hierarchy

    NASA Technical Reports Server (NTRS)

    Ruderman, Michael

    1992-01-01

    A discussion of Mesa Archival Systems' data archiving system is presented. This data archiving system is strictly a software system that is implemented on a mainframe and manages the data into permanent file storage. Emphasis is placed on the fact that any kind of client system on the network can be connected through the Unix interface of the data archiving system.

  11. JPEG 2000 in advanced ground station architectures

    NASA Astrophysics Data System (ADS)

    Chien, Alan T.; Brower, Bernard V.; Rajan, Sreekanth D.

    2000-11-01

    The integration and management of information from distributed and heterogeneous information producers and providers must be a key foundation of any developing imagery intelligence system. Historically, imagery providers acted as production agencies for imagery, imagery intelligence, and geospatial information. In the future, these imagery producers will be evolving to act more like e-business information brokers. The management of imagery and geospatial information-visible, spectral, infrared (IR), radar, elevation, or other feature and foundation data-is crucial from a quality and content perspective. By 2005, there will be significantly advanced collection systems and a myriad of storage devices. There will also be a number of automated and man-in-the-loop correlation, fusion, and exploitation capabilities. All of these new imagery collection and storage systems will result in a higher volume and greater variety of imagery being disseminated and archived in the future. This paper illustrates the importance-from a collection, storage, exploitation, and dissemination perspective-of the proper selection and implementation of standards-based compression technology for ground station and dissemination/archive networks. It specifically discusses the new compression capabilities featured in JPEG 2000 and how that commercially based technology can provide significant improvements to the overall imagery and geospatial enterprise both from an architectural perspective as well as from a user's prospective.

  12. PACS storage technology update: holographic storage.

    PubMed

    Colang, John E; Johnston, James N

    2006-01-01

    This paper focuses on the emerging technology of holographic storage and its effect on picture archiving and communication systems (PACS). A review of the emerging technology is presented, which includes a high level description of holographic drives and the associated substrate media, the laser and optical technology, and the spatial light modulator. The potential advantages and disadvantages of holographic drive and storage technology are evaluated. PACS administrators face myriad complex and expensive storage solutions and selecting an appropriate system is time-consuming and costly. Storage technology may become obsolete quickly because of the exponential nature of the advances in digital storage media. Holographic storage may turn out to be a low cost, high speed, high volume storage solution of the future; however, data is inconclusive at this early stage of the technology lifecycle. Despite the current lack of quantitative data to support the hypothesis that holographic technology will have a significant effect on PACS and standards of practice, it seems likely from the current information that holographic technology will generate significant efficiencies. This paper assumes the reader has a fundamental understanding of PACS technology.

  13. The National Institutes of Health Clinical Center Digital Imaging Network, Picture Archival and Communication System, and Radiology Information System.

    PubMed

    Goldszal, A F; Brown, G K; McDonald, H J; Vucich, J J; Staab, E V

    2001-06-01

    In this work, we describe the digital imaging network (DIN), picture archival and communication system (PACS), and radiology information system (RIS) currently being implemented at the Clinical Center, National Institutes of Health (NIH). These systems are presently in clinical operation. The DIN is a redundant meshed network designed to address gigabit density and expected high bandwidth requirements for image transfer and server aggregation. The PACS projected workload is 5.0 TB of new imaging data per year. Its architecture consists of a central, high-throughput Digital Imaging and Communications in Medicine (DICOM) data repository and distributed redundant array of inexpensive disks (RAID) servers employing fiber-channel technology for immediate delivery of imaging data. On demand distribution of images and reports to clinicians and researchers is accomplished via a clustered web server. The RIS follows a client-server model and provides tools to order exams, schedule resources, retrieve and review results, and generate management reports. The RIS-hospital information system (HIS) interfaces include admissions, discharges, and transfers (ATDs)/demographics, orders, appointment notifications, doctors update, and results.

  14. Volume server: A scalable high speed and high capacity magnetic tape archive architecture with concurrent multi-host access

    NASA Technical Reports Server (NTRS)

    Rybczynski, Fred

    1993-01-01

    A major challenge facing data processing centers today is data management. This includes the storage of large volumes of data and access to it. Current media storage for large data volumes is typically off line and frequently off site in warehouses. Access to data archived in this fashion can be subject to long delays, errors in media selection and retrieval, and even loss of data through misplacement or damage to the media. Similarly, designers responsible for architecting systems capable of continuous high-speed recording of large volumes of digital data are faced with the challenge of identifying technologies and configurations that meet their requirements. Past approaches have tended to evaluate the combination of the fastest tape recorders with the highest capacity tape media and then to compromise technology selection as a consequence of cost. This paper discusses an architecture that addresses both of these challenges and proposes a cost effective solution based on robots, high speed helical scan tape drives, and large-capacity media.

  15. DICOM-compliant PACS with CD-based image archival

    NASA Astrophysics Data System (ADS)

    Cox, Robert D.; Henri, Christopher J.; Rubin, Richard K.; Bret, Patrice M.

    1998-07-01

    This paper describes the design and implementation of a low- cost PACS conforming to the DICOM 3.0 standard. The goal was to provide an efficient image archival and management solution on a heterogeneous hospital network as a basis for filmless radiology. The system follows a distributed, client/server model and was implemented at a fraction of the cost of a commercial PACS. It provides reliable archiving on recordable CD and allows access to digital images throughout the hospital and on the Internet. Dedicated servers have been designed for short-term storage, CD-based archival, data retrieval and remote data access or teleradiology. The short-term storage devices provide DICOM storage and query/retrieve services to scanners and workstations and approximately twelve weeks of 'on-line' image data. The CD-based archival and data retrieval processes are fully automated with the exception of CD loading and unloading. The system employs lossless compression on both short- and long-term storage devices. All servers communicate via the DICOM protocol in conjunction with both local and 'master' SQL-patient databases. Records are transferred from the local to the master database independently, ensuring that storage devices will still function if the master database server cannot be reached. The system features rules-based work-flow management and WWW servers to provide multi-platform remote data access. The WWW server system is distributed on the storage, retrieval and teleradiology servers allowing viewing of locally stored image data directly in a WWW browser without the need for data transfer to a central WWW server. An independent system monitors disk usage, processes, network and CPU load on each server and reports errors to the image management team via email. The PACS was implemented using a combination of off-the-shelf hardware, freely available software and applications developed in-house. The system has enabled filmless operation in CT, MR and ultrasound within the radiology department and throughout the hospital. The use of WWW technology has enabled the development of an intuitive we- based teleradiology and image management solution that provides complete access to image data.

  16. Change Management in the Governance of Schooling: The Rise of Experts, Planners, and Statistics in the Early OECD

    ERIC Educational Resources Information Center

    Tröhler, Daniel

    2014-01-01

    Background/Context: Based on archival material, the following paper analyzes the political strategies of the early OECD stakeholders in transforming schooling from a cultural to a technological system and how they were in need of standardizing different existing patterns of thoughts or institutional behaviors in the member countries. The European…

  17. ACTS (Advanced Communications Technology Satellite) Propagation Experiment: Preprocessing Software User's Manual

    NASA Technical Reports Server (NTRS)

    Crane, Robert K.; Wang, Xuhe; Westenhaver, David

    1996-01-01

    The preprocessing software manual describes the Actspp program originally developed to observe and diagnose Advanced Communications Technology Satellite (ACTS) propagation terminal/receiver problems. However, it has been quite useful for automating the preprocessing functions needed to convert the terminal output to useful attenuation estimates. Prior to having data acceptable for archival functions, the individual receiver system must be calibrated and the power level shifts caused by ranging tone modulation must be received. Actspp provides three output files: the daylog, the diurnal coefficient file, and the file that contains calibration information.

  18. PDS4: Current Status and Future Vision

    NASA Astrophysics Data System (ADS)

    Crichton, D. J.; Hughes, J. S.; Hardman, S. H.; Law, E. S.; Beebe, R. F.

    2017-12-01

    In 2010, the Planetary Data System began the largest standards and software upgrade in its history called "PDS4". PDS4 was architected with core principles, applying years of experience and lessons learned working with scientific data returned from robotic solar system missions. In addition to applying those lessons learned, the PDS team was able to take advantage of modern software and data architecture approaches and emerging information technologies which has enabled the capture, management, discovery, and distribution of data from planetary science archives world-wide. What has emerged is a foundational set of standards, services, and common tools to construct and enable interoperability of planetary science archives from distributed repositories. Early in the PDS4 development, PDS selected two missions as drivers to be used to validate the PDS4 approach: LADEE and MAVEN. Additionally, PDS partnered with international agencies to begin discussing the architecture, design, and implementation to ensure that PDS4 would be architected as a world-wide standard and platform for archive development and interoperability. Given the evolving requirements, an agile software development methodology known as the "Evolutionary Software Development Lifecycle" was chosen. This led to incremental releases of increasing capability over time which were matched against emerging mission and user needs. To date, PDS has now performed 16 releases of PDS4 with adoption of over 12 missions world-wide. PDS has also increased from approximately 200 TBs in 2010 to approximately 1.3 PBs of data today, bringing it into the era of big data. The development of PDS4 has not only focused on the construction of compatible archives, but also on increasing access and use of the data in the big data era. As PDS looks forward, it is focused on achieving the recommendations of the Planetary Science Decadal Survey (2013-2022): "support the ongoing effort to evolve the Planetary Data System to an effective online resource for the NASA and international communities". The foundation laid by the standards, software services, and tools positions PDS to develop and adopt new approaches and technologies to enable users to effectively search, extract, integrate, and analyze with the wealth of observational data across international boundaries.

  19. The Role of NASA's Planetary Data System in the Planetary Spatial Data Infrastructure Initiative

    NASA Astrophysics Data System (ADS)

    Arvidson, R. E.; Gaddis, L. R.

    2017-12-01

    An effort underway in NASA's planetary science community is the Mapping and Planetary Spatial Infrastructure Team (MAPSIT, http://www.lpi.usra.edu/mapsit/). MAPSIT is a community assessment group organized to address a lack of strategic spatial data planning for space science and exploration. Working with MAPSIT, a new initiative of NASA and USGS is the development of a Planetary Spatial Data Infrastructure (PSDI) that builds on extensive knowledge on storing, accessing, and working with terrestrial spatial data. PSDI is a knowledge and technology framework that enables the efficient discovery, access, and exploitation of planetary spatial data to facilitate data analysis, knowledge synthesis, and decision-making. NASA's Planetary Data System (PDS) archives >1.2 petabytes of digital data resulting from decades of planetary exploration and research. The PDS charter focuses on the efficient collection, archiving, and accessibility of these data. The PDS emphasis on data preservation and archiving is complementary to that of the PSDI initiative because the latter utilizes and extends available data to address user needs in the areas of emerging technologies, rapid development of tailored delivery systems, and development of online collaborative research environments. The PDS plays an essential PSDI role because it provides expertise to help NASA missions and other data providers to organize and document their planetary data, to collect and maintain the archives with complete, well-documented and peer-reviewed planetary data, to make planetary data accessible by providing online data delivery tools and search services, and ultimately to ensure the long-term preservation and usability of planetary data. The current PDS4 information model extends and expands PDS metadata and relationships between and among elements of the collections. The PDS supports data delivery through several node services, including the Planetary Image Atlas (https://pds-imaging.jpl.nasa.gov/search/), the Orbital Data Explorers (http://ode.rsl.wustl.edu/), and the Planetary Image Locator Tool (PILOT, https://pilot.wr.usgs.gov/); the latter offers ties to the Integrated Software for Imagers and Spectrometers (ISIS), the premier planetary cartographic software package from USGS's Astrogeology Science Team.

  20. Programmed database system at the Chang Gung Craniofacial Center: part II--digitizing photographs.

    PubMed

    Chuang, Shiow-Shuh; Hung, Kai-Fong; de Villa, Glenda H; Chen, Philip K T; Lo, Lun-Jou; Chang, Sophia C N; Yu, Chung-Chih; Chen, Yu-Ray

    2003-07-01

    The archival tools used for digital images in advertising are not to fulfill the clinic requisition and are just beginning to develop. The storage of a large amount of conventional photographic slides needs a lot of space and special conditions. In spite of special precautions, degradation of the slides still occurs. The most common degradation is the appearance of fungus flecks. With the recent advances in digital technology, it is now possible to store voluminous numbers of photographs on a computer hard drive and keep them for a long time. A self-programmed interface has been developed to integrate database and image browser system that can build and locate needed files archive in a matter of seconds with the click of a button. This system requires hardware and software were market provided. There are 25,200 patients recorded in the database that involve 24,331 procedures. In the image files, there are 6,384 patients with 88,366 digital pictures files. From 1999 through 2002, NT400,000 dollars have been saved using the new system. Photographs can be managed with the integrating Database and Browse software for database archiving. This allows labeling of the individual photographs with demographic information and browsing. Digitized images are not only more efficient and economical than the conventional slide images, but they also facilitate clinical studies.

  1. Remotely sensed data available from the US Geological Survey EROS Data Center

    USGS Publications Warehouse

    Dwyer, John L.; Qu, J.J.; Gao, W.; Kafatos, M.; Murphy , R.E.; Salomonson, V.V.

    2006-01-01

    The Center for Earth Resources Observation Systems (EROS) is a field center of the geography discipline within the US geological survey (USGS) of the Department of the Interior. The EROS Data Center (EDC) was established in the early 1970s as the nation’s principal archive of remotely sensed data. Initially the EDC was responsible for the archive, reproduction, and distribution of black-and-white and color-infrared aerial photography acquired under numerous mapping programs conducted by various Federal agencies including the USGS, Department of Agriculture, Environmental Protection Agency, and NASA. The EDC was also designated the central archive for data acquired by the first satellite sensor designed for broad-scale earth observations in support of civilian agency needs for earth resource information. A four-band multispectral scanner (MSS) and a return-beam vidicon (RBV) camera were initially flown on the Earth Resources Technology Satellite-1, subsequently designated Landsat-1. The synoptic coverage, moderate spatial resolution, and multi-spectral view provided by these data stimulated scientists with an unprecedented perspective from which to study the Earth’s surface and to understand the relationships between human activity and natural systems.

  2. Integration of Geographical Information Systems and Geophysical Applications with Distributed Computing Technologies.

    NASA Astrophysics Data System (ADS)

    Pierce, M. E.; Aktas, M. S.; Aydin, G.; Fox, G. C.; Gadgil, H.; Sayar, A.

    2005-12-01

    We examine the application of Web Service Architectures and Grid-based distributed computing technologies to geophysics and geo-informatics. We are particularly interested in the integration of Geographical Information System (GIS) services with distributed data mining applications. GIS services provide the general purpose framework for building archival data services, real time streaming data services, and map-based visualization services that may be integrated with data mining and other applications through the use of distributed messaging systems and Web Service orchestration tools. Building upon on our previous work in these areas, we present our current research efforts. These include fundamental investigations into increasing XML-based Web service performance, supporting real time data streams, and integrating GIS mapping tools with audio/video collaboration systems for shared display and annotation.

  3. Improve wildlife species tracking—Implementing an enhanced global positioning system data management system for California condors

    USGS Publications Warehouse

    Waltermire, Robert G.; Emmerich, Christopher U.; Mendenhall, Laura C.; Bohrer, Gil; Weinzierl, Rolf P.; McGann, Andrew J.; Lineback, Pat K.; Kern, Tim J.; Douglas, David C.

    2016-05-03

    U.S. Fish and Wildlife Service (USFWS) staff in the Pacific Southwest Region and at the Hopper Mountain National Wildlife Refuge Complex requested technical assistance to improve their global positioning system (GPS) data acquisition, management, and archive in support of the California Condor Recovery Program. The USFWS deployed and maintained GPS units on individual Gymnogyps californianus (California condor) in support of long-term research and daily operational monitoring and management of California condors. The U.S. Geological Survey (USGS) obtained funding through the Science Support Program to provide coordination among project participants, provide GPS Global System for Mobile Communication (GSM) transmitters for testing, and compare GSM/GPS with existing Argos satellite GPS technology. The USFWS staff worked with private companies to design, develop, and fit condors with GSM/GPS transmitters. The Movebank organization, an online database of animal tracking data, coordinated with each of these companies to automatically stream their GPS data into Movebank servers and coordinated with USFWS to improve Movebank software for managing transmitter data, including proofing/error checking of incoming GPS data. The USGS arranged to pull raw GPS data from Movebank into the USGS California Condor Management and Analysis Portal (CCMAP) (https://my.usgs.gov/ccmap) for production and dissemination of a daily map of condor movements including various automated alerts. Further, the USGS developed an automatic archiving system for pulling raw and proofed Movebank data into USGS ScienceBase to comply with the Federal Information Security Management Act of 2002. This improved data management system requires minimal manual intervention resulting in more efficient data flow from GPS data capture to archive status. As a result of the project’s success, Pinnacles National Park and the Ventana Wildlife Society California condor programs became partners and adopted the same workflow, tracking, and data archive system. This GPS tracking data management model and workflow should be applicable and beneficial to other wildlife tracking programs.

  4. Acquisition plan for Digital Document Storage (DDS) prototype system

    NASA Technical Reports Server (NTRS)

    1990-01-01

    NASA Headquarters maintains a continuing interest in and commitment to exploring the use of new technology to support productivity improvements in meeting service requirements tasked to the NASA Scientific and Technical Information (STI) Facility, and to support cost effective approaches to the development and delivery of enhanced levels of service provided by the STI Facility. The DDS project has been pursued with this interest and commitment in mind. It is believed that DDS will provide improved archival blowback quality and service for ad hoc requests for paper copies of documents archived and serviced centrally at the STI Facility. It will also develop an operating capability to scan, digitize, store, and reproduce paper copies of 5000 NASA technical reports archived annually at the STI Facility and serviced to the user community. Additionally, it will provide NASA Headquarters and field installations with on-demand, remote, electronic retrieval of digitized, bilevel, bit mapped report images along with branched, nonsequential retrieval of report subparts.

  5. DMFS: A Data Migration File System for NetBSD

    NASA Technical Reports Server (NTRS)

    Studenmund, William

    1999-01-01

    I have recently developed dmfs, a Data Migration File System, for NetBSD. This file system is based on the overlay file system, which is discussed in a separate paper, and provides kernel support for the data migration system being developed by my research group here at NASA/Ames. The file system utilizes an underlying file store to provide the file backing, and coordinates user and system access to the files. It stores its internal meta data in a flat file, which resides on a separate file system. Our data migration system provides archiving and file migration services. System utilities scan the dmfs file system for recently modified files, and archive them to two separate tape stores. Once a file has been doubly archived, files larger than a specified size will be truncated to that size, potentially freeing up large amounts of the underlying file store. Some sites will choose to retain none of the file (deleting its contents entirely from the file system) while others may choose to retain a portion, for instance a preamble describing the remainder of the file. The dmfs layer coordinates access to the file, retaining user-perceived access and modification times, file size, and restricting access to partially migrated files to the portion actually resident. When a user process attempts to read from the non-resident portion of a file, it is blocked and the dmfs layer sends a request to a system daemon to restore the file. As more of the file becomes resident, the user process is permitted to begin accessing the now-resident portions of the file. For simplicity, our data migration system divides a file into two portions, a resident portion followed by an optional non-resident portion. Also, a file is in one of three states: fully resident, fully resident and archived, and (partially) non-resident and archived. For a file which is only partially resident, any attempt to write or truncate the file, or to read a non-resident portion, will trigger a file restoration. Truncations and writes are blocked until the file is fully restored so that a restoration which only partially succeed does not leave the file in an indeterminate state with portions existing only on tape and other portions only in the disk file system. We chose layered file system technology as it permits us to focus on the data migration functionality, and permits end system administrators to choose the underlying file store technology. We chose the overlay layered file system instead of the null layer for two reasons: first to permit our layer to better preserve meta data integrity and second to prevent even root processes from accessing migrated files. This is achieved as the underlying file store becomes inaccessible once the dmfs layer is mounted. We are quite pleased with how the layered file system has turned out. Of the 45 vnode operations in NetBSD, 20 (forty-four percent) required no intervention by our file layer - they are passed directly to the underlying file store. Of the twenty five we do intercept, nine (such as vop_create()) are intercepted only to ensure meta data integrity. Most of the functionality was concentrated in five operations: vop_read, vop_write, vop_getattr, vop_setattr, and vop_fcntl. The first four are the core operations for controlling access to migrated files and preserving the user experience. vop_fcntl, a call generated for a certain class of fcntl codes, provides the command channel used by privileged user programs to communicate with the dmfs layer.

  6. Reconstructing Forty Years of Landsat Observations

    NASA Astrophysics Data System (ADS)

    Meyer, D. J.; Dwyer, J. L.; Steinwand, D.

    2013-12-01

    In July 1972, NASA launched the Earth Resource Technology Satellite (ERTS), the first of what was to be the series of Earth-observing satellites we now know as the Landsat system. This system, originally conceived in the 1960's within the US Department of the Interior and US Geological Survey (USGS), has continued with little interruption for over 40 years, creating the longest record of satellite-based global land observations. The current USGS archive of Landsat images exceeds 4 million scenes, and the recently launched Landsat 8 platform will extend that archive to nearly 50 years of observations. Clearly, these observations are critical to the study of Earth system processes, and the interaction between these processes and human activities. However, the seven successful Landsat missions represent more of an ad hoc program than a long-term record of consistent observations, due largely to changing Federal policies and challenges finding an operational home for the program. Technologically, these systems evolved from the original Multispectral Scanning System (MSS) through the Thematic Mapper and Enhanced Thematic Mapper Plus (ETM+) systems, to the current Observational Land Imager (OLI) and Thermal Infrared Sensor (TIRS) systems. Landsat data were collected globally by a network of international cooperators having diverse data management policies. Much of the oldest data were stored on archaic media that could not be retrieved using modern media readers. Collecting these data from various sensors and sources, and reconstructing them into coherent Earth observation records, posed numerous challenges. We present here a brief overview of work done to overcome these challenges and create a consistent, long-term Landsat observation record. Much of the current archive was 'repatriated' from international cooperators and often required the reconstruction of (sometimes absent) metadata for geo-location and radiometric calibration. The older MSS data, some of which had been successfully retrieved from outdated wide band video media, required similar metadata reconstruction. TM data from Landsats 4 and 5 relied on questionable on-board lamp data for calibration, thus the calibration history for these missions was reconstructed to account for sensor degradation over time. To improve continuity between platforms, Landsat 7 and 8 missions employed 'under-flight' maneuvers to reduce inter-calibration error. Data from the various sensors, platforms and sources were integrated into a common metadata standard, with quality assurance information, to ensure understandability of the data for long-term preservation. Because of these efforts, the current Landsat archive can now support the creation of the long-term climate data records and essential climate variables required to monitor changes on the Earth's surface quantitatively over decades of observations.

  7. Fifth NASA Goddard Conference on Mass Storage Systems and Technologies. Volume 2

    NASA Technical Reports Server (NTRS)

    Kobler, Benjamin (Editor); Hariharan, P. C. (Editor)

    1996-01-01

    This document contains copies of those technical papers received in time for publication prior to the Fifth Goddard Conference on Mass Storage Systems and Technologies held September 17 - 19, 1996, at the University of Maryland, University Conference Center in College Park, Maryland. As one of an ongoing series, this conference continues to serve as a unique medium for the exchange of information on topics relating to the ingestion and management of substantial amounts of data and the attendant problems involved. This year's discussion topics include storage architecture, database management, data distribution, file system performance and modeling, and optical recording technology. There will also be a paper on Application Programming Interfaces (API) for a Physical Volume Repository (PVR) defined in Version 5 of the Institute of Electrical and Electronics Engineers (IEEE) Reference Model (RM). In addition, there are papers on specific archives and storage products.

  8. From PACS to Web-based ePR system with image distribution for enterprise-level filmless healthcare delivery.

    PubMed

    Huang, H K

    2011-07-01

    The concept of PACS (picture archiving and communication system) was initiated in 1982 during the SPIE medical imaging conference in New Port Beach, CA. Since then PACS has been matured to become an everyday clinical tool for image archiving, communication, display, and review. This paper follows the continuous development of PACS technology including Web-based PACS, PACS and ePR (electronic patient record), enterprise PACS to ePR with image distribution (ID). The concept of large-scale Web-based enterprise PACS and ePR with image distribution is presented along with its implementation, clinical deployment, and operation. The Hong Kong Hospital Authority's (HKHA) integration of its home-grown clinical management system (CMS) with PACS and ePR with image distribution is used as a case study. The current concept and design criteria of the HKHA enterprise integration of the CMS, PACS, and ePR-ID for filmless healthcare delivery are discussed, followed by its work-in-progress and current status.

  9. A MySQL Based EPICS Archiver

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Christopher Slominski

    2009-10-01

    Archiving a large fraction of the EPICS signals within the Jefferson Lab (JLAB) Accelerator control system is vital for postmortem and real-time analysis of the accelerator performance. This analysis is performed on a daily basis by scientists, operators, engineers, technicians, and software developers. Archiving poses unique challenges due to the magnitude of the control system. A MySQL Archiving system (Mya) was developed to scale to the needs of the control system; currently archiving 58,000 EPICS variables, updating at a rate of 11,000 events per second. In addition to the large collection rate, retrieval of the archived data must also bemore » fast and robust. Archived data retrieval clients obtain data at a rate over 100,000 data points per second. Managing the data in a relational database provides a number of benefits. This paper describes an archiving solution that uses an open source database and standard off the shelf hardware to reach high performance archiving needs. Mya has been in production at Jefferson Lab since February of 2007.« less

  10. Image standards in tissue-based diagnosis (diagnostic surgical pathology).

    PubMed

    Kayser, Klaus; Görtler, Jürgen; Goldmann, Torsten; Vollmer, Ekkehard; Hufnagl, Peter; Kayser, Gian

    2008-04-18

    Progress in automated image analysis, virtual microscopy, hospital information systems, and interdisciplinary data exchange require image standards to be applied in tissue-based diagnosis. To describe the theoretical background, practical experiences and comparable solutions in other medical fields to promote image standards applicable for diagnostic pathology. THEORY AND EXPERIENCES: Images used in tissue-based diagnosis present with pathology-specific characteristics. It seems appropriate to discuss their characteristics and potential standardization in relation to the levels of hierarchy in which they appear. All levels can be divided into legal, medical, and technological properties. Standards applied to the first level include regulations or aims to be fulfilled. In legal properties, they have to regulate features of privacy, image documentation, transmission, and presentation; in medical properties, features of disease-image combination, human-diagnostics, automated information extraction, archive retrieval and access; and in technological properties features of image acquisition, display, formats, transfer speed, safety, and system dynamics. The next lower second level has to implement the prescriptions of the upper one, i.e. describe how they are implemented. Legal aspects should demand secure encryption for privacy of all patient related data, image archives that include all images used for diagnostics for a period of 10 years at minimum, accurate annotations of dates and viewing, and precise hardware and software information. Medical aspects should demand standardized patients' files such as DICOM 3 or HL 7 including history and previous examinations, information of image display hardware and software, of image resolution and fields of view, of relation between sizes of biological objects and image sizes, and of access to archives and retrieval. Technological aspects should deal with image acquisition systems (resolution, colour temperature, focus, brightness, and quality evaluation procedures), display resolution data, implemented image formats, storage, cycle frequency, backup procedures, operation system, and external system accessibility. The lowest third level describes the permitted limits and threshold in detail. At present, an applicable standard including all mentioned features does not exist to our knowledge; some aspects can be taken from radiological standards (PACS, DICOM 3); others require specific solutions or are not covered yet. The progress in virtual microscopy and application of artificial intelligence (AI) in tissue-based diagnosis demands fast preparation and implementation of an internationally acceptable standard. The described hierarchic order as well as analytic investigation in all potentially necessary aspects and details offers an appropriate tool to specifically determine standardized requirements.

  11. Ocean Wireless Networking and Real Time Data Management

    NASA Astrophysics Data System (ADS)

    Berger, J.; Orcutt, J. A.; Vernon, F. L.; Braun, H. W.; Rajasekar, A.

    2001-12-01

    Recent advances in technology have enabled the exploitation of satellite communications for high-speed (> 64 kbps) duplex communications with oceanographic ships at sea. Furthermore, decreasing costs for high-speed communications have made possible continuous connectivity to the global Internet for delivery of data ashore and communications with scientists and engineers on the ship. Through support from the Office of Naval Research, we have planned a series of tests using the R/V Revelle for real time data delivery of large quantities of underway data (e.g. continuous multibeam profiling) to shore for quality control, archiving, and real-time data availability. The Cecil H. and Ida M. Green Institute of Geophysics and Planetary Physics (IGPP) and the San Diego Supercomputer Center (SDSC) were funded by the NSF Information Technology Research (ITR) Program, the California Institute for Telecommunications and Information Technology [Cal-(IT)2] and the Scripps Institution of Oceanography for research entitled: "Exploring the Environment in Time: Wireless Networks & Real-Time Management." We will describe the technology to be used for the real-time seagoing experiment and the planned expansion of the project through support from the ITR grant. The short-term goal is to exercise the communications system aboard ship in various weather conditions and sea states while testing and developing the real-time data quality control and archiving methodology. The long-term goal is to enable continuous observations in the ocean, specifically supporting the goals of the DEOS (Dynamics of Earth and Ocean Systems) observatory program supported through a NSF Major Research Equipment (MRE) program - a permanent presence in the oceans. The impact on scientific work aboard ships, however, is likely to be fundamental. It will be possible to go to sea in the future with limited engineering capability for scientific operations by allowing shore-based quality control of data collected and videoconferencing for problem resolution. Costs for shipboard measurements will be reduced significantly while, at the same time, the quality of data collected will increase and ex-post-facto data archiving will no longer be necessary.

  12. Earth observation archive activities at DRA Farnborough

    NASA Technical Reports Server (NTRS)

    Palmer, M. D.; Williams, J. M.

    1993-01-01

    Space Sector, Defence Research Agency (DRA), Farnborough have been actively involved in the acquisition and processing of Earth Observation data for over 15 years. During that time an archive of over 20,000 items has been built up. This paper describes the major archive activities, including: operation and maintenance of the main DRA Archive, the development of a prototype Optical Disc Archive System (ODAS), the catalog systems in use at DRA, the UK Processing and Archive Facility for ERS-1 data, and future plans for archiving activities.

  13. Using and Distributing Spaceflight Data: The Johnson Space Center Life Sciences Data Archive

    NASA Technical Reports Server (NTRS)

    Cardenas, J. A.; Buckey, J. C.; Turner, J. N.; White, T. S.; Havelka,J. A.

    1995-01-01

    Life sciences data collected before, during and after spaceflight are valuable and often irreplaceable. The Johnson Space Center Life is hard to find, and much of the data (e.g. Sciences Data Archive has been designed to provide researchers, engineers, managers and educators interactive access to information about and data from human spaceflight experiments. The archive system consists of a Data Acquisition System, Database Management System, CD-ROM Mastering System and Catalog Information System (CIS). The catalog information system is the heart of the archive. The CIS provides detailed experiment descriptions (both written and as QuickTime movies), hardware descriptions, hardware images, documents, and data. An initial evaluation of the archive at a scientific meeting showed that 88% of those who evaluated the catalog want to use the system when completed. The majority of the evaluators found the archive flexible, satisfying and easy to use. We conclude that the data archive effectively provides key life sciences data to interested users.

  14. The development of the Medical Literature Analysis and Retrieval System (MEDLARS)*

    PubMed Central

    Dee, Cheryl Rae

    2007-01-01

    Objective: The research provides a chronology of the US National Library of Medicine's (NLM's) contribution to access to the world's biomedical literature through its computerization of biomedical indexes, particularly the Medical Literature Analysis and Retrieval System (MEDLARS). Method: Using material gathered from NLM's archives and from personal interviews with people associated with developing MEDLARS and its associated systems, the author discusses key events in the history of MEDLARS. Discussion: From the development of the early mechanized bibliographic retrieval systems of the 1940s and to the beginnings of online, interactive computerized bibliographic search systems of the early 1970s chronicled here, NLM's contributions to automation and bibliographic retrieval have been extensive. Conclusion: As NLM's technological experience and expertise grew, innovative bibliographic storage and retrieval systems emerged. NLM's accomplishments regarding MEDLARS were cutting edge, placing the library at the forefront of incorporating mechanization and technologies into medical information systems. PMID:17971889

  15. WSTIAC: Weapon Systems Technology Information Analysis Center. Volume 6, Number 1

    DTIC Science & Technology

    2005-01-01

    official documentation. 2 WS"IAE Newslelter Winter 2005 Lobster Robo [s (Continued from page 1) When compared to terrestrial arthropods, underwater...were partially offset by a net during the S&T phase, could help ease the transition to the decrease of planned quantities to be purchased (-$24.4...environment to archive the ly always involves the use of a Test and Evaluation Master development of an entire system, on paper qnd/or elec- Plan (TEMP). Test

  16. Enhancement of real-time EPICS IOC PV management for the data archiving system

    NASA Astrophysics Data System (ADS)

    Kim, Jae-Ha

    2015-10-01

    The operation of a 100-MeV linear proton accelerator, the major driving values and experimental data need to be archived. According to the experimental conditions, different data are required. Functions that can add new data and delete data in real time need to be implemented. In an experimental physics and industrial control system (EPICS) input output controller (IOC), the value of process variables (PVs) are matched with the driving values and data. The PV values are archived in text file format by using the channel archiver. There is no need to create a database (DB) server, just a need for large hard disk. Through the web, the archived data can be loaded, and new PV values can be archived without stopping the archive engine. The details of the implementation of a data archiving system with channel archiver are presented, and some preliminary results are reported.

  17. The Self-Organized Archive: SPASE, PDS and Archive Cooperatives

    NASA Astrophysics Data System (ADS)

    King, T. A.; Hughes, J. S.; Roberts, D. A.; Walker, R. J.; Joy, S. P.

    2005-05-01

    Information systems with high quality metadata enable uses and services which often go beyond the original purpose. There are two types of metadata: annotations which are items that comment on or describe the content of a resource and identification attributes which describe the external properties of the resource itself. For example, annotations may indicate which columns are present in a table of data, whereas an identification attribute would indicate source of the table, such as the observatory, instrument, organization, and data type. When the identification attributes are collected and used as the basis of a search engine, a user can constrain on an attribute, the archive can then self-organize around the constraint, presenting the user with a particular view of the archive. In an archive cooperative where each participating data system or archive may have its own metadata standards, providing a multi-system search engine requires that individual archive metadata be mapped to a broad based standard. To explore how cooperative archives can form a larger self-organized archive we will show how the Space Physics Archive Search and Extract (SPASE) data model will allow different systems to create a cooperative and will use Planetary Data System (PDS) plus existing space physics activities as a demonstration.

  18. XMM-Newton On-demand Reprocessing Using SaaS Technology

    NASA Astrophysics Data System (ADS)

    Ibarra, A.; Fajersztejn, N.; Loiseau, N.; Gabriel, C.

    2014-05-01

    We present here the architectural design of the new on-the-fly reprocessing capabilities that will be soon developed and implemented in the new XMM-Newton Science Operation Centre. The inclusion of processing capabilities into the archive, as we plan, will be possible thanks to the recent refurbishment of the XMM-Newton science archive, its alignment with the latest web technologies and the XMM-Newton Remote Interface for Science Analysis (RISA), a revolutionary idea of providing processing capabilities through internet services.

  19. The Geodetic Seamless Archive Centers Service Layer: A System Architecture for Federating Geodesy Data Repositories

    NASA Astrophysics Data System (ADS)

    McWhirter, J.; Boler, F. M.; Bock, Y.; Jamason, P.; Squibb, M. B.; Noll, C. E.; Blewitt, G.; Kreemer, C. W.

    2010-12-01

    Three geodesy Archive Centers, Scripps Orbit and Permanent Array Center (SOPAC), NASA's Crustal Dynamics Data Information System (CDDIS) and UNAVCO are engaged in a joint effort to define and develop a common Web Service Application Programming Interface (API) for accessing geodetic data holdings. This effort is funded by the NASA ROSES ACCESS Program to modernize the original GPS Seamless Archive Centers (GSAC) technology which was developed in the 1990s. A new web service interface, the GSAC-WS, is being developed to provide uniform and expanded mechanisms through which users can access our data repositories. In total, our respective archives hold tens of millions of files and contain a rich collection of site/station metadata. Though we serve similar user communities, we currently provide a range of different access methods, query services and metadata formats. This leads to a lack of consistency in the userís experience and a duplication of engineering efforts. The GSAC-WS API and its reference implementation in an underlying Java-based GSAC Service Layer (GSL) supports metadata and data queries into site/station oriented data archives. The general nature of this API makes it applicable to a broad range of data systems. The overall goals of this project include providing consistent and rich query interfaces for end users and client programs, the development of enabling technology to facilitate third party repositories in developing these web service capabilities and to enable the ability to perform data queries across a collection of federated GSAC-WS enabled repositories. A fundamental challenge faced in this project is to provide a common suite of query services across a heterogeneous collection of data yet enabling each repository to expose their specific metadata holdings. To address this challenge we are developing a "capabilities" based service where a repository can describe its specific query and metadata capabilities. Furthermore, the architecture of the GSL is based on a model-view paradigm that decouples the underlying data model semantics from particular representations of the data model. This will allow for the GSAC-WS enabled repositories to evolve their service offerings to incorporate new metadata definition formats (e.g., ISO-19115, FGDC, JSON, etc.) and new techniques for accessing their holdings. Building on the core GSAC-WS implementations the project is also developing a federated/distributed query service. This service will seamlessly integrate with the GSAC Service Layer and will support data and metadata queries across a collection of federated GSAC repositories.

  20. The Videodisc as a Pilot Project of the Public Archives of Canada.

    ERIC Educational Resources Information Center

    Mole, Dennis

    1981-01-01

    Discusses a project in which a large variety of materials from the collection of the Canadian Public Archives were recorded and played back using laser optical videodisc technology. The videodisc's capabilities for preserving, storing, and retrieving information are discussed. (Author/JJD)

  1. Digitizing and Securing Archived Laboratory Notebooks

    ERIC Educational Resources Information Center

    Caporizzo, Marilyn

    2008-01-01

    The Information Group at Millipore has been successfully using a digital rights management tool to secure the email distribution of archived laboratory notebooks. Millipore is a life science leader providing cutting-edge technologies, tools, and services for bioscience research and biopharmaceutical manufacturing. Consisting of four full-time…

  2. [Development and clinical evaluation of an anesthesia information management system].

    PubMed

    Feng, Jing-yi; Chen, Hua; Zhu, Sheng-mei

    2010-09-21

    To study the design, implementation and clinical evaluation of an anesthesia information management system. To record, process and store peri-operative patient data automatically, all kinds of bedside monitoring equipments are connected into the system based on information integrating technology; after a statistical analysis of those patient data by data mining technology, patient status can be evaluated automatically based on risk prediction standard and decision support system, and then anesthetist could perform reasonable and safe clinical processes; with clinical processes electronically recorded, standard record tables could be generated, and clinical workflow is optimized, as well. With the system, kinds of patient data could be collected, stored, analyzed and archived, kinds of anesthesia documents could be generated, and patient status could be evaluated to support clinic decision. The anesthesia information management system is useful for improving anesthesia quality, decreasing risk of patient and clinician, and aiding to provide clinical proof.

  3. Deep Space Network information system architecture study

    NASA Technical Reports Server (NTRS)

    Beswick, C. A.; Markley, R. W. (Editor); Atkinson, D. J.; Cooper, L. P.; Tausworthe, R. C.; Masline, R. C.; Jenkins, J. S.; Crowe, R. A.; Thomas, J. L.; Stoloff, M. J.

    1992-01-01

    The purpose of this article is to describe an architecture for the Deep Space Network (DSN) information system in the years 2000-2010 and to provide guidelines for its evolution during the 1990s. The study scope is defined to be from the front-end areas at the antennas to the end users (spacecraft teams, principal investigators, archival storage systems, and non-NASA partners). The architectural vision provides guidance for major DSN implementation efforts during the next decade. A strong motivation for the study is an expected dramatic improvement in information-systems technologies, such as the following: computer processing, automation technology (including knowledge-based systems), networking and data transport, software and hardware engineering, and human-interface technology. The proposed Ground Information System has the following major features: unified architecture from the front-end area to the end user; open-systems standards to achieve interoperability; DSN production of level 0 data; delivery of level 0 data from the Deep Space Communications Complex, if desired; dedicated telemetry processors for each receiver; security against unauthorized access and errors; and highly automated monitor and control.

  4. Community archiving of imaging studies

    NASA Astrophysics Data System (ADS)

    Fritz, Steven L.; Roys, Steven R.; Munjal, Sunita

    1996-05-01

    The quantity of image data created in a large radiology practice has long been a challenge for available archiving technology. Traditional methods ofarchiving the large quantity of films generated in radiology have relied on warehousing in remote sites, with courier delivery of film files for historical comparisons. A digital community archive, accessible via a wide area network, represents a feasible solution to the problem of archiving digital images from a busy practice. In addition, it affords a physician caring for a patient access to imaging studies performed at a variety ofhealthcare institutions without the need to repeat studies. Security problems include both network security issues in the WAN environment and access control for patient, physician and imaging center. The key obstacle to developing a community archive is currently political. Reluctance to participate in a community archive can be reduced by appropriate design of the access mechanisms.

  5. NASA's small spacecraft technology initiative _Clark_ spacecraft

    NASA Astrophysics Data System (ADS)

    Hayduk, Robert J.; Scott, Walter S.; Walberg, Gerald D.; Butts, James J.; Starr, Richard D.

    1996-11-01

    The Small Satellite Technology Initiative (SSTI) is a National Aeronautics and Space Administration (NASA) program to demonstrate smaller, high technology satellites constructed rapidly and less expensively. Under SSTI, NASA funded the development of "Clark," a high technology demonstration satellite to provide 3-m resolution panchromatic and 15-m resolution multispectral images, as well as collect atmospheric constituent and cosmic x-ray data. The 690-Ib. satellite, to be launched in early 1997, will be in a 476 km, circular, sun-synchronous polar orbit. This paper describes the program objectives, the technical characteristics of the sensors and satellite, image processing, archiving and distribution. Data archiving and distribution will be performed by NASA Stennis Space Center and by the EROS Data Center, Sioux Falls, South Dakota, USA.

  6. The state of the art of medical imaging technology: from creation to archive and back.

    PubMed

    Gao, Xiaohong W; Qian, Yu; Hui, Rui

    2011-01-01

    Medical imaging has learnt itself well into modern medicine and revolutionized medical industry in the last 30 years. Stemming from the discovery of X-ray by Nobel laureate Wilhelm Roentgen, radiology was born, leading to the creation of large quantities of digital images as opposed to film-based medium. While this rich supply of images provides immeasurable information that would otherwise not be possible to obtain, medical images pose great challenges in archiving them safe from corrupted, lost and misuse, retrievable from databases of huge sizes with varying forms of metadata, and reusable when new tools for data mining and new media for data storing become available. This paper provides a summative account on the creation of medical imaging tomography, the development of image archiving systems and the innovation from the existing acquired image data pools. The focus of this paper is on content-based image retrieval (CBIR), in particular, for 3D images, which is exemplified by our developed online e-learning system, MIRAGE, home to a repository of medical images with variety of domains and different dimensions. In terms of novelties, the facilities of CBIR for 3D images coupled with image annotation in a fully automatic fashion have been developed and implemented in the system, resonating with future versatile, flexible and sustainable medical image databases that can reap new innovations.

  7. The State of the Art of Medical Imaging Technology: from Creation to Archive and Back

    PubMed Central

    Gao, Xiaohong W; Qian, Yu; Hui, Rui

    2011-01-01

    Medical imaging has learnt itself well into modern medicine and revolutionized medical industry in the last 30 years. Stemming from the discovery of X-ray by Nobel laureate Wilhelm Roentgen, radiology was born, leading to the creation of large quantities of digital images as opposed to film-based medium. While this rich supply of images provides immeasurable information that would otherwise not be possible to obtain, medical images pose great challenges in archiving them safe from corrupted, lost and misuse, retrievable from databases of huge sizes with varying forms of metadata, and reusable when new tools for data mining and new media for data storing become available. This paper provides a summative account on the creation of medical imaging tomography, the development of image archiving systems and the innovation from the existing acquired image data pools. The focus of this paper is on content-based image retrieval (CBIR), in particular, for 3D images, which is exemplified by our developed online e-learning system, MIRAGE, home to a repository of medical images with variety of domains and different dimensions. In terms of novelties, the facilities of CBIR for 3D images coupled with image annotation in a fully automatic fashion have been developed and implemented in the system, resonating with future versatile, flexible and sustainable medical image databases that can reap new innovations. PMID:21915232

  8. The STARPAHC collection: part of an archive of the history of telemedicine.

    PubMed

    Freiburger, Gary; Holcomb, Mary; Piper, Dave

    2007-01-01

    An early telemedicine project involving NASA, the Papago Tribe (now the Tohono O'odham Indian Nation), the Lockheed Missile and Space Company, the Indian Health Service and the Department of Health, Education and Welfare explored the possibilities of using technology to provide improved health care to a remote population in southern Arizona. The project, called STARPAHC (Space Technology Applied to Rural Papago Advanced Health Care), took place in the 1970s and demonstrated the feasibility of a consortium of public and private partners working together to provide medical care to remote populations via telecommunication. In 2001 the Arizona Health Sciences Library acquired important archival materials documenting the STARPAHC project and in collaboration with the Arizona Telemedicine Program established the Arizona Archive of Telemedicine. The material is likely to interest those studying early attempts to use technology to deliver health care at a distance, as well as those studying the sociological ramifications of technical and scientific projects among indigenous populations.

  9. Going, going, still there: using the WebCite service to permanently archive cited web pages.

    PubMed

    Eysenbach, Gunther; Trudel, Mathieu

    2005-12-30

    Scholars are increasingly citing electronic "web references" which are not preserved in libraries or full text archives. WebCite is a new standard for citing web references. To "webcite" a document involves archiving the cited Web page through www.webcitation.org and citing the WebCite permalink instead of (or in addition to) the unstable live Web page. This journal has amended its "instructions for authors" accordingly, asking authors to archive cited Web pages before submitting a manuscript. Almost 200 other journals are already using the system. We discuss the rationale for WebCite, its technology, and how scholars, editors, and publishers can benefit from the service. Citing scholars initiate an archiving process of all cited Web references, ideally before they submit a manuscript. Authors of online documents and websites which are expected to be cited by others can ensure that their work is permanently available by creating an archived copy using WebCite and providing the citation information including the WebCite link on their Web document(s). Editors should ask their authors to cache all cited Web addresses (Uniform Resource Locators, or URLs) "prospectively" before submitting their manuscripts to their journal. Editors and publishers should also instruct their copyeditors to cache cited Web material if the author has not done so already. Finally, WebCite can process publisher submitted "citing articles" (submitted for example as eXtensible Markup Language [XML] documents) to automatically archive all cited Web pages shortly before or on publication. Finally, WebCite can act as a focussed crawler, caching retrospectively references of already published articles. Copyright issues are addressed by honouring respective Internet standards (robot exclusion files, no-cache and no-archive tags). Long-term preservation is ensured by agreements with libraries and digital preservation organizations. The resulting WebCite Index may also have applications for research assessment exercises, being able to measure the impact of Web services and published Web documents through access and Web citation metrics.

  10. The Impact of Developing Technology on Media Communications.

    ERIC Educational Resources Information Center

    MacDonald, Lindsay W.

    1997-01-01

    Examines changes in media communications resulting from new information technologies: communications technologies (networks, World Wide Web, digital set-top box); graphic arts (digital photography, CD and digital archives, desktop design and publishing, printing technology); television and video (digital editing, interactive television, news and…

  11. Farewell to a legendary mission : ESA to hand over the IUE archive to the world scientific community

    NASA Astrophysics Data System (ADS)

    2000-03-01

    The IUE Archive, storing two decades of ultraviolet astronomy, has become a historical reference. It contains more than 110 000 spectra from observations that in most cases cannot be repeated, and is an excellent source for studying variable phenomena. The long time-lapse covered and the stability of the instrument have enabled astronomers to witness events they never thought they would, such as the metamorphosis of a very old star into a beautiful planetary nebula: a hot central star surrounded by glowing gas and dust. The IUE archive was the first astronomical archive accessible online -- back in 1985, when the World Wide Web did not even exist-- and has been a key catalyst for science: it has triggered the publication of 3 600 articles in refereed journals so far, and a whole generation of astrophysicists have used IUE data at some stage. During IUE's lifetime the archive was managed by ESA, from the Villafranca Satellite Tracking Station near Madrid (Spain). But not any longer. The IUE archive will now belong to the world scientific community. ESA has created INES (IUE Newly Extracted Spectra), a distribution system that allows IUE data to be accessed faster and more easily from non-ESA national hosts throughout the world, managed entirely by local experts. INES maintenance costs are minimal, and the system is designed for ready incorporation of whatever innovations might come in the future. "The INES system and its data guarantee that future generations of astronomers will be able to use IUE data as much as they want, regardless of whether they know about the technicalities of the mission or whether there is an improvement in archive technology. And the distributed structure is better adapted to changes in user needs than a single archive centre", says Antonio Talavera from the Laboratory for Space Astrophysics and Theoretical Physics (LAEFF), based at Villafranca. "ESA has created INES using a minimalist engineering approach for the world scientific community, and has made it to last. INES is easy to use and easy to upgrade, and LAEFF in Spain is proud to serve as the hub for the whole world". The INES Principal Centre is at the LAEFF, owned by INTA, the Spanish National Institute for Aerospace Technology. This centre, with a data mirror at the CADC in Victoria (Canada), holds the complete database and provides information not available from national hosts. So far 17 national hosts (listed below) have come online. Together they form with the Principal Centre an efficient and highly reliable distribution system for the community. The whole process of data retrieval is fully automated and totally transparent to the end user. This distributed structure avoids localised connectivity problems and guarantees availability of data. The release of INES will be celebrated on 21 March with a ceremony at the ESA/VILSPA Satellite Tracking Station in Villafranca near Madrid (see attached agenda and accreditation form). At various other national hosts the release of the INES system will also be celebrated by local academic and demonstration events on different dates. FOOTNOTE ON IUE SATELLITE The ESA/NASA/UK IUE spacecraft, launched in January 1978, became the first space observatory facility available to the whole astronomical community. It marked the beginning of UV astronomy, a field for which space telescopes are essential because UV light does not reach the Earth's surface. By the time IUE was switched off, in September 1996 --14 years later than originally planned -- IUE had changed the view astronomers had of the universe. Among many other findings, IUE discovered the auroras in Jupiter; detected for the first time the halo in our galaxy --a large amount of very hot matter in the outskirts of the Milky Way (the halo); and measured the size of a black hole in the core of an active galaxy.

  12. Landsat International Cooperators and Global Archive Consolidation

    USGS Publications Warehouse

    ,

    2016-04-07

    Landsat missions have always been an important component of U.S. foreign policy, as well as science and technology policy. The program’s longstanding network of International Cooperators (ICs), which operate numerous International Ground Stations (IGS) around the world, embodies the United States’ policy of peaceful use of outer space and the worldwide dissemination of civil space technology for public benefit. Thus, the ICs provide an essential dimension to the Landsat mission.In 2010, the Landsat Global Archive Consolidation (LGAC) effort began, with a goal to consolidate the Landsat data archives of all international ground stations, make the data more accessible to the global Landsat community, and significantly increase the frequency of observations over a given area of interest to improve scientific uses such as change detection and analysis.

  13. Stewardship of NASA's Earth Science Data and Ensuring Long-Term Active Archives

    NASA Astrophysics Data System (ADS)

    Ramapriyan, H.; Behnke, J.

    2016-12-01

    NASA's Earth Observing System Data and Information System (EOSDIS) has been in operation since 1994. EOSDIS manages data from pre-EOS missions dating back to 1960s, EOS missions that started in 1997, and missions from the post-EOS era. Its data holdings come from many different sources - satellite and airborne instruments, in situ measures, field experiments, science investigations, etc. Since the beginning of the EOS Program, NASA has followed an open data policy, with non-discriminatory access to data with no period of exclusive access. NASA has well-established processes for assigning and/or accepting datasets into one of 12 Distributed Active Archive Centers (DAACs) that are parts of EOSDIS. EOSDIS has been evolving through several information technology cycles, adapting to hardware and software changes in the commercial sector. NASA is responsible for maintaining Earth science data as long as users are interested in using them for research and applications, which is well beyond the life of the data gathering missions. For science data to remain useful over long periods of time, steps must be taken to preserve: 1. Data bits with no corruption, 2. Discoverability and access, 3. Readability, 4. Understandability, 5. Usability and 6. Reproducibility of results. NASA's Earth Science data and Information System (ESDIS) Project, along with the 12 EOSDIS Distributed Active Archive Centers (DAACs), has made significant progress in each of these areas over the last decade, and continues to evolve its active archive capabilities. Particular attention is being paid in recent years to ensure that the datasets are "published" in an easily accessible and citable manner through a unified metadata model, a common metadata repository (CMR), a coherent view through the earthdata.gov website, and assignment of Digital Object Identifiers (DOI) with well-designed landing/product information pages.

  14. GIS Technologies For The New Planetary Science Archive (PSA)

    NASA Astrophysics Data System (ADS)

    Docasal, R.; Barbarisi, I.; Rios, C.; Macfarlane, A. J.; Gonzalez, J.; Arviset, C.; De Marchi, G.; Martinez, S.; Grotheer, E.; Lim, T.; Besse, S.; Heather, D.; Fraga, D.; Barthelemy, M.

    2015-12-01

    Geographical information system (GIS) is becoming increasingly used for planetary science. GIS are computerised systems for the storage, retrieval, manipulation, analysis, and display of geographically referenced data. Some data stored in the Planetary Science Archive (PSA), for instance, a set of Mars Express/Venus Express data, have spatial metadata associated to them. To facilitate users in handling and visualising spatial data in GIS applications, the new PSA should support interoperability with interfaces implementing the standards approved by the Open Geospatial Consortium (OGC). These standards are followed in order to develop open interfaces and encodings that allow data to be exchanged with GIS Client Applications, well-known examples of which are Google Earth and NASA World Wind as well as open source tools such as Openlayers. The technology already exists within PostgreSQL databases to store searchable geometrical data in the form of the PostGIS extension. An existing open source maps server is GeoServer, an instance of which has been deployed for the new PSA, uses the OGC standards to allow, among others, the sharing, processing and editing of data and spatial data through the Web Feature Service (WFS) standard as well as serving georeferenced map images through the Web Map Service (WMS). The final goal of the new PSA, being developed by the European Space Astronomy Centre (ESAC) Science Data Centre (ESDC), is to create an archive which enables science exploitation of ESA's planetary missions datasets. This can be facilitated through the GIS framework, offering interfaces (both web GUI and scriptable APIs) that can be used more easily and scientifically by the community, and that will also enable the community to build added value services on top of the PSA.

  15. Archive Inventory Management System (AIMS) — A Fast, Metrics Gathering Framework for Validating and Gaining Insight from Large File-Based Data Archives

    NASA Astrophysics Data System (ADS)

    Verma, R. V.

    2018-04-01

    The Archive Inventory Management System (AIMS) is a software package for understanding the distribution, characteristics, integrity, and nuances of files and directories in large file-based data archives on a continuous basis.

  16. Geoinformation web-system for processing and visualization of large archives of geo-referenced data

    NASA Astrophysics Data System (ADS)

    Gordov, E. P.; Okladnikov, I. G.; Titov, A. G.; Shulgina, T. M.

    2010-12-01

    Developed working model of information-computational system aimed at scientific research in area of climate change is presented. The system will allow processing and analysis of large archives of geophysical data obtained both from observations and modeling. Accumulated experience of developing information-computational web-systems providing computational processing and visualization of large archives of geo-referenced data was used during the implementation (Gordov et al, 2007; Okladnikov et al, 2008; Titov et al, 2009). Functional capabilities of the system comprise a set of procedures for mathematical and statistical analysis, processing and visualization of data. At present five archives of data are available for processing: 1st and 2nd editions of NCEP/NCAR Reanalysis, ECMWF ERA-40 Reanalysis, JMA/CRIEPI JRA-25 Reanalysis, and NOAA-CIRES XX Century Global Reanalysis Version I. To provide data processing functionality a computational modular kernel and class library providing data access for computational modules were developed. Currently a set of computational modules for climate change indices approved by WMO is available. Also a special module providing visualization of results and writing to Encapsulated Postscript, GeoTIFF and ESRI shape files was developed. As a technological basis for representation of cartographical information in Internet the GeoServer software conforming to OpenGIS standards is used. Integration of GIS-functionality with web-portal software to provide a basis for web-portal’s development as a part of geoinformation web-system is performed. Such geoinformation web-system is a next step in development of applied information-telecommunication systems offering to specialists from various scientific fields unique opportunities of performing reliable analysis of heterogeneous geophysical data using approved computational algorithms. It will allow a wide range of researchers to work with geophysical data without specific programming knowledge and to concentrate on solving their specific tasks. The system would be of special importance for education in climate change domain. This work is partially supported by RFBR grant #10-07-00547, SB RAS Basic Program Projects 4.31.1.5 and 4.31.2.7, SB RAS Integration Projects 4 and 9.

  17. Superfund Public Information System (SPIS), June 1998 (on CD-ROM). Data file

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1998-06-01

    The Superfund Public Information System (SPIS) on CD-ROM contains Superfund data for the United States Environmental Protection Agency. The Superfund data is a collection of four databases, CERCLIS, Archive (NFRAP), RODS, and NPL Sites. Descriptions of these databases and CD contents are listed below. The FolioViews browse and retrieval engine is used as a graphical interface to the data. Users can access simple queries and can do complex searching on key words or fields. In addition, context sensitive help, a Superfund process overview, and an integrated data dictionary are available. RODS is the Records Of Decision System. RODS is usedmore » to track site clean-ups under the Superfund program to justify the type of treatment chosen at each site. RODS contains information on technology justification, site history, community participation, enforcement activities, site characteristics, scope and role of response action, and remedy. Explanation of Significant Differences (ESDs) are also available on the CD. CERCLIS is the Comprehensive Environmental Response, Compensation, and Liability Information System. It is the official repository for all Superfund site and incident data. It contains comprehensive information on hazardous waste sites, site inspections, preliminary assessments, and remedial status. The system is sponsored by the EPA`s Office of Emergency and Remedial Response, Information Management Center. Archive (NFRAP) consists of hazardous waste sites that have no further remedial action planned; only basic identifying information is provided for archive sites. The sites found in the Archive database were originally in the CERCLIS database, but were removed beginning in the fall of 1995. NPL sites (available online) are fact sheets that describe the location and history of Superfund sites. Included are descriptions of the most recent activities and past actions at the sites that have contributed to the contamination. Population estimates, land usages, and nearby resources give background on the local setting surrounding a site.« less

  18. Implementation of a departmental picture archiving and communication system: a productivity and cost analysis.

    PubMed

    Macyszyn, Luke; Lega, Brad; Bohman, Leif-Erik; Latefi, Ahmad; Smith, Michelle J; Malhotra, Neil R; Welch, William; Grady, Sean M

    2013-09-01

    Digital radiology enhances productivity and results in long-term cost savings. However, the viewing, storage, and sharing of outside imaging studies on compact discs at ambulatory offices and hospitals pose a number of unique challenges to a surgeon's efficiency and clinical workflow. To improve the efficiency and clinical workflow of an academic neurosurgical practice when evaluating patients with outside radiological studies. Open-source software and commercial hardware were used to design and implement a departmental picture archiving and communications system (PACS). The implementation of a departmental PACS system significantly improved productivity and enhanced collaboration in a variety of clinical settings. Using published data on the rate of information technology problems associated with outside studies on compact discs, this system produced a cost savings ranging from $6250 to $33600 and from $43200 to $72000 for 2 cohorts, urgent transfer and spine clinic patients, respectively, therefore justifying the costs of the system in less than a year. The implementation of a departmental PACS system using open-source software is straightforward and cost-effective and results in significant gains in surgeon productivity when evaluating patients with outside imaging studies.

  19. The Arctic Cooperative Data and Information System: Data Management Support for the NSF Arctic Research Program (Invited)

    NASA Astrophysics Data System (ADS)

    Moore, J.; Serreze, M. C.; Middleton, D.; Ramamurthy, M. K.; Yarmey, L.

    2013-12-01

    The NSF funds the Advanced Cooperative Arctic Data and Information System (ACADIS), url: (http://www.aoncadis.org/). It serves the growing and increasingly diverse data management needs of NSF's arctic research community. The ACADIS investigator team combines experienced data managers, curators and software engineers from the NSIDC, UCAR and NCAR. ACADIS fosters scientific synthesis and discovery by providing a secure long-term data archive to NSF investigators. The system provides discovery and access to arctic related data from this and other archives. This paper updates the technical components of ACADIS, the implementation of best practices, the value of ACADIS to the community and the major challenges facing this archive for the future in handling the diverse data coming from NSF Arctic investigators. ACADIS provides sustainable data management, data stewardship services and leadership for the NSF Arctic research community through open data sharing, adherence to best practices and standards, capitalizing on appropriate evolving technologies, community support and engagement. ACADIS leverages other pertinent projects, capitalizing on appropriate emerging technologies and participating in emerging cyberinfrastructure initiatives. The key elements of ACADIS user services to the NSF Arctic community include: data and metadata upload; support for datasets with special requirements; metadata and documentation generation; interoperability and initiatives with other archives; and science support to investigators and the community. Providing a self-service data publishing platform requiring minimal curation oversight while maintaining rich metadata for discovery, access and preservation is challenging. Implementing metadata standards are a first step towards consistent content. The ACADIS Gateway and ADE offer users choices for data discovery and access with the clear objective of increasing discovery and use of all Arctic data especially for analysis activities. Metadata is at the core of ACADIS activities, from capturing metadata at the point of data submission to ensuring interoperability , providing data citations, and supporting data discovery. ACADIS metadata efforts include: 1) Evolution of the ACADIS metadata profile to increase flexibility in search; 2) Documentation guidelines; and 3) Metadata standardization efforts. A major activity is now underway to ensure consistency in the metadata profile across all archived datasets. ACADIS is embarking on a critical activity to create Digital Object Identifiers (DOI) for all its holdings. The data services offered by ACADIS focus on meeting the needs of the data providers, providing dynamic search capabilities to peruse the ACADIS and related cyrospheric data repositories, efficient data download and some special services including dataset reformatting and visualization. The service is built around of the following key technical elements: The ACADIS Gateway housed at NCAR has been developed to support NSF Arctic data coming from AON and now broadly across PLR/ARC and related archives: The Arctic Data Explorer (ADE) developed at NSIDC is an integral service of ACADIS bringing the rich archive from NSIDC together with catalogs from ACADIS and international partners in Arctic research: and Rosetta and the Digital Object Identifier (DOI) generation scheme are tools available to the community to help publish and utilize datasets in integration and synthesis and publication.

  20. Development and Evaluation of a Compartmental Picture Archiving and Communications System Model for Integration and Visualization of Multidisciplinary Biomedical Data to Facilitate Student Learning in an Integrative Health Clinic

    ERIC Educational Resources Information Center

    Chow, Meyrick; Chan, Lawrence

    2010-01-01

    Information technology (IT) has the potential to improve the clinical learning environment. The extent to which IT enhances or detracts from healthcare professionals' role performance can be expected to affect both student learning and patient outcomes. This study evaluated nursing students' satisfaction with a novel compartmental Picture…

  1. Archival Information Management System.

    DTIC Science & Technology

    1995-02-01

    management system named Archival Information Management System (AIMS), designed to meet the audit trail requirement for studies completed under the...are to be archived to the extent that future reproducibility and interrogation of results will exist. This report presents a prototype information

  2. Cassini/Huygens Program Archive Plan for Science Data

    NASA Technical Reports Server (NTRS)

    Conners, D.

    2000-01-01

    The purpose of this document is to describe the Cassini/Huygens science data archive system which includes policy, roles and responsibilities, description of science and supplementary data products or data sets, metadata, documentation, software, and archive schedule and methods for archive transfer to the NASA Planetary Data System (PDS).

  3. The COROT ground-based archive and access system

    NASA Astrophysics Data System (ADS)

    Solano, E.; González-Riestra, R.; Catala, C.; Baglin, A.

    2002-01-01

    A prototype of the COROT ground-based archive and access system is presented here. The system has been developed at LAEFF and it is based on the experience gained at Laboratorio de Astrofisica Espacial y Fisica Fundamental (LAEFF) with the INES (IUE Newly Extracted System) Archive.

  4. Online resources for news about toxicology and other environmental topics.

    PubMed

    South, J C

    2001-01-12

    Technology has revolutionized researchers' ability to find and retrieve news stories and press releases. Thanks to electronic library systems and telecommunications--notably the Internet--computer users in seconds can sift through millions of articles to locate mainstream articles about toxicology and other environmental topics. But that does not mean it is easy to find what one is looking for. There is a confusing array of databases and services that archive news articles and press releases: (1) some are free; others cost thousands of dollars a year to access, (2) some include hundreds of newspaper and magazine titles; others cover only one publication, (3) some contain archives going back decades; others have just the latest news, (4) some offer only journalistically balanced reports from mainstream news sources; others mix news with opinions and advocacy and include reports from obscure or biased sources. This article explores ways to find news online - particularly news about toxicology, hazardous chemicals, environmental health and the environment in general. The article covers web sites devoted to environmental news; sites and search engines for general-interest news; newspaper archives; commercial information services; press release distribution services and archives; and other resources and strategies for finding articles in the popular press about toxicology and the environment.

  5. The European HST Science Data Archive. [and Data Management Facility (DMF)

    NASA Technical Reports Server (NTRS)

    Pasian, F.; Pirenne, B.; Albrecht, R.; Russo, G.

    1993-01-01

    The paper describes the European HST Science Data Archive. Particular attention is given to the flow from the HST spacecraft to the Science Data Archive at the Space Telescope European Coordinating Facility (ST-ECF); the archiving system at the ST-ECF, including the hardware and software system structure; the operations at the ST-ECF and differences with the Data Management Facility; and the current developments. A diagram of the logical structure and data flow of the system managing the European HST Science Data Archive is included.

  6. U.S. Spacesuit Knowledge Capture

    NASA Technical Reports Server (NTRS)

    Chullen, Cinda; Thomas, Ken; McMann, Joe; Dolan, Kristi; Bitterly, Rose; Lewis, Cathleen

    2010-01-01

    The ability to learn from both the mistakes and successes of the past is vital to assuring success in the future. Due to the close physical interaction between spacesuit systems and human beings as users, spacesuit technology and usage lends itself rather uniquely to the benefits realized from the skillful organization of historical information; its dissemination; the collection and identification of artifacts; and the education of individuals and groups working in the field. The National Aeronautics and Space Administration (NASA), other organizations and individuals have been performing United States (U.S.) spacesuit knowledge capture since the beginning of space exploration. Avenues used to capture the knowledge have included publication of reports; conference presentations; specialized seminars; and classes usually given by veterans in the field. Recently, the effort has been more concentrated and formalized whereby a new avenue of spacesuit knowledge capture has been added to the archives through which videotaping occurs, engaging both current and retired specialists in the field presenting technical scope specifically for education and preservation of knowledge. Now with video archiving, all these avenues of learning can be brought to life with the real experts presenting their wealth of knowledge on screen for future learners to enjoy. U.S. spacesuit knowledge capture topics have included lessons learned in spacesuit technology, experience from the Gemini, Apollo, Skylab and Shuttle programs, hardware certification, design, development and other program components, spacesuit evolution and experience, failure analysis and resolution, and aspects of program management. Concurrently, U.S. spacesuit knowledge capture activities have progressed to a level where NASA, the National Air and Space Museum (NASM), Hamilton Sundstrand (HS) and the spacesuit community are now working together to provide a rather closed-looped spacesuit knowledge capture system which includes specific attention to spacesuit system artifacts as well. A NASM report has recently been created that allows the cross reference of history to the artifacts and the artifacts to the history including spacesuit manufacturing details with current condition and location. NASA has examined spacesuits in the NASM collection for evidence of wear during their operational life. NASA s formal spacesuit knowledge capture efforts now make use of both the NASM spacesuit preservation collection and report to enhance its efforts to educate NASA personnel and contribute to spacesuit history. Be it archiving of human knowledge or archiving of the actual spacesuit legacy hardware with its rich history, the joining together of spacesuit system artifact history with that of development and use during past programs will provide a wealth of knowledge which will greatly enhance the chances for the success of future and more ambitious spacesuit system programs.

  7. NASA's Earth Observing System Data and Information System - Many Mechanisms for On-Going Evolution

    NASA Astrophysics Data System (ADS)

    Ramapriyan, H. K.

    2012-12-01

    NASA's Earth Observing System Data and Information System has been serving a broad user community since August 1994. As a long-lived multi-mission system serving multiple scientific disciplines and a diverse user community, EOSDIS has been evolving continuously. It has had and continues to have many forms of community input to help with this evolution. Early in its history, it had inputs from the EOSDIS Advisory Panel, benefited from the reviews by various external committees and evolved into the present distributed architecture with discipline-based Distributed Active Archive Centers (DAACs), Science Investigator-led Processing Systems and a cross-DAAC search and data access capability. EOSDIS evolution has been helped by advances in computer technology, moving from an initially planned supercomputing environment to SGI workstations to Linux Clusters for computation and from near-line archives of robotic silos with tape cassettes to RAID-disk-based on-line archives for storage. The network capacities have increased steadily over the years making delivery of data on media almost obsolete. The advances in information systems technologies have been having an even greater impact on the evolution of EOSDIS. In the early days, the advent of the World Wide Web came as a game-changer in the operation of EOSDIS. The metadata model developed for the EOSDIS Core System for representing metadata from EOS standard data products has had an influence on the Federal Geographic Data Committee's metadata content standard and the ISO metadata standards. The influence works both ways. As ISO 19115 metadata standard has developed in recent years, EOSDIS is reviewing its metadata to ensure compliance with the standard. Improvements have been made in the cross-DAAC search and access of data using the centralized metadata clearing house (EOS Clearing House - ECHO) and the client Reverb. Given the diversity of the Earth science disciplines served by the DAACs, the DAACs have developed a number of software tools tailored to their respective user communities. Web services play an important part in improved access to data products including some basic analysis and visualization capabilities. A coherent view into all capabilities available from EOSDIS is evolving through the "Coherent Web" effort. Data are being made available in near real-time for scientific research as well as time-critical applications. On-going community inputs for infusion for maintaining vitality of EOSDIS come from technology developments by NASA-sponsored community data system programs - Advancing Collaborative Connections for Earth System Science (ACCESS), Making Earth System Data Records for Use in Research Environments (MEaSUREs) and Applied Information System Technology (AIST), as well as participation in Earth Science Data System Working Groups, the Earth Science Information Partners Federation and other interagency/international activities. An important source of community needs is the annual American Customer Satisfaction Index survey of EOSDIS users. Some of the key areas in which improvements are required and incremental progress is being made are: ease of discovery and access; cross-organizational interoperability; data inter-use; ease of collaboration; ease of citation of datasets; preservation of provenance and context and making them conveniently available to users.

  8. An Archive of Downscaled WCRP CMIP3 Climate Projections for Planning Applications in the Contiguous United States

    NASA Astrophysics Data System (ADS)

    Brekke, L. D.; Pruitt, T.; Maurer, E. P.; Duffy, P. B.

    2007-12-01

    Incorporating climate change information into long-term evaluations of water and energy resources requires analysts to have access to climate projection data that have been spatially downscaled to "basin-relevant" resolution. This is necessary in order to develop system-specific hydrology and demand scenarios consistent with projected climate scenarios. Analysts currently have access to "climate model" resolution data (e.g., at LLNL PCMDI), but not spatially downscaled translations of these datasets. Motivated by a common interest in supporting regional and local assessments, the U.S. Bureau of Reclamation and LLNL (through support from the DOE National Energy Technology Laboratory) have teamed to develop an archive of downscaled climate projections (temperature and precipitation) with geographic coverage consistent with the North American Land Data Assimilation System domain, encompassing the contiguous United States. A web-based information service, hosted at LLNL Green Data Oasis, has been developed to provide Reclamation, LLNL, and other interested analysts free access to archive content. A contemporary statistical method was used to bias-correct and spatially disaggregate projection datasets, and was applied to 112 projections included in the WCRP CMIP3 multi-model dataset hosted by LLNL PCMDI (i.e. 16 GCMs and their multiple simulations of SRES A2, A1b, and B1 emissions pathways).

  9. Improving accessibility and discovery of ESA planetary data through the new planetary science archive

    NASA Astrophysics Data System (ADS)

    Macfarlane, A. J.; Docasal, R.; Rios, C.; Barbarisi, I.; Saiz, J.; Vallejo, F.; Besse, S.; Arviset, C.; Barthelemy, M.; De Marchi, G.; Fraga, D.; Grotheer, E.; Heather, D.; Lim, T.; Martinez, S.; Vallat, C.

    2018-01-01

    The Planetary Science Archive (PSA) is the European Space Agency's (ESA) repository of science data from all planetary science and exploration missions. The PSA provides access to scientific data sets through various interfaces at http://psa.esa.int. Mostly driven by the evolution of the PDS standards which all new ESA planetary missions shall follow and the need to update the interfaces to the archive, the PSA has undergone an important re-engineering. In order to maximise the scientific exploitation of ESA's planetary data holdings, significant improvements have been made by utilising the latest technologies and implementing widely recognised open standards. To facilitate users in handling and visualising the many products stored in the archive which have spatial data associated, the new PSA supports Geographical Information Systems (GIS) by implementing the standards approved by the Open Geospatial Consortium (OGC). The modernised PSA also attempts to increase interoperability with the international community by implementing recognised planetary science specific protocols such as the PDAP (Planetary Data Access Protocol) and EPN-TAP (EuroPlanet-Table Access Protocol). In this paper we describe some of the methods by which the archive may be accessed and present the challenges that are being faced in consolidating data sets of the older PDS3 version of the standards with the new PDS4 deliveries into a single data model mapping to ensure transparent access to the data for users and services whilst maintaining a high performance.

  10. 21 CFR 892.2050 - Picture archiving and communications system.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Picture archiving and communications system. 892.2050 Section 892.2050 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN... communications system. (a) Identification. A picture archiving and communications system is a device that...

  11. Recommendations for a service framework to access astronomical archives

    NASA Technical Reports Server (NTRS)

    Travisano, J. J.; Pollizzi, J.

    1992-01-01

    There are a large number of astronomical archives and catalogs on-line for network access, with many different user interfaces and features. Some systems are moving towards distributed access, supplying users with client software for their home sites which connects to servers at the archive site. Many of the issues involved in defining a standard framework of services that archive/catalog suppliers can use to achieve a basic level of interoperability are described. Such a framework would simplify the development of client and server programs to access the wide variety of astronomical archive systems. The primary services that are supplied by current systems include: catalog browsing, dataset retrieval, name resolution, and data analysis. The following issues (and probably more) need to be considered in establishing a standard set of client/server interfaces and protocols: Archive Access - dataset retrieval, delivery, file formats, data browsing, analysis, etc.; Catalog Access - database management systems, query languages, data formats, synchronous/asynchronous mode of operation, etc.; Interoperability - transaction/message protocols, distributed processing mechanisms (DCE, ONC/SunRPC, etc), networking protocols, etc.; Security - user registration, authorization/authentication mechanisms, etc.; Service Directory - service registration, lookup, port/task mapping, parameters, etc.; Software - public vs proprietary, client/server software, standard interfaces to client/server functions, software distribution, operating system portability, data portability, etc. Several archive/catalog groups, notably the Astrophysics Data System (ADS), are already working in many of these areas. In the process of developing StarView, which is the user interface to the Space Telescope Data Archive and Distribution Service (ST-DADS), these issues and the work of others were analyzed. A framework of standard interfaces for accessing services on any archive system which would benefit archive user and supplier alike is proposed.

  12. Electronic publishing: opportunities and challenges for clinical linguistics and phonetics.

    PubMed

    Powell, Thomas W; Müller, Nicole; Ball, Martin J

    2003-01-01

    This paper discusses the contributions of informatics technology to the field of clinical linguistics and phonetics. The electronic publication of research reports and books has facilitated both the dissemination and the retrieval of scientific information. Electronic archives of speech and language corpora, too, stimulate research efforts. Although technology provides many opportunities, there remain significant challenges. Establishment and maintenance of scientific archives is largely dependent upon volunteer efforts, and there are few standards to ensure long-term access. Coordinated efforts and peer review are necessary to ensure utility and quality.

  13. A reliable, low-cost picture archiving and communications system for small and medium veterinary practices built using open-source technology.

    PubMed

    Iotti, Bryan; Valazza, Alberto

    2014-10-01

    Picture Archiving and Communications Systems (PACS) are the most needed system in a modern hospital. As an integral part of the Digital Imaging and Communications in Medicine (DICOM) standard, they are charged with the responsibility for secure storage and accessibility of the diagnostic imaging data. These machines need to offer high performance, stability, and security while proving reliable and ergonomic in the day-to-day and long-term storage and retrieval of the data they safeguard. This paper reports the experience of the authors in developing and installing a compact and low-cost solution based on open-source technologies in the Veterinary Teaching Hospital for the University of Torino, Italy, during the course of the summer of 2012. The PACS server was built on low-cost x86-based hardware and uses an open source operating system derived from Oracle OpenSolaris (Oracle Corporation, Redwood City, CA, USA) to host the DCM4CHEE PACS DICOM server (DCM4CHEE, http://www.dcm4che.org ). This solution features very high data security and an ergonomic interface to provide easy access to a large amount of imaging data. The system has been in active use for almost 2 years now and has proven to be a scalable, cost-effective solution for practices ranging from small to very large, where the use of different hardware combinations allows scaling to the different deployments, while the use of paravirtualization allows increased security and easy migrations and upgrades.

  14. Earth imaging and scientific observations by SSTI ``Clark'' a NASA technology demonstration spacecraft

    NASA Astrophysics Data System (ADS)

    Hayduk, Robert J.; Scott, Walter S.; Walberg, Gerald D.; Butts, James J.; Starr, Richard D.

    1997-01-01

    The Small Satellite Technology Initiative (SSTI) is a National Aeronautics and Space Administration (NASA) program to demonstrate smaller, high technology satellites constructed rapidly and less expensively. Under SSTI, NASA funded the development of ``Clark,'' a high technology demonstration satellite to provide 3-m resolution panchromatic and 15-m resolution multispectral images, as well as collect atmospheric constituent and cosmic x-ray data. The 690-lb. satellite, to be launched in early 1997, will be in a 476 km, circular, sun-synchronous polar orbit. This paper describes the program objectives, the technical characteristics of the sensors and satellite, image processing, archiving and distribution. Data archiving and distribution will be performed by NASA Stennis Space Center and by the EROS Data Center, Sioux Falls, South Dakota, USA.

  15. Factors influencing the adoption of health information technologies: a systematic review

    PubMed Central

    Garavand, Ali; Mohseni, Mohammah; Asadi, Heshmatollah; Etemadi, Manal; Moradi-Joo, Mohammad; Moosavi, Ahmad

    2016-01-01

    Introduction The successful implementation of health information technologies requires investigating the factors affecting the acceptance and use of them. The aim of this study was to determine the most important factors affecting the adoption of health information technologies by doing a systematic review on the factors affecting the acceptance of health information technology. Methods This systematic review was conducted by searching the major databases, such as Google Scholar, Emerald, Science Direct, Web of Science, Pubmed, and Scopus. We used various keywords, such as adoption, use, acceptance of IT in medicine, hospitals, and IT theories in health services, and we also searched on the basis of several important technologies, such as Electronic Health Records (HER), Electronic Patient Records (EPR), Electronic Medical Records (EMR), Computerized Physician Order Entry (CPOE), Hospital Information System (HIS), Picture Archiving and Communication System (PACS), and others in the 2004–2014 period. Results The technology acceptance model (TAM) is the most important model used to identify the factors influencing the adoption of information technologies in the health system; also, the unified theory of acceptance and use of technology (UTAUT) model has had a lot of applications in recent years in the health system. Ease of use, usefulness, social impact, facilitating conditions, attitudes and behavior of users are effective in the adoption of health information technologies. Conclusion By considering various factors, including ease of use, usefulness, and social impact, the rate of the adoption of health information technology can be increased. PMID:27757179

  16. Factors influencing the adoption of health information technologies: a systematic review.

    PubMed

    Garavand, Ali; Mohseni, Mohammah; Asadi, Heshmatollah; Etemadi, Manal; Moradi-Joo, Mohammad; Moosavi, Ahmad

    2016-08-01

    The successful implementation of health information technologies requires investigating the factors affecting the acceptance and use of them. The aim of this study was to determine the most important factors affecting the adoption of health information technologies by doing a systematic review on the factors affecting the acceptance of health information technology. This systematic review was conducted by searching the major databases, such as Google Scholar, Emerald, Science Direct, Web of Science, Pubmed, and Scopus. We used various keywords, such as adoption, use, acceptance of IT in medicine, hospitals, and IT theories in health services, and we also searched on the basis of several important technologies, such as Electronic Health Records (HER), Electronic Patient Records (EPR), Electronic Medical Records (EMR), Computerized Physician Order Entry (CPOE), Hospital Information System (HIS), Picture Archiving and Communication System (PACS), and others in the 2004-2014 period. The technology acceptance model (TAM) is the most important model used to identify the factors influencing the adoption of information technologies in the health system; also, the unified theory of acceptance and use of technology (UTAUT) model has had a lot of applications in recent years in the health system. Ease of use, usefulness, social impact, facilitating conditions, attitudes and behavior of users are effective in the adoption of health information technologies. By considering various factors, including ease of use, usefulness, and social impact, the rate of the adoption of health information technology can be increased.

  17. Cloud object store for archive storage of high performance computing data using decoupling middleware

    DOEpatents

    Bent, John M.; Faibish, Sorin; Grider, Gary

    2015-06-30

    Cloud object storage is enabled for archived data, such as checkpoints and results, of high performance computing applications using a middleware process. A plurality of archived files, such as checkpoint files and results, generated by a plurality of processes in a parallel computing system are stored by obtaining the plurality of archived files from the parallel computing system; converting the plurality of archived files to objects using a log structured file system middleware process; and providing the objects for storage in a cloud object storage system. The plurality of processes may run, for example, on a plurality of compute nodes. The log structured file system middleware process may be embodied, for example, as a Parallel Log-Structured File System (PLFS). The log structured file system middleware process optionally executes on a burst buffer node.

  18. Improving Patient Safety in Hospitals through Usage of Cloud Supported Video Surveillance.

    PubMed

    Dašić, Predrag; Dašić, Jovan; Crvenković, Bojan

    2017-04-15

    Patient safety in hospitals is of equal importance as providing treatments and urgent healthcare. With the development of Cloud technologies and Big Data analytics, it is possible to employ VSaaS technology virtually anywhere, for any given security purpose. For the listed benefits, in this paper, we give an overview of the existing cloud surveillance technologies which can be implemented for improving patient safety. Modern VSaaS systems provide higher elasticity and project scalability in dealing with real-time information processing. Modern surveillance technologies can prove to be an effective tool for prevention of patient falls, undesired movement and tempering with attached life supporting devices. Given a large number of patients who require constant supervision, a cloud-based monitoring system can dramatically reduce the occurring costs. It provides continuous real-time monitoring, increased overall security and safety, improved staff productivity, prevention of dishonest claims and long-term digital archiving. Patient safety is a growing issue which can be improved with the usage of high-end centralised surveillance systems allowing the staff to focus more on treating health issues rather that keeping a watchful eye on potential incidents.

  19. The Italian National Seismic Network

    NASA Astrophysics Data System (ADS)

    Michelini, Alberto

    2016-04-01

    The Italian National Seismic Network is composed by about 400 stations, mainly broadband, installed in the Country and in the surrounding regions. About 110 stations feature also collocated strong motion instruments. The Centro Nazionale Terremoti, (National Earthquake Center), CNT, has installed and operates most of these stations, although a considerable number of stations contributing to the INGV surveillance has been installed and is maintained by other INGV sections (Napoli, Catania, Bologna, Milano) or even other Italian or European Institutions. The important technological upgrades carried out in the last years has allowed for significant improvements of the seismic monitoring of Italy and of the Euro-Mediterranean Countries. The adopted data transmission systems include satellite, wireless connections and wired lines. The Seedlink protocol has been adopted for data transmission. INGV is a primary node of EIDA (European Integrated Data Archive) for archiving and distributing, continuous, quality checked data. The data acquisition system was designed to accomplish, in near-real-time, automatic earthquake detection and hypocenter and magnitude determination (moment tensors, shake maps, etc.). Database archiving of all parametric results are closely linked to the existing procedures of the INGV seismic monitoring environment. Overall, the Italian earthquake surveillance service provides, in quasi real-time, hypocenter parameters which are then revised routinely by the analysts of the Bollettino Sismico Nazionale. The results are published on the web page http://cnt.rm.ingv.it/ and are publicly available to both the scientific community and the the general public. This presentation will describe the various activities and resulting products of the Centro Nazionale Terremoti. spanning from data acquisition to archiving, distribution and specialised products.

  20. The Computing and Data Grid Approach: Infrastructure for Distributed Science Applications

    NASA Technical Reports Server (NTRS)

    Johnston, William E.

    2002-01-01

    With the advent of Grids - infrastructure for using and managing widely distributed computing and data resources in the science environment - there is now an opportunity to provide a standard, large-scale, computing, data, instrument, and collaboration environment for science that spans many different projects and provides the required infrastructure and services in a relatively uniform and supportable way. Grid technology has evolved over the past several years to provide the services and infrastructure needed for building 'virtual' systems and organizations. We argue that Grid technology provides an excellent basis for the creation of the integrated environments that can combine the resources needed to support the large- scale science projects located at multiple laboratories and universities. We present some science case studies that indicate that a paradigm shift in the process of science will come about as a result of Grids providing transparent and secure access to advanced and integrated information and technologies infrastructure: powerful computing systems, large-scale data archives, scientific instruments, and collaboration tools. These changes will be in the form of services that can be integrated with the user's work environment, and that enable uniform and highly capable access to these computers, data, and instruments, regardless of the location or exact nature of these resources. These services will integrate transient-use resources like computing systems, scientific instruments, and data caches (e.g., as they are needed to perform a simulation or analyze data from a single experiment); persistent-use resources. such as databases, data catalogues, and archives, and; collaborators, whose involvement will continue for the lifetime of a project or longer. While we largely address large-scale science in this paper, Grids, particularly when combined with Web Services, will address a broad spectrum of science scenarios. both large and small scale.

  1. User Acceptance of Picture Archiving and Communication System in the Emergency Department.

    PubMed

    Goodarzi, Hassan; Khatami, Seyed-Masoud; Javadzadeh, Hammidreza; Mahmoudi, Sadrollah; Khajehpour, Hojjatollah; Heidari, Soleiman; Khodaparast, Morteza; Ebrahimi, Ali; Rasouli, Hamidreza; Ghane, Mohammadreza; Faraji, Mehrdad; Hassanpour, Kasra

    2016-04-01

    Picture archiving and communication system (PACS) has allowed the medical images to be transmitted, stored, retrieved, and displayed in different locations of a hospital or health system. Using PACS in the emergency department will eventually result in improved efficiency and patient care. In spite of the abundant benefits of employing PACS, there are some challenges in implementing this technology like users' resistance to accept the technology, which has a critical role in PACS success. In this study, we will assess and compare user acceptance of PACS in the emergency departments of three different hospitals and investigate the effect of socio-demographic factors on this acceptance. A variant of technology acceptance model (TAM) has been used in order to measure the acceptance level of PACS in the emergency department of three educational hospitals in Iran. A previously used questionnaire was validated and utilized to collect the study data. A stepwise multiple regression model was used to predict factors influencing acceptance score as the dependent variable. Mean age of participants was 32.9 years (standard deviation [SD] = 6.08). Participants with the specialty degree got a higher acceptance score than the three other groups (Mean ± SD = 4.17 ± 0.20). Age, gender, degree of PACS usage and participant's occupation (profession) did not influence the acceptance score. In our multiple regression model, all three variables of perceived usefulness (PU), perceived ease of use (PEU) and the effect of PACS (change) had a significant effect in the prediction of acceptance. The most influencing factor was change with the beta of 0.22 (P value < 0.001). PACS is highly accepted in all three emergency departments especially among specialists. PU, PEU and change are factors influencing PACS acceptance. Our study can be used as an evidence of PACS acceptance in emergency wards.

  2. U.S. Spacesuit Knowledge Capture Status and Initiatives

    NASA Technical Reports Server (NTRS)

    Chullen, Cinda; Woods, Ron; Jairala, Juniper; Bitterly, Rose; McMann, Joe; Lewis, Cathleen

    2011-01-01

    The National Aeronautics and Space Administration (NASA), other organizations and individuals have been performing United States (U.S.) spacesuit knowledge capture since the beginning of space exploration via publication of reports, conference presentations, specialized seminars, and classes instructed by veterans in the field. Recently, the effort has been more concentrated and formalized whereby a new avenue of spacesuit knowledge capture has been added to the archives through which videotaping occurs, engaging both current and retired specialists in the field presenting technical scope specifically for education and preservation of knowledge or being interviewed to archive their significance to NASA s history. Now with video archiving, all these avenues of learning are brought to life with the real experts presenting their wealth of knowledge on screen for future learners to enjoy. U.S. spacesuit knowledge capture topics have included lessons learned in spacesuit technology, experience from the Gemini, Apollo, Skylab and Shuttle programs, hardware certification, design, development and other program components, spacesuit evolution and experience, failure analysis and resolution, aspects of program management, and personal interviews. These archives of actual spacesuit legacy now reflect its rich history and will provide a wealth of knowledge which will greatly enhance the chances for the success of future and more ambitious spacesuit system programs. In this paper, NASA s formal spacesuit knowledge capture efforts will be reviewed and a status will be provided to reveal initiatives and accomplishments since the inception of the more formal U.S. spacesuit knowledge program. A detail itemization of the actual archives will be addressed along with topics that are now available to the general NASA community and the public. Additionally, the latest developments in the archival relationship with the Smithsonian will be discussed.

  3. U.S. Spacesuit Knowledge Capture Status and Initiatives

    NASA Technical Reports Server (NTRS)

    Chullen, Cinda; Woods, Ron; Jairala, Juniper; Bitterly, Rose; McMann, Joe; Lewis, Cathleen

    2012-01-01

    The National Aeronautics and Space Administration (NASA), other organizations and individuals have been performing United States (U.S.) spacesuit knowledge capture since the beginning of space exploration via publication of reports, conference presentations, specialized seminars, and classes instructed by veterans in the field. Recently, the effort has been more concentrated and formalized whereby a new avenue of spacesuit knowledge capture has been added to the archives through which videotaping occurs, engaging both current and retired specialists in the field presenting technical scope specifically for education and preservation of knowledge or being interviewed to archive their significance to NASA's history. Now with video archiving, all these avenues of learning are brought to life with the real experts presenting their wealth of knowledge on screen for future learners to enjoy. U.S. spacesuit knowledge capture topics have included lessons learned in spacesuit technology, experience from the Gemini, Apollo, Skylab and Shuttle programs, hardware certification, design, development and other program components, spacesuit evolution and experience, failure analysis and resolution, aspects of program management, and personal interviews. These archives of actual spacesuit legacy now reflect its rich history and will provide a wealth of knowledge which will greatly enhance the chances for the success of future and more ambitious spacesuit system programs. In this paper, NASA s formal spacesuit knowledge capture efforts will be reviewed and a status will be provided to reveal initiatives and accomplishments since the inception of the more formal U.S. spacesuit knowledge program. A detail itemization of the actual archives will be addressed along with topics that are now available to the general NASA community and the public. Additionally, the latest developments in the archival relationship with the Smithsonian will be discussed.

  4. NASA planetary data: applying planetary satellite remote sensing data in the classroom

    NASA Technical Reports Server (NTRS)

    Liggett, P.; Dobinson, E.; Sword, B.; Hughes, D.; Martin, M.; Martin, D.

    2002-01-01

    NASA supports several data archiving and distribution mechanisms that provide a means whereby scientists can participate in education and outreach through the use of technology for data and information dissemination. The Planetary Data System (PDS) is sponsored by NASA's Office of Space Science. Its purpose is to ensure the long-term usability of NASA data and to stimulate advanced research. In addition, the NASA Regional Planetary Image Facility (RPIF), an international system of planetary image libraries, maintains photographic and digital data as well as mission documentation and cartographic data.

  5. Third International Symposium on Space Mission Operations and Ground Data Systems, part 1

    NASA Technical Reports Server (NTRS)

    Rash, James L. (Editor)

    1994-01-01

    Under the theme of 'Opportunities in Ground Data Systems for High Efficiency Operations of Space Missions,' the SpaceOps '94 symposium included presentations of more than 150 technical papers spanning five topic areas: Mission Management, Operations, Data Management, System Development, and Systems Engineering. The papers focus on improvements in the efficiency, effectiveness, productivity, and quality of data acquisition, ground systems, and mission operations. New technology, techniques, methods, and human systems are discussed. Accomplishments are also reported in the application of information systems to improve data retrieval, reporting, and archiving; the management of human factors; the use of telescience and teleoperations; and the design and implementation of logistics support for mission operations.

  6. Building a virtual archive using brain architecture and Web 3D to deliver neuropsychopharmacology content over the Internet.

    PubMed

    Mongeau, R; Casu, M A; Pani, L; Pillolla, G; Lianas, L; Giachetti, A

    2008-05-01

    The vast amount of heterogeneous data generated in various fields of neurosciences such as neuropsychopharmacology can hardly be classified using traditional databases. We present here the concept of a virtual archive, spatially referenced over a simplified 3D brain map and accessible over the Internet. A simple prototype (available at http://aquatics.crs4.it/neuropsydat3d) has been realized using current Web-based virtual reality standards and technologies. It illustrates how primary literature or summary information can easily be retrieved through hyperlinks mapped onto a 3D schema while navigating through neuroanatomy. Furthermore, 3D navigation and visualization techniques are used to enhance the representation of brain's neurotransmitters, pathways and the involvement of specific brain areas in any particular physiological or behavioral functions. The system proposed shows how the use of a schematic spatial organization of data, widely exploited in other fields (e.g. Geographical Information Systems) can be extremely useful to develop efficient tools for research and teaching in neurosciences.

  7. System Requirement Analyses for Ubiquitous Environment Management System

    NASA Astrophysics Data System (ADS)

    Lim, Sang Boem; Gil, Kyung Jun; Choe, Ho Rim; Eo, Yang Dam

    We are living in new stage of society. U-City introduces new paradigm that cannot be archived in traditional city to future city. Korea is one of the most active countries to construct U-City based on advances of IT technologies - especially based on high-speed network through out country [1]. Peoples are realizing ubiquitous service is key factor of success of U-City. Among the U-services, U-security service is one of the most important services. Nowadays we have to concern about traditional threat and also personal information. Since apartment complex is the most common residence type in Korea. We are developing security rules and system based on analyses of apartment complex and assert of apartment complex. Based on these analyses, we are developing apartment complex security using various technologies including home network system. We also will discuss basic home network security architecture.

  8. CDDIS: NASA's Archive of Space Geodesy Data and Products Supporting GGOS

    NASA Technical Reports Server (NTRS)

    Noll, Carey; Michael, Patrick

    2016-01-01

    The Crustal Dynamics Data Information System (CDDIS) supports data archiving and distribution activities for the space geodesy and geodynamics community. The main objectives of the system are to store space geodesy and geodynamics related data and products in a central archive, to maintain information about the archival of these data,to disseminate these data and information in a timely manner to a global scientific research community, and provide user based tools for the exploration and use of the archive. The CDDIS data system and its archive is a key component in several of the geometric services within the International Association of Geodesy (IAG) and its observing systemthe Global Geodetic Observing System (GGOS), including the IGS, the International DORIS Service (IDS), the International Laser Ranging Service (ILRS), the International VLBI Service for Geodesy and Astrometry (IVS), and the International Earth Rotation and Reference Systems Service (IERS). The CDDIS provides on-line access to over 17 Tbytes of dataand derived products in support of the IAG services and GGOS. The systems archive continues to grow and improve as new activities are supported and enhancements are implemented. Recently, the CDDIS has established a real-time streaming capability for GNSS data and products. Furthermore, enhancements to metadata describing the contents ofthe archive have been developed to facilitate data discovery. This poster will provide a review of the improvements in the system infrastructure that CDDIS has made over the past year for the geodetic community and describe future plans for the system.

  9. The Social and Organizational Life Data Archive (SOLDA).

    ERIC Educational Resources Information Center

    Reed, Ken; Blunsdon, Betsy; Rimme, Malcolm

    2000-01-01

    Outlines the rationale and design of the Social and Organizational Life Data Archive (SOLDA), an on-line collection of survey and other statistical data relevant to research in the fields of management, organizational studies, industrial relations, marketing, and related social sciences. The database uses CD-ROM technology and the World Wide Web…

  10. 76 FR 73506 - Adoption of Updated EDGAR Filer Manual

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-29

    ... Mackintosh, Office of Information Technology, at (202) 551-3600; in the Division of Trading and Markets for... is http://www.sec.gov/info/edgar.shtml . You can also inspect the document at the National Archives... (202) 741-6030, or go to: http://www.archives.gov/federal_register/code_of_federal_regulations/ibr...

  11. What Is A Picture Archiving And Communication System (PACS)?

    NASA Astrophysics Data System (ADS)

    Marceau, Carla

    1982-01-01

    A PACS is a digital system for acquiring, storing, moving and displaying picture or image information. It is an alternative to film jackets that has been made possible by recent breakthroughs in computer technology: telecommunications, local area nets and optical disks. The fundamental concept of the digital representation of image information is introduced. It is shown that freeing images from a material representation on film or paper leads to a dramatic increase in flexibility in our use of the images. The ultimate goal of a medical PACS system is a radiology department without film jackets. The inherent nature of digital images and the power of the computer allow instant free "copies" of images to be made and thrown away. These copies can be transmitted to distant sites in seconds, without the "original" ever leaving the archives of the radiology department. The result is a radiology department with much freer access to patient images and greater protection against lost or misplaced image information. Finally, images in digital form can be treated as data for the computer in image processing, which includes enhancement, reconstruction and even computer-aided analysis.

  12. Recognition techniques for extracting information from semistructured documents

    NASA Astrophysics Data System (ADS)

    Della Ventura, Anna; Gagliardi, Isabella; Zonta, Bruna

    2000-12-01

    Archives of optical documents are more and more massively employed, the demand driven also by the new norms sanctioning the legal value of digital documents, provided they are stored on supports that are physically unalterable. On the supply side there is now a vast and technologically advanced market, where optical memories have solved the problem of the duration and permanence of data at costs comparable to those for magnetic memories. The remaining bottleneck in these systems is the indexing. The indexing of documents with a variable structure, while still not completely automated, can be machine supported to a large degree with evident advantages both in the organization of the work, and in extracting information, providing data that is much more detailed and potentially significant for the user. We present here a system for the automatic registration of correspondence to and from a public office. The system is based on a general methodology for the extraction, indexing, archiving, and retrieval of significant information from semi-structured documents. This information, in our prototype application, is distributed among the database fields of sender, addressee, subject, date, and body of the document.

  13. A Support Database System for Integrated System Health Management (ISHM)

    NASA Technical Reports Server (NTRS)

    Schmalzel, John; Figueroa, Jorge F.; Turowski, Mark; Morris, John

    2007-01-01

    The development, deployment, operation and maintenance of Integrated Systems Health Management (ISHM) applications require the storage and processing of tremendous amounts of low-level data. This data must be shared in a secure and cost-effective manner between developers, and processed within several heterogeneous architectures. Modern database technology allows this data to be organized efficiently, while ensuring the integrity and security of the data. The extensibility and interoperability of the current database technologies also allows for the creation of an associated support database system. A support database system provides additional capabilities by building applications on top of the database structure. These applications can then be used to support the various technologies in an ISHM architecture. This presentation and paper propose a detailed structure and application description for a support database system, called the Health Assessment Database System (HADS). The HADS provides a shared context for organizing and distributing data as well as a definition of the applications that provide the required data-driven support to ISHM. This approach provides another powerful tool for ISHM developers, while also enabling novel functionality. This functionality includes: automated firmware updating and deployment, algorithm development assistance and electronic datasheet generation. The architecture for the HADS has been developed as part of the ISHM toolset at Stennis Space Center for rocket engine testing. A detailed implementation has begun for the Methane Thruster Testbed Project (MTTP) in order to assist in developing health assessment and anomaly detection algorithms for ISHM. The structure of this implementation is shown in Figure 1. The database structure consists of three primary components: the system hierarchy model, the historical data archive and the firmware codebase. The system hierarchy model replicates the physical relationships between system elements to provide the logical context for the database. The historical data archive provides a common repository for sensor data that can be shared between developers and applications. The firmware codebase is used by the developer to organize the intelligent element firmware into atomic units which can be assembled into complete firmware for specific elements.

  14. 77 FR 60475 - Privacy Act of 1974, as Amended; System of Records Notices

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-03

    ... NATIONAL ARCHIVES AND RECORDS ADMINISTRATION Privacy Act of 1974, as Amended; System of Records Notices AGENCY: National Archives and Records Administration (NARA). ACTION: Notice of the establishment of new privacy system of record, NARA 44. SUMMARY: The National Archives and Records Administration...

  15. 76 FR 13671 - Privacy Act of 1974, as Amended; System of Records Notices

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-14

    ... NATIONAL ARCHIVES AND RECORDS ADMINISTRATION Privacy Act of 1974, as Amended; System of Records Notices AGENCY: National Archives and Records Administration (NARA). ACTION: Notice of the establishment of new privacy system of record, NARA 41. SUMMARY: The National Archives and Records Administration...

  16. Planning for PACS: a comprehensive guide to nontechnical considerations.

    PubMed

    Cohen, Mervyn D; Rumreich, Lori L; Garriot, Kimberley M; Jennings, S Gregory

    2005-04-01

    A complete picture archiving and communication system (PACS) installation is one of the largest projects a radiology department will undertake. Although technology issues are important, they often draw focus away from many other significant issues This paper describes in detail all of these other necessary components that need to be addressed if a PACS installation is to be relatively trouble free, provides guidelines for successful PACS implementation, and details pitfalls to be avoided.

  17. The Telecommunications and Data Acquisition Report

    NASA Technical Reports Server (NTRS)

    Posner, E. C. (Editor)

    1993-01-01

    This quarterly publication provides archival reports on developments in programs managed by JPL's Office of Telecommunications and Data Acquisition (TDA). In space communications, radio navigation, radio science, and ground-based radio and radar astronomy, it reports on activities of the Deep Space Network (DSN) in planning, supporting research and technology, implementation, and operations. Also included are standards activity at JPL for space data and information systems and reimbursable DSN work performed for other space agencies through NASA.

  18. A Survey of Videodisc Technology.

    DTIC Science & Technology

    1985-12-01

    store images and the microcomputer is used as an interactive and management tool , makes for a powerful teaching system. General Motors was the first...videodisc are used for archival storage of documents. * IBM uses videodisc in over 180 branch offices where they are used both as a presentation tool and to...provide reference material. IBM is also currently working on a videodisc project as a direct training tool for mainten- ance of their computers. A

  19. The function of the earth observing system - Data information system Distributed Active Archive Centers

    NASA Technical Reports Server (NTRS)

    Lapenta, C. C.

    1992-01-01

    The functionality of the Distributed Active Archive Centers (DAACs) which are significant elements of the Earth Observing System Data and Information System (EOSDIS) is discussed. Each DAAC encompasses the information management system, the data archival and distribution system, and the product generation system. The EOSDIS DAACs are expected to improve the access to earth science data set needed for global change research.

  20. AstroCloud, a Cyber-Infrastructure for Astronomy Research: Data Archiving and Quality Control

    NASA Astrophysics Data System (ADS)

    He, B.; Cui, C.; Fan, D.; Li, C.; Xiao, J.; Yu, C.; Wang, C.; Cao, Z.; Chen, J.; Yi, W.; Li, S.; Mi, L.; Yang, S.

    2015-09-01

    AstroCloud is a cyber-Infrastructure for Astronomy Research initiated by Chinese Virtual Observatory (China-VO) under funding support from NDRC (National Development and Reform commission) and CAS (Chinese Academy of Sciences)1(Cui et al. 2014). To archive the astronomical data in China, we present the implementation of the astronomical data archiving system (ADAS). Data archiving and quality control are the infrastructure for the AstroCloud. Throughout the data of the entire life cycle, data archiving system standardized data, transferring data, logging observational data, archiving ambient data, And storing these data and metadata in database. Quality control covers the whole process and all aspects of data archiving.

  1. Building a COTS archive for satellite data

    NASA Technical Reports Server (NTRS)

    Singer, Ken; Terril, Dave; Kelly, Jack; Nichols, Cathy

    1994-01-01

    The goal of the NOAA/NESDIS Active Archive was to provide a method of access to an online archive of satellite data. The archive had to manage and store the data, let users interrogate the archive, and allow users to retrieve data from the archive. Practical issues of the system design such as implementation time, cost and operational support were examined in addition to the technical issues. There was a fixed window of opportunity to create an operational system, along with budget and staffing constraints. Therefore, the technical solution had to be designed and implemented subject to constraint imposed by the practical issues. The NOAA/NESDIS Active Archive came online in July of 1994, meeting all of its original objectives.

  2. Land Cover Change Community-based Processing and Analysis System (LC-ComPS): Lessons Learned from Technology Infusion

    NASA Astrophysics Data System (ADS)

    Masek, J.; Rao, A.; Gao, F.; Davis, P.; Jackson, G.; Huang, C.; Weinstein, B.

    2008-12-01

    The Land Cover Change Community-based Processing and Analysis System (LC-ComPS) combines grid technology, existing science modules, and dynamic workflows to enable users to complete advanced land data processing on data available from local and distributed archives. Changes in land cover represent a direct link between human activities and the global environment, and in turn affect Earth's climate. Thus characterizing land cover change has become a major goal for Earth observation science. Many science algorithms exist to generate new products (e.g., surface reflectance, change detection) used to study land cover change. The overall objective of the LC-ComPS is to release a set of tools and services to the land science community that can be implemented as a flexible LC-ComPS to produce surface reflectance and land-cover change information with ground resolution on the order of Landsat-class instruments. This package includes software modules for pre-processing Landsat-type satellite imagery (calibration, atmospheric correction, orthorectification, precision registration, BRDF correction) for performing land-cover change analysis and includes pre-built workflow chains to automatically generate surface reflectance and land-cover change products based on user input. In order to meet the project objectives, the team created the infrastructure (i.e., client-server system with graphical and machine interfaces) to expand the use of these existing science algorithm capabilities in a community with distributed, large data archives and processing centers. Because of the distributed nature of the user community, grid technology was chosen to unite the dispersed community resources. At that time, grid computing was not used consistently and operationally within the Earth science research community. Therefore, there was a learning curve to configure and implement the underlying public key infrastructure (PKI) interfaces, required for the user authentication, secure file transfer and remote job execution on the grid network of machines. In addition, science support was needed to vet that the grid technology did not have any adverse affects of the science module outputs. Other open source, unproven technologies, such as a workflow package to manage jobs submitted by the user, were infused into the overall system with successful results. This presentation will discuss the basic capabilities of LC-ComPS, explain how the technology was infused, and provide lessons learned for using and integrating the various technologies while developing and operating the system, and finally outline plans moving forward (maintenance and operations decisions) based on the experience to date.

  3. Storage, retrieval, and analysis of ST data

    NASA Technical Reports Server (NTRS)

    Albrecht, R.

    1984-01-01

    Space Telescope can generate multidimensional image data, very similar in nature to data produced with microdensitometers. An overview is presented of the ST science ground system between carrying out the observations and the interactive analysis of preprocessed data. The ground system elements used in data archival and retrieval are described and operational procedures are discussed. Emphasis is given to aspects of the ground system that are relevant to the science user and to general principles of system software development in a production environment. While the system being developed uses relatively conservative concepts for the launch baseline, concepts were developed to enhance the ground system. This includes networking, remote access, and the utilization of alternate data storage technologies.

  4. Archived Data User Service self evaluation report : FAST

    DOT National Transportation Integrated Search

    2000-11-01

    The Archived Data User Service (ADUS) is a recent addition to the National Intelligent Transportation System (ITS) Architecture. This user service required ITS system to have the capability to receive, collect and archive ITS-generated operational...

  5. Archiving Mars Mission Data Sets with the Planetary Data System

    NASA Technical Reports Server (NTRS)

    Guinness, Edward A.

    2006-01-01

    This viewgraph presentation reviews the use of the Planetary Data System (PDS) to archive the datasets that are received from the Mars Missions. It reviews the lessons learned in the actual archiving process, and presents an overview of the actual archiving process. It also reviews the lessons learned from the perspectives of the projects, the data producers and the data users.

  6. Virtual microscopy and digital cytology: state of the art.

    PubMed

    Giansanti, Daniele; Grigioni, Mauro; D'Avenio, Giuseppe; Morelli, Sandra; Maccioni, Giovanni; Bondi, Arrigo; Giovagnoli, Maria Rosaria

    2010-01-01

    The paper approaches a new technological scenario relevant for the introduction of the digital cytology (D-CYT) in the health service. A detailed analysis of the state of the art on the status of the introduction of D-CYT in the hospital and more in general in the dispersed territory has been conducted. The analysis was conducted in a form of review and was arranged into two parts: the first part focused on the technological tools needed to carry out a successful service (client server architectures, e-learning, quality assurance issues); the second part focused on issues oriented to help the introduction and evaluation of the technology (specific training in D-CYT, health technology assessment in-routine application, data format standards and picture archiving computerized systems (PACS) implementation, image quality assessment, strategies of navigation, 3D-virtual-reality potentialities). The work enlightens future scenarios of actions relevant for the introduction of the technology.

  7. Adoption of PACS by the Department of Veterans Affairs: the past, the present, and future plans

    NASA Astrophysics Data System (ADS)

    Siegel, Eliot L.; Reiner, Bruce I.; Kuzmak, Peter M.

    2000-05-01

    The diffusion of PACS technology within the Department of Veterans Affairs has followed the 'S' curve transition originally described by Ryan and Gross in 1943. They described a paradigm that describes the diffusion of a new technology into the community. However the rate of adoption of filmless radiology by the VA has been much higher than that of the general healthcare system. This is likely due to the fact that the VA and Department of Defense medical systems are somewhat isolated and independent from other health care systems and are subject to a different rate of diffusion of technology. The early introduction and success of PACS in the VA undoubtedly accelerated its acceptance throughout the system. An additional impetus to the growth of PACS in the VA has been the development of an image management system that has been incorporated into the electronic medical record. The universal use of the VISTA HIS and RIS system throughout the VA and the fact that it was developed 'in-house' as well as its extensive support for DICOM functionality have also played a major role in facilitating the acceptance of Picture Archival and Communication Systems throughout the VA.

  8. NASA's EOSDIS, Trust and Certification

    NASA Technical Reports Server (NTRS)

    Ramapriyan, H. K.

    2017-01-01

    NASA's Earth Observing System Data and Information System (EOSDIS) has been in operation since August 1994, managing most of NASA's Earth science data from satellites, airborne sensors, filed campaigns and other activities. Having been designated by the Federal Government as a project responsible for production, archiving and distribution of these data through its Distributed Active Archive Centers (DAACs), the Earth Science Data and Information System Project (ESDIS) is responsible for EOSDIS, and is legally bound by the Office of Management and Budgets circular A-130, the Federal Records Act. It must follow the regulations of the National Institute of Standards and Technologies (NIST) and National Archive and Records Administration (NARA). It must also follow the NASA Procedural Requirement 7120.5 (NASA Space Flight Program and Project Management). All these ensure that the data centers managed by ESDIS are trustworthy from the point of view of efficient and effective operations as well as preservation of valuable data from NASA's missions. Additional factors contributing to this trust are an extensive set of internal and external reviews throughout the history of EOSDIS starting in the early 1990s. Many of these reviews have involved external groups of scientific and technological experts. Also, independent annual surveys of user satisfaction that measure and publish the American Customer Satisfaction Index (ACSI), where EOSDIS has scored consistently high marks since 2004, provide an additional measure of trustworthiness. In addition, through an effort initiated in 2012 at the request of NASA HQ, the ESDIS Project and 10 of 12 DAACs have been certified by the International Council for Science (ICSU) World Data System (WDS) and are members of the ICSUWDS. This presentation addresses questions such as pros and cons of the certification process, key outcomes and next steps regarding certification. Recently, the ICSUWDS and Data Seal of Approval (DSA) organizations merged their Core Trustworthy Data Repositories Requirements and require that members be recertified every three years. Given the rigor with which NASA manages the ESDIS Project and the DAACs, the recertification through WDSDSA, while involving some additional work, is a relatively simple process.

  9. Depth-Charge in the Archive: The Documentation of Performance Revisited in the Digital Age

    ERIC Educational Resources Information Center

    Allen, Jess

    2010-01-01

    The debate surrounding the documentation of performance is principally concerned with the ephemerality of the live event, set against the stasis and "death" that the archive is conventionally believed to represent. The advent of digital technology in live performance has complexified this still further, by altering the architecture, space and…

  10. Storage media for computers in radiology.

    PubMed

    Dandu, Ravi Varma

    2008-11-01

    The introduction and wide acceptance of digital technology in medical imaging has resulted in an exponential increase in the amount of data produced by the radiology department. There is an insatiable need for storage space to archive this ever-growing volume of image data. Healthcare facilities should plan the type and size of the storage media that they needed, based not just on the volume of data but also on considerations such as the speed and ease of access, redundancy, security, costs, as well as the longevity of the archival technology. This article reviews the various digital storage media and compares their merits and demerits.

  11. Cassini Archive Tracking System

    NASA Technical Reports Server (NTRS)

    Conner, Diane; Sayfi, Elias; Tinio, Adrian

    2006-01-01

    The Cassini Archive Tracking System (CATS) is a computer program that enables tracking of scientific data transfers from originators to the Planetary Data System (PDS) archives. Without CATS, there is no systematic means of locating products in the archive process or ensuring their completeness. By keeping a database of transfer communications and status, CATS enables the Cassini Project and the PDS to efficiently and accurately report on archive status. More importantly, problem areas are easily identified through customized reports that can be generated on the fly from any Web-enabled computer. A Web-browser interface and clearly defined authorization scheme provide safe distributed access to the system, where users can perform functions such as create customized reports, record a transfer, and respond to a transfer. CATS ensures that Cassini provides complete science archives to the PDS on schedule and that those archives are available to the science community by the PDS. The three-tier architecture is loosely coupled and designed for simple adaptation to multimission use. Written in the Java programming language, it is portable and can be run on any Java-enabled Web server.

  12. NASA Langley Atmospheric Science Data Center (ASDC) Experience with Aircraft Data

    NASA Astrophysics Data System (ADS)

    Perez, J.; Sorlie, S.; Parker, L.; Mason, K. L.; Rinsland, P.; Kusterer, J.

    2011-12-01

    Over the past decade the NASA Langley ASDC has archived and distributed a variety of aircraft mission data sets. These datasets posed unique challenges for archiving from the rigidity of the archiving system and formats to the lack of metadata. The ASDC developed a state-of-the-art data archive and distribution system to serve the atmospheric sciences data provider and researcher communities. The system, called Archive - Next Generation (ANGe), is designed with a distributed, multi-tier, serviced-based, message oriented architecture enabling new methods for searching, accessing, and customizing data. The ANGe system provides the ease and flexibility to ingest and archive aircraft data through an ad hoc workflow or to develop a new workflow to suit the providers needs. The ASDC will describe the challenges encountered in preparing aircraft data for archiving and distribution. The ASDC is currently providing guidance to the DISCOVER-AQ (Deriving Information on Surface Conditions from Column and Vertically Resolved Observations Relevant to Air Quality) Earth Venture-1 project on developing collection, granule, and browse metadata as well as supporting the ADAM (Airborne Data For Assessing Models) site.

  13. Stewardship of NASA's Earth Science Data and Ensuring Long-Term Active Archives

    NASA Technical Reports Server (NTRS)

    Ramapriyan, Hampapuram K.; Behnke, Jeanne

    2016-01-01

    Program, NASA has followed an open data policy, with non-discriminatory access to data with no period of exclusive access. NASA has well-established processes for assigning and or accepting datasets into one of 12 Distributed Active Archive Centers (DAACs) that are parts of EOSDIS. EOSDIS has been evolving through several information technology cycles, adapting to hardware and software changes in the commercial sector. NASA is responsible for maintaining Earth science data as long as users are interested in using them for research and applications, which is well beyond the life of the data gathering missions. For science data to remain useful over long periods of time, steps must be taken to preserve: (1) Data bits with no corruption, (2) Discoverability and access, (3) Readability, (4) Understandability, (5) Usability' and (6). Reproducibility of results. NASAs Earth Science data and Information System (ESDIS) Project, along with the 12 EOSDIS Distributed Active Archive Centers (DAACs), has made significant progress in each of these areas over the last decade, and continues to evolve its active archive capabilities. Particular attention is being paid in recent years to ensure that the datasets are published in an easily accessible and citable manner through a unified metadata model, a common metadata repository (CMR), a coherent view through the earthdata.gov website, and assignment of Digital Object Identifiers (DOI) with well-designed landing product information pages.

  14. Contents of the NASA ocean data system archive, version 11-90

    NASA Technical Reports Server (NTRS)

    Smith, Elizabeth A. (Editor); Lassanyi, Ruby A. (Editor)

    1990-01-01

    The National Aeronautics and Space Administration (NASA) Ocean Data System (NODS) archive at the Jet Propulsion Laboratory (JPL) includes satellite data sets for the ocean sciences and global-change research to facilitate multidisciplinary use of satellite ocean data. Parameters include sea-surface height, surface-wind vector, sea-surface temperature, atmospheric liquid water, and surface pigment concentration. NODS will become the Data Archive and Distribution Service of the JPL Distributed Active Archive Center for the Earth Observing System Data and Information System (EOSDIS) and will be the United States distribution site for Ocean Topography Experiment (TOPEX)/POSEIDON data and metadata.

  15. The Planetary Archive

    NASA Astrophysics Data System (ADS)

    Penteado, Paulo F.; Trilling, David; Szalay, Alexander; Budavári, Tamás; Fuentes, César

    2014-11-01

    We are building the first system that will allow efficient data mining in the astronomical archives for observations of Solar System Bodies. While the Virtual Observatory has enabled data-intensive research making use of large collections of observations across multiple archives, Planetary Science has largely been denied this opportunity: most astronomical data services are built based on sky positions, and moving objects are often filtered out.To identify serendipitous observations of Solar System objects, we ingest the archive metadata. The coverage of each image in an archive is a volume in a 3D space (RA,Dec,time), which we can represent efficiently through a hierarchical triangular mesh (HTM) for the spatial dimensions, plus a contiguous time interval. In this space, an asteroid occupies a curve, which we determine integrating its orbit into the past. Thus when an asteroid trajectory intercepts the volume of an archived image, we have a possible observation of that body. Our pipeline then looks in the archive's catalog for a source with the corresponding coordinates, to retrieve its photometry. All these matches are stored into a database, which can be queried by object identifier.This database consists of archived observations of known Solar System objects. This means that it grows not only from the ingestion of new images, but also from the growth in the number of known objects. As new bodies are discovered, our pipeline can find archived observations where they could have been recorded, providing colors for these newly-found objects. This growth becomes more relevant with the new generation of wide-field surveys, particularly LSST.We also present one use case of our prototype archive: after ingesting the metadata for SDSS, 2MASS and GALEX, we were able to identify serendipitous observations of Solar System bodies in these 3 archives. Cross-matching these occurrences provided us with colors from the UV to the IR, a much wider spectral range than that commonly used for asteroid taxonomy. We present here archive-derived spectrophotometry from searching for 440 thousand asteroids, from 0.3 to 3 µm. In the future we will expand to other archives, including HST, Spitzer, WISE and Pan-STARRS.

  16. From Data to Knowledge in Earth Science, Planetary Science, and Astronomy

    NASA Technical Reports Server (NTRS)

    Dobinson, Elaine R.; Jacob, Joseph C.; Yunck, Thomas P.

    2004-01-01

    This paper examines three NASA science data archive systems from the Earth, planetary, and astronomy domains, and discusses the various efforts underway to provide their science communities with not only better access to their holdings, but also with the services they need to interpret the data and understand their physical meaning. The paper identifies problems common to all three domains and suggests ways that common standards, technologies, and even implementations be leveraged to benefit each other.

  17. Going, Going, Still There: Using the WebCite Service to Permanently Archive Cited Web Pages

    PubMed Central

    Trudel, Mathieu

    2005-01-01

    Scholars are increasingly citing electronic “web references” which are not preserved in libraries or full text archives. WebCite is a new standard for citing web references. To “webcite” a document involves archiving the cited Web page through www.webcitation.org and citing the WebCite permalink instead of (or in addition to) the unstable live Web page. This journal has amended its “instructions for authors” accordingly, asking authors to archive cited Web pages before submitting a manuscript. Almost 200 other journals are already using the system. We discuss the rationale for WebCite, its technology, and how scholars, editors, and publishers can benefit from the service. Citing scholars initiate an archiving process of all cited Web references, ideally before they submit a manuscript. Authors of online documents and websites which are expected to be cited by others can ensure that their work is permanently available by creating an archived copy using WebCite and providing the citation information including the WebCite link on their Web document(s). Editors should ask their authors to cache all cited Web addresses (Uniform Resource Locators, or URLs) “prospectively” before submitting their manuscripts to their journal. Editors and publishers should also instruct their copyeditors to cache cited Web material if the author has not done so already. Finally, WebCite can process publisher submitted “citing articles” (submitted for example as eXtensible Markup Language [XML] documents) to automatically archive all cited Web pages shortly before or on publication. Finally, WebCite can act as a focussed crawler, caching retrospectively references of already published articles. Copyright issues are addressed by honouring respective Internet standards (robot exclusion files, no-cache and no-archive tags). Long-term preservation is ensured by agreements with libraries and digital preservation organizations. The resulting WebCite Index may also have applications for research assessment exercises, being able to measure the impact of Web services and published Web documents through access and Web citation metrics. PMID:16403724

  18. The data storage grid: the next generation of fault-tolerant storage for backup and disaster recovery of clinical images

    NASA Astrophysics Data System (ADS)

    King, Nelson E.; Liu, Brent; Zhou, Zheng; Documet, Jorge; Huang, H. K.

    2005-04-01

    Grid Computing represents the latest and most exciting technology to evolve from the familiar realm of parallel, peer-to-peer and client-server models that can address the problem of fault-tolerant storage for backup and recovery of clinical images. We have researched and developed a novel Data Grid testbed involving several federated PAC systems based on grid architecture. By integrating a grid computing architecture to the DICOM environment, a failed PACS archive can recover its image data from others in the federation in a timely and seamless fashion. The design reflects the five-layer architecture of grid computing: Fabric, Resource, Connectivity, Collective, and Application Layers. The testbed Data Grid architecture representing three federated PAC systems, the Fault-Tolerant PACS archive server at the Image Processing and Informatics Laboratory, Marina del Rey, the clinical PACS at Saint John's Health Center, Santa Monica, and the clinical PACS at the Healthcare Consultation Center II, USC Health Science Campus, will be presented. The successful demonstration of the Data Grid in the testbed will provide an understanding of the Data Grid concept in clinical image data backup as well as establishment of benchmarks for performance from future grid technology improvements and serve as a road map for expanded research into large enterprise and federation level data grids to guarantee 99.999 % up time.

  19. ESO Receives Computerworld Honors Program 21st Century Achievement Award in Science Category

    NASA Astrophysics Data System (ADS)

    2005-06-01

    In a ceremony held in Washington, D.C. (USA) on June 6, 2005, ESO, the European Organisation for Astronomical Research in the southern Hemisphere, received the coveted 21st Century Achievement Award from the Computerworld Honors Program for its visionary use of information technology in the Science category. Sybase, a main database server vendor and member of the Chairmen's Committee, nominated ESO's Data Flow System in recognition of its contributions to the global information technology revolution and its positive impact on society. The citations reads: "ESO has revolutionized the operations of ground-based astronomical observatories with a new end-to-end data flow system, designed to improve the transmission and management of astronomical observations and data over transcontinental distances." This year's awards, in 10 categories, were presented at a gala event at the National Building Museum, attended by over 250 guests, including leaders of the information technology industry, former award recipients, judges, scholars, and diplomats representing many of the 54 countries from which the 17-year-old program's laureates have come. "The Computerworld Honors Program 21st Century Achievement Awards are presented to companies from around the world whose visionary use of information technology promotes positive social, economic and educational change," said Bob Carrigan, president and CEO of Computerworld and chairman of the Chairmen's Committee of the Computerworld Honors Program. "The recipients of these awards are the true heroes of the information age and have been appropriately recognized by the leading IT industry chairmen as true revolutionaries in their fields." ESO PR Photo 18/05 ESO PR Photo 18/05 ESO Receives the Award in the Science Category [Preview - JPEG: 400 x 496 pix - 53k] [Normal - JPEG: 800 x 992 pix - 470k] [Full Res - JPEG: 1250 x 1550 pix - 1.1M] Caption: ESO PR Photo 18/05: Receiving the Computerworld 21st Century Achievement Award for Science on behalf of ESO: Drs Preben Grosbøl, Michele Péron, Peter Quinn (Head of the ESO Data Management Division) and David Silva. Traditionally, ground based astronomical observatories have been used as facilities where scientists apply for observing time, eventually travel to the remote sites where telescopes are located, carry out their observations by themselves and finally take their data back to their home institutes to do the final scientific analysis. As observatories become more complex and located in ever more remote locations (to reduce light pollution), this operational concept (coupled with the weather lottery effect [1]) becomes less and less effective. In particular, the lack of data re-use has been increasingly seen as scientifically unproductive. Such thoughts guided the design and implementation of the ESO Data Flow System (DFS). The DFS allows both traditional on-site observing as well as service observing, where data is collected by observatory staff on behalf of the ESO user community based on user submitted descriptions and requirements [2]. In either case, the data is captured by DFS and saved in the ESO science archive [3]. After a one-year proprietary period during which the original investigators have private access to their data, researchers can access the data for their own use. ESO was the first ground-based observatory to implement these operational concepts and tools within a complete system. It was also the first ground-based observatory to build and maintain such an extensive science archive that does not only contain observational data, but also auxiliary information describing the operation process. In both areas, ESO remains the world-leader in end-to-end observatory operations on the ground. "The result of our strategy has been a significant increase in the scientific productivity of the ESO user community", said Peter Quinn, Head of ESO's Data Management and Operations Division, responsible for DFS. "As measured by the number of papers in peer-reviewed journals, ESO is now one of the leading astronomical facilities in the world. Coupled with cutting edge optical telescopes and astronomical instruments at the Chile sites, the DFS has contributed to this success by providing the fundamental IT infrastructure for observation and data management." The case study about ESO, together with the case studies from the other winners and laureates of the 2005 Collection, is available on the Computerworld Honors Program Archives On-Line, www.cwheroes.org, and also distributed to more than 134 members of the Computerworld Honors Global Archives. According to Dan Morrow, a founding director and chief historian for the Honors Program, "This year's award recipients exemplify the very best in the creative use of IT in service to mankind. Their work and their stories are outstanding contributions to the history of the information technology revolution in every sense of the word, and, for the archives we serve all over the world, they are, truly, priceless." From more than 250 nominations submitted this year by the industry chairmen and CEO's who serve on the program's Chairmen's Committee, 162 were honoured as laureates at ceremonies in San Francisco, on April 3, 2005, when their case studies officially became part of the Computerworld Honors 2005 Collection. Of these, 48 finalists were chosen by an academy of distinguished judges to attend the June 6 gala in Washington, D.C., at which 10 were announced recipients of the award, one in each of the following categories: Business and Related Services; Education and Academia; Environment, Energy and Agriculture; Finance, Insurance and Real Estate; Government and Non-Profit Organizations; Manufacturing; Media, Arts and Entertainment; Medicine; Science; and Transportation. Additional information about the 2005 Collection is available at www.cwheroes.org, where the entire collection is available to scholars, researchers and the general public. The ESO Data Management and Operations Division web page is at http://www.eso.org/org/dmd/. More information About the Computerworld Honors Program: Governed by the Computerworld Information Technology Awards Foundation, a Massachusetts not-for-profit corporation founded by International Data Group (IDG) in 1988, the Computerworld Honors Program searches for and recognizes individuals and organizations who have demonstrated vision and leadership as they strive to use information technology in innovative ways across 10 categories: Business and Related Services; Education and Academia; Environment, Energy and Agriculture; Finance, Insurance and Real Estate; Government and Non-Profit Organizations; Manufacturing; Media, Arts and Entertainment; Medicine; Science; and Transportation. Each year, the Computerworld Honors Chairmen's Committee nominates organizations that are using information technology to improve society for inclusion in the Computerworld Honors Online Archive and the Collections of the Global Archives. The Global Archives represents the 100-plus institutions from more than 30 countries that include the Computerworld Honors Collection in their archives and libraries.

  20. 77 FR 20104 - Privacy Act of 1974, as Amended; System of Records

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-03

    ... Archives and Records Administration regulations. System Manager and Address: Director, Enforcement and... retained in accordance with the OCC's records management policies and National Archives and Records... accordance with the OCC's records management policies and National Archives and Records Administration...

  1. Architecture and evolution of Goddard Space Flight Center Distributed Active Archive Center

    NASA Technical Reports Server (NTRS)

    Bedet, Jean-Jacques; Bodden, Lee; Rosen, Wayne; Sherman, Mark; Pease, Phil

    1994-01-01

    The Goddard Space Flight Center (GSFC) Distributed Active Archive Center (DAAC) has been developed to enhance Earth Science research by improved access to remote sensor earth science data. Building and operating an archive, even one of a moderate size (a few Terabytes), is a challenging task. One of the critical components of this system is Unitree, the Hierarchical File Storage Management System. Unitree, selected two years ago as the best available solution, requires constant system administrative support. It is not always suitable as an archive and distribution data center, and has moderate performance. The Data Archive and Distribution System (DADS) software developed to monitor, manage, and automate the ingestion, archive, and distribution functions turned out to be more challenging than anticipated. Having the software and tools is not sufficient to succeed. Human interaction within the system must be fully understood to improve efficiency to improve efficiency and ensure that the right tools are developed. One of the lessons learned is that the operability, reliability, and performance aspects should be thoroughly addressed in the initial design. However, the GSFC DAAC has demonstrated that it is capable of distributing over 40 GB per day. A backup system to archive a second copy of all data ingested is under development. This backup system will be used not only for disaster recovery but will also replace the main archive when it is unavailable during maintenance or hardware replacement. The GSFC DAAC has put a strong emphasis on quality at all level of its organization. A Quality team has also been formed to identify quality issues and to propose improvements. The DAAC has conducted numerous tests to benchmark the performance of the system. These tests proved to be extremely useful in identifying bottlenecks and deficiencies in operational procedures.

  2. MODIS land data at the EROS data center DAAC

    USGS Publications Warehouse

    Jenkerson, Calli B.; Reed, B.C.

    2001-01-01

    The US Geological Survey's (USGS) Earth Resources Observation Systems (EROS) Data Center (EDC) in Sioux Falls, SD, USA, is the primary national archive for land processes data and one of the National Aeronautics and Space Administration's (NASA) Distributed Active Archive Centers (DAAC) for the Earth Observing System (EOS). One of EDC's functions as a DAAC is the archival and distribution of Moderate Resolution Spectroradiometer (MODIS) Land Data collected from the Earth Observing System (EOS) satellite Terra. More than 500,000 publicly available MODIS land data granules totaling 25 Terabytes (Tb) are currently stored in the EDC archive. This collection is managed, archived, and distributed by EOS Data and Information System (EOSDIS) Core System (ECS) at EDC. EDC User Services support the use of MODIS Land data, which include land surface reflectance/albedo, temperature/emissivity, vegetation characteristics, and land cover, by responding to user inquiries, constructing user information sites on the EDC web page, and presenting MODIS materials worldwide.

  3. The ``One Archive'' for JWST

    NASA Astrophysics Data System (ADS)

    Greene, G.; Kyprianou, M.; Levay, K.; Sienkewicz, M.; Donaldson, T.; Dower, T.; Swam, M.; Bushouse, H.; Greenfield, P.; Kidwell, R.; Wolfe, D.; Gardner, L.; Nieto-Santisteban, M.; Swade, D.; McLean, B.; Abney, F.; Alexov, A.; Binegar, S.; Aloisi, A.; Slowinski, S.; Gousoulin, J.

    2015-09-01

    The next generation for the Space Telescope Science Institute data management system is gearing up to provide a suite of archive system services supporting the operation of the James Webb Space Telescope. We are now completing the initial stage of integration and testing for the preliminary ground system builds of the JWST Science Operations Center which includes multiple components of the Data Management Subsystem (DMS). The vision for astronomical science and research with the JWST archive introduces both solutions to formal mission requirements and innovation derived from our existing mission systems along with the collective shared experience of our global user community. We are building upon the success of the Hubble Space Telescope archive systems, standards developed by the International Virtual Observatory Alliance, and collaborations with our archive data center partners. In proceeding forward, the “one archive” architectural model presented here is designed to balance the objectives for this new and exciting mission. The STScI JWST archive will deliver high quality calibrated science data products, support multi-mission data discovery and analysis, and provide an infrastructure which supports bridges to highly valued community tools and services.

  4. The Protein Data Bank archive as an open data resource

    DOE PAGES

    Berman, Helen M.; Kleywegt, Gerard J.; Nakamura, Haruki; ...

    2014-07-26

    The Protein Data Bank archive was established in 1971, and recently celebrated its 40th anniversary (Berman et al. in Structure 20:391, 2012). Here, an analysis of interrelationships of the science, technology and community leads to further insights into how this resource evolved into one of the oldest and most widely used open-access data resources in biology.

  5. The Protein Data Bank archive as an open data resource.

    PubMed

    Berman, Helen M; Kleywegt, Gerard J; Nakamura, Haruki; Markley, John L

    2014-10-01

    The Protein Data Bank archive was established in 1971, and recently celebrated its 40th anniversary (Berman et al. in Structure 20:391, 2012). An analysis of interrelationships of the science, technology and community leads to further insights into how this resource evolved into one of the oldest and most widely used open-access data resources in biology.

  6. Archival Administration in the Electronic Information Age: An Advanced Institute for Government Archivists (2nd, Pittsburgh, Pennsylvania, June 3-14, 1991).

    ERIC Educational Resources Information Center

    Pittsburgh Univ., PA. Graduate School of Library and Information Sciences.

    This report describes the first phase of an institute that was designed to provide technical information to the chief administrative officials of state archival agencies about new trends in information technology and to introduce them to management tools needed for operating in this environment. Background information on the first institute…

  7. The Protein Data Bank archive as an open data resource

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Berman, Helen M.; Kleywegt, Gerard J.; Nakamura, Haruki

    The Protein Data Bank archive was established in 1971, and recently celebrated its 40th anniversary (Berman et al. in Structure 20:391, 2012). Here, an analysis of interrelationships of the science, technology and community leads to further insights into how this resource evolved into one of the oldest and most widely used open-access data resources in biology.

  8. James Webb Space Telescope

    NASA Image and Video Library

    2017-12-08

    When the James Webb Space Telescope (JWST) reaches its orbit about a million miles (1.5 kilometers) from Earth and begins studying the distant reaches of the universe, the event will mark an unprecedented triumph on several technological fronts. Photo Credit: Chris Gunn For more information go to the Goddard Tech Trends Archive: Spring 2007 (http://gsfctechnology.gsfc.nasa.gov/TechTrendsArchive.html)

  9. Archiving Innovations Preserve Essential Historical Records

    NASA Technical Reports Server (NTRS)

    2013-01-01

    The Apollo 11 mission left on the Moon a silicon disc inscribed with microscopic recreations of messages from 73 countries. NanoArk Corporation of Fairport, New York, built on that NASA technology to develop a fire and water resistant archiving innovation that provides cost savings and security in preserving documents. Since its launch, NanoArk has grown from 2 to 10 employees.

  10. 78 FR 35617 - President's Council of Advisors on Science and Technology (PCAST)

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-13

    ..., July 18, 2013; 9:00 a.m.-12:00 p.m. ADDRESSES: National Academy of Sciences (in the Lecture Room), 2101...://whitehouse.gov/ostp/pcast . A live video webcast and an archive of the webcast after the event are expected to be available at http://whitehouse.gov/ostp/pcast . The archived video will be available within one...

  11. Techno-Nationalism and the Construction of University Technology Transfer

    ERIC Educational Resources Information Center

    Sá, Creso; Kretz, Andrew; Sigurdson, Kristjan

    2013-01-01

    Our historical study of Canada's main research university illuminates the overlooked influence of national identities and interests as forces shaping the institutionalization of technology transfer. Through the use of archival sources we trace the rise and influence of Canadian technological nationalism--a response to Canada's perceived dependency…

  12. Adoption and resistance: reflections on human, organizational, and information technologies in picture archive and communication systems (PACS)

    NASA Astrophysics Data System (ADS)

    Sappington, Rodney W.

    2005-04-01

    In research conducted at academic and community hospitals in the United States since 2001, this paper examines complex human and technological relationships employed to renegotiate behavior within hospital administrative and clinical cultures. In the planning and implementation of PACS in a four-facility hospital we will enter into what can be described as processes of "adoption" and "resistance", seemingly opposite approaches to system implementation, which I argue are in fact key responses to planning, design, and customization of imaging and information systems in a context of convergence. In a larger context of convergence known as NBIC tools (nanotechnology, biotechnology, information technology, and cognitive sciences) it has become increasingly clear to leaders in the field that it is essential to redesign organizational technologies. A novel system has little chance of being fully utilized by conventional organizational structures in an era of system convergence. The challenge of embracing a larger systems perspective lies in opening untapped potential within the healthcare enterprise by preparing the ground for reflection on new approaches to training, and bridging specialized knowledge across computer engineering, clinical decision making, and organizational perspectives for the benefit of patient care. Case studies will demonstrate how organizational and system design technologies are crucial in insuring that PACS implementation strategies can encourage the emergence of new levels of quality for patient care. The goal is to provide successful design-build models that are true to organizational specificity, persons, and clinical practices undergoing change and uncertainty.

  13. The Telecommunications and Data Acquisition Report

    NASA Technical Reports Server (NTRS)

    Yuen, Joseph H. (Editor)

    1993-01-01

    This quarterly publication provides archival reports on developments in programs managed by JPL's Office of Telecommunications and Data Acquisition (TDA). In space communications, radio navigation, radio science, and ground-based radio and radar astronomy, it reports on activities of the Deep Space Network (DSN) in planning, supporting research and technology, implementation, and operations. Also included are standards activity at JPL for space data and information systems and reimbursable DSN work performed for other space agencies through NASA. The papers included in this document cover satellite tracking and ground-based navigation, spacecraft-ground communications, and optical communication systems for the Deep Space Network.

  14. Investigation of air transportation technology at Princeton University, 1992-1993

    NASA Technical Reports Server (NTRS)

    Stengel, Robert F.

    1994-01-01

    The Air Transportation Research Program at Princeton University proceeded along five avenues during the past year: (1) Flight Control System Robustness; (2) Microburst Hazards to Aircraft; (3) Wind Rotor Hazards to Aircraft; (4) Intelligent Aircraft/Airspace Systems; and (5) Aerospace Optical Communications. This research resulted in a number of publications, including theses, archival papers, and conference papers. An annotated bibliography of publications that appeared between June 1992 and June 1993 is included. The research that these papers describe was supported in whole or in part by the Joint University Program, including work that was completed prior to the reporting period.

  15. BOREAS AFM-6 Boundary Layer Height Data

    NASA Technical Reports Server (NTRS)

    Wilczak, James; Hall, Forrest G. (Editor); Newcomer, Jeffrey A. (Editor); Smith, David E. (Technical Monitor)

    2000-01-01

    The Boreal Ecosystem-Atmosphere Study (BOREAS) Airborne Fluxes and Meteorology (AFM)-6 team from National Oceanic and Atmospheric Adminsitration/Environment Technology Laboratory (NOAA/ETL) operated a 915-MHz wind/Radio Acoustic Sounding System (RASS) profiler system in the Southern Study Area (SSA) near the Old Jack Pine (OJP) site. This data set provides boundary layer height information over the site. The data were collected from 21 May 1994 to 20 Sep 1994 and are stored in tabular ASCII files. The boundary layer height data are available from the Earth Observing System Data and Information System (EOSDIS) Oak Ridge National Laboratory (ORNL) Distributed Active Archive Center (DAAC). The data files are available on a CD-ROM (see document number 20010000884).

  16. Integration, acceptance testing, and clinical operation of the Medical Information, Communication and Archive System, phase II.

    PubMed

    Smith, E M; Wandtke, J; Robinson, A

    1999-05-01

    The Medical Information, Communication and Archive System (MICAS) is a multivendor incremental approach to picture archiving and communications system (PACS). It is a multimodality integrated image management system that is seamlessly integrated with the radiology information system (RIS). Phase II enhancements of MICAS include a permanent archive, automated workflow, study caches, Microsoft (Redmond, WA) Windows NT diagnostic workstations with all components adhering to Digital Information Communications in Medicine (DICOM) standards. MICAS is designed as an enterprise-wide PACS to provide images and reports throughout the Strong Health healthcare network. Phase II includes the addition of a Cemax-Icon (Fremont, CA) archive, PACS broker (Mitra, Waterloo, Canada), an interface (IDX PACSlink, Burlington, VT) to the RIS (IDXrad) plus the conversion of the UNIX-based redundant array of inexpensive disks (RAID) 5 temporary archives in phase I to NT-based RAID 0 DICOM modality-specific study caches (ImageLabs, Bedford, MA). The phase I acquisition engines and workflow management software was uninstalled and the Cemax archive manager (AM) assumed these functions. The existing ImageLabs UNIX-based viewing software was enhanced and converted to an NT-based DICOM viewer. Installation of phase II hardware and software and integration with existing components began in July 1998. Phase II of MICAS demonstrates that a multivendor open-system incremental approach to PACS is feasible, cost-effective, and has significant advantages over a single-vendor implementation.

  17. Improvements in Space Geodesy Data Discovery at the CDDIS

    NASA Technical Reports Server (NTRS)

    Noll, C.; Pollack, N.; Michael, P.

    2011-01-01

    The Crustal Dynamics Data Information System (CDDIS) supports data archiving and distribution activities for the space geodesy and geodynamics community. The main objectives of the system are to store space geodesy and geodynamics related data products in a central data bank. to maintain information about the archival of these data, and to disseminate these data and information in a timely manner to a global scientific research community. The archive consists of GNSS, laser ranging, VLBI, and DORIS data sets and products derived from these data. The CDDIS is one of NASA's Earth Observing System Data and Information System (EOSDIS) distributed data centers; EOSDIS data centers serve a diverse user community and arc tasked to provide facilities to search and access science data and products. Several activities are currently under development at the CDDIS to aid users in data discovery, both within the current community and beyond. The CDDIS is cooperating in the development of Geodetic Seamless Archive Centers (GSAC) with colleagues at UNAVCO and SIO. TIle activity will provide web services to facilitate data discovery within and across participating archives. In addition, the CDDIS is currently implementing modifications to the metadata extracted from incoming data and product files pushed to its archive. These enhancements will permit information about COOlS archive holdings to be made available through other data portals such as Earth Observing System (EOS) Clearinghouse (ECHO) and integration into the Global Geodetic Observing System (GGOS) portal.

  18. A MYSQL-BASED DATA ARCHIVER: PRELIMINARY RESULTS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Matthew Bickley; Christopher Slominski

    2008-01-23

    Following an evaluation of the archival requirements of the Jefferson Laboratory accelerator’s user community, a prototyping effort was executed to determine if an archiver based on MySQL had sufficient functionality to meet those requirements. This approach was chosen because an archiver based on a relational database enables the development effort to focus on data acquisition and management, letting the database take care of storage, indexing and data consistency. It was clear from the prototype effort that there were no performance impediments to successful implementation of a final system. With our performance concerns addressed, the lab undertook the design and developmentmore » of an operational system. The system is in its operational testing phase now. This paper discusses the archiver system requirements, some of the design choices and their rationale, and presents the acquisition, storage and retrieval performance.« less

  19. [Research and implementation of the TLS network transport security technology based on DICOM standard].

    PubMed

    Lu, Xiaoqi; Wang, Lei; Zhao, Jianfeng

    2012-02-01

    With the development of medical information, Picture Archiving and Communications System (PACS), Hospital Information System/Radiology Information System(HIS/RIS) and other medical information management system become popular and developed, and interoperability between these systems becomes more frequent. So, these enclosed systems will be open and regionalized by means of network, and this is inevitable. If the trend becomes true, the security of information transmission may be the first problem to be solved. Based on the need for network security, we investigated the Digital Imaging and Communications in Medicine (DICOM) Standard and Transport Layer Security (TLS) Protocol, and implemented the TLS transmission of the DICOM medical information with OpenSSL toolkit and DCMTK toolkit.

  20. NASA CDDIS: Next Generation System

    NASA Astrophysics Data System (ADS)

    Michael, B. P.; Noll, C. E.; Woo, J. Y.; Limbacher, R. I.

    2017-12-01

    The Crustal Dynamics Data Information System (CDDIS) supports data archiving and distribution activities for the space geodesy and geodynamics community. The main objectives of the system are to make space geodesy and geodynamics related data and derived products available in a central archive, to maintain information about the archival of these data, to disseminate these data and information in a timely manner to a global scientific research community, and to provide user based tools for the exploration and use of the archive. As the techniques and data volume have increased, the CDDIS has evolved to offer a broad range of data ingest services, from data upload, quality control, documentation, metadata extraction, and ancillary information. As a major step taken to improve services, the CDDIS has transitioned to a new hardware system and implemented incremental upgrades to a new software system to meet these goals while increasing automation. This new system increases the ability of the CDDIS to consistently track errors and issues associated with data and derived product files uploaded to the system and to perform post-ingest checks on all files received for the archive. In addition, software to process new data sets and changes to existing data sets have been implemented to handle new formats and any issues identified during the ingest process. In this poster, we will discuss the CDDIS archive in general as well as review and contrast the system structures and quality control measures employed before and after the system upgrade. We will also present information about new data sets and changes to existing data and derived products archived at the CDDIS.

  1. Ark and Archive: Making a Place for Long-Term Research on Barro Colorado Island, Panama.

    PubMed

    Raby, Megan

    2015-12-01

    Barro Colorado Island (BCI), Panama, may be the most studied tropical forest in the world. A 1,560-hectare island created by the flooding of the Panama Canal, BCI became a nature reserve and biological research station in 1923. Contemporaries saw the island as an "ark" preserving a sample of primeval tropical nature for scientific study. BCI was not simply "set aside," however. The project of making it a place for science significantly reshaped the island through the twentieth century. This essay demonstrates that BCI was constructed specifically to allow long-term observation of tropical organisms--their complex behaviors, life histories, population dynamics, and changing species composition. An evolving system of monitoring and information technology transformed the island into a living scientific "archive," in which the landscape became both an object and a repository of scientific knowledge. As a research site, BCI enabled a long-term, place-based form of collective empiricism, focused on the study of the ecology of a single tropical island. This essay articulates tropical ecology as a "science of the archive" in order to examine the origins of practices of environmental surveillance that have become central to debates about global change and conservation.

  2. Digital data preservation for scholarly publications in astronomy

    NASA Astrophysics Data System (ADS)

    Choudhury, Sayeed; di Lauro, Tim; Szalay, Alex; Vishniac, Ethan; Hanisch, Robert; Steffen, Julie; Milkey, Robert; Ehling, Teresa; Plante, Ray

    2007-11-01

    Astronomy is similar to other scientific disciplines in that scholarly publication relies on the presentation and interpretation of data. But although astronomy now has archives for its primary research telescopes and associated surveys, the highly processed data that is presented in the peer-reviewed journals and is the basis for final analysis and interpretation is generally not archived and has no permanent repository. We have initiated a project whose goal is to implement an end-to-end prototype system which, through a partnership of a professional society, that society's scholarly publications/publishers, research libraries, and an information technology substrate provided by the Virtual Observatory, will capture high-level digital data as part of the publication process and establish a distributed network of curated, permanent data repositories. The data in this network will be accessible through the research journals, astronomy data centers, and Virtual Observatory data discovery portals.

  3. An open-source LabVIEW application toolkit for phasic heart rate analysis in psychophysiological research.

    PubMed

    Duley, Aaron R; Janelle, Christopher M; Coombes, Stephen A

    2004-11-01

    The cardiovascular system has been extensively measured in a variety of research and clinical domains. Despite technological and methodological advances in cardiovascular science, the analysis and evaluation of phasic changes in heart rate persists as a way to assess numerous psychological concomitants. Some researchers, however, have pointed to constraints on data analysis when evaluating cardiac activity indexed by heart rate or heart period. Thus, an off-line application toolkit for heart rate analysis is presented. The program, written with National Instruments' LabVIEW, incorporates a variety of tools for off-line extraction and analysis of heart rate data. Current methods and issues concerning heart rate analysis are highlighted, and how the toolkit provides a flexible environment to ameliorate common problems that typically lead to trial rejection is discussed. Source code for this program may be downloaded from the Psychonomic Society Web archive at www.psychonomic.org/archive/.

  4. Miniaturized GPS Tags Identify Non-breeding Territories of a Small Breeding Migratory Songbird.

    PubMed

    Hallworth, Michael T; Marra, Peter P

    2015-06-09

    For the first time, we use a small archival global positioning system (GPS) tag to identify and characterize non-breeding territories, quantify migratory connectivity, and identify population boundaries of Ovenbirds (Seiurus aurocapilla), a small migratory songbird, captured at two widely separated breeding locations. We recovered 15 (31%) GPS tags with data and located the non-breeding territories of breeding Ovenbirds from Maryland and New Hampshire, USA (0.50 ± 0.15 ha, mean ± SE). All non-breeding territories had similar environmental attributes despite being distributed across parts of Florida, Cuba and Hispaniola. New Hampshire and Maryland breeding populations had non-overlapping non-breeding population boundaries that encompassed 114,803 and 169,233 km(2), respectively. Archival GPS tags provided unprecedented pinpoint locations and associated environmental information of tropical non-breeding territories. This technology is an important step forward in understanding seasonal interactions and ultimately population dynamics of populations throughout the annual cycle.

  5. Building a Digital Library for Multibeam Data, Images and Documents

    NASA Astrophysics Data System (ADS)

    Miller, S. P.; Staudigel, H.; Koppers, A.; Johnson, C.; Cande, S.; Sandwell, D.; Peckman, U.; Becker, J. J.; Helly, J.; Zaslavsky, I.; Schottlaender, B. E.; Starr, S.; Montoya, G.

    2001-12-01

    The Scripps Institution of Oceanography, the UCSD Libraries and the San Diego Supercomputing Center have joined forces to establish a digital library for accessing a wide range of multibeam and marine geophysical data, to a community that ranges from the MGG researcher to K-12 outreach clients. This digital library collection will include 233 multibeam cruises with grids, plots, photographs, station data, technical reports, planning documents and publications, drawn from the holdings of the Geological Data Center and the SIO Archives. Inquiries will be made through an Ocean Exploration Console, reminiscent of a cockpit display where a multitude of data may be displayed individually or in two or three-dimensional projections. These displays will provide access to cruise data as well as global databases such as Global Topography, crustal age, and sediment thickness, thus meeting the day-to-day needs of researchers as well as educators, students, and the public. The prototype contains a few selected expeditions, and a review of the initial approach will be solicited from the user community during the poster session. The search process can be focused by a variety of constraints: geospatial (lat-lon box), temporal (e.g., since 1996), keyword (e.g., cruise, place name, PI, etc.), or expert-level (e.g., K-6 or researcher). The Storage Resource Broker (SRB) software from the SDSC manages the evolving collection as a series of distributed but related archives in various media, from shipboard data through processing and final archiving. The latest version of MB-System provides for the systematic creation of standard metadata, and for the harvesting of metadata from multibeam files. Automated scripts will be used to load the metadata catalog to enable queries with an Oracle database management system. These new efforts to bridge the gap between libraries and data archives are supported by the NSF Information Technology and National Science Digital Library (NSDL) programs, augmented by UC funds, and closely coordinated with Digital Library for Earth System Education (DLESE) activities.

  6. Third International Symposium on Space Mission Operations and Ground Data Systems, part 2

    NASA Technical Reports Server (NTRS)

    Rash, James L. (Editor)

    1994-01-01

    Under the theme of 'Opportunities in Ground Data Systems for High Efficiency Operations of Space Missions,' the SpaceOps '94 symposium included presentations of more than 150 technical papers spanning five topic areas: Mission Management, Operations, Data Management, System Development, and Systems Engineering. The symposium papers focus on improvements in the efficiency, effectiveness, and quality of data acquisition, ground systems, and mission operations. New technology, methods, and human systems are discussed. Accomplishments are also reported in the application of information systems to improve data retrieval, reporting, and archiving; the management of human factors; the use of telescience and teleoperations; and the design and implementation of logistics support for mission operations. This volume covers expert systems, systems development tools and approaches, and systems engineering issues.

  7. Improving Patient Safety in Hospitals through Usage of Cloud Supported Video Surveillance

    PubMed Central

    Dašić, Predrag; Dašić, Jovan; Crvenković, Bojan

    2017-01-01

    BACKGROUND: Patient safety in hospitals is of equal importance as providing treatments and urgent healthcare. With the development of Cloud technologies and Big Data analytics, it is possible to employ VSaaS technology virtually anywhere, for any given security purpose. AIM: For the listed benefits, in this paper, we give an overview of the existing cloud surveillance technologies which can be implemented for improving patient safety. MATERIAL AND METHODS: Modern VSaaS systems provide higher elasticity and project scalability in dealing with real-time information processing. Modern surveillance technologies can prove to be an effective tool for prevention of patient falls, undesired movement and tempering with attached life supporting devices. Given a large number of patients who require constant supervision, a cloud-based monitoring system can dramatically reduce the occurring costs. It provides continuous real-time monitoring, increased overall security and safety, improved staff productivity, prevention of dishonest claims and long-term digital archiving. CONCLUSION: Patient safety is a growing issue which can be improved with the usage of high-end centralised surveillance systems allowing the staff to focus more on treating health issues rather that keeping a watchful eye on potential incidents. PMID:28507610

  8. The role of the Department of Defense in PACS and telemedicine research and development.

    PubMed

    Mogel, Greg T

    2003-01-01

    The United States Department of Defense (DOD) has played a leading role in the movement of digital imaging, picture archiving and communications systems, and more recently telemedicine with its associated technologies into the mainstream of healthcare. Beginning in the 1980s with domestic implementations, and followed in the 1990s by both small and large-scale military deployments, these technologies have been put into action with varying degrees of success. These efforts however, have always served as a guidepost for similar civilian efforts and the establishment of a marketplace for the technologies. This paper examines the history of the DOD's role in these areas, the projects and programs established, assessing their current state of development and identifying the future direction of the DOD's research and implementation efforts in telemedicine and advanced medical technologies. Copyright 2002 Published by Elsevier Science Ltd.

  9. The NAS Computational Aerosciences Archive

    NASA Technical Reports Server (NTRS)

    Miceli, Kristina D.; Globus, Al; Lasinski, T. A. (Technical Monitor)

    1995-01-01

    In order to further the state-of-the-art in computational aerosciences (CAS) technology, researchers must be able to gather and understand existing work in the field. One aspect of this information gathering is studying published work available in scientific journals and conference proceedings. However, current scientific publications are very limited in the type and amount of information that they can disseminate. Information is typically restricted to text, a few images, and a bibliography list. Additional information that might be useful to the researcher, such as additional visual results, referenced papers, and datasets, are not available. New forms of electronic publication, such as the World Wide Web (WWW), limit publication size only by available disk space and data transmission bandwidth, both of which are improving rapidly. The Numerical Aerodynamic Simulation (NAS) Systems Division at NASA Ames Research Center is in the process of creating an archive of CAS information on the WWW. This archive will be based on the large amount of information produced by researchers associated with the NAS facility. The archive will contain technical summaries and reports of research performed on NAS supercomputers, visual results (images, animations, visualization system scripts), datasets, and any other supporting meta-information. This information will be available via the WWW through the NAS homepage, located at http://www.nas.nasa.gov/, fully indexed for searching. The main components of the archive are technical summaries and reports, visual results, and datasets. Technical summaries are gathered every year by researchers who have been allotted resources on NAS supercomputers. These summaries, together with supporting visual results and references, are browsable by interested researchers. Referenced papers made available by researchers can be accessed through hypertext links. Technical reports are in-depth accounts of tools and applications research projects performed by NAS staff members and collaborators. Visual results, which may be available in the form of images, animations, and/or visualization scripts, are generated by researchers with respect to a certain research project, depicting dataset features that were determined important by the investigating researcher. For example, script files for visualization systems (e.g. FAST, PLOT3D, AVS) are provided to create visualizations on the user's local workstation to elucidate the key points of the numerical study. Users can then interact with the data starting where the investigator left off. Datasets are intended to give researchers an opportunity to understand previous work, 'mine' solutions for new information (for example, have you ever read a paper thinking "I wonder what the helicity density looks like?"), compare new techniques with older results, collaborate with remote colleagues, and perform validation. Supporting meta-information associated with the research projects is also important to provide additional context for research projects. This may include information such as the software used in the simulation (e.g. grid generators, flow solvers, visualization). In addition to serving the CAS research community, the information archive will also be helpful to students, visualization system developers and researchers, and management. Students (of any age) can use the data to study fluid dynamics, compare results from different flow solvers, learn about meshing techniques, etc., leading to better informed individuals. For these users it is particularly important that visualization be integrated into dataset archives. Visualization researchers can use dataset archives to test algorithms and techniques, leading to better visualization systems, Management can use the data to figure what is really going on behind the viewgraphs. All users will benefit from fast, easy, and convenient access to CFD datasets. The CAS information archive hopes to serve as a useful resource to those interested in computational sciences. At present, only information that may be distributed internationally is made available via the archive. Studies are underway to determine security requirements and solutions to make additional information available. By providing access to the archive via the WWW, the process of information gathering can be more productive and fruitful due to ease of access and ability to manage many different types of information. As the archive grows, additional resources from outside NAS will be added, providing a dynamic source of research results.

  10. Storage media for computers in radiology

    PubMed Central

    Dandu, Ravi Varma

    2008-01-01

    The introduction and wide acceptance of digital technology in medical imaging has resulted in an exponential increase in the amount of data produced by the radiology department. There is an insatiable need for storage space to archive this ever-growing volume of image data. Healthcare facilities should plan the type and size of the storage media that they needed, based not just on the volume of data but also on considerations such as the speed and ease of access, redundancy, security, costs, as well as the longevity of the archival technology. This article reviews the various digital storage media and compares their merits and demerits. PMID:19774182

  11. Development of public science archive system of Subaru Telescope

    NASA Astrophysics Data System (ADS)

    Baba, Hajime; Yasuda, Naoki; Ichikawa, Shin-Ichi; Yagi, Masafumi; Iwamoto, Nobuyuki; Takata, Tadafumi; Horaguchi, Toshihiro; Taga, Masatochi; Watanabe, Masaru; Okumura, Shin-Ichiro; Ozawa, Tomohiko; Yamamoto, Naotaka; Hamabe, Masaru

    2002-09-01

    We have developed a public science archive system, Subaru-Mitaka-Okayama-Kiso Archive system (SMOKA), as a successor of Mitaka-Okayama-Kiso Archive (MOKA) system. SMOKA provides an access to the public data of Subaru Telescope, the 188 cm telescope at Okayama Astrophysical Observatory, and the 105 cm Schmidt telescope at Kiso Observatory of the University of Tokyo. Since 1997, we have tried to compile the dictionary of FITS header keywords. The accomplishment of the dictionary enabled us to construct an unified public archive of the data obtained with various instruments at the telescopes. SMOKA has two kinds of user interfaces; Simple Search and Advanced Search. Novices can search data by simply selecting the name of the target with the Simple Search interface. Experts would prefer to set detailed constraints on the query, using the Advanced Search interface. In order to improve the efficiency of searching, several new features are implemented, such as archive status plots, calibration data search, an annotation system, and an improved Quick Look Image browsing system. We can efficiently develop and operate SMOKA by adopting a three-tier model for the system. Java servlets and Java Server Pages (JSP) are useful to separate the front-end presentation from the middle and back-end tiers.

  12. The Telecommunications and Data Acquisition Report

    NASA Technical Reports Server (NTRS)

    Posner, Edward C. (Editor)

    1991-01-01

    This quarterly publication provides archival reports on developments in programs managed by the Jet Propulsion Laboratory's (JPL's) Office of Telecommunications and Data Acquisition (TDA). In space communications, radio navigation, radio science, and ground-based radio and radar astronomy, it reports on the activities of the Deep Space Network (DSN) in planning, in supporting research and technology, in implementation, and in operations. Also included is standards activity at JPL for space data, information systems, and reimbursable DSN work performed for other space agencies through NASA.

  13. The Telecommunications and Data Acquisition Report

    NASA Technical Reports Server (NTRS)

    Posner, E. C. (Editor)

    1990-01-01

    Archival reports on developments in programs managed by the Jet Propulsion Laboratory's (JPL) Office of Telecommunications and Data Acquisition (TDA) are given. Space communications, radio navigation, radio science, and ground-based radio and radar astronomy, activities of the Deep Space Network (DSN) and its associated Ground Communications Facility (GCF) in planning, supporting research and technology, implementation, and operations are reported. Also included is TDA-funded activity at JPL on data and information systems and reimbursable Deep Space Network (DSN) work performed for other space agencies through NASA.

  14. The Telecommunications and Data Acquisition

    NASA Technical Reports Server (NTRS)

    Posner, Edward C. (Editor)

    1992-01-01

    This quarterly publication provides archival reports on developments in programs managed by JPL's Office of Telecommunications and Data Acquisition (TDA). In space communications, radio navigation, radio science, and ground-based radio and radar astronomy, it reports on activities of the Deep Space Network (DSN) in planning, supporting research and technology, implementation, and operations. Also included are standards activity at JPL for space data and information systems and reimbursable DSN work performed for other space agencies through NASA. The preceding work is all performed for NASA's Office of Space Communications (OSC).

  15. Software Technology for Adaptable, Reliable Systems (STARS). Software Architecture Seminar Report: Central Archive for Reusable Defense Software (CARDS)

    DTIC Science & Technology

    1994-01-29

    other processes, but that he arrived at his results in a different manner. Batory didn’t start with idioms; he performed a domain analysis and...abstracted idioms. Through domain analysis and domain modeling, new idioms can be found and the form of architecture can be the same. It was also questioned...Programming 5. Consensus Definition of Architecture 6. Inductive Analysis of Current Exemplars 7. VHDL (Bailor) 8. Ontological Structuring 3.3.3

  16. Implementation of a Campuswide Distributed Mass Storage Service: the Dream Versus Reality

    NASA Technical Reports Server (NTRS)

    Prahst, Stephen; Armstead, Betty Jo

    1996-01-01

    In 1990, a technical team at NASA Lewis Research Center, Cleveland, Ohio, began defining a Mass Storage Service to pro- wide long-term archival storage, short-term storage for very large files, distributed Network File System access, and backup services for critical data dw resides on workstations and personal computers. Because of software availability and budgets, the total service was phased in over dm years. During the process of building the service from the commercial technologies available, our Mass Storage Team refined the original vision and learned from the problems and mistakes that occurred. We also enhanced some technologies to better meet the needs of users and system administrators. This report describes our team's journey from dream to reality, outlines some of the problem areas that still exist, and suggests some solutions.

  17. Optical Fiber Transmission In A Picture Archiving And Communication System For Medical Applications

    NASA Astrophysics Data System (ADS)

    Aaron, Gilles; Bonnard, Rene

    1984-03-01

    In an hospital, the need for an electronic communication network is increasing along with the digitization of pictures. This local area network is intended to link some picture sources such as digital radiography, computed tomography, nuclear magnetic resonance, ultrasounds etc...with an archiving system. Interactive displays can be used in examination rooms, physicians offices and clinics. In such a system, three major requirements must be considered : bit-rate, cable length, and number of devices. - The bit-rate is very important because a maximum response time of a few seconds must be guaranteed for several mega-bit pictures. - The distance between nodes may be a few kilometers in some large hospitals. - The number of devices connected to the network is never greater than a few tens because picture sources and computers represent important hardware, and simple displays can be concentrated. All these conditions are fulfilled by optical fiber transmissions. Depending on the topology and the access protocol, two solutions are to be considered - Active ring - Active or passive star Finally Thomson-CSF developments of optical transmission devices for large networks of TV distribution bring us a technological support and a mass produc-tion which will cut down hardware costs.

  18. Lessons learned from planetary science archiving

    NASA Astrophysics Data System (ADS)

    Zender, J.; Grayzeck, E.

    2006-01-01

    The need for scientific archiving of past, current, and future planetary scientific missions, laboratory data, and modeling efforts is indisputable. To quote from a message by G. Santayama carved over the entrance of the US Archive in Washington DC “Those who can not remember the past are doomed to repeat it.” The design, implementation, maintenance, and validation of planetary science archives are however disputed by the involved parties. The inclusion of the archives into the scientific heritage is problematic. For example, there is the imbalance between space agency requirements and institutional and national interests. The disparity of long-term archive requirements and immediate data analysis requests are significant. The discrepancy between the space missions archive budget and the effort required to design and build the data archive is large. An imbalance exists between new instrument development and existing, well-proven archive standards. The authors present their view on the problems and risk areas in the archiving concepts based on their experience acquired within NASA’s Planetary Data System (PDS) and ESA’s Planetary Science Archive (PSA). Individual risks and potential problem areas are discussed based on a model derived from a system analysis done upfront. The major risk for a planetary mission science archive is seen in the combination of minimal involvement by Mission Scientists and inadequate funding. The authors outline how the risks can be reduced. The paper ends with the authors view on future planetary archive implementations including the archive interoperability aspect.

  19. JPL Big Data Technologies for Radio Astronomy

    NASA Astrophysics Data System (ADS)

    Jones, Dayton L.; D'Addario, L. R.; De Jong, E. M.; Mattmann, C. A.; Rebbapragada, U. D.; Thompson, D. R.; Wagstaff, K.

    2014-04-01

    During the past three years the Jet Propulsion Laboratory has been working on several technologies to deal with big data challenges facing next-generation radio arrays, among other applications. This program has focused on the following four areas: 1) We are investigating high-level ASIC architectures that reduce power consumption for cross-correlation of data from large interferometer arrays by one to two orders of magnitude. The cost of operations for the Square Kilometre Array (SKA), which may be dominated by the cost of power for data processing, is a serious concern. A large improvement in correlator power efficiency could have a major positive impact. 2) Data-adaptive algorithms (machine learning) for real-time detection and classification of fast transient signals in high volume data streams are being developed and demonstrated. Studies of the dynamic universe, particularly searches for fast (<< 1 second) transient events, require that data be analyzed rapidly and with robust RFI rejection. JPL, in collaboration with the International Center for Radio Astronomy Research in Australia, has developed a fast transient search system for eventual deployment on ASKAP. In addition, a real-time transient detection experiment is now running continuously and commensally on NRAO's Very Long Baseline Array. 3) Scalable frameworks for data archiving, mining, and distribution are being applied to radio astronomy. A set of powerful open-source Object Oriented Data Technology (OODT) tools is now available through Apache. OODT was developed at JPL for Earth science data archives, but it is proving to be useful for radio astronomy, planetary science, health care, Earth climate, and other large-scale archives. 4) We are creating automated, event-driven data visualization tools that can be used to extract information from a wide range of complex data sets. Visualization of complex data can be improved through algorithms that detect events or features of interest and autonomously generate images or video to display those features. This work has been carried out at the Jet Propulsion Laboratory, California Institute of Technology, under contract with the National Aeronautics and Space Administration.

  20. Mass storage technology in networks

    NASA Astrophysics Data System (ADS)

    Ishii, Katsunori; Takeda, Toru; Itao, Kiyoshi; Kaneko, Reizo

    1990-08-01

    Trends and features of mass storage subsystems in network are surveyed and their key technologies spotlighted. Storage subsystems are becoming increasingly important in new network systems in which communications and data processing are systematically combined. These systems require a new class of high-performance mass-information storage in order to effectively utilize their processing power. The requirements of high transfer rates, high transactional rates and large storage capacities, coupled with high functionality, fault tolerance and flexibility in configuration, are major challenges in storage subsystems. Recent progress in optical disk technology has resulted in improved performance of on-line external memories to optical disk drives, which are competing with mid-range magnetic disks. Optical disks are more effective than magnetic disks in using low-traffic random-access file storing multimedia data that requires large capacity, such as in archive use and in information distribution use by ROM disks. Finally, it demonstrates image coded document file servers for local area network use that employ 130mm rewritable magneto-optical disk subsystems.

  1. Rapid Multiplex PCR Assay To Identify Respiratory Viral Pathogens: Moving Forward Diagnosing The Common Cold

    PubMed Central

    Gordon, Sarah M; Elegino-Steffens, Diane U; Agee, Willie; Barnhill, Jason; Hsue, Gunther

    2013-01-01

    Upper respiratory tract infections (URIs) can be a serious burden to the healthcare system. The majority of URIs are viral in etiology, but definitive diagnosis can prove difficult due to frequently overlapping clinical presentations of viral and bacterial infections, and the variable sensitivity, and lengthy turn-around time of viral culture. We tested new automated nested multiplex PCR technology, the FilmArray® system, in the TAMC department of clinical investigations, to determine the feasibility of replacing the standard viral culture with a rapid turn-around system. We conducted a feasibility study using a single-blinded comparison study, comparing PCR results with archived viral culture results from a convenience sample of cryopreserved archived nasopharyngeal swabs from acutely ill ED patients who presented with complaints of URI symptoms. A total of 61 archived samples were processed. Viral culture had previously identified 31 positive specimens from these samples. The automated nested multiplex PCR detected 38 positive samples. In total, PCR was 94.5% concordant with the previously positive viral culture results. However, PCR was only 63.4% concordant with the negative viral culture results, owing to PCR detection of 11 additional viral pathogens not recovered on viral culture. The average time to process a sample was 75 minutes. We determined that an automated nested multiplex PCR is a feasible alternative to viral culture in an acute clinical setting. We were able to detect at least 94.5% as many viral pathogens as viral culture is able to identify, with a faster turn-around time. PMID:24052914

  2. MIMIC II: a massive temporal ICU patient database to support research in intelligent patient monitoring

    NASA Technical Reports Server (NTRS)

    Saeed, M.; Lieu, C.; Raber, G.; Mark, R. G.

    2002-01-01

    Development and evaluation of Intensive Care Unit (ICU) decision-support systems would be greatly facilitated by the availability of a large-scale ICU patient database. Following our previous efforts with the MIMIC (Multi-parameter Intelligent Monitoring for Intensive Care) Database, we have leveraged advances in networking and storage technologies to develop a far more massive temporal database, MIMIC II. MIMIC II is an ongoing effort: data is continuously and prospectively archived from all ICU patients in our hospital. MIMIC II now consists of over 800 ICU patient records including over 120 gigabytes of data and is growing. A customized archiving system was used to store continuously up to four waveforms and 30 different parameters from ICU patient monitors. An integrated user-friendly relational database was developed for browsing of patients' clinical information (lab results, fluid balance, medications, nurses' progress notes). Based upon its unprecedented size and scope, MIMIC II will prove to be an important resource for intelligent patient monitoring research, and will support efforts in medical data mining and knowledge-discovery.

  3. Ensemble LUT classification for degraded document enhancement

    NASA Astrophysics Data System (ADS)

    Obafemi-Ajayi, Tayo; Agam, Gady; Frieder, Ophir

    2008-01-01

    The fast evolution of scanning and computing technologies have led to the creation of large collections of scanned paper documents. Examples of such collections include historical collections, legal depositories, medical archives, and business archives. Moreover, in many situations such as legal litigation and security investigations scanned collections are being used to facilitate systematic exploration of the data. It is almost always the case that scanned documents suffer from some form of degradation. Large degradations make documents hard to read and substantially deteriorate the performance of automated document processing systems. Enhancement of degraded document images is normally performed assuming global degradation models. When the degradation is large, global degradation models do not perform well. In contrast, we propose to estimate local degradation models and use them in enhancing degraded document images. Using a semi-automated enhancement system we have labeled a subset of the Frieder diaries collection.1 This labeled subset was then used to train an ensemble classifier. The component classifiers are based on lookup tables (LUT) in conjunction with the approximated nearest neighbor algorithm. The resulting algorithm is highly effcient. Experimental evaluation results are provided using the Frieder diaries collection.1

  4. Performance of asynchronous transfer mode (ATM) local area and wide area networks for medical imaging transmission in clinical environment.

    PubMed

    Huang, H K; Wong, A W; Zhu, X

    1997-01-01

    Asynchronous transfer mode (ATM) technology emerges as a leading candidate for medical image transmission in both local area network (LAN) and wide area network (WAN) applications. This paper describes the performance of an ATM LAN and WAN network at the University of California, San Francisco. The measurements were obtained using an intensive care unit (ICU) server connecting to four image workstations (WS) at four different locations of a hospital-integrated picture archiving and communication system (HI-PACS) in a daily regular clinical environment. Four types of performance were evaluated: magnetic disk-to-disk, disk-to-redundant array of inexpensive disks (RAID), RAID-to-memory, and memory-to-memory. Results demonstrate that the transmission rate between two workstations can reach 5-6 Mbytes/s from RAID-to-memory, and 8-10 Mbytes/s from memory-to-memory. When the server has to send images to all four workstations simultaneously, the transmission rate to each WS is about 4 Mbytes/s. Both situations are adequate for radiologic image communications for picture archiving and communication systems (PACS) and teleradiology applications.

  5. Monitoring and Correcting Autonomic Function Aboard Mir: NASA Technology Used in Space and on Earth to Facilitate Adaptation

    NASA Technical Reports Server (NTRS)

    Cowings, P.; Toscano, W.; Taylor, B.; DeRoshia, C.; Kornilova, L.; Koslovskaya, I.; Miller, N.

    1999-01-01

    The broad objective of the research was to study individual characteristics of human adaptation to long duration spaceflight and possibilities of their correction using autonomic conditioning. The changes in autonomic state during adaptation to microgravity can have profound effects on the operational efficiency of crewmembers and may result in debilitating biomedical symptoms. Ground-based and inflight experiment results showed that certain responses of autonomic nervous system were correlated with, or consistently preceded, reports of performance decrements or the symptoms. Autogenic-Feedback-Training Exercise (AFTE) is a physiological conditioning method that has been used to train people to voluntary control several of their own physiological responses. The specific objectives were: 1) To study human autonomic nervous system (ANS) responses to sustained exposure to microgravity; 2) To study human behavior/performance changes related to physiology; 3) To evaluate the effectiveness of preflight autonomic conditioning (AFTE) for facilitating adaptation to space and readaptation to Earth; and 4) To archive these data for the NASA Life Sciences Data Archive and thereby make this information available to the international scientific community.

  6. Life Sciences Data Archives (LSDA) in the Post-Shuttle Era

    NASA Technical Reports Server (NTRS)

    Fitts, Mary A.; Johnson-Throop, Kathy; Havelka, Jacque; Thomas, Diedre

    2010-01-01

    Now, more than ever before, NASA is realizing the value and importance of their intellectual assets. Principles of knowledge management-the systematic use and reuse of information, experience, and expertise to achieve a specific goal-are being applied throughout the agency. LSDA is also applying these solutions, which rely on a combination of content and collaboration technologies, to enable research teams to create, capture, share, and harness knowledge to do the things they do well, even better. In the early days of spaceflight, space life sciences data were collected and stored in numerous databases, formats, media-types and geographical locations. These data were largely unknown/unavailable to the research community. The Biomedical Informatics and Health Care Systems Branch of the Space Life Sciences Directorate at JSC and the Data Archive Project at ARC, with funding from the Human Research Program through the Exploration Medical Capability Element, are fulfilling these requirements through the systematic population of the Life Sciences Data Archive. This project constitutes a formal system for the acquisition, archival and distribution of data for HRP-related experiments and investigations. The general goal of the archive is to acquire, preserve, and distribute these data and be responsive to inquiries for the science communities. Information about experiments and data, as well as non-attributable human data and data from other species' are available on our public Web site http://lsda.jsc.nasa.gov. The Web site also includes a repository for biospecimens, and a utilization process. NASA has undertaken an initiative to develop a Shuttle Data Archive repository. The Shuttle program is nearing its end in 2010 and it is critical that the medical and research data related to the Shuttle program be captured, retained, and usable for research, lessons learned, and future mission planning. Communities of practice are groups of people who share a concern or a passion for something they do, and learn how to do it better as they interact regularly. LSDA works with the HRP community of practice to ensure that we are preserving the relevant research and data they need in the LSDA repository. An evidence-based approach to risk management is required in space life sciences. Evidence changes over time. LSDA has a pilot project with Collexis, a new type of Web-based search engine. Collexis differentiates itself from full-text search engines by making use of thesauri for information retrieval. The high-quality search is based on semantics that have been defined in a life sciences ontology. Additionally, Collexis' matching technology is unique, allowing discovery of partially matching dicuments. Users do not have to construct a complicated (Boolean) search query, but can simply enter a free text search without the risk of getting "no results". Collexis may address these issues by virtue of its retrieval and discovery capabilities across multiple repositories.

  7. Applying Task-Technology Fit Model to the Healthcare Sector: a Case Study of Hospitals' Computed Tomography Patient-Referral Mechanism.

    PubMed

    Chen, Ping-Shun; Yu, Chun-Jen; Chen, Gary Yu-Hsin

    2015-08-01

    With the growth in the number of elderly and people with chronic diseases, the number of hospital services will need to increase in the near future. With myriad of information technologies utilized daily and crucial information-sharing tasks performed at hospitals, understanding the relationship between task performance and information system has become a critical topic. This research explored the resource pooling of hospital management and considered a computed tomography (CT) patient-referral mechanism between two hospitals using the information system theory framework of Task-Technology Fit (TTF) model. The TTF model could be used to assess the 'match' between the task and technology characteristics. The patient-referral process involved an integrated information framework consisting of a hospital information system (HIS), radiology information system (RIS), and picture archiving and communication system (PACS). A formal interview was conducted with the director of the case image center on the applicable characteristics of TTF model. Next, the Icam DEFinition (IDEF0) method was utilized to depict the As-Is and To-Be models for CT patient-referral medical operational processes. Further, the study used the 'leagility' concept to remove non-value-added activities and increase the agility of hospitals. The results indicated that hospital information systems could support the CT patient-referral mechanism, increase hospital performance, reduce patient wait time, and enhance the quality of care for patients.

  8. The JPL ASTER Volcano Archive: the development and capabilities of a 15 year global high resolution archive of volcano data.

    NASA Astrophysics Data System (ADS)

    Linick, J. P.; Pieri, D. C.; Sanchez, R. M.

    2014-12-01

    The physical and temporal systematics of the world's volcanic activity is a compelling and productive arena for the exercise of orbital remote sensing techniques, informing studies ranging from basic volcanology to societal risk. Comprised of over 160,000 frames and spanning 15 years of the Terra platform mission, the ASTER Volcano Archive (AVA: http://ava.jpl.nasa.gov) is the world's largest (100+Tb) high spatial resolution (15-30-90m/pixel), multi-spectral (visible-SWIR-TIR), downloadable (kml enabled) dedicated archive of volcano imagery. We will discuss the development of the AVA, and describe its growing capability to provide new easy public access to ASTER global volcano remote sensing data. AVA system architecture is designed to facilitate parameter-based data mining, and for the implementation of archive-wide data analysis algorithms. Such search and analysis capabilities exploit AVA's unprecedented time-series data compilations for over 1,550 volcanoes worldwide (Smithsonian Holocene catalog). Results include thermal anomaly detection and mapping, as well as detection of SO2 plumes from explosive eruptions and passive SO2 emissions confined to the troposphere. We are also implementing retrospective ASTER image retrievals based on volcanic activity reports from Volcanic Ash Advisory Centers (VAACs) and the US Air Force Weather Agency (AFWA). A major planned expansion of the AVA is currently underway, with the ingest of the full 1972-present LANDSAT, and NASA EO-1, volcano imagery for comparison and integration with ASTER data. Work described here is carried out under contract to NASA at the Jet Propulsion Laboratory as part of the California Institute of Technology.

  9. Interoperability Outlook in the Big Data Future

    NASA Astrophysics Data System (ADS)

    Kuo, K. S.; Ramachandran, R.

    2015-12-01

    The establishment of distributed active archive centers (DAACs) as data warehouses and the standardization of file format by NASA's Earth Observing System Data Information System (EOSDIS) had doubtlessly propelled interoperability of NASA Earth science data to unprecedented heights in the 1990s. However, we obviously still feel wanting two decades later. We believe the inadequate interoperability we experience is a result of the the current practice that data are first packaged into files before distribution and only the metadata of these files are cataloged into databases and become searchable. Data therefore cannot be efficiently filtered. Any extensive study thus requires downloading large volumes of data files to a local system for processing and analysis.The need to download data not only creates duplication and inefficiency but also further impedes interoperability, because the analysis has to be performed locally by individual researchers in individual institutions. Each institution or researcher often has its/his/her own preference in the choice of data management practice as well as programming languages. Analysis results (derived data) so produced are thus subject to the differences of these practices, which later form formidable barriers to interoperability. A number of Big Data technologies are currently being examined and tested to address Big Earth Data issues. These technologies share one common characteristics: exploiting compute and storage affinity to more efficiently analyze large volumes and great varieties of data. Distributed active "archive" centers are likely to evolve into distributed active "analysis" centers, which not only archive data but also provide analysis service right where the data reside. "Analysis" will become the more visible function of these centers. It is thus reasonable to expect interoperability to improve because analysis, in addition to data, becomes more centralized. Within a "distributed active analysis center" interoperability is almost guaranteed because data, analysis, and results all can be readily shared and reused. Effectively, with the establishment of "distributed active analysis centers", interoperation turns from a many-to-many problem into a less complicated few-to-few problem and becomes easier to solve.

  10. Cloud archiving and data mining of High-Resolution Rapid Refresh forecast model output

    NASA Astrophysics Data System (ADS)

    Blaylock, Brian K.; Horel, John D.; Liston, Samuel T.

    2017-12-01

    Weather-related research often requires synthesizing vast amounts of data that need archival solutions that are both economical and viable during and past the lifetime of the project. Public cloud computing services (e.g., from Amazon, Microsoft, or Google) or private clouds managed by research institutions are providing object data storage systems potentially appropriate for long-term archives of such large geophysical data sets. We illustrate the use of a private cloud object store developed by the Center for High Performance Computing (CHPC) at the University of Utah. Since early 2015, we have been archiving thousands of two-dimensional gridded fields (each one containing over 1.9 million values over the contiguous United States) from the High-Resolution Rapid Refresh (HRRR) data assimilation and forecast modeling system. The archive is being used for retrospective analyses of meteorological conditions during high-impact weather events, assessing the accuracy of the HRRR forecasts, and providing initial and boundary conditions for research simulations. The archive is accessible interactively and through automated download procedures for researchers at other institutions that can be tailored by the user to extract individual two-dimensional grids from within the highly compressed files. Characteristics of the CHPC object storage system are summarized relative to network file system storage or tape storage solutions. The CHPC storage system is proving to be a scalable, reliable, extensible, affordable, and usable archive solution for our research.

  11. How to Get Data from NOAA Environmental Satellites: An Overview of Operations, Products, Access and Archive

    NASA Astrophysics Data System (ADS)

    Donoho, N.; Graumann, A.; McNamara, D. P.

    2015-12-01

    In this presentation we will highlight access and availability of NOAA satellite data for near real time (NRT) and retrospective product users. The presentation includes an overview of the current fleet of NOAA satellites and methods of data distribution and access to hundreds of imagery and products offered by the Environmental Satellite Processing Center (ESPC) and the Comprehensive Large Array-data Stewardship System (CLASS). In particular, emphasis on the various levels of services for current and past observations will be presented. The National Environmental Satellite, Data, and Information Service (NESDIS) is dedicated to providing timely access to global environmental data from satellites and other sources. In special cases, users are authorized direct access to NESDIS data distribution systems for environmental satellite data and products. Other means of access include publicly available distribution services such as the Global Telecommunication System (GTS), NOAA satellite direct broadcast services and various NOAA websites and ftp servers, including CLASS. CLASS is NOAA's information technology system designed to support long-term, secure preservation and standards-based access to environmental data collections and information. The National Centers for Environmental Information (NCEI) is responsible for the ingest, quality control, stewardship, archival and access to data and science information. This work will also show the latest technology improvements, enterprise approach and future plans for distribution of exponentially increasing data volumes from future NOAA missions. A primer on access to NOAA operational satellite products and services is available at http://www.ospo.noaa.gov/Organization/About/access.html. Access to post-operational satellite data and assorted products is available at http://www.class.noaa.gov

  12. Software for Managing an Archive of Images

    NASA Technical Reports Server (NTRS)

    Hallai, Charles; Jones, Helene; Callac, Chris

    2003-01-01

    This is a revised draft by Innovators concerning the report on Software for Managing and Archive of Images.The SSC Multimedia Archive is an automated electronic system to manage images, acquired both by film and digital cameras, for the Public Affairs Office (PAO) at Stennis Space Center (SSC). Previously, the image archive was based on film photography and utilized a manual system that, by todays standards, had become inefficient and expensive. Now, the SSC Multimedia Archive, based on a server at SSC, contains both catalogs and images for pictures taken both digitally and with a traditional film-based camera, along with metadata about each image.

  13. TokSearch: A search engine for fusion experimental data

    DOE PAGES

    Sammuli, Brian S.; Barr, Jayson L.; Eidietis, Nicholas W.; ...

    2018-04-01

    At a typical fusion research site, experimental data is stored using archive technologies that deal with each discharge as an independent set of data. These technologies (e.g. MDSplus or HDF5) are typically supplemented with a database that aggregates metadata for multiple shots to allow for efficient querying of certain predefined quantities. Often, however, a researcher will need to extract information from the archives, possibly for many shots, that is not available in the metadata store or otherwise indexed for quick retrieval. To address this need, a new search tool called TokSearch has been added to the General Atomics TokSys controlmore » design and analysis suite [1]. This tool provides the ability to rapidly perform arbitrary, parallelized queries of archived tokamak shot data (both raw and analyzed) over large numbers of shots. The TokSearch query API borrows concepts from SQL, and users can choose to implement queries in either MatlabTM or Python.« less

  14. TokSearch: A search engine for fusion experimental data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sammuli, Brian S.; Barr, Jayson L.; Eidietis, Nicholas W.

    At a typical fusion research site, experimental data is stored using archive technologies that deal with each discharge as an independent set of data. These technologies (e.g. MDSplus or HDF5) are typically supplemented with a database that aggregates metadata for multiple shots to allow for efficient querying of certain predefined quantities. Often, however, a researcher will need to extract information from the archives, possibly for many shots, that is not available in the metadata store or otherwise indexed for quick retrieval. To address this need, a new search tool called TokSearch has been added to the General Atomics TokSys controlmore » design and analysis suite [1]. This tool provides the ability to rapidly perform arbitrary, parallelized queries of archived tokamak shot data (both raw and analyzed) over large numbers of shots. The TokSearch query API borrows concepts from SQL, and users can choose to implement queries in either MatlabTM or Python.« less

  15. Project MICAS: a multivendor open-system incremental approach to implementing an integrated enterprise-wide PACS: works in progress

    NASA Astrophysics Data System (ADS)

    Smith, Edward M.; Wright, Jeffrey; Fontaine, Marc T.; Robinson, Arvin E.

    1998-07-01

    The Medical Information, Communication and Archive System (MICAS) is a multi-vendor incremental approach to PACS. MICAS is a multi-modality integrated image management system that incorporates the radiology information system (RIS) and radiology image database (RID) with future 'hooks' to other hospital databases. Even though this approach to PACS is more risky than a single-vendor turn-key approach, it offers significant advantages. The vendors involved in the initial phase of MICAS are IDX Corp., ImageLabs, Inc. and Digital Equipment Corp (DEC). The network architecture operates at 100 MBits per sec except between the modalities and the stackable intelligent switch which is used to segment MICAS by modality. Each modality segment contains the acquisition engine for the modality, a temporary archive and one or more diagnostic workstations. All archived studies are available at all workstations, but there is no permanent archive at this time. At present, the RIS vendor is responsible for study acquisition and workflow as well as maintenance of the temporary archive. Management of study acquisition, workflow and the permanent archive will become the responsibility of the archive vendor when the archive is installed in the second quarter of 1998. The modalities currently interfaced to MICAS are MRI, CT and a Howtek film digitizer with Nuclear Medicine and computed radiography (CR) to be added when the permanent archive is installed. There are six dual-monitor diagnostic workstations which use ImageLabs Shared Vision viewer software located in MRI, CT, Nuclear Medicine, musculoskeletal reading areas and two in Radiology's main reading area. One of the major lessons learned to date is that the permanent archive should have been part of the initial MICAS installation and the archive vendor should have been responsible for image acquisition rather than the RIS vendor. Currently an archive vendor is being selected who will be responsible for the management of the archive plus the HIS/RIS interface, image acquisition, modality work list manager and interfacing to the current DICOM viewer software. The next phase of MICAS will include interfacing ultrasound, locating servers outside of the Radiology LAN to support the distribution of images and reports to the clinical floors and physician offices both within and outside of the University of Rochester Medical Center (URMC) campus and the teaching archive.

  16. PDS4 - Some Principles for Agile Data Curation

    NASA Astrophysics Data System (ADS)

    Hughes, J. S.; Crichton, D. J.; Hardman, S. H.; Joyner, R.; Algermissen, S.; Padams, J.

    2015-12-01

    PDS4, a research data management and curation system for NASA's Planetary Science Archive, was developed using principles that promote the characteristics of agile development. The result is an efficient system that produces better research data products while using less resources (time, effort, and money) and maximizes their usefulness for current and future scientists. The key principle is architectural. The PDS4 information architecture is developed and maintained independent of the infrastructure's process, application and technology architectures. The information architecture is based on an ontology-based information model developed to leverage best practices from standard reference models for digital archives, digital object registries, and metadata registries and capture domain knowledge from a panel of planetary science domain experts. The information model provides a sharable, stable, and formal set of information requirements for the system and is the primary source for information to configure most system components, including the product registry, search engine, validation and display tools, and production pipelines. Multi-level governance is also allowed for the effective management of the informational elements at the common, discipline, and project level. This presentation will describe the development principles, components, and uses of the information model and how an information model-driven architecture exhibits characteristics of agile curation including early delivery, evolutionary development, adaptive planning, continuous improvement, and rapid and flexible response to change.

  17. NASA Space Technology Draft Roadmap Area 13: Ground and Launch Systems Processing

    NASA Technical Reports Server (NTRS)

    Clements, Greg

    2011-01-01

    This slide presentation reviews the technology development roadmap for the area of ground and launch systems processing. The scope of this technology area includes: (1) Assembly, integration, and processing of the launch vehicle, spacecraft, and payload hardware (2) Supply chain management (3) Transportation of hardware to the launch site (4) Transportation to and operations at the launch pad (5) Launch processing infrastructure and its ability to support future operations (6) Range, personnel, and facility safety capabilities (7) Launch and landing weather (8) Environmental impact mitigations for ground and launch operations (9) Launch control center operations and infrastructure (10) Mission integration and planning (11) Mission training for both ground and flight crew personnel (12) Mission control center operations and infrastructure (13) Telemetry and command processing and archiving (14) Recovery operations for flight crews, flight hardware, and returned samples. This technology roadmap also identifies ground, launch and mission technologies that will: (1) Dramatically transform future space operations, with significant improvement in life-cycle costs (2) Improve the quality of life on earth, while exploring in co-existence with the environment (3) Increase reliability and mission availability using low/zero maintenance materials and systems, comprehensive capabilities to ascertain and forecast system health/configuration, data integration, and the use of advanced/expert software systems (4) Enhance methods to assess safety and mission risk posture, which would allow for timely and better decision making. Several key technologies are identified, with a couple of slides devoted to one of these technologies (i.e., corrosion detection and prevention). Development of these technologies can enhance life on earth and have a major impact on how we can access space, eventually making routine commercial space access and improve building and manufacturing, and weather forecasting for example for the effect of these process improvements on our daily lives.

  18. Integrating medical imaging analyses through a high-throughput bundled resource imaging system

    NASA Astrophysics Data System (ADS)

    Covington, Kelsie; Welch, E. Brian; Jeong, Ha-Kyu; Landman, Bennett A.

    2011-03-01

    Exploitation of advanced, PACS-centric image analysis and interpretation pipelines provides well-developed storage, retrieval, and archival capabilities along with state-of-the-art data providence, visualization, and clinical collaboration technologies. However, pursuit of integrated medical imaging analysis through a PACS environment can be limiting in terms of the overhead required to validate, evaluate and integrate emerging research technologies. Herein, we address this challenge through presentation of a high-throughput bundled resource imaging system (HUBRIS) as an extension to the Philips Research Imaging Development Environment (PRIDE). HUBRIS enables PACS-connected medical imaging equipment to invoke tools provided by the Java Imaging Science Toolkit (JIST) so that a medical imaging platform (e.g., a magnetic resonance imaging scanner) can pass images and parameters to a server, which communicates with a grid computing facility to invoke the selected algorithms. Generated images are passed back to the server and subsequently to the imaging platform from which the images can be sent to a PACS. JIST makes use of an open application program interface layer so that research technologies can be implemented in any language capable of communicating through a system shell environment (e.g., Matlab, Java, C/C++, Perl, LISP, etc.). As demonstrated in this proof-of-concept approach, HUBRIS enables evaluation and analysis of emerging technologies within well-developed PACS systems with minimal adaptation of research software, which simplifies evaluation of new technologies in clinical research and provides a more convenient use of PACS technology by imaging scientists.

  19. An Introduction to Archival Automation: A RAMP Study with Guidelines.

    ERIC Educational Resources Information Center

    Cook, Michael

    Developed under a contract with the International Council on Archives, these guidelines are designed to emphasize the role of automation techniques in archives and records services, provide an indication of existing computer systems used in different archives services and of specific computer applications at various stages of archives…

  20. High performance compression of science data

    NASA Technical Reports Server (NTRS)

    Storer, James A.; Cohn, Martin

    1992-01-01

    In the future, NASA expects to gather over a tera-byte per day of data requiring space for levels of archival storage. Data compression will be a key component in systems that store this data (e.g., optical disk and tape) as well as in communications systems (both between space and Earth and between scientific locations on Earth). We propose to develop algorithms that can be a basis for software and hardware systems that compress a wide variety of scientific data with different criteria for fidelity/bandwidth tradeoffs. The algorithmic approaches we consider are specially targeted for parallel computation where data rates of over 1 billion bits per second are achievable with current technology.

  1. High performance compression of science data

    NASA Technical Reports Server (NTRS)

    Storer, James A.; Cohn, Martin

    1993-01-01

    In the future, NASA expects to gather over a tera-byte per day of data requiring space for levels of archival storage. Data compression will be a key component in systems that store this data (e.g., optical disk and tape) as well as in communications systems (both between space and Earth and between scientific locations on Earth). We propose to develop algorithms that can be a basis for software and hardware systems that compress a wide variety of scientific data with different criteria for fidelity/bandwidth tradeoffs. The algorithmic approaches we consider are specially targeted for parallel computation where data rates of over 1 billion bits per second are achievable with current technology.

  2. Grid Application Meta-Repository System: Repository Interconnectivity and Cross-domain Application Usage in Distributed Computing Environments

    NASA Astrophysics Data System (ADS)

    Tudose, Alexandru; Terstyansky, Gabor; Kacsuk, Peter; Winter, Stephen

    Grid Application Repositories vary greatly in terms of access interface, security system, implementation technology, communication protocols and repository model. This diversity has become a significant limitation in terms of interoperability and inter-repository access. This paper presents the Grid Application Meta-Repository System (GAMRS) as a solution that offers better options for the management of Grid applications. GAMRS proposes a generic repository architecture, which allows any Grid Application Repository (GAR) to be connected to the system independent of their underlying technology. It also presents applications in a uniform manner and makes applications from all connected repositories visible to web search engines, OGSI/WSRF Grid Services and other OAI (Open Archive Initiative)-compliant repositories. GAMRS can also function as a repository in its own right and can store applications under a new repository model. With the help of this model, applications can be presented as embedded in virtual machines (VM) and therefore they can be run in their native environments and can easily be deployed on virtualized infrastructures allowing interoperability with new generation technologies such as cloud computing, application-on-demand, automatic service/application deployments and automatic VM generation.

  3. Evaluating the Benefits of Providing Archived Online Lectures to In-Class Math Students

    ERIC Educational Resources Information Center

    Cascaval, Radu C.; Fogler, Kethera A.; Abrams, Gene D.; Durham, Robert L.

    2008-01-01

    The present study examines the impact of a novel online video lecture archiving system on in-class students enrolled in traditional math courses at a mid-sized, primarily undergraduate, university in the West. The archiving system allows in-class students web access to complete video recordings of the actual classroom lectures, and sometimes of…

  4. The Internet as a Medium of Training for Picture Archival and Communication Systems (PACS).

    ERIC Educational Resources Information Center

    Majid, Shaheen; Misra, Ramesh Kumar

    2002-01-01

    Explores the potential of Web-based training for PACS (Picture Archival and Communication Systems) used in radiology departments for the storage and archiving of patients' medical images. Reports results of studies in three hospitals in Malaysia, Singapore and the Philippines that showed that the Internet can be used effectively for training.…

  5. Archival Services and the Concept of the User: A RAMP Study.

    ERIC Educational Resources Information Center

    Taylor, Hugh A.

    Prepared under contract with the International Council on Archives (ICA), this study is intended to assist archivists and information specialists in creating, developing, and evaluating modern archival systems and services, particularly with reference to the concept and the role of the user in such systems and services. It ranges over a wide field…

  6. Data management in NOAA

    NASA Technical Reports Server (NTRS)

    Callicott, William M.

    1993-01-01

    The NOAA archives contain 150 terabytes of data in digital form, most of which are the high volume GOES satellite image data. There are 630 data bases containing 2,350 environmental variables. There are 375 million film records and 90 million paper records in addition to the digital data base. The current data accession rate is 10 percent per year and the number of users are increasing at a 10 percent annual rate. NOAA publishes 5,000 publications and distributes over one million copies to almost 41,000 paying customers. Each year, over six million records are key entered from manuscript documents and about 13,000 computer tapes and 40,000 satellite hardcopy images are entered into the archive. Early digital data were stored on punched cards and open reel computer tapes. In the late seventies, an advanced helical scan technology (AMPEX TBM) was implemented. Now, punched cards have disappeared, the TBM system was abandoned, most data stored on open reel tapes have been migrated to 3480 cartridges, many specialized data sets were distributed on CD ROM's, special archives are being copied to 12 inch optical WORM disks, 5 1/4 inch magneto-optical disks were employed for workstation applications, and 8 mm EXABYTE tapes are planned for major data collection programs. The rapid expansion of new data sets, some of which constitute large volumes of data, coupled with the need for vastly improved access mechanisms, portability, and improved longevity are factors which will influence NOAA's future systems approaches for data management.

  7. Digital time stamping system based on open source technologies.

    PubMed

    Miskinis, Rimantas; Smirnov, Dmitrij; Urba, Emilis; Burokas, Andrius; Malysko, Bogdan; Laud, Peeter; Zuliani, Francesco

    2010-03-01

    A digital time stamping system based on open source technologies (LINUX-UBUNTU, OpenTSA, OpenSSL, MySQL) is described in detail, including all important testing results. The system, called BALTICTIME, was developed under a project sponsored by the European Commission under the Program FP 6. It was designed to meet the requirements posed to the systems of legal and accountable time stamping and to be applicable to the hardware commonly used by the national time metrology laboratories. The BALTICTIME system is intended for the use of governmental and other institutions as well as personal bodies. Testing results demonstrate that the time stamps issued to the user by BALTICTIME and saved in BALTICTIME's archives (which implies that the time stamps are accountable) meet all the regulatory requirements. Moreover, the BALTICTIME in its present implementation is able to issue more than 10 digital time stamps per second. The system can be enhanced if needed. The test version of the BALTICTIME service is free and available at http://baltictime. pfi.lt:8080/btws/ and http://baltictime.lnmc.lv:8080/btws/.

  8. Determinant Factors in Applying Picture Archiving and Communication Systems (PACS) in Healthcare.

    PubMed

    Abdekhoda, Mohammadhiwa; Salih, Kawa Mirza

    2017-01-01

    Meaningful use of picture archiving and communication systems (PACS) can change the workflow for accessing digital images, lead to faster turnaround time, reduce tests and examinations, and increase patient throughput. This study was carried out to identify determinant factors that affect the adoption of PACS by physicians. This was a cross-sectional study in which 190 physicians working in a teaching hospital affiliated with Tehran University of Medical Sciences were randomly selected. Physicians' perceptions concerning the adoption of PACS were assessed by the conceptual path model of the Unified Theory of Acceptance and Use of Technology (UTAUT). Collected data were analyzed with regression analysis. Structural equation modeling was applied to test the final model that was developed. The results show that the UTAUT model can explain about 61 percent of the variance on in the adoption of PACS ( R 2 = 0.61). The findings also showed that performance expectancy, effort expectancy, social influences, and behavior intention have a direct and significant effect on the adoption of PACS. However, facility condition showed to have no significant effect on physicians' behavior intentions. Implementation of new technology such as PACS in the healthcare sector is unavoidable. Our study clearly identified significant and nonsignificant factors that may affect the adoption of PACS. Also, this study acknowledged that physicians' perception is a key factor to manage the implementation of PACS optimally, and this fact should be considered by healthcare managers and policy makers.

  9. Determinant Factors in Applying Picture Archiving and Communication Systems (PACS) in Healthcare

    PubMed Central

    Abdekhoda, Mohammadhiwa; Salih, Kawa Mirza

    2017-01-01

    Objectives Meaningful use of picture archiving and communication systems (PACS) can change the workflow for accessing digital images, lead to faster turnaround time, reduce tests and examinations, and increase patient throughput. This study was carried out to identify determinant factors that affect the adoption of PACS by physicians. Methods This was a cross-sectional study in which 190 physicians working in a teaching hospital affiliated with Tehran University of Medical Sciences were randomly selected. Physicians’ perceptions concerning the adoption of PACS were assessed by the conceptual path model of the Unified Theory of Acceptance and Use of Technology (UTAUT). Collected data were analyzed with regression analysis. Structural equation modeling was applied to test the final model that was developed. Results The results show that the UTAUT model can explain about 61 percent of the variance on in the adoption of PACS (R2 = 0.61). The findings also showed that performance expectancy, effort expectancy, social influences, and behavior intention have a direct and significant effect on the adoption of PACS. However, facility condition showed to have no significant effect on physicians’ behavior intentions. Conclusions Implementation of new technology such as PACS in the healthcare sector is unavoidable. Our study clearly identified significant and nonsignificant factors that may affect the adoption of PACS. Also, this study acknowledged that physicians’ perception is a key factor to manage the implementation of PACS optimally, and this fact should be considered by healthcare managers and policy makers. PMID:28855856

  10. Optimizing digital 8mm drive performance

    NASA Technical Reports Server (NTRS)

    Schadegg, Gerry

    1993-01-01

    The experience of attaching over 350,000 digital 8mm drives to 85-plus system platforms has uncovered many factors which can reduce cartridge capacity or drive throughput, reduce reliability, affect cartridge archivability and actually shorten drive life. Some are unique to an installation. Others result from how the system is set up to talk to the drive. Many stem from how applications use the drive, the work load that's present, the kind of media used and, very important, the kind of cleaning program in place. Digital 8mm drives record data at densities that rival those of disk technology. Even with technology this advanced, they are extremely robust and, given proper usage, care and media, should reward the user with a long productive life. The 8mm drive will give its best performance using high-quality 'data grade' media. Even though it costs more, good 'data grade' media can sustain the reliability and rigorous needs of a data storage environment and, with proper care, give users an archival life of 30 years or more. Various factors, taken individually, may not necessarily produce performance or reliability problems. Taken in combination, their effects can compound, resulting in rapid reductions in a drive's serviceable life, cartridge capacity, or drive performance. The key to managing media is determining the importance one places upon their recorded data and, subsequently, setting media usage guidelines that can deliver data reliability. Various options one can implement to optimize digital 8mm drive performance are explored.

  11. User Acceptance of Picture Archiving and Communication System in the Emergency Department

    PubMed Central

    Goodarzi, Hassan; Khatami, Seyed-Masoud; Javadzadeh, Hammidreza; Mahmoudi, Sadrollah; Khajehpour, Hojjatollah; Heidari, Soleiman; Khodaparast, Morteza; Ebrahimi, Ali; Rasouli, Hamidreza; Ghane, Mohammadreza; Faraji, Mehrdad; Hassanpour, Kasra

    2016-01-01

    Background Picture archiving and communication system (PACS) has allowed the medical images to be transmitted, stored, retrieved, and displayed in different locations of a hospital or health system. Using PACS in the emergency department will eventually result in improved efficiency and patient care. In spite of the abundant benefits of employing PACS, there are some challenges in implementing this technology like users’ resistance to accept the technology, which has a critical role in PACS success. Objectives In this study, we will assess and compare user acceptance of PACS in the emergency departments of three different hospitals and investigate the effect of socio-demographic factors on this acceptance. Materials and Methods A variant of technology acceptance model (TAM) has been used in order to measure the acceptance level of PACS in the emergency department of three educational hospitals in Iran. A previously used questionnaire was validated and utilized to collect the study data. A stepwise multiple regression model was used to predict factors influencing acceptance score as the dependent variable. Results Mean age of participants was 32.9 years (standard deviation [SD] = 6.08). Participants with the specialty degree got a higher acceptance score than the three other groups (Mean ± SD = 4.17 ± 0.20). Age, gender, degree of PACS usage and participant’s occupation (profession) did not influence the acceptance score. In our multiple regression model, all three variables of perceived usefulness (PU), perceived ease of use (PEU) and the effect of PACS (change) had a significant effect in the prediction of acceptance. The most influencing factor was change with the beta of 0.22 (P value < 0.001). Conclusion PACS is highly accepted in all three emergency departments especially among specialists. PU, PEU and change are factors influencing PACS acceptance. Our study can be used as an evidence of PACS acceptance in emergency wards. PMID:27679692

  12. The Role of Advanced Information System Technology in Remote Sensing for NASA's Earth Science Enterprise in the 21st Century

    NASA Technical Reports Server (NTRS)

    Prescott, Glenn; Komar, George (Technical Monitor)

    2001-01-01

    Future NASA Earth observing satellites will carry high-precision instruments capable of producing large amounts of scientific data. The strategy will be to network these instrument-laden satellites into a web-like array of sensors to facilitate the collection, processing, transmission, storage, and distribution of data and data products - the essential elements of what we refer to as "Information Technology." Many of these Information Technologies will enable the satellite and ground information systems to function effectively in real-time, providing scientists with the capability of customizing data collection activities on a satellite or group of satellites directly from the ground. In future systems, extremely large quantities of data collected by scientific instruments will require the fastest processors, the highest communication channel transfer rates, and the largest data storage capacity to insure that data flows smoothly from the satellite-based instrument to the ground-based archive. Autonomous systems will control all essential processes and play a key role in coordinating the data flow through space-based communication networks. In this paper, we will discuss those critical information technologies for Earth observing satellites that will support the next generation of space-based scientific measurements of planet Earth, and insure that data and data products provided by these systems will be accessible to scientists and the user community in general.

  13. A flexible, open, decentralized system for digital pathology networks.

    PubMed

    Schuler, Robert; Smith, David E; Kumaraguruparan, Gowri; Chervenak, Ann; Lewis, Anne D; Hyde, Dallas M; Kesselman, Carl

    2012-01-01

    High-resolution digital imaging is enabling digital archiving and sharing of digitized microscopy slides and new methods for digital pathology. Collaborative research centers, outsourced medical services, and multi-site organizations stand to benefit from sharing pathology data in a digital pathology network. Yet significant technological challenges remain due to the large size and volume of digitized whole slide images. While information systems do exist for managing local pathology laboratories, they tend to be oriented toward narrow clinical use cases or offer closed ecosystems around proprietary formats. Few solutions exist for networking digital pathology operations. Here we present a system architecture and implementation of a digital pathology network and share results from a production system that federates major research centers.

  14. A Flexible, Open, Decentralized System for Digital Pathology Networks

    PubMed Central

    SMITH, David E.; KUMARAGURUPARAN, Gowri; CHERVENAK, Ann; LEWIS, Anne D.; HYDE, Dallas M.; KESSELMAN, Carl

    2014-01-01

    High-resolution digital imaging is enabling digital archiving and sharing of digitized microscopy slides and new methods for digital pathology. Collaborative research centers, outsourced medical services, and multi-site organizations stand to benefit from sharing pathology data in a digital pathology network. Yet significant technological challenges remain due to the large size and volume of digitized whole slide images. While information systems do exist for managing local pathology laboratories, they tend to be oriented toward narrow clinical use cases or offer closed ecosystems around proprietary formats. Few solutions exist for networking digital pathology operations. Here we present a system architecture and implementation of a digital pathology network and share results from a production system that federates major research centers. PMID:22941985

  15. ROSETTA: How to archive more than 10 years of mission

    NASA Astrophysics Data System (ADS)

    Barthelemy, Maud; Heather, D.; Grotheer, E.; Besse, S.; Andres, R.; Vallejo, F.; Barnes, T.; Kolokolova, L.; O'Rourke, L.; Fraga, D.; A'Hearn, M. F.; Martin, P.; Taylor, M. G. G. T.

    2018-01-01

    The Rosetta spacecraft was launched in 2004 and, after several planetary and two asteroid fly-bys, arrived at comet 67P/Churyumov-Gerasimenko in August 2014. After escorting the comet for two years and executing its scientific observations, the mission ended on 30 September 2016 through a touch down on the comet surface. This paper describes how the Planetary Science Archive (PSA) and the Planetary Data System - Small Bodies Node (PDS-SBN) worked with the Rosetta instrument teams to prepare the science data collected over the course of the Rosetta mission for inclusion in the science archive. As Rosetta is an international mission in collaboration between ESA and NASA, all science data from the mission are fully archived within both the PSA and the PDS. The Rosetta archiving process, supporting tools, archiving systems, and their evolution throughout the mission are described, along with a discussion of a number of the challenges faced during the Rosetta implementation. The paper then presents the current status of the archive for each of the science instruments, before looking to the improvements planned both for the archive itself and for the Rosetta data content. The lessons learned from the first 13 years of archiving on Rosetta are finally discussed with an aim to help future missions plan and implement their science archives.

  16. Service-Based Extensions to an OAIS Archive for Science Data Management

    NASA Astrophysics Data System (ADS)

    Flathers, E.; Seamon, E.; Gessler, P. E.

    2014-12-01

    With new data management mandates from major funding sources such as the National Institutes for Health and the National Science Foundation, architecture of science data archive systems is becoming a critical concern for research institutions. The Consultative Committee for Space Data Systems (CCSDS), in 2002, released their first version of a Reference Model for an Open Archival Information System (OAIS). The CCSDS document (now an ISO standard) was updated in 2012 with additional focus on verifying the authenticity of data and developing concepts of access rights and a security model. The OAIS model is a good fit for research data archives, having been designed to support data collections of heterogeneous types, disciplines, storage formats, etc. for the space sciences. As fast, reliable, persistent Internet connectivity spreads, new network-available resources have been developed that can support the science data archive. A natural extension of an OAIS archive is the interconnection with network- or cloud-based services and resources. We use the Service Oriented Architecture (SOA) design paradigm to describe a set of extensions to an OAIS-type archive: purpose and justification for each extension, where and how each extension connects to the model, and an example of a specific service that meets the purpose.

  17. Issues and recommendations associated with distributed computation and data management systems for the space sciences

    NASA Technical Reports Server (NTRS)

    1986-01-01

    The primary purpose of the report is to explore management approaches and technology developments for computation and data management systems designed to meet future needs in the space sciences.The report builds on work presented in previous reports on solar-terrestrial and planetary reports, broadening the outlook to all of the space sciences, and considering policy issues aspects related to coordiantion between data centers, missions, and ongoing research activities, because it is perceived that the rapid growth of data and the wide geographic distribution of relevant facilities will present especially troublesome problems for data archiving, distribution, and analysis.

  18. Reference Model for an Open Archival Information System

    NASA Technical Reports Server (NTRS)

    1997-01-01

    This document is a technical report for use in developing a consensus on what is required to operate a permanent, or indefinite long-term, archive of digital information. It may be useful as a starting point for a similar document addressing the indefinite long-term preservation of non-digital information. This report establishes a common framework of terms and concepts which comprise an Open Archival Information System (OAIS). It allows existing and future archives to be more meaningfully compared and contrasted. It provides a basis for further standardization of within an archival context and it should promote greater vendor awareness of, and support of , archival requirements. Through the process of normal evolution, it is expected that expansion, deletion, or modification to this document may occur. This report is therefore subject to CCSDS document management and change control procedures.

  19. Contents of the JPL Distributed Active Archive Center (DAAC) archive, version 2-91

    NASA Technical Reports Server (NTRS)

    Smith, Elizabeth A. (Editor); Lassanyi, Ruby A. (Editor)

    1991-01-01

    The Distributed Active Archive Center (DAAC) archive at the Jet Propulsion Laboratory (JPL) includes satellite data sets for the ocean sciences and global change research to facilitate multidisciplinary use of satellite ocean data. Parameters include sea surface height, surface wind vector, sea surface temperature, atmospheric liquid water, and surface pigment concentration. The Jet Propulsion Laboratory DAAC is an element of the Earth Observing System Data and Information System (EOSDIS) and will be the United States distribution site for the Ocean Topography Experiment (TOPEX)/POSEIDON data and metadata.

  20. Archives of Transformation: A Case Study of the International Women's Network against Militarism's Archival System

    ERIC Educational Resources Information Center

    Cachola, Ellen-Rae Cabebe

    2014-01-01

    This dissertation describes the International Women's Network Against Militarism's (IWNAM) political epistemology of security from an archival perspective, and how they create community archives to evidence this epistemology. This research examines records created by Women for Genuine Security (WGS) and Women's Voices Women Speak (WVWS), U.S. and…

  1. The Role of Archives and Records Management in National Information Systems: A RAMP Study.

    ERIC Educational Resources Information Center

    Rhoads, James B.

    Produced as part of the United Nations Educational, Scientific, and Cultural Organization (UNESCO) Records and Archives Management Programme (RAMP), this publication provides information about the essential character and value of archives and about the procedures and programs that should govern the management of both archives and current records,…

  2. Development and Validation of an Ultradeep Next-Generation Sequencing Assay for Testing of Plasma Cell-Free DNA from Patients with Advanced Cancer.

    PubMed

    Janku, Filip; Zhang, Shile; Waters, Jill; Liu, Li; Huang, Helen J; Subbiah, Vivek; Hong, David S; Karp, Daniel D; Fu, Siqing; Cai, Xuyu; Ramzanali, Nishma M; Madwani, Kiran; Cabrilo, Goran; Andrews, Debra L; Zhao, Yue; Javle, Milind; Kopetz, E Scott; Luthra, Rajyalakshmi; Kim, Hyunsung J; Gnerre, Sante; Satya, Ravi Vijaya; Chuang, Han-Yu; Kruglyak, Kristina M; Toung, Jonathan; Zhao, Chen; Shen, Richard; Heymach, John V; Meric-Bernstam, Funda; Mills, Gordon B; Fan, Jian-Bing; Salathia, Neeraj S

    2017-09-15

    Purpose: Tumor-derived cell-free DNA (cfDNA) in plasma can be used for molecular testing and provide an attractive alternative to tumor tissue. Commonly used PCR-based technologies can test for limited number of alterations at the time. Therefore, novel ultrasensitive technologies capable of testing for a broad spectrum of molecular alterations are needed to further personalized cancer therapy. Experimental Design: We developed a highly sensitive ultradeep next-generation sequencing (NGS) assay using reagents from TruSeqNano library preparation and NexteraRapid Capture target enrichment kits to generate plasma cfDNA sequencing libraries for mutational analysis in 61 cancer-related genes using common bioinformatics tools. The results were retrospectively compared with molecular testing of archival primary or metastatic tumor tissue obtained at different points of clinical care. Results: In a study of 55 patients with advanced cancer, the ultradeep NGS assay detected 82% (complete detection) to 87% (complete and partial detection) of the aberrations identified in discordantly collected corresponding archival tumor tissue. Patients with a low variant allele frequency (VAF) of mutant cfDNA survived longer than those with a high VAF did ( P = 0.018). In patients undergoing systemic therapy, radiological response was positively associated with changes in cfDNA VAF ( P = 0.02), and compared with unchanged/increased mutant cfDNA VAF, decreased cfDNA VAF was associated with longer time to treatment failure (TTF; P = 0.03). Conclusions: Ultradeep NGS assay has good sensitivity compared with conventional clinical mutation testing of archival specimens. A high VAF in mutant cfDNA corresponded with shorter survival. Changes in VAF of mutated cfDNA were associated with TTF. Clin Cancer Res; 23(18); 5648-56. ©2017 AACR . ©2017 American Association for Cancer Research.

  3. An automated, web-enabled and searchable database system for archiving electrogram and related data from implantable cardioverter defibrillators.

    PubMed

    Zong, W; Wang, P; Leung, B; Moody, G B; Mark, R G

    2002-01-01

    The advent of implantable cardioverter defibrillators (ICDs) has resulted in significant reductions in mortality in patients at high risk for sudden cardiac death. Extensive related basic research and clinical investigation continue. ICDs typically record intracardiac electrograms and inter-beat intervals along with device settings during episodes of device delivery of therapy. Researchers wishing to study these data further have until now been limited to viewing paper plots. In support of multi-center clinical studies of patients with ICDs, we have developed a web based searchable ICD data archiving system, which allows users to use a web browser to upload ICD data from diskettes to a server where the data are automatically processed and archived. Users can view and download the archived ICD data directly via the web. The entire system is built from open source software. At present more than 500 patient ICD data sets have been uploaded to and archived in the system. This project will be of value not only to those who wish to conduct research using ICD data, but also to clinicians who need to archive and review ICD data collected from their patients.

  4. A Waveform Archiving System for the GE Solar 8000i Bedside Monitor.

    PubMed

    Fanelli, Andrea; Jaishankar, Rohan; Filippidis, Aristotelis; Holsapple, James; Heldt, Thomas

    2018-01-01

    Our objective was to develop, deploy, and test a data-acquisition system for the reliable and robust archiving of high-resolution physiological waveform data from a variety of bedside monitoring devices, including the GE Solar 8000i patient monitor, and for the logging of ancillary clinical and demographic information. The data-acquisition system consists of a computer-based archiving unit and a GE Tram Rac 4A that connects to the GE Solar 8000i monitor. Standard physiological front-end sensors connect directly to the Tram Rac, which serves as a port replicator for the GE monitor and provides access to these waveform signals through an analog data interface. Together with the GE monitoring data streams, we simultaneously collect the cerebral blood flow velocity envelope from a transcranial Doppler ultrasound system and a non-invasive arterial blood pressure waveform along a common time axis. All waveform signals are digitized and archived through a LabView-controlled interface that also allows for the logging of relevant meta-data such as clinical and patient demographic information. The acquisition system was certified for hospital use by the clinical engineering team at Boston Medical Center, Boston, MA, USA. Over a 12-month period, we collected 57 datasets from 11 neuro-ICU patients. The system provided reliable and failure-free waveform archiving. We measured an average temporal drift between waveforms from different monitoring devices of 1 ms every 66 min of recorded data. The waveform acquisition system allows for robust real-time data acquisition, processing, and archiving of waveforms. The temporal drift between waveforms archived from different devices is entirely negligible, even for long-term recording.

  5. JavaScript Access to DICOM Network and Objects in Web Browser.

    PubMed

    Drnasin, Ivan; Grgić, Mislav; Gogić, Goran

    2017-10-01

    Digital imaging and communications in medicine (DICOM) 3.0 standard provides the baseline for the picture archiving and communication systems (PACS). The development of Internet and various communication media initiated demand for non-DICOM access to PACS systems. Ever-increasing utilization of the web browsers, laptops and handheld devices, as opposed to desktop applications and static organizational computers, lead to development of different web technologies. The DICOM standard officials accepted those subsequently as tools of alternative access. This paper provides an overview of the current state of development of the web access technology to the DICOM repositories. It presents a different approach of using HTML5 features of the web browsers through the JavaScript language and the WebSocket protocol by enabling real-time communication with DICOM repositories. JavaScript DICOM network library, DICOM to WebSocket proxy and a proof-of-concept web application that qualifies as a DICOM 3.0 device were developed.

  6. A File Archival System

    NASA Technical Reports Server (NTRS)

    Fanselow, J. L.; Vavrus, J. L.

    1984-01-01

    ARCH, file archival system for DEC VAX, provides for easy offline storage and retrieval of arbitrary files on DEC VAX system. System designed to eliminate situations that tie up disk space and lead to confusion when different programers develop different versions of same programs and associated files.

  7. Archiving and Distributing Seismic Data at the Southern California Earthquake Data Center (SCEDC)

    NASA Astrophysics Data System (ADS)

    Appel, V. L.

    2002-12-01

    The Southern California Earthquake Data Center (SCEDC) archives and provides public access to earthquake parametric and waveform data gathered by the Southern California Seismic Network and since January 1, 2001, the TriNet seismic network, southern California's earthquake monitoring network. The parametric data in the archive includes earthquake locations, magnitudes, moment-tensor solutions and phase picks. The SCEDC waveform archive prior to TriNet consists primarily of short-period, 100-samples-per-second waveforms from the SCSN. The addition of the TriNet array added continuous recordings of 155 broadband stations (20 samples per second or less), and triggered seismograms from 200 accelerometers and 200 short-period instruments. Since the Data Center and TriNet use the same Oracle database system, new earthquake data are available to the seismological community in near real-time. Primary access to the database and waveforms is through the Seismogram Transfer Program (STP) interface. The interface enables users to search the database for earthquake information, phase picks, and continuous and triggered waveform data. Output is available in SAC, miniSEED, and other formats. Both the raw counts format (V0) and the gain-corrected format (V1) of COSMOS (Consortium of Organizations for Strong-Motion Observation Systems) are now supported by STP. EQQuest is an interface to prepackaged waveform data sets for select earthquakes in Southern California stored at the SCEDC. Waveform data for large-magnitude events have been prepared and new data sets will be available for download in near real-time following major events. The parametric data from 1981 to present has been loaded into the Oracle 9.2.0.1 database system and the waveforms for that time period have been converted to mSEED format and are accessible through the STP interface. The DISC optical-disk system (the "jukebox") that currently serves as the mass-storage for the SCEDC is in the process of being replaced with a series of inexpensive high-capacity (1.6 Tbyte) magnetic-disk RAIDs. These systems are built with PC-technology components, using 16 120-Gbyte IDE disks, hot-swappable disk trays, two RAID controllers, dual redundant power supplies and a Linux operating system. The system is configured over a private gigabit network that connects to the two Data Center servers and spans between the Seismological Lab and the USGS. To ensure data integrity, each RAID disk system constantly checks itself against its twin and verifies file integrity using 128-bit MD5 file checksums that are stored separate from the system. The final level of data protection is a Sony AIT-3 tape backup of the files. The primary advantage of the magnetic-disk approach is faster data access because magnetic disk drives have almost no latency. This means that the SCEDC can provide better "on-demand" interactive delivery of the seismograms in the archive.

  8. Technology insertion of a COTS RAID server as an image buffer in the image chain of the Defense Mapping Agency's Digital Production System

    NASA Astrophysics Data System (ADS)

    Mehring, James W.; Thomas, Scott D.

    1995-11-01

    The Data Services Segment of the Defense Mapping Agency's Digital Production System provides a digital archive of imagery source data for use by DMA's cartographic user's. This system was developed in the mid-1980's and is currently undergoing modernization. This paper addresses the modernization of the imagery buffer function that was performed by custom hardware in the baseline system and is being replaced by a RAID Server based on commercial off the shelf (COTS) hardware. The paper briefly describes the baseline DMA image system and the modernization program, that is currently under way. Throughput benchmark measurements were made to make design configuration decisions for a commercial off the shelf (COTS) RAID Server to perform as system image buffer. The test program began with performance measurements of the RAID read and write operations between the RAID arrays and the server CPU for RAID levels 0, 5 and 0+1. Interface throughput measurements were made for the HiPPI interface between the RAID Server and the image archive and processing system as well as the client side interface between a custom interface board that provides the interface between the internal bus of the RAID Server and the Input- Output Processor (IOP) external wideband network currently in place in the DMA system to service client workstations. End to end measurements were taken from the HiPPI interface through the RAID write and read operations to the IOP output interface.

  9. The Telecommunications and Data Acquisition Report

    NASA Technical Reports Server (NTRS)

    Posner, E. C. (Editor)

    1992-01-01

    Archival reports on developments in programs managed by JPL's Office of Telecommunications and Data Acquisition (TDA) are provided. In space communications, radio navigation, radio science, and ground-based radio and radar astronomy, it reports on activities of the Deep Space Network (DSN) in planning, in supporting research and technology, in implementation, and in operations. Also included is standards activity at JPL for space data and information. In the search for extraterrestrial intelligence (SETI), the TDA Progress Report reports on implementation and operations for searching the microwave spectrum. Topics covered include tracking and ground-based navigation; communications, spacecraft-ground; station control and system technology; capabilities for new projects; network upgrade and sustaining; network operations and operations support; and TDA program management and analysis.

  10. Getting the Bigger Picture With Digital Surveillance

    NASA Technical Reports Server (NTRS)

    2002-01-01

    Through a Space Act Agreement, Diebold, Inc., acquired the exclusive rights to Glenn Research Center's patented video observation technology, originally designed to accelerate video image analysis for various ongoing and future space applications. Diebold implemented the technology into its AccuTrack digital, color video recorder, a state-of- the-art surveillance product that uses motion detection for around-the- clock monitoring. AccuTrack captures digitally signed images and transaction data in real-time. This process replaces the onerous tasks involved in operating a VCR-based surveillance system, and subsequently eliminates the need for central viewing and tape archiving locations altogether. AccuTrack can monitor an entire bank facility, including four automated teller machines, multiple teller lines, and new account areas, all from one central location.

  11. Enabling Earth Science: The Facilities and People of the NCCS

    NASA Technical Reports Server (NTRS)

    2002-01-01

    The NCCS's mass data storage system allows scientists to store and manage the vast amounts of data generated by these computations, and its high-speed network connections allow the data to be accessed quickly from the NCCS archives. Some NCCS users perform studies that are directly related to their ability to run computationally expensive and data-intensive simulations. Because the number and type of questions scientists research often are limited by computing power, the NCCS continually pursues the latest technologies in computing, mass storage, and networking technologies. Just as important as the processors, tapes, and routers of the NCCS are the personnel who administer this hardware, create and manage accounts, maintain security, and assist the scientists, often working one on one with them.

  12. Operating a petabyte class archive at ESO

    NASA Astrophysics Data System (ADS)

    Suchar, Dieter; Lockhart, John S.; Burrows, Andrew

    2008-07-01

    The challenges of setting up and operating a Petabyte Class Archive will be described in terms of computer systems within a complex Data Centre environment. The computer systems, including the ESO Primary and Secondary Archive and the associated computational environments such as relational databases will be explained. This encompasses the entire system project cycle, including the technical specifications, procurement process, equipment installation and all further operational phases. The ESO Data Centre construction and the complexity of managing the environment will be presented. Many factors had to be considered during the construction phase, such as power consumption, targeted cooling and the accumulated load on the building structure to enable the smooth running of a Petabyte class Archive.

  13. [A new concept for integration of image databanks into a comprehensive patient documentation].

    PubMed

    Schöll, E; Holm, J; Eggli, S

    2001-05-01

    Image processing and archiving are of increasing importance in the practice of modern medicine. Particularly due to the introduction of computer-based investigation methods, physicians are dealing with a wide variety of analogue and digital picture archives. On the other hand, clinical information is stored in various text-based information systems without integration of image components. The link between such traditional medical databases and picture archives is a prerequisite for efficient data management as well as for continuous quality control and medical education. At the Department of Orthopedic Surgery, University of Berne, a software program was developed to create a complete multimedia electronic patient record. The client-server system contains all patients' data, questionnaire-based quality control, and a digital picture archive. Different interfaces guarantee the integration into the hospital's data network. This article describes our experiences in the development and introduction of a comprehensive image archiving system at a large orthopedic center.

  14. Australian DefenceScience. Volume 15, Number 2, Winter

    DTIC Science & Technology

    2007-01-01

    sources, including the original builders, Vickers Shipyards, the Royal Australian Navy Archives, the Australian National Archives, the British...South Australia. The trial team also included personnel from the Army 3rd/9th Light Horse as well the Royal Melbourne Institute of Technology and Vision...the Austeyr and AK47 weapons. The two torsos presented for testing were made of a 20% strength solution of gelatine jelly , which, at a temperature of

  15. Improvements to a Major Digital Archive of Seismic Waveforms from Nuclear Explosions: Borovoye Seismogram Archive

    DTIC Science & Technology

    2008-09-30

    coda) meet expectations. We are also interpreting absolute amplitudes, for those underground nuclear explosions at the Semipalatinsk Test Site (STS...waves, coda) meet expectations. We are also interpreting absolute amplitudes, for those underground nuclear explosions at the Semipalatinsk Test Site ...Monitoring Research Review: Ground-Based Nuclear Explosion Monitoring Technologies 4.0- Balapan Subregion Semipalatinsk Test Site n- 3.5 - (U CIO ’-3.0 ES UI

  16. Flexible server-side processing of climate archives

    NASA Astrophysics Data System (ADS)

    Juckes, Martin; Stephens, Ag; Damasio da Costa, Eduardo

    2014-05-01

    The flexibility and interoperability of OGC Web Processing Services are combined with an extensive range of data processing operations supported by the Climate Data Operators (CDO) library to facilitate processing of the CMIP5 climate data archive. The challenges posed by this peta-scale archive allow us to test and develop systems which will help us to deal with approaching exa-scale challenges. The CEDA WPS package allows users to manipulate data in the archive and export the results without first downloading the data -- in some cases this can drastically reduce the data volumes which need to be transferred and greatly reduce the time needed for the scientists to get their results. Reductions in data transfer are achieved at the expense of an additional computational load imposed on the archive (or near-archive) infrastructure. This is managed with a load balancing system. Short jobs may be run in near real-time, longer jobs will be queued. When jobs are queued the user is provided with a web dashboard displaying job status. A clean split between the data manipulation software and the request management software is achieved by exploiting the extensive CDO library. This library has a long history of development to support the needs of the climate science community. Use of the library ensures that operations run on data by the system can be reproduced by users using the same operators installed on their own computers. Examples using the system deployed for the CMIP5 archive will be shown and issues which need to be addressed as archive volumes expand into the exa-scale will be discussed.

  17. Flexible server-side processing of climate archives

    NASA Astrophysics Data System (ADS)

    Juckes, M. N.; Stephens, A.; da Costa, E. D.

    2013-12-01

    The flexibility and interoperability of OGC Web Processing Services are combined with an extensive range of data processing operations supported by the Climate Data Operators (CDO) library to facilitate processing of the CMIP5 climate data archive. The challenges posed by this peta-scale archive allow us to test and develop systems which will help us to deal with approaching exa-scale challenges. The CEDA WPS package allows users to manipulate data in the archive and export the results without first downloading the data -- in some cases this can drastically reduce the data volumes which need to be transferred and greatly reduce the time needed for the scientists to get their results. Reductions in data transfer are achieved at the expense of an additional computational load imposed on the archive (or near-archive) infrastructure. This is managed with a load balancing system. Short jobs may be run in near real-time, longer jobs will be queued. When jobs are queued the user is provided with a web dashboard displaying job status. A clean split between the data manipulation software and the request management software is achieved by exploiting the extensive CDO library. This library has a long history of development to support the needs of the climate science community. Use of the library ensures that operations run on data by the system can be reproduced by users using the same operators installed on their own computers. Examples using the system deployed for the CMIP5 archive will be shown and issues which need to be addressed as archive volumes expand into the exa-scale will be discussed.

  18. Trends in computer hardware and software.

    PubMed

    Frankenfeld, F M

    1993-04-01

    Previously identified and current trends in the development of computer systems and in the use of computers for health care applications are reviewed. Trends identified in a 1982 article were increasing miniaturization and archival ability, increasing software costs, increasing software independence, user empowerment through new software technologies, shorter computer-system life cycles, and more rapid development and support of pharmaceutical services. Most of these trends continue today. Current trends in hardware and software include the increasing use of reduced instruction-set computing, migration to the UNIX operating system, the development of large software libraries, microprocessor-based smart terminals that allow remote validation of data, speech synthesis and recognition, application generators, fourth-generation languages, computer-aided software engineering, object-oriented technologies, and artificial intelligence. Current trends specific to pharmacy and hospitals are the withdrawal of vendors of hospital information systems from the pharmacy market, improved linkage of information systems within hospitals, and increased regulation by government. The computer industry and its products continue to undergo dynamic change. Software development continues to lag behind hardware, and its high cost is offsetting the savings provided by hardware.

  19. Application-driven strategies for efficient transfer of medical images over very high speed networks

    NASA Astrophysics Data System (ADS)

    Alsafadi, Yasser H.; McNeill, Kevin M.; Martinez, Ralph

    1993-09-01

    The American College of Radiology (ACR) and the National Electrical Manufacturing Association (NEMA) in 1982 formed the ACR-NEMA committee to develop a standard to enable equipment from different vendors to communicate and participate in a picture archiving and communications system (PACS). The standard focused mostly on interconnectivity issues and communication needs of PACS. It was patterned after the international standards organization open systems interconnection (ISO/OSI) reference model. Three versions of the standard appeared, evolving from simple point-to-point specification of connection between two medical devices to a complex standard of a network environment. However, fast changes in network software and hardware technologies makes it difficult for the standard to keep pace. This paper compares two versions of the ACR-NEMA standard and then describes a system that is used at the University of Arizona Intensive Care Unit. In this system, the application should specify the interface to network services and grade of service required. These provisions are suggested to make the application independent from evolving network technology and support true open systems.

  20. [Application of electronic fence technology based on GIS in Oncomelania hupensis snail monitoring].

    PubMed

    Zhi-Hua, Chen; Yi-Sheng, Zhu; Zhi-Qiang, Xue; Xue-Bing, Li; Yi-Min, Ding; Li-Jun, Bi; Kai-Min, Gao; You, Zhang

    2017-07-27

    To study the application of Geographic Information System (GIS) electronic fence technique in Oncomelania hupensis snail monitoring. The electronic fence was set around the history and existing snail environments in the electronic map, the information about snail monitoring and controlling was linked to the electronic fence, and the snail monitoring information system was established on these bases. The monitoring information was input through the computer and smart phone. The electronic fence around the history and existing snail environments was set in the electronic map (Baidu map), and the snail monitoring information system and smart phone APP were established. The monitoring information was input and upload real-time, and the snail monitoring information was demonstrated in real time on Baidu map. By using the electronic fence technology based on GIS, the unique "environment electronic archives" for each snail monitoring environment can be established in the electronic map, and real-time, dynamic monitoring and visual management can be realized.

  1. Visual Systems for Interactive Exploration and Mining of Large-Scale Neuroimaging Data Archives

    PubMed Central

    Bowman, Ian; Joshi, Shantanu H.; Van Horn, John D.

    2012-01-01

    While technological advancements in neuroimaging scanner engineering have improved the efficiency of data acquisition, electronic data capture methods will likewise significantly expedite the populating of large-scale neuroimaging databases. As they do and these archives grow in size, a particular challenge lies in examining and interacting with the information that these resources contain through the development of compelling, user-driven approaches for data exploration and mining. In this article, we introduce the informatics visualization for neuroimaging (INVIZIAN) framework for the graphical rendering of, and dynamic interaction with the contents of large-scale neuroimaging data sets. We describe the rationale behind INVIZIAN, detail its development, and demonstrate its usage in examining a collection of over 900 T1-anatomical magnetic resonance imaging (MRI) image volumes from across a diverse set of clinical neuroimaging studies drawn from a leading neuroimaging database. Using a collection of cortical surface metrics and means for examining brain similarity, INVIZIAN graphically displays brain surfaces as points in a coordinate space and enables classification of clusters of neuroanatomically similar MRI images and data mining. As an initial step toward addressing the need for such user-friendly tools, INVIZIAN provides a highly unique means to interact with large quantities of electronic brain imaging archives in ways suitable for hypothesis generation and data mining. PMID:22536181

  2. On detecting variables using ROTSE-IIId archival data

    NASA Astrophysics Data System (ADS)

    Yesilyaprak, C.; Yerli, S. K.; Aksaker, N.; Gucsav, B. B.; Kiziloglu, U.; Dikicioglu, E.; Coker, D.; Aydin, E.; Ozeren, F. F.

    ROTSE (Robotic Optical Transient Search Experiment) telescopes can also be used for variable star detection. As explained in the system description tep{2003PASP..115..132A}, they have a good sky coverage and they allow a fast data acquisition. The optical magnitude range varies between 7^m to 19^m. Thirty percent of the telescope time of north-eastern leg of the network, namely ROTSE-IIId (located at TUBITAK National Observatory, Bakirlitepe, Turkey http://www.tug.tubitak.gov.tr/) is owned by Turkish researchers. Since its first light (May 2004) considerably a large amount of data has been collected (around 2 TB) from the Turkish time and roughly one million objects have been identified from the reduced data. A robust pipeline has been constructed to discover new variables, transients and planetary nebulae from this archival data. In the detection process, different statistical methods were applied to the archive. We have detected thousands of variable stars by applying roughly four different tests to light curve of each star. In this work a summary of the pipeline is presented. It uses a high performance computing (HPC) algorithm which performs inhomogeneous ensemble photometry of the data on a 36 core cluster. This study is supported by TUBITAK (Scientific and Technological Research Council of Turkey) with the grant number TBAG-108T475.

  3. Proceedings from the Texas ITS data uses and archiving workshop

    DOT National Transportation Integrated Search

    1999-03-01

    The "Texas ITS Data Uses and Archiving Workshop" was held November 10, 1998, in Austin, Texas, to : discuss issues and opportunities related to archiving data from intelligent transportation systems (ITS). The : workshop participants represented seve...

  4. Tropical Storm Hermine in the Gulf of Mexico

    NASA Image and Video Library

    2017-12-08

    NASA image acquired Sept 6, 2010 at 16 :45 UTC Tropical Storm Hermine (10L) in the Gulf of Mexico Satellite: Terra Credit: NASA/GSFC/Jeff Schmaltz/MODIS Land Rapid Response Team To learn more go to: www.nasa.gov/mission_pages/hurricanes/archives/2010/h2010... NASA Goddard Space Flight Center is home to the nation's largest organization of combined scientists, engineers and technologists that build spacecraft, instruments and new technology to study the Earth, the sun, our solar system, and the universe. Follow us on Twitter Join us on Facebook

  5. The MSG Central Facility - A Mission Control System for Windows NT

    NASA Astrophysics Data System (ADS)

    Thompson, R.

    The MSG Central Facility, being developed by Science Systems for EUMETSAT1, represents the first of a new generation of satellite mission control systems, based on the Windows NT operating system. The system makes use of a range of new technologies to provide an integrated environment for the planning, scheduling, control and monitoring of the entire Meteosat Second Generation mission. It supports packetised TM/TC and uses Science System's Space UNiT product to provide automated operations support at both Schedule (Timeline) and Procedure levels. Flexible access to historical data is provided through an operations archive based on ORACLE Enterprise Server, hosted on a large RAID array and off-line tape jukebox. Event driven real-time data distribution is based on the CORBA standard. Operations preparation and configuration control tools form a fully integrated element of the system.

  6. An Update on the CDDIS

    NASA Technical Reports Server (NTRS)

    Noll, Carey; Michael, Patrick; Dube, Maurice P.; Pollack, N.

    2012-01-01

    The Crustal Dynamics Data Inforn1ation System (CoorS) supports data archiving and distribution activities for the space geodesy and geodynamics community. The main objectives of the system are to store space geodesy and geodynamics related data products in a central data bank, to maintain infom1ation about the archival of these data, and to disseminate these data and information in a timely mam1er to a global scientific research community. The archive consists of GNSS, laser ranging, VLBI, and OORIS data sets and products derived from these data. The coors is one of NASA's Earth Observing System Oata and Infom1ation System (EOSorS) distributed data centers; EOSOIS data centers serve a diverse user community and are tasked to provide facilities to search and access science data and products. The coors data system and its archive have become increasingly important to many national and international science communities, in pal1icular several of the operational services within the International Association of Geodesy (lAG) and its project the Global Geodetic Observing System (GGOS), including the International OORIS Service (IDS), the International GNSS Service (IGS), the International Laser Ranging Service (ILRS), the International VLBI Service for Geodesy and Astrometry (IVS), and the International Earth Rotation Service (IERS). The coors has recently expanded its archive to supp011 the IGS Multi-GNSS Experiment (MGEX). The archive now contains daily and hourly 3D-second and subhourly I-second data from an additional 35+ stations in RINEX V3 fOm1at. The coors will soon install an Ntrip broadcast relay to support the activities of the IGS Real-Time Pilot Project (RTPP) and the future Real-Time IGS Service. The coors has also developed a new web-based application to aid users in data discovery, both within the current community and beyond. To enable this data discovery application, the CDDIS is currently implementing modifications to the metadata extracted from incoming data and product files pushed to its archive. This poster will include background information about the system and its user communities, archive contents and updates, enhancements for data discovery, new system architecture, and future plans.

  7. Implementation of system intelligence in a 3-tier telemedicine/PACS hierarchical storage management system

    NASA Astrophysics Data System (ADS)

    Chao, Woodrew; Ho, Bruce K. T.; Chao, John T.; Sadri, Reza M.; Huang, Lu J.; Taira, Ricky K.

    1995-05-01

    Our tele-medicine/PACS archive system is based on a three-tier distributed hierarchical architecture, including magnetic disk farms, optical jukebox, and tape jukebox sub-systems. The hierarchical storage management (HSM) architecture, built around a low cost high performance platform [personal computers (PC) and Microsoft Windows NT], presents a very scaleable and distributed solution ideal for meeting the needs of client/server environments such as tele-medicine, tele-radiology, and PACS. These image based systems typically require storage capacities mirroring those of film based technology (multi-terabyte with 10+ years storage) and patient data retrieval times at near on-line performance as demanded by radiologists. With the scaleable architecture, storage requirements can be easily configured to meet the needs of the small clinic (multi-gigabyte) to those of a major hospital (multi-terabyte). The patient data retrieval performance requirement was achieved by employing system intelligence to manage migration and caching of archived data. Relevant information from HIS/RIS triggers prefetching of data whenever possible based on simple rules. System intelligence embedded in the migration manger allows the clustering of patient data onto a single tape during data migration from optical to tape medium. Clustering of patient data on the same tape eliminates multiple tape loading and associated seek time during patient data retrieval. Optimal tape performance can then be achieved by utilizing the tape drives high performance data streaming capabilities thereby reducing typical data retrieval delays associated with streaming tape devices.

  8. A Testbed Demonstration of an Intelligent Archive in a Knowledge Building System

    NASA Technical Reports Server (NTRS)

    Ramapriyan, Hampapuram; Isaac, David; Morse, Steve; Yang, Wenli; Bonnlander, Brian; McConaughy, Gail; Di, Liping; Danks, David

    2005-01-01

    The last decade's influx of raw data and derived geophysical parameters from several Earth observing satellites to NASA data centers has created a data-rich environment for Earth science research and applications. While advances in hardware and information management have made it possible to archive petabytes of data and distribute terabytes of data daily to a broad community of users, further progress is necessary in the transformation of data into information, and information into knowledge that can be used in particular applications in order to realize the full potential of these valuable datasets. In examining what is needed to enable this progress in the data provider environment that exists today and is expected to evolve in the next several years, we arrived at the concept of an Intelligent Archive in context of a Knowledge Building System (IA/KBS). Our prior work and associated papers investigated usage scenarios, required capabilities, system architecture, data volume issues, and supporting technologies. We identified six key capabilities of an IA/KBS: Virtual Product Generation, Significant Event Detection, Automated Data Quality Assessment, Large-Scale Data Mining, Dynamic Feedback Loop, and Data Discovery and Efficient Requesting. Among these capabilities, large-scale data mining is perceived by many in the community to be an area of technical risk. One of the main reasons for this is that standard data mining research and algorithms operate on datasets that are several orders of magnitude smaller than the actual sizes of datasets maintained by realistic earth science data archives. Therefore, we defined a test-bed activity to implement a large-scale data mining algorithm in a pseudo-operational scale environment and to examine any issues involved. The application chosen for applying the data mining algorithm is wildfire prediction over the continental U.S. This paper reports a number of observations based on our experience with this test-bed. While proof-of-concept for data mining scalability and utility has been a major goal for the research reported here, it was not the only one. The other five capabilities of an WKBS named above have been considered as well, and an assessment of the implications of our experience for these other areas will also be presented. The lessons learned through the testbed effort and presented in this paper will benefit technologists, scientists, and system operators as they consider introducing IA/KBS capabilities into production systems.

  9. ASF archive issues: Current status, past history, and questions for the future

    NASA Technical Reports Server (NTRS)

    Goula, Crystal A.; Wales, Carl

    1994-01-01

    The Alaska SAR Facility (ASF) collects, processes, archives, and distributes data from synthetic aperture radar (SAR) satellites in support of scientific research. ASF has been in operation since 1991 and presently has an archive of over 100 terabytes of data. ASF is performing an analysis of its magnetic tape storage system to ensure long-term preservation of this archive. Future satellite missions have the possibility of doubling to tripling the amounts of data that ASF acquires. ASF is examining the current data systems and the high volume storage, and exploring future concerns and solutions.

  10. LVFS: A Big Data File Storage Bridge for the HPC Community

    NASA Astrophysics Data System (ADS)

    Golpayegani, N.; Halem, M.; Mauoka, E.; Fonseca, L. F.

    2015-12-01

    Merging Big Data capabilities into High Performance Computing architecture starts at the file storage level. Heterogeneous storage systems are emerging which offer enhanced features for dealing with Big Data such as the IBM GPFS storage system's integration into Hadoop Map-Reduce. Taking advantage of these capabilities requires file storage systems to be adaptive and accommodate these new storage technologies. We present the extension of the Lightweight Virtual File System (LVFS) currently running as the production system for the MODIS Level 1 and Atmosphere Archive and Distribution System (LAADS) to incorporate a flexible plugin architecture which allows easy integration of new HPC hardware and/or software storage technologies without disrupting workflows, system architectures and only minimal impact on existing tools. We consider two essential aspects provided by the LVFS plugin architecture needed for the future HPC community. First, it allows for the seamless integration of new and emerging hardware technologies which are significantly different than existing technologies such as Segate's Kinetic disks and Intel's 3DXPoint non-volatile storage. Second is the transparent and instantaneous conversion between new software technologies and various file formats. With most current storage system a switch in file format would require costly reprocessing and nearly doubling of storage requirements. We will install LVFS on UMBC's IBM iDataPlex cluster with a heterogeneous storage architecture utilizing local, remote, and Seagate Kinetic storage as a case study. LVFS merges different kinds of storage architectures to show users a uniform layout and, therefore, prevent any disruption in workflows, architecture design, or tool usage. We will show how LVFS will convert HDF data produced by applying machine learning algorithms to Xco2 Level 2 data from the OCO-2 satellite to produce CO2 surface fluxes into GeoTIFF for visualization.

  11. Proceedings from the Texas ITS data uses and archiving workshop : draft

    DOT National Transportation Integrated Search

    1999-03-01

    The "Texas ITS Data Uses and Archiving Workshop" was held November 10, 1998, in Austin, Texas, to discuss issues and opportunities related to archiving data from intelligent transportation systems (ITS). The workshop participants represented several ...

  12. Use of film digitizers to assist radiology image management

    NASA Astrophysics Data System (ADS)

    Honeyman-Buck, Janice C.; Frost, Meryll M.; Staab, Edward V.

    1996-05-01

    The purpose of this development effort was to evaluate the possibility of using digital technologies to solve image management problems in the Department of Radiology at the University of Florida. The three problem areas investigated were local interpretation of images produced in remote locations, distribution of images to areas outside of radiology, and film handling. In all cases the use of a laser film digitizer interfaced to an existing Picture Archiving and Communication System (PACS) was investigated as a solution to the problem. In each case the volume of studies involved were evaluated to estimate the impact of the solution on the network, archive, and workstations. Communications were stressed in the analysis of the needs for all image transmission. The operational aspects of the solution were examined to determine the needs for training, service, and maintenance. The remote sites requiring local interpretation included were a rural hospital needing coverage for after hours studies, the University of Florida student infirmary, and the emergency room. Distribution of images to the intensive care units was studied to improve image access and patient care. Handling of films originating from remote sites and those requiring urgent reporting were evaluated to improve management functions. The results of our analysis and the decisions that were made based on the analysis are described below. In the cases where systems were installed, a description of the system and its integration into the PACS system is included. For all three problem areas, although we could move images via a digitizer to the archive and a workstation, there was no way to inform the radiologist that a study needed attention. In the case of outside films, the patient did not always have a medical record number that matched one in our Radiology Information Systems (RIS). In order to incorporate all studies for a patient, we needed common locations for orders, reports, and images. RIS orders were generated for each outside study to be interpreted and a medical record number assigned if none existed. All digitized outside films were archived in the PACS archive for later review or comparison use. The request generated by the RIS requesting a diagnostic interpretation was placed at the PACS workstation to alert the radiologists that unread images had arrived and a box was added to the workstation user interface that could be checked by the radiologist to indicate that a report had been dictated. The digitizer system solved several problems, unavailable films in the emergency room, teleradiology, and archiving of outside studies that had been read by University of Florida radiologists. In addition to saving time for outside film management, we now store the studies for comparison purposes, no longer lose emergency room films, generate diagnostic reports on emergency room films in a timely manner (important for billing and reimbursement), and can handle the distributed nature of our business. As changes in health care drive management changes, existing tools can be used in new ways to help make the transition easier. In this case, adding digitizers to an existing PACS network helped solve several image management problems.

  13. XDS-I Gateway Development for HIE Connectivity with Legacy PACS at Gil Hospital.

    PubMed

    Simalango, Mikael Fernandus; Kim, Youngchul; Seo, Young Tae; Choi, Young Hwan; Cho, Yong Kyun

    2013-12-01

    The ability to support healthcare document sharing is imperative in a health information exchange (HIE). Sharing imaging documents or images, however, can be challenging, especially when they are stored in a picture archiving and communication system (PACS) archive that does not support document sharing via standard HIE protocols. This research proposes a standard-compliant imaging gateway that enables connectivity between a legacy PACS and the entire HIE. Investigation of the PACS solutions used at Gil Hospital was conducted. An imaging gateway application was then developed using a Java technology stack. Imaging document sharing capability enabled by the gateway was tested by integrating it into Gil Hospital's order communication system and its HIE infrastructure. The gateway can acquire radiology images from a PACS storage system, provide and register the images to Gil Hospital's HIE for document sharing purposes, and make the images retrievable by a cross-enterprise document sharing document viewer. Development of an imaging gateway that mediates communication between a PACS and an HIE can be considered a viable option when the PACS does not support the standard protocol for cross-enterprise document sharing for imaging. Furthermore, the availability of common HIE standards expedites the development and integration of the imaging gateway with an HIE.

  14. XDS-I Gateway Development for HIE Connectivity with Legacy PACS at Gil Hospital

    PubMed Central

    Simalango, Mikael Fernandus; Kim, Youngchul; Seo, Young Tae; Cho, Yong Kyun

    2013-01-01

    Objectives The ability to support healthcare document sharing is imperative in a health information exchange (HIE). Sharing imaging documents or images, however, can be challenging, especially when they are stored in a picture archiving and communication system (PACS) archive that does not support document sharing via standard HIE protocols. This research proposes a standard-compliant imaging gateway that enables connectivity between a legacy PACS and the entire HIE. Methods Investigation of the PACS solutions used at Gil Hospital was conducted. An imaging gateway application was then developed using a Java technology stack. Imaging document sharing capability enabled by the gateway was tested by integrating it into Gil Hospital's order communication system and its HIE infrastructure. Results The gateway can acquire radiology images from a PACS storage system, provide and register the images to Gil Hospital's HIE for document sharing purposes, and make the images retrievable by a cross-enterprise document sharing document viewer. Conclusions Development of an imaging gateway that mediates communication between a PACS and an HIE can be considered a viable option when the PACS does not support the standard protocol for cross-enterprise document sharing for imaging. Furthermore, the availability of common HIE standards expedites the development and integration of the imaging gateway with an HIE. PMID:24523994

  15. Buckets: A New Digital Library Technology for Preserving NASA Research.

    ERIC Educational Resources Information Center

    Nelson, Michael L.

    2001-01-01

    Discusses the need for preserving and disseminating scientific and technical information through digital libraries and describes buckets, an intelligent construct for publishing that contains data and metadata and methods for accessing them. Explains SODA (Smart Object, Dumb Archive) and discusses experiences using these technologies in NASA and…

  16. The Evolution of Communication from Hieroglyphics to DVDs

    ERIC Educational Resources Information Center

    Fitzgerald, Mike

    2006-01-01

    Communication spreads knowledge worldwide. It could be argued that "communication" is one of the greatest human achievements. In this article, the author briefly describes some key technologies associated with the archiving and communication of information. Since communication technologies continue to evolve at a fast pace, he simply focuses on…

  17. High Performance Computing and Networking for Science--Background Paper.

    ERIC Educational Resources Information Center

    Congress of the U.S., Washington, DC. Office of Technology Assessment.

    The Office of Technology Assessment is conducting an assessment of the effects of new information technologies--including high performance computing, data networking, and mass data archiving--on research and development. This paper offers a view of the issues and their implications for current discussions about Federal supercomputer initiatives…

  18. Photo CD and Other Digital Imaging Technologies: What's out There and What's It For?

    ERIC Educational Resources Information Center

    Chen, Ching-Chih

    1993-01-01

    Describes Kodak's Photo CD technology and its impact on digital imaging. Color desktop publishing, image processing and preservation, image archival storage, and interactive multimedia development, as well as the equipment, software, and services that make these applications possible, are described. Contact information for developers and…

  19. Making Technology Work for Scholarship: Investing in the Data.

    ERIC Educational Resources Information Center

    Hockey, Susan

    This paper examines issues related to how providers and consumers can make the best use of electronic information, focusing on the humanities. Topics include: new technology or old; electronic text and data formats; Standard Generalized Markup Language (SGML); text encoding initiative; encoded archival description (EAD); other applications of…

  20. Optimisation of solar synoptic observations

    NASA Astrophysics Data System (ADS)

    Klvaña, Miroslav; Sobotka, Michal; Švanda, Michal

    2012-09-01

    The development of instrumental and computer technologies is connected with steadily increasing needs for archiving of large data volumes. The current trend to meet this requirement includes the data compression and growth of storage capacities. This approach, however, has technical and practical limits. A further reduction of the archived data volume can be achieved by means of an optimisation of the archiving that consists in data selection without losing the useful information. We describe a method of optimised archiving of solar images, based on the selection of images that contain a new information. The new information content is evaluated by means of the analysis of changes detected in the images. We present characteristics of different kinds of image changes and divide them into fictitious changes with a disturbing effect and real changes that provide a new information. In block diagrams describing the selection and archiving, we demonstrate the influence of clouds, the recording of images during an active event on the Sun, including a period before the event onset, and the archiving of long-term history of solar activity. The described optimisation technique is not suitable for helioseismology, because it does not conserve the uniform time step in the archived sequence and removes the information about solar oscillations. In case of long-term synoptic observations, the optimised archiving can save a large amount of storage capacities. The actual capacity saving will depend on the setting of the change-detection sensitivity and on the capability to exclude the fictitious changes.

  1. Using Modern Technologies to Capture and Share Indigenous Astronomical Knowledge

    NASA Astrophysics Data System (ADS)

    Nakata, Martin; Hamacher, Duane W.; Warren, John; Byrne, Alex; Pagnucco, Maurice; Harley, Ross; Venugopal, Srikumar; Thorpe, Kirsten; Neville, Richard; Bolt, Reuben

    2014-06-01

    Indigenous Knowledge is important for Indigenous communities across the globe and for the advancement of our general scientific knowledge. In particular, Indigenous astronomical knowledge integrates many aspects of Indigenous Knowledge, including seasonal calendars, navigation, food economics, law, ceremony, and social structure. Capturing, managing, and disseminating this knowledge in the digital environment poses a number of challenges, which we aim to address using a collaborative project emerging between experts in the higher education, library, archive and industry sectors. Using Microsoft's WorldWide Telescope and Rich Interactive Narratives technologies, we propose to develop software, media design, and archival management solutions to allow Indigenous communities to share their astronomical knowledge with the world on their terms and in a culturally sensitive manner.

  2. Simple, Script-Based Science Processing Archive

    NASA Technical Reports Server (NTRS)

    Lynnes, Christopher; Hegde, Mahabaleshwara; Barth, C. Wrandle

    2007-01-01

    The Simple, Scalable, Script-based Science Processing (S4P) Archive (S4PA) is a disk-based archival system for remote sensing data. It is based on the data-driven framework of S4P and is used for data transfer, data preprocessing, metadata generation, data archive, and data distribution. New data are automatically detected by the system. S4P provides services such as data access control, data subscription, metadata publication, data replication, and data recovery. It comprises scripts that control the data flow. The system detects the availability of data on an FTP (file transfer protocol) server, initiates data transfer, preprocesses data if necessary, and archives it on readily available disk drives with FTP and HTTP (Hypertext Transfer Protocol) access, allowing instantaneous data access. There are options for plug-ins for data preprocessing before storage. Publication of metadata to external applications such as the Earth Observing System Clearinghouse (ECHO) is also supported. S4PA includes a graphical user interface for monitoring the system operation and a tool for deploying the system. To ensure reliability, S4P continuously checks stored data for integrity, Further reliability is provided by tape backups of disks made once a disk partition is full and closed. The system is designed for low maintenance, requiring minimal operator oversight.

  3. Design and implementation of scalable tape archiver

    NASA Technical Reports Server (NTRS)

    Nemoto, Toshihiro; Kitsuregawa, Masaru; Takagi, Mikio

    1996-01-01

    In order to reduce costs, computer manufacturers try to use commodity parts as much as possible. Mainframes using proprietary processors are being replaced by high performance RISC microprocessor-based workstations, which are further being replaced by the commodity microprocessor used in personal computers. Highly reliable disks for mainframes are also being replaced by disk arrays, which are complexes of disk drives. In this paper we try to clarify the feasibility of a large scale tertiary storage system composed of 8-mm tape archivers utilizing robotics. In the near future, the 8-mm tape archiver will be widely used and become a commodity part, since recent rapid growth of multimedia applications requires much larger storage than disk drives can provide. We designed a scalable tape archiver which connects as many 8-mm tape archivers (element archivers) as possible. In the scalable archiver, robotics can exchange a cassette tape between two adjacent element archivers mechanically. Thus, we can build a large scalable archiver inexpensively. In addition, a sophisticated migration mechanism distributes frequently accessed tapes (hot tapes) evenly among all of the element archivers, which improves the throughput considerably. Even with the failures of some tape drives, the system dynamically redistributes hot tapes to the other element archivers which have live tape drives. Several kinds of specially tailored huge archivers are on the market, however, the 8-mm tape scalable archiver could replace them. To maintain high performance in spite of high access locality when a large number of archivers are attached to the scalable archiver, it is necessary to scatter frequently accessed cassettes among the element archivers and to use the tape drives efficiently. For this purpose, we introduce two cassette migration algorithms, foreground migration and background migration. Background migration transfers cassettes between element archivers to redistribute frequently accessed cassettes, thus balancing the load of each archiver. Background migration occurs the robotics are idle. Both migration algorithms are based on access frequency and space utility of each element archiver. To normalize these parameters according to the number of drives in each element archiver, it is possible to maintain high performance even if some tape drives fail. We found that the foreground migration is efficient at reducing access response time. Beside the foreground migration, the background migration makes it possible to track the transition of spatial access locality quickly.

  4. A Case in Pointe: Romance and Regimentation at the New York City Ballet.

    PubMed

    Laemmli, Whitney E

    2015-01-01

    This article analyzes the ballet dancer's pointe shoe as a technology of artistic production and bodily discipline. Drawing on oral histories, memoirs, dance journals, advertisements, and other archival materials, it demonstrates that the shoe utilized by dancers at George Balanchine's New York City Ballet was not the quintessentially Romantic entity it is so often presumed to be. Instead, it emerged from uniquely twentieth-century systems of labor and production, and it was used to alter dancers' bodies and professional lives in particularly modern ways. The article explores not only the substance of these changes but also the ways in which Balanchine's artistic oeuvre was inextricably intertwined with the material technologies he employed and, more broadly, how the history of technology and the history of dance can productively inform one another. Fundamentally, this article recasts Balanchine, seeing him not as a disconnected artist but as an eager participant in the twentieth-century national romance with American technology.

  5. Use of multidimensional, multimodal imaging and PACS to support neurological diagnoses

    NASA Astrophysics Data System (ADS)

    Wong, Stephen T. C.; Knowlton, Robert C.; Hoo, Kent S.; Huang, H. K.

    1995-05-01

    Technological advances in brain imaging have revolutionized diagnosis in neurology and neurological surgery. Major imaging techniques include magnetic resonance imaging (MRI) to visualize structural anatomy, positron emission tomography (PET) to image metabolic function and cerebral blood flow, magnetoencephalography (MEG) to visualize the location of physiologic current sources, and magnetic resonance spectroscopy (MRS) to measure specific biochemicals. Each of these techniques studies different biomedical aspects of the brain, but there lacks an effective means to quantify and correlate the disparate imaging datasets in order to improve clinical decision making processes. This paper describes several techniques developed in a UNIX-based neurodiagnostic workstation to aid the noninvasive presurgical evaluation of epilepsy patients. These techniques include online access to the picture archiving and communication systems (PACS) multimedia archive, coregistration of multimodality image datasets, and correlation and quantitation of structural and functional information contained in the registered images. For illustration, we describe the use of these techniques in a patient case of nonlesional neocortical epilepsy. We also present out future work based on preliminary studies.

  6. The Diesel Combustion Collaboratory: Combustion Researchers Collaborating over the Internet

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    C. M. Pancerella; L. A. Rahn; C. Yang

    2000-02-01

    The Diesel Combustion Collaborator (DCC) is a pilot project to develop and deploy collaborative technologies to combustion researchers distributed throughout the DOE national laboratories, academia, and industry. The result is a problem-solving environment for combustion research. Researchers collaborate over the Internet using DCC tools, which include: a distributed execution management system for running combustion models on widely distributed computers, including supercomputers; web-accessible data archiving capabilities for sharing graphical experimental or modeling data; electronic notebooks and shared workspaces for facilitating collaboration; visualization of combustion data; and video-conferencing and data-conferencing among researchers at remote sites. Security is a key aspect of themore » collaborative tools. In many cases, the authors have integrated these tools to allow data, including large combustion data sets, to flow seamlessly, for example, from modeling tools to data archives. In this paper the authors describe the work of a larger collaborative effort to design, implement and deploy the DCC.« less

  7. Satellite and earth science data management activities at the U.S. geological survey's EROS data center

    USGS Publications Warehouse

    Carneggie, David M.; Metz, Gary G.; Draeger, William C.; Thompson, Ralph J.

    1991-01-01

    The U.S. Geological Survey's Earth Resources Observation Systems (EROS) Data Center, the national archive for Landsat data, has 20 years of experience in acquiring, archiving, processing, and distributing Landsat and earth science data. The Center is expanding its satellite and earth science data management activities to support the U.S. Global Change Research Program and the National Aeronautics and Space Administration (NASA) Earth Observing System Program. The Center's current and future data management activities focus on land data and include: satellite and earth science data set acquisition, development and archiving; data set preservation, maintenance and conversion to more durable and accessible archive medium; development of an advanced Land Data Information System; development of enhanced data packaging and distribution mechanisms; and data processing, reprocessing, and product generation systems.

  8. OneDep: Unified wwPDB System for Deposition, Biocuration, and Validation of Macromolecular Structures in the PDB Archive.

    PubMed

    Young, Jasmine Y; Westbrook, John D; Feng, Zukang; Sala, Raul; Peisach, Ezra; Oldfield, Thomas J; Sen, Sanchayita; Gutmanas, Aleksandras; Armstrong, David R; Berrisford, John M; Chen, Li; Chen, Minyu; Di Costanzo, Luigi; Dimitropoulos, Dimitris; Gao, Guanghua; Ghosh, Sutapa; Gore, Swanand; Guranovic, Vladimir; Hendrickx, Pieter M S; Hudson, Brian P; Igarashi, Reiko; Ikegawa, Yasuyo; Kobayashi, Naohiro; Lawson, Catherine L; Liang, Yuhe; Mading, Steve; Mak, Lora; Mir, M Saqib; Mukhopadhyay, Abhik; Patwardhan, Ardan; Persikova, Irina; Rinaldi, Luana; Sanz-Garcia, Eduardo; Sekharan, Monica R; Shao, Chenghua; Swaminathan, G Jawahar; Tan, Lihua; Ulrich, Eldon L; van Ginkel, Glen; Yamashita, Reiko; Yang, Huanwang; Zhuravleva, Marina A; Quesada, Martha; Kleywegt, Gerard J; Berman, Helen M; Markley, John L; Nakamura, Haruki; Velankar, Sameer; Burley, Stephen K

    2017-03-07

    OneDep, a unified system for deposition, biocuration, and validation of experimentally determined structures of biological macromolecules to the PDB archive, has been developed as a global collaboration by the worldwide PDB (wwPDB) partners. This new system was designed to ensure that the wwPDB could meet the evolving archiving requirements of the scientific community over the coming decades. OneDep unifies deposition, biocuration, and validation pipelines across all wwPDB, EMDB, and BMRB deposition sites with improved focus on data quality and completeness in these archives, while supporting growth in the number of depositions and increases in their average size and complexity. In this paper, we describe the design, functional operation, and supporting infrastructure of the OneDep system, and provide initial performance assessments. Published by Elsevier Ltd.

  9. The Telecommunications and Data Acquisition Report

    NASA Technical Reports Server (NTRS)

    Yuen, Joseph H. (Editor)

    1995-01-01

    This quarterly publiction provides archival reports on developments in programs managed by JPL Telecommunications and Mission Operations Directorate (TMOD), which now includes the former communications and Data Acquisition (TDA) Office. In space communications, radio navigation, radio science, and ground-based radio and radar astronomy, it reports on activities of the Deep Space Network (DSN) in planning, supporting research and technology, implementation, and operations. Also included are standards activity at JPL for space data and information systems and reimbursable DSN work performed for other space agencies through NASA. The Orbital Debris Radar Program, funded by the Office of Space Systems Development, makes use of the planetary radar capability when the antennas are configured at science instruments making direct observations of planets, their satellites, and asteroids of our solar system.

  10. Experiences with ATM in a multivendor pilot system at Forschungszentrum Julich

    NASA Astrophysics Data System (ADS)

    Kleines, H.; Ziemons, K.; Zwoll, K.

    1998-08-01

    The ATM technology for high speed serial transmission provides a new quality of communication by introducing novel features in a LAN environment, especially support of real time communication, of both LAN and WAN communication and of multimedia streams. In order to evaluate ATM for future DAQ systems and remote control systems as well as for a high speed picture archiving and communications system for medical images, Forschungszentrum Julich has build up a pilot system for the evaluation of ATM and standard low cost multimedia systems. It is a heterogeneous multivendor system containing a variety of switches and desktop solutions, employing different protocol options of ATM. The tests conducted in the pilot system revealed major difficulties regarding stability, interoperability and performance. The paper presents motivations, layout and results of the pilot system. Discussion of results concentrates on performance issues relevant for realistic applications, e.g., connection to a RAID system via NFS over ATM.

  11. Development of public science archive system of Subaro Telescope. 2

    NASA Astrophysics Data System (ADS)

    Yamamoto, Naotaka; Noda, Sachiyo; Taga, Masatoshi; Ozawa, Tomohiko; Horaguchi, Toshihiro; Okumura, Shin-Ichiro; Furusho, Reiko; Baba, Hajime; Yagi, Masafumi; Yasuda, Naoki; Takata, Tadafumi; Ichikawa, Shin-Ichi

    2003-09-01

    We report various improvements in a public science archive system, SMOKA (Subaru-Mitaka-Okayama-Kiso Archive system). We have developed a new interface to search observational data of minor bodies in the solar system. In addition, the other improvements (1) to search frames by specifying wavelength directly, (2) to find out calibration data set automatically, (3) to browse data on weather, humidity, and temperature, which provide information of image quality, (4) to provide quick-look images of OHS/CISCO and IRCS, and (5) to include the data from OAO HIDES (HIgh Dispersion Echelle Spectrograph), are also summarized.

  12. PACS technologies and reliability: are we making things better or worse?

    NASA Astrophysics Data System (ADS)

    Horii, Steven C.; Redfern, Regina O.; Kundel, Harold L.; Nodine, Calvin F.

    2002-05-01

    In the process of installing picture archiving and communications (PACS) and speech recognition equipment, upgrading it, and working with previously stored digital image information, the authors encountered a number of problems. Examination of these difficulties illustrated the complex nature of our existing systems and how difficult it is, in many cases, to predict the behavior of these systems. This was found to be true even for our relatively small number of interconnected systems. The purpose of this paper is to illustrate some of the principles of understanding complex system interaction through examples from our experience. The work for this paper grew out of a number of studies we had carried out on our PACS over several years. The complex nature of our systems was evaluated through comparison of our operations with known examples of systems in other industries. Three scenarios: a network failure, a system software upgrade, and attempting to read media from an old archive showed that the major systems used in the radiology departments of many healthcare facilities (HIS, RIS, PACS, and speed recognition) are likely to interact in complex and often unpredictable ways. These interactions may be very difficult or impossible to predict, so that some plans should be made to overcome the negative aspects of the problems that result. Failures and problems, often unpredictable ones, are a likely side effect of having multiple information handling and processing systems interconnected and interoperating. Planning to avoid, or at least not be so vulnerable, to such difficulties is an important aspect of systems planning.

  13. Investment alternative: the status quo or PACS?

    NASA Astrophysics Data System (ADS)

    Vanden Brink, John A.; Cywinski, Jozef K.

    1990-08-01

    While the cost of Picture Archiving and Communication Systems (PACS) can be substantial, the cost of continuing with present manual methods may become prohibitive in growing departments as the need for additional space and personnel (both technical and professional) to meet the increasing requirements for all image management activities continues to grow. This will occur simultaneously with increasing pressures on problems of the present system, i.e., lost films, lost revenues, delayed reporting and longer diagnostic cycle times. Present methods of image archiving communication and management i.e. the relationship of procedure volume to VFE requirements for professional and technical personnel, costs of film, film storage space, and other performance factors are analyzed based on the database created by the Technology Marketing Group (TMG) computerized cost analysis model applied to over 50 US hospitals. Also, the model is used to provide the projected cost of present methods of film management for an average US 400 +bed hospital based on ten year growth rate assumptions. TMG PACS Tracking data provides confirmation of staffmg pattern correlation to procedure volume. The data presented in the paper provides a basis for comparing the investment in maintaining the status quo to an investment in PACS.

  14. The Path from Large Earth Science Datasets to Information

    NASA Astrophysics Data System (ADS)

    Vicente, G. A.

    2013-12-01

    The NASA Goddard Earth Sciences Data (GES) and Information Services Center (DISC) is one of the major Science Mission Directorate (SMD) for archiving and distribution of Earth Science remote sensing data, products and services. This virtual portal provides convenient access to Atmospheric Composition and Dynamics, Hydrology, Precipitation, Ozone, and model derived datasets (generated by GSFC's Global Modeling and Assimilation Office), the North American Land Data Assimilation System (NLDAS) and the Global Land Data Assimilation System (GLDAS) data products (both generated by GSFC's Hydrological Sciences Branch). This presentation demonstrates various tools and computational technologies developed in the GES DISC to manage the huge volume of data and products acquired from various missions and programs over the years. It explores approaches to archive, document, distribute, access and analyze Earth Science data and information as well as addresses the technical and scientific issues, governance and user support problem faced by scientists in need of multi-disciplinary datasets. It also discusses data and product metrics, user distribution profiles and lessons learned through interactions with the science communities around the world. Finally it demonstrates some of the most used data and product visualization and analyses tools developed and maintained by the GES DISC.

  15. Concept and design engineering: endourology operating room.

    PubMed

    Sabnis, Ravindra; Ganesamoni, Raguram; Mishra, Shashikant; Sinha, Lokesh; Desai, Mahesh R

    2013-03-01

    A dedicated operating room with fluoroscopic imaging capability and adequate data connectivity is important to the success of any endourology program. Proper understanding of the recent developments in technology in relation to operating room is necessary before planning an endourology operating room. An endourology operating room is a fluorocompatible operating room with enough space to accommodate equipment like multiple flat monitors to display video, C-arm with its monitor, ultrasonography machine, laser machine, intracorporeal lithotripsy unit, irrigation pumps and two large trolleys with instruments. This operating room is integrated with devices to continuously record and archive data from endovision and surface cameras, ultrasound and fluoroscopy. Moreover, advances made in data relay systems have created seamless two-way communication between the operating room and electronic medical records, radiological picture archiving and communication system, classroom, auditorium and literally anywhere in the world. A dedicated endourology operating room is required for any hospital, which has a significant amount of endourology procedures. A custom-made integrated endourology operating room will facilitate endourology procedures, smoothen the workflow in operating room and improve patient outcomes. Meticulous planning and involving experts in the field are critical for the success of the project.

  16. AGU Council makes major commitment to the future of electronic publication

    NASA Astrophysics Data System (ADS)

    The establishment of a "perpetual care trust fund for AGU's electronic archives," which was done at the December 1996 meeting, is perhaps one of the most far-reaching actions ever taken by the AGU Council. The Union recognizes that it has a responsibility to the scientific community to assure that AGU publications are available in the future. Without the careful protection and upgrading of the files developed for AGU's electronic publications, there could be a hiatus in the archive of the body of knowledge. The costs associated with the maintenance of the electronic archive for these publications will be a continuing obligation of the Union, one that may be too large to be absorbed in the annual operating budget. Thus, the Council at the recommendation of the Publications Committee set up a trust fund to help ensure that AGU has the financial resources to maintain an archive of the material included in its electronic publications, to refresh these files on a regular basis, and to migrate the material in the archive to new formats and media as the technology for electronic publishing changes.

  17. LBT Distributed Archive: Status and Features

    NASA Astrophysics Data System (ADS)

    Knapic, C.; Smareglia, R.; Thompson, D.; Grede, G.

    2011-07-01

    After the first release of the LBT Distributed Archive, this successful collaboration is continuing within the LBT corporation. The IA2 (Italian Center for Astronomical Archive) team had updated the LBT DA with new features in order to facilitate user data retrieval while abiding by VO standards. To facilitate the integration of data from any new instruments, we have migrated to a new database, developed new data distribution software, and enhanced features in the LBT User Interface. The DBMS engine has been changed to MySQL. Consequently, the data handling software now uses java thread technology to update and synchronize the main storage archives on Mt. Graham and in Tucson, as well as archives in Trieste and Heidelberg, with all metadata and proprietary data. The LBT UI has been updated with additional features allowing users to search by instrument and some of the more important characteristics of the images. Finally, instead of a simple cone search service over all LBT image data, new instrument specific SIAP and cone search services have been developed. They will be published in the IVOA framework later this fall.

  18. BOREAS AFM-06 Mean Wind Profile Data

    NASA Technical Reports Server (NTRS)

    Wilczak, James; Hall, Forrest G. (Editor); Newcomer, Jeffrey A. (Editor); Smith, David E. (Technical Monitor)

    2000-01-01

    The Boreal Ecosystem-Atmosphere Study (BOREAS) Airborne Fluxes and Meteorology (AFM)-6 team from the National Oceanic and Atmospheric Administration/Environment Technology Laboratory (NOAA/ETL) operated a 915-MHz wind/Radio Acoustic Sounding System (RASS) profiler system in the Southern Study Area (SSA) near the Old Jack Pine (OJP) tower from 21 May 1994 to 20 Sep 1994. The data set provides wind profiles at 38 heights, containing the variables of wind speed; wind direction; and the u-, v-, and w-components of the total wind. The data are stored in tabular ASCII files. The mean wind profile data are available from the Earth Observing System Data and Information System (EOSDIS) Oak Ridge National Laboratory (ORNL) Distributed Active Archive Center (DAAC). The data files are available on a CD-ROM (see document number 20010000884).

  19. BOREAS AFM-06 Mean Temperature Profile Data

    NASA Technical Reports Server (NTRS)

    Wilczak, James; Hall, Forrest G. (Editor); Newcomer, Jeffrey A. (Editor); Smith, David E. (Technical Monitor)

    2000-01-01

    The Boreal Ecosystem-Atmosphere Study (BOREAS) Airborne Fluxes and Meteorology (AFM)-6 team from the National Oceanic and Atmospheric Adminsitration/Environment Technology Laboratory (NOAA/ETL) operated a 915-MHz wind/Radio Acoustic Sounding System (RASS) profiler system in the Southern Study Area (SSA) near the Old Jack Pine (OJP) tower from 21 May 1994 to 20 Sep 1994. The data set provides temperature profiles at 15 heights, containing the variables of virtual temperature, vertical velocity, the speed of sound, and w-bar. The data are stored in tabular ASCII files. The mean temperature profile data are available from the Earth Observing System Data and Information System (EOSDIS) Oak Ridge National Laboratory (ORNL) Distributed Active Archive Center (DAAC). The data files are available on a CD-ROM (see document number 20010000884).

  20. [Development of a medical equipment support information system based on PDF portable document].

    PubMed

    Cheng, Jiangbo; Wang, Weidong

    2010-07-01

    According to the organizational structure and management system of the hospital medical engineering support, integrate medical engineering support workflow to ensure the medical engineering data effectively, accurately and comprehensively collected and kept in electronic archives. Analyse workflow of the medical, equipment support work and record all work processes by the portable electronic document. Using XML middleware technology and SQL Server database, complete process management, data calculation, submission, storage and other functions. The practical application shows that the medical equipment support information system optimizes the existing work process, standardized and digital, automatic and efficient orderly and controllable. The medical equipment support information system based on portable electronic document can effectively optimize and improve hospital medical engineering support work, improve performance, reduce costs, and provide full and accurate digital data

  1. Environmental System Science Data Infrastructure for a Virtual Ecosystem (ESS-DIVE) - A New U.S. DOE Data Archive

    NASA Astrophysics Data System (ADS)

    Agarwal, D.; Varadharajan, C.; Cholia, S.; Snavely, C.; Hendrix, V.; Gunter, D.; Riley, W. J.; Jones, M.; Budden, A. E.; Vieglais, D.

    2017-12-01

    The ESS-DIVE archive is a new U.S. Department of Energy (DOE) data archive designed to provide long-term stewardship and use of data from observational, experimental, and modeling activities in the earth and environmental sciences. The ESS-DIVE infrastructure is constructed with the long-term vision of enabling broad access to and usage of the DOE sponsored data stored in the archive. It is designed as a scalable framework that incentivizes data providers to contribute well-structured, high-quality data to the archive and that enables the user community to easily build data processing, synthesis, and analysis capabilities using those data. The key innovations in our design include: (1) application of user-experience research methods to understand the needs of users and data contributors; (2) support for early data archiving during project data QA/QC and before public release; (3) focus on implementation of data standards in collaboration with the community; (4) support for community built tools for data search, interpretation, analysis, and visualization tools; (5) data fusion database to support search of the data extracted from packages submitted and data available in partner data systems such as the Earth System Grid Federation (ESGF) and DataONE; and (6) support for archiving of data packages that are not to be released to the public. ESS-DIVE data contributors will be able to archive and version their data and metadata, obtain data DOIs, search for and access ESS data and metadata via web and programmatic portals, and provide data and metadata in standardized forms. The ESS-DIVE archive and catalog will be federated with other existing catalogs, allowing cross-catalog metadata search and data exchange with existing systems, including DataONE's Metacat search. ESS-DIVE is operated by a multidisciplinary team from Berkeley Lab, the National Center for Ecological Analysis and Synthesis (NCEAS), and DataONE. The primarily data copies are hosted at DOE's NERSC supercomputing facility with replicas at DataONE nodes.

  2. Transfer and utilization of government technology assets to the private sector in the fields of health care and information technologies

    NASA Astrophysics Data System (ADS)

    Kun, Luis G.

    1995-10-01

    During the first Health Care Technology Policy conference last year, during health care reform, four major issues were brought up in regards to the efforts underway to develop a computer based patient record (CBPR), the National Information Infrastructure (NII) as part of the high performance computers and communications (HPCC), and the so-called 'patient card.' More specifically it was explained how a national information system will greatly affect the way health care delivery is provided to the United States public and reduce its costs. These four issues were: (1) Constructing a national information infrastructure (NII); (2) Building a computer based patient record system; (3) Bringing the collective resources of our national laboratories to bear in developing and implementing the NII and CBPR, as well as a security system with which to safeguard the privacy rights of patients and the physician-patient privilege; (4) Utilizing government (e.g., DOD, DOE) capabilities (technology and human resources) to maximize resource utilization, create new jobs, and accelerate technology transfer to address health care issues. This year a section of this conference entitled: 'Health Care Technology Assets of the Federal Government' addresses benefits of the technology transfer which should occur for maximizing already developed resources. This section entitled: 'Transfer and Utilization of Government Technology Assets to the Private Sector,' will look at both health care and non-health care related technologies since many areas such as information technologies (i.e. imaging, communications, archival/retrieval, systems integration, information display, multimedia, heterogeneous data bases, etc.) already exist and are part of our national labs and/or other federal agencies, i.e., ARPA. These technologies although they are not labeled under health care programs they could provide enormous value to address technical needs. An additional issue deals with both the technical (hardware, software) and human expertise that resides within these labs and their possible role in creating cost effective solutions.

  3. A review of radio frequency identification technology for the anatomic pathology or biorepository laboratory: Much promise, some progress, and more work needed.

    PubMed

    Lou, Jerry J; Andrechak, Gary; Riben, Michael; Yong, William H

    2011-01-01

    Patient safety initiatives throughout the anatomic laboratory and in biorepository laboratories have mandated increasing emphasis on the need for accurately identifying and tracking biospecimen assets throughout their production lifecycle and for archiving/retrieval purposes. However, increasing production volume along with complex workflow characteristics, reliance on manual production processes, and required asset movement to disparate destinations throughout asset lifecycles continue to challenge laboratory efforts. Radio Frequency Identification (RFID) technology, use of radio waves to communicate data between electronic tags attached to objects and a reader, shows significant potential to facilitate and overcome these hurdles. Advantages over traditional barcode labeling include readability without direct line-of-sight alignment to the reader, ability to read multiple tags simultaneously, higher data storage capacity, faster data transmission rate, and capacity to perform multiple read-writes of data to the tag. Most importantly, use of radio waves decreases the need to manually scan each asset, and at each step, identification or tracking event is needed. Temperature monitoring by on-board sensors and three-dimensional position tracking are additional potential benefits of using RFID technology. To date, barriers to implementation of RFID systems in the anatomic laboratory include increased associated costs of tags and readers, system software, data security concerns, lack of specific data standards for stored information, and potential for technological obsolescence during decades of specimen storage. Novel RFID production techniques and increased production capacity are projected to lower costs of some tags to a few cents each. Potentially, information security concerns can be addressed by techniques such as shielding, data encryption, and tag pseudonyms. Commitment by stakeholder groups to develop RFID tag data standards for anatomic pathology and biorepository laboratories could avoid or mitigate the "islands of data" dilemma presented by barcode usage where there are innumerable standards and a consequent paucity of hardware or software "plug and play" interoperability. Work remains to be done to establish the durability and appropriate shielding of individual tag types for use in harsh laboratory environmental conditions, and for long-term archival storage. Finally, given the requirements for long-term storage of biospecimen assets, consideration should be given to ways of mitigating data isolation due to eventual technological obsolescence of a particular RFID technology or software.

  4. A review of radio frequency identification technology for the anatomic pathology or biorepository laboratory: Much promise, some progress, and more work needed

    PubMed Central

    Lou, Jerry J.; Andrechak, Gary; Riben, Michael; Yong, William H.

    2011-01-01

    Patient safety initiatives throughout the anatomic laboratory and in biorepository laboratories have mandated increasing emphasis on the need for accurately identifying and tracking biospecimen assets throughout their production lifecycle and for archiving/retrieval purposes. However, increasing production volume along with complex workflow characteristics, reliance on manual production processes, and required asset movement to disparate destinations throughout asset lifecycles continue to challenge laboratory efforts. Radio Frequency Identification (RFID) technology, use of radio waves to communicate data between electronic tags attached to objects and a reader, shows significant potential to facilitate and overcome these hurdles. Advantages over traditional barcode labeling include readability without direct line-of-sight alignment to the reader, ability to read multiple tags simultaneously, higher data storage capacity, faster data transmission rate, and capacity to perform multiple read-writes of data to the tag. Most importantly, use of radio waves decreases the need to manually scan each asset, and at each step, identification or tracking event is needed. Temperature monitoring by on-board sensors and three-dimensional position tracking are additional potential benefits of using RFID technology. To date, barriers to implementation of RFID systems in the anatomic laboratory include increased associated costs of tags and readers, system software, data security concerns, lack of specific data standards for stored information, and potential for technological obsolescence during decades of specimen storage. Novel RFID production techniques and increased production capacity are projected to lower costs of some tags to a few cents each. Potentially, information security concerns can be addressed by techniques such as shielding, data encryption, and tag pseudonyms. Commitment by stakeholder groups to develop RFID tag data standards for anatomic pathology and biorepository laboratories could avoid or mitigate the “islands of data” dilemma presented by barcode usage where there are innumerable standards and a consequent paucity of hardware or software “plug and play” interoperability. Work remains to be done to establish the durability and appropriate shielding of individual tag types for use in harsh laboratory environmental conditions, and for long-term archival storage. Finally, given the requirements for long-term storage of biospecimen assets, consideration should be given to ways of mitigating data isolation due to eventual technological obsolescence of a particular RFID technology or software. PMID:21886890

  5. Image dissemination and archiving.

    PubMed

    Robertson, Ian

    2007-08-01

    Images generated as part of the sonographic examination are an integral part of the medical record and must be retained according to local regulations. The standard medical image format, known as DICOM (Digital Imaging and COmmunications in Medicine) makes it possible for images from many different imaging modalities, including ultrasound, to be distributed via a standard internet network to distant viewing workstations and a central archive in an almost seamless fashion. The DICOM standard is a truly universal standard for the dissemination of medical images. When purchasing an ultrasound unit, the consumer should research the unit's capacity to generate images in a DICOM format, especially if one wishes interconnectivity with viewing workstations and an image archive that stores other medical images. PACS, an acronym for Picture Archive and Communication System refers to the infrastructure that links modalities, workstations, the image archive, and the medical record information system into an integrated system, allowing for efficient electronic distribution and storage of medical images and access to medical record data.

  6. Data catalog for JPL Physical Oceanography Distributed Active Archive Center (PO.DAAC)

    NASA Technical Reports Server (NTRS)

    Digby, Susan

    1995-01-01

    The Physical Oceanography Distributed Active Archive Center (PO.DAAC) archive at the Jet Propulsion Laboratory contains satellite data sets and ancillary in-situ data for the ocean sciences and global-change research to facilitate multidisciplinary use of satellite ocean data. Geophysical parameters available from the archive include sea-surface height, surface-wind vector, surface-wind speed, surface-wind stress vector, sea-surface temperature, atmospheric liquid water, integrated water vapor, phytoplankton pigment concentration, heat flux, and in-situ data. PO.DAAC is an element of the Earth Observing System Data and Information System and is the United States distribution site for TOPEX/POSEIDON data and metadata.

  7. Direct Data Distribution From Low-Earth Orbit

    NASA Technical Reports Server (NTRS)

    Budinger, James M.; Fujikawa, Gene; Kunath, Richard R.; Nguyen, Nam T.; Romanofsky, Robert R.; Spence, Rodney L.

    1997-01-01

    NASA Lewis Research Center (LeRC) is developing the space and ground segment technologies necessary to demonstrate a direct data distribution (1)3) system for use in space-to-ground communication links from spacecraft in low-Earth orbit (LEO) to strategically located tracking ground terminals. The key space segment technologies include a K-band (19 GHz) MMIC-based transmit phased array antenna, and a multichannel bandwidth- and power-efficient digital encoder/modulate with an aggregate data rate of 622 Mb/s. Along with small (1.8 meter), low-cost tracking terminals on the ground, the D3 system enables affordable distribution of data to the end user or archive facility through interoperability with commercial terrestrial telecommunications networks. The D3 system is applicable to both government and commercial science and communications spacecraft in LEO. The features and benefits of the D3 system concept are described. Starting with typical orbital characteristics, a set of baseline requirements for representative applications is developed, including requirements for onboard storage and tracking terminals, and sample link budgets are presented. Characteristics of the transmit array antenna and digital encoder/modulator are described. The architecture and components of the tracking terminal are described, including technologies for the next generation terminal. Candidate flights of opportunity for risk mitigation and space demonstration of the D3 features are identified.

  8. Synergy with HST and JWST Data Management Systems

    NASA Astrophysics Data System (ADS)

    Greene, Gretchen; Space Telescope Data Management Team

    2014-01-01

    The data processing and archive systems for the JWST will contain a petabyte of science data and the best news is that users will have fast access to the latest calibrations through a variety of new services. With a synergistic approach currently underway with the STScI science operations between the Hubble Space Telescope and James Webb Space Telescope data management subsystems (DMS), operational verification is right around the corner. Next year the HST archive will provide scientists on-demand fully calibrated data products via the Mikulski Archive for Space Telescopes (MAST), which takes advantage of an upgraded DMS. This enhanced system, developed jointly with the JWST DMS is based on a new CONDOR distributed processing system capable of reprocessing data using a prioritization queue which runs in the background. A Calibration Reference Data System manages the latest optimal configuration for each scientific instrument pipeline. Science users will be able to search and discover the growing MAST archive calibrated datasets from these missions along with the other multiple mission holdings both local to MAST and available through the Virtual Observatory. JWST data systems will build upon the successes and lessons learned from the HST legacy and move us forward into the next generation of multi-wavelength archive research.

  9. Web Services Implementations at Land Process and Goddard Earth Sciences Distributed Active Archive Centers

    NASA Astrophysics Data System (ADS)

    Cole, M.; Bambacus, M.; Lynnes, C.; Sauer, B.; Falke, S.; Yang, W.

    2007-12-01

    NASA's vast array of scientific data within its Distributed Active Archive Centers (DAACs) is especially valuable to both traditional research scientists as well as the emerging market of Earth Science Information Partners. For example, the air quality science and management communities are increasingly using satellite derived observations in their analyses and decision making. The Air Quality Cluster in the Federation of Earth Science Information Partners (ESIP) uses web infrastructures of interoperability, or Service Oriented Architecture (SOA), to extend data exploration, use, and analysis and provides a user environment for DAAC products. In an effort to continually offer these NASA data to the broadest research community audience, and reusing emerging technologies, both NASA's Goddard Earth Science (GES) and Land Process (LP) DAACs have engaged in a web services pilot project. Through these projects both GES and LP have exposed data through the Open Geospatial Consortiums (OGC) Web Services standards. Reusing several different existing applications and implementation techniques, GES and LP successfully exposed a variety data, through distributed systems to be ingested into multiple end-user systems. The results of this project will enable researchers world wide to access some of NASA's GES & LP DAAC data through OGC protocols. This functionality encourages inter-disciplinary research while increasing data use through advanced technologies. This paper will concentrate on the implementation and use of OGC Web Services, specifically Web Map and Web Coverage Services (WMS, WCS) at GES and LP DAACs, and the value of these services within scientific applications, including integration with the DataFed air quality web infrastructure and in the development of data analysis web applications.

  10. The NASA Astrophysics Data System joins the Revolution

    NASA Astrophysics Data System (ADS)

    Accomazzi, Alberto; Kurtz, Michael J.; Henneken, Edwin; Grant, Carolyn S.; Thompson, Donna M.; Chyla, Roman; Holachek, Alexandra; Sudilovsky, Vladimir; Elliott, Jonathan; Murray, Stephen S.

    2015-08-01

    Whether or not scholarly publications are going through an evolution or revolution, one comforting certainty remains: the NASA Astrophysics Data System (ADS) is here to help the working astronomer and librarian navigate through the increasingly complex communication environment we find ourselves in. Born as a bibliographic database, today's ADS is best described as a an "aggregator" of scholarly resources relevant to the needs of researchers in astronomy and physics. In addition to indexing content from a variety of publishers, data and software archives, the ADS enriches its records by text-mining and indexing the full-text articles, enriching its metadata through the extraction of citations and acknowledgments and the ingest of bibliographies and data links maintained by astronomy institutions and data archives. In addition, ADS generates and maintains citation and co-readership networks to support discovery and bibliometric analysis.In this talk I will summarize new and ongoing curation activities and technology developments of the ADS in the face of the ever-changing world of scholarly publishing and the trends in information-sharing behavior of astronomers. Recent curation efforts include the indexing of non-standard scholarly content (such as software packages, IVOA documents and standards, and NASA award proposals); the indexing of additional content (full-text of articles, acknowledgments, affiliations, ORCID ids); and enhanced support for bibliographic groups and data links. Recent technology developments include a new Application Programming Interface which provides access to a variety of ADS microservices, a new user interface featuring a variety of visualizations and bibliometric analysis, and integration with ORCID services to support paper claiming.

  11. Investigation of air transportation technology at Princeton University, 1991-1992

    NASA Technical Reports Server (NTRS)

    Stengel, Robert F.

    1993-01-01

    The Air Transportation Research Program at Princeton University proceeded along six avenues during the past year: (1) intelligent flight control; (2) computer-aided control system design; (3) neural networks for flight control; (4) stochastic robustness of flight control systems; (5) microburst hazards to aircraft; and (6) fundamental dynamics of atmospheric flight. This research has resulted in a number of publications, including archival papers and conference papers. An annotated bibliography of publications that appeared between June 1991 and June 1992 appears at the end of this report. The research that these papers describe was supported in whole or in part by the Joint University Program, including work that was completed prior to the reporting period.

  12. Naval sensor data database (NSDD)

    NASA Astrophysics Data System (ADS)

    Robertson, Candace J.; Tubridy, Lisa H.

    1999-08-01

    The Naval Sensor Data database (NSDD) is a multi-year effort to archive, catalogue, and disseminate data from all types of sensors to the mine warfare, signal and image processing, and sensor development communities. The purpose is to improve and accelerate research and technology. Providing performers with the data required to develop and validate improvements in hardware, simulation, and processing will foster advances in sensor and system performance. The NSDD will provide a centralized source of sensor data in its associated ground truth, which will support an improved understanding will be benefited in the areas of signal processing, computer-aided detection and classification, data compression, data fusion, and geo-referencing, as well as sensor and sensor system design.

  13. Should radiology IT be owned by the chief information officer?

    PubMed

    Channin, David S; Bowers, George; Nagy, Paul

    2009-06-01

    Considerable debate within the medical community has focused on the optimal location of information technology (IT) support groups on the organizational chart. The challenge has been to marry local accountability and physician acceptance of IT with the benefits gained by the economies of scale achieved by centralized knowledge and system best practices. In the picture archiving and communication systems (PACS) industry, a slight shift has recently occurred toward centralized control. Radiology departments, however, have begun to realize that no physicians in any other discipline are as dependent on IT as radiologists are on their PACS. The potential strengths and weaknesses of centralized control of the PACS is the topic of discussion for this month's Point/Counterpoint.

  14. The Telecommunications and Data Acquisition Report

    NASA Technical Reports Server (NTRS)

    Posner, E. C. (Editor)

    1990-01-01

    Archival reports are given on developments in programs managed by JPL's Office of Telecommunications and Data Acquisition (TDA), including space communications, radio navigation, radio science, ground-based radio and radar astronomy, and the Deep Space Network (DSN) and its associated Ground Communications Facility (GCF) in planning, supporting research and technology, implementation, and operations. Also included is TDA-funded activity at JPL on data and information systems and reimbursable DSN work performed for other space agencies through NASA. In the search for extraterrestrial intelligence (SETI), implementation and operations for searching the microwave spectrum are reported. Use of the Goldstone Solar System Radar for scientific exploration of the planets, their rings and satellites, asteroids, and comets are discussed.

  15. Redundant array of independent disks: practical on-line archiving of nuclear medicine image data.

    PubMed

    Lear, J L; Pratt, J P; Trujillo, N

    1996-02-01

    While various methods for long-term archiving of nuclear medicine image data exist, none support rapid on-line search and retrieval of information. We assembled a 90-Gbyte redundant array of independent disks (RAID) system using 10-, 9-Gbyte disk drives. The system was connected to a personal computer and software was used to partition the array into 4-Gbyte sections. All studies (50,000) acquired over a 7-year period were archived in the system. Based on patient name/number and study date, information could be located within 20 seconds and retrieved for display and analysis in less than 5 seconds. RAID offers a practical, redundant method for long-term archiving of nuclear medicine studies that supports rapid on-line retrieval.

  16. The EOSDIS Version 0 Distributed Active Archive Center for physical oceanography and air-sea interaction

    NASA Technical Reports Server (NTRS)

    Hilland, Jeffrey E.; Collins, Donald J.; Nichols, David A.

    1991-01-01

    The Distributed Active Archive Center (DAAC) at the Jet Propulsion Laboratory will support scientists specializing in physical oceanography and air-sea interaction. As part of the NASA Earth Observing System Data and Information System Version 0 the DAAC will build on existing capabilities to provide services for data product generation, archiving, distribution and management of information about data. To meet scientist's immediate needs for data, existing data sets from missions such as Seasat, Geosat, the NOAA series of satellites and the Global Positioning Satellite system will be distributed to investigators upon request. In 1992, ocean topography, wave and surface roughness data from the Topex/Poseidon radar altimeter mission will be archived and distributed. New data products will be derived from Topex/Poseidon and other sensor systems based on recommendations of the science community. In 1995, ocean wind field measurements from the NASA Scatterometer will be supported by the DAAC.

  17. Data Management in the Euclid Science Archive System

    NASA Astrophysics Data System (ADS)

    de Teodoro, P.; Nieto, S.; Altieri, B.

    2017-06-01

    Euclid is the ESA M2 mission and a milestone in the understanding of the geometry of the Universe. In total Euclid will produce up to 26 PB per year of observations. The Science Archive Systems (SAS) belongs to the Euclid Archive System (EAS) that sits in the core of the Euclid Science Ground Segment (SGS). The SAS is being built at the ESAC Science Data Centre (ESDC), which is responsible for the development and operations of the scientific archives for the Astronomy, Planetary and Heliophysics missions of ESA. The SAS is focused on the needs of the scientific community and is intended to provide access to the most valuable scientific metadata from the Euclid mission. In this paper we describe the architectural design of the system, implementation progress and the main challenges from the data management point of view in the building of the SAS.

  18. OneDep: Unified wwPDB System for Deposition, Biocuration, and Validation of Macromolecular Structures in the PDB Archive

    PubMed Central

    Young, Jasmine Y.; Westbrook, John D.; Feng, Zukang; Sala, Raul; Peisach, Ezra; Oldfield, Thomas J.; Sen, Sanchayita; Gutmanas, Aleksandras; Armstrong, David R.; Berrisford, John M.; Chen, Li; Chen, Minyu; Di Costanzo, Luigi; Dimitropoulos, Dimitris; Gao, Guanghua; Ghosh, Sutapa; Gore, Swanand; Guranovic, Vladimir; Hendrickx, Pieter MS; Hudson, Brian P.; Igarashi, Reiko; Ikegawa, Yasuyo; Kobayashi, Naohiro; Lawson, Catherine L.; Liang, Yuhe; Mading, Steve; Mak, Lora; Mir, M. Saqib; Mukhopadhyay, Abhik; Patwardhan, Ardan; Persikova, Irina; Rinaldi, Luana; Sanz-Garcia, Eduardo; Sekharan, Monica R.; Shao, Chenghua; Swaminathan, G. Jawahar; Tan, Lihua; Ulrich, Eldon L.; van Ginkel, Glen; Yamashita, Reiko; Yang, Huanwang; Zhuravleva, Marina A.; Quesada, Martha; Kleywegt, Gerard J.; Berman, Helen M.; Markley, John L.; Nakamura, Haruki; Velankar, Sameer; Burley, Stephen K.

    2017-01-01

    SUMMARY OneDep, a unified system for deposition, biocuration, and validation of experimentally determined structures of biological macromolecules to the Protein Data Bank (PDB) archive, has been developed as a global collaboration by the Worldwide Protein Data Bank (wwPDB) partners. This new system was designed to ensure that the wwPDB could meet the evolving archiving requirements of the scientific community over the coming decades. OneDep unifies deposition, biocuration, and validation pipelines across all wwPDB, EMDB, and BMRB deposition sites with improved focus on data quality and completeness in these archives, while supporting growth in the number of depositions and increases in their average size and complexity. In this paper, we describe the design, functional operation, and supporting infrastructure of the OneDep system, and provide initial performance assessments. PMID:28190782

  19. Cryogenic hydrogen-induced air liquefaction technologies

    NASA Technical Reports Server (NTRS)

    Escher, William J. D.

    1990-01-01

    Extensively utilizing a special advanced airbreathing propulsion archives database, as well as direct contacts with individuals who were active in the field in previous years, a technical assessment of cryogenic hydrogen-induced air liquefaction, as a prospective onboard aerospace vehicle process, was performed and documented. The resulting assessment report is summarized. Technical findings are presented relating the status of air liquefaction technology, both as a singular technical area, and also that of a cluster of collateral technical areas including: compact lightweight cryogenic heat exchangers; heat exchanger atmospheric constituents fouling alleviation; para/ortho hydrogen shift conversion catalysts; hydrogen turbine expanders, cryogenic air compressors and liquid air pumps; hydrogen recycling using slush hydrogen as heat sink; liquid hydrogen/liquid air rocket-type combustion devices; air collection and enrichment systems (ACES); and technically related engine concepts.

  20. The Evolution of NSF Arctic Data Management: Challenges and Lessons Learned after Two Decades of Support

    NASA Astrophysics Data System (ADS)

    Moore, J. A.; Serreze, M. C.; Williams, S.; Ramamurthy, M. K.; Middleton, D.

    2014-12-01

    The U.S. National Science Foundation has been providing data management support to the Arctic research community through the UCAR/NCAR since late 1995. Support began during the early planning phase of the Surface Heat Budget of the Arctic (SHEBA) Project and continues today with a major collaboration involving the NCAR Earth Observing Laboratory (EOL), the NCAR Computational Information Systems Laboratory (CISL), the UCAR Unidata Program, and the National Snow and Ice Data Center (NSIDC), in the Advanced Cooperative Arctic Data and Information System (ACADIS). These groups have managed thousands of datasets for hundreds of Principal Investigators. The datasets, including the metadata and documentation held in the archives vary in size from less than 30 kilobytes to tens of gigabytes and represent dozens of research disciplines. The ACADIS holdings alone include more than 50 scientific disciplines as defined by the NASA/GCMD keywords. The data formats vary from simple ASCII text to proprietary complex binary and imagery. A lot has changed in the way data are collected due to improved data collection technologies, real time processing and wide bandwidth communications. There have been some changes to data management best practices especially related to metadata, flexible formatting, DOIs, and interoperability with other archives to take advantage of new technologies, software and related support capabilities. ACADIS has spent more than 7 years working these issues and implementing an agile service approach. There are some very interesting challenges that we have been confronted with and overcome during the past 20 years. However, with all those improvements there are guiding principles for the data managers that are robust and remain important even after 20 years of experience. These include the provision of evolving standards and complete metadata records to describe each dataset, International data exchange and easy access to the archived data, and the inclusion of comprehensive documentation to foster long-term reuse potential of the data. The authors will provide details on the handling of these specific issues and also consider some other more subtle situations that continue to require serious consideration and problem solving.

  1. The SciELO Brazilian Scientific Journal Gateway and Open Archives; Usability of Hypermedia Educational e-Books; Building Upon the MyLibrary Concept To Better Meet the Information Needs of College Students; Open Archives and UK Institutions; The Utah Digital Newspapers Project; Examples of Practical Digital Libraries.

    ERIC Educational Resources Information Center

    Marcondes, Carlos Henrique; Sayao, Luis Fernando; Diaz, Paloma; Gibbons, Susan; Pinfield, Stephen; Kenning, Arlitsch; Edge, Karen; Yapp, L.; Witten, Ian H.

    2003-01-01

    Includes six articles that focus on practical uses of technologies developed from digital library research in the areas of education and scholarship reflecting the international impact of digital library research initiatives. Includes the Scientific Electronic Library Online (SciELO) (Brazil); the National Science Foundation (NSF) (US); the Joint…

  2. 2006 Environmental Scan. ACAATO Archive Document

    ERIC Educational Resources Information Center

    Colleges Ontario, 2006

    2006-01-01

    The Association of Colleges of Applied Arts and Technology of Ontario (ACAATO) is pleased to present this report. The 2006 Environmental Scan provides an aggregate synopsis of the key trends which will impact on Ontario's Colleges of Applied Arts and Technology in the future and will assist colleges in their advocacy and strategic planning…

  3. 2005 Environmental Scan. ACAATO Archive Document

    ERIC Educational Resources Information Center

    Colleges Ontario, 2005

    2005-01-01

    The Association of Colleges of Applied Arts and Technology of Ontario (ACAATO) is pleased to present this report. The 2005 Environmental Scan provides an aggregate synopsis of the key trends which will impact on Ontario's Colleges of Applied Arts and Technology in the future and will assist colleges in their advocacy and strategic planning…

  4. Improved technique that allows the performance of large-scale SNP genotyping on DNA immobilized by FTA technology.

    PubMed

    He, Hongbin; Argiro, Laurent; Dessein, Helia; Chevillard, Christophe

    2007-01-01

    FTA technology is a novel method designed to simplify the collection, shipment, archiving and purification of nucleic acids from a wide variety of biological sources. The number of punches that can normally be obtained from a single specimen card are often however, insufficient for the testing of the large numbers of loci required to identify genetic factors that control human susceptibility or resistance to multifactorial diseases. In this study, we propose an improved technique to perform large-scale SNP genotyping. We applied a whole genome amplification method to amplify DNA from buccal cell samples stabilized using FTA technology. The results show that using the improved technique it is possible to perform up to 15,000 genotypes from one buccal cell sample. Furthermore, the procedure is simple. We consider this improved technique to be a promising methods for performing large-scale SNP genotyping because the FTA technology simplifies the collection, shipment, archiving and purification of DNA, while whole genome amplification of FTA card bound DNA produces sufficient material for the determination of thousands of SNP genotypes.

  5. Archive & Data Management Activities for ISRO Science Archives

    NASA Astrophysics Data System (ADS)

    Thakkar, Navita; Moorthi, Manthira; Gopala Krishna, Barla; Prashar, Ajay; Srinivasan, T. P.

    2012-07-01

    ISRO has kept a step ahead by extending remote sensing missions to planetary and astronomical exploration. It has started with Chandrayaan-1 and successfully completed the moon imaging during its life time in the orbit. Now, in future ISRO is planning to launch Chandrayaan-2 (next moon mission), Mars Mission and Astronomical mission ASTROSAT. All these missions are characterized by the need to receive process, archive and disseminate the acquired science data to the user community for analysis and scientific use. All these science missions will last for a few months to a few years but the data received are required to be archived, interoperable and requires a seamless access to the user community for the future. ISRO has laid out definite plans to archive these data sets in specified standards and develop relevant access tools to be able to serve the user community. To achieve this goal, a Data Center is set up at Bangalore called Indian Space Science Data Center (ISSDC). This is the custodian of all the data sets of the current and future science missions of ISRO . Chandrayaan-1 is the first among the planetary missions launched/to be launched by ISRO and we had taken the challenge and developed a system for data archival and dissemination of the payload data received. For Chandrayaan-1 the data collected from all the instruments are processed and is archived in the archive layer in the Planetary Data System (PDS 3.0) standards, through the automated pipeline. But the dataset once stored is of no use unless it is made public, which requires a Web-based dissemination system that can be accessible to all the planetary scientists/data users working in this field. Towards this, a Web- based Browse and Dissemination system has been developed, wherein users can register and search for their area of Interest and view the data archived for TMC & HYSI with relevant Browse chips and Metadata of the data. Users can also order the data and get it on their desktop in the PDS. For other AO payloads users can view the metadata and the data is available through FTP site. This same archival and dissemination strategy will be extended for the next moon mission Chandrayaan-2. ASTROSAT is going to be the first multi-wavelength astronomical mission for which the data is archived at ISSDC. It consists of five astronomical payloads that would allow simultaneous multi-wavelengths observations from X-ray to Ultra-Violet (UV) of astronomical objects. It is planned to archive the data sets in FITS. The archive of the ASTROSAT will be done in the Archive Layer at ISSDC. The Browse of the Archive will be available through the ISDA (Indian Science Data Archive) web site. The Browse will be IVOA compliant with a search mechanism using VOTable. The data will be available to the users only on request basis via a FTP site after the lock in period is over. It is planned that the Level2 pipeline software and various modules for processing the data sets will be also available on the web site. This paper, describes the archival procedure of Chandrayaan-1 and archive plan for the ASTROSAT, Chandrayaan-2 and other future mission of ISRO including the discussion on data management activities.

  6. Archive data base and handling system for the Orbiter flying qualities experiment program

    NASA Technical Reports Server (NTRS)

    Myers, T. T.; Dimarco, R.; Magdaleno, R. E.; Aponso, B. L.

    1986-01-01

    The OFQ archives data base and handling system assembled as part of the Orbiter Flying Qualities (OFQ) research of the Orbiter Experiments Program (EOX) are described. The purpose of the OFQ archives is to preserve and document shuttle flight data relevant to vehicle dynamics, flight control, and flying qualities in a form that permits maximum use for qualified users. In their complete form, the OFQ archives contain descriptive text (general information about the flight, signal descriptions and units) as well as numerical time history data. Since the shuttle program is so complex, the official data base contains thousands of signals and very complex entries are required to obtain data. The OFQ archives are intended to provide flight phase oriented data subsets with relevant signals which are easily identified for flying qualities research.

  7. European distributed seismological data archives infrastructure: EIDA

    NASA Astrophysics Data System (ADS)

    Clinton, John; Hanka, Winfried; Mazza, Salvatore; Pederson, Helle; Sleeman, Reinoud; Stammler, Klaus; Strollo, Angelo

    2014-05-01

    The European Integrated waveform Data Archive (EIDA) is a distributed Data Center system within ORFEUS that (a) securely archives seismic waveform data and related metadata gathered by European research infrastructures, and (b) provides transparent access to the archives for the geosciences research communities. EIDA was founded in 2013 by ORFEUS Data Center, GFZ, RESIF, ETH, INGV and BGR to ensure sustainability of a distributed archive system and the implementation of standards (e.g. FDSN StationXML, FDSN webservices) and coordinate new developments. Under the mandate of the ORFEUS Board of Directors and Executive Committee the founding group is responsible for steering and maintaining the technical developments and organization of the European distributed seismic waveform data archive and the integration within broader multidisciplanry frameworks like EPOS. EIDA currently offers uniform data access to unrestricted data from 8 European archives (www.orfeus-eu.org/eida), linked by the Arclink protocol, hosting data from 75 permanent networks (1800+ stations) and 33 temporary networks (1200+) stations). Moreover, each archive may also provide unique, restricted datasets. A webinterface, developed at GFZ, offers interactive access to different catalogues (EMSC, GFZ, USGS) and EIDA waveform data. Clients and toolboxes like arclink_fetch and ObsPy can connect directly to any EIDA node to collect data. Current developments are directed to the implementation of quality parameters and strong motion parameters.

  8. Educational Labeling System for Atmospheres (ELSA): Python Tool Development for Archiving Under the PDS4 Standard

    NASA Astrophysics Data System (ADS)

    Neakrase, Lynn; Hornung, Danae; Sweebe, Kathrine; Huber, Lyle; Chanover, Nancy J.; Stevenson, Zena; Berdis, Jodi; Johnson, Joni J.; Beebe, Reta F.

    2017-10-01

    The Research and Analysis programs within NASA’s Planetary Science Division now require archiving of resultant data with the Planetary Data System (PDS) or an equivalent archive. The PDS Atmospheres Node is developing an online environment for assisting data providers with this task. The Educational Labeling System for Atmospheres (ELSA) is being designed with Django/Python coding to provide an easier environment for facilitating not only communication with the PDS node, but also streamlining the process of learning, developing, submitting, and reviewing archive bundles under the new PDS4 archiving standard. Under the PDS4 standard, data are archived in bundles, collections, and basic products that form an organizational hierarchy of interconnected labels that describe the data and relationships between the data and its documentation. PDS4 labels are implemented using Extensible Markup Language (XML), which is an international standard for managing metadata. Potential data providers entering the ELSA environment can learn more about PDS4, plan and develop label templates, and build their archive bundles. ELSA provides an interface to tailor label templates aiding in the creation of required internal Logical Identifiers (URN - Uniform Resource Names) and Context References (missions, instruments, targets, facilities, etc.). The underlying structure of ELSA uses Django/Python code that make maintaining and updating the interface easy to do for our undergraduate/graduate students. The ELSA environment will soon provide an interface for using the tailored templates in a pipeline to produce entire collections of labeled products, essentially building the user’s archive bundle. Once the pieces of the archive bundle are assembled, ELSA provides options for queuing the completed bundle for peer review. The peer review process has also been streamlined for online access and tracking to help make the archiving process with PDS as transparent as possible. We discuss the current status of ELSA and provide examples of its implementation.

  9. Use of a wiki as a radiology departmental knowledge management system.

    PubMed

    Meenan, Christopher; King, Antoinette; Toland, Christopher; Daly, Mark; Nagy, Paul

    2010-04-01

    Information technology teams in health care are tasked with maintaining a variety of information systems with complex support requirements. In radiology, this includes picture archive and communication systems, radiology information systems, speech recognition systems, and other ancillary systems. Hospital information technology (IT) departments are required to provide 24 x 7 support for these mission-critical systems that directly support patient care in emergency and other critical care departments. The practical know-how to keep these systems operational and diagnose problems promptly is difficult to maintain around the clock. Specific details on infrequent failure modes or advanced troubleshooting strategies may reside with only a few senior staff members. Our goal was to reduce diagnosis and recovery times for issues with our mission-critical systems. We created a knowledge base for building and quickly disseminating technical expertise to our entire support staff. We used an open source, wiki-based, collaborative authoring system internally within our IT department to improve our ability to deliver a high level of service to our customers. In this paper, we describe our evaluation of the wiki and the ways in which we used it to organize our support knowledge. We found the wiki to be an effective tool for knowledge management and for improving our ability to provide mission-critical support for health care IT systems.

  10. Measurements of CO2 Mole Fractionand δ13C in Archived Air Samples from Cape Meares, Oregon (USA) 1977 - 1998

    NASA Astrophysics Data System (ADS)

    Clark, O.; Rice, A. L.

    2017-12-01

    Carbon dioxide (CO2) is the most abundant, anthropogenically forced greenhouse gas (GHG) in the global atmosphere. Emissions of CO2 account for approximately 75% of the world's total GHG emissions. Atmospheric concentrations of CO2 are higher now than they've been at any other time in the past 800,000 years. Currently, the global mean concentration exceeds 400 ppm. Today, global networks regularly monitor CO2 concentrations and isotopic composition (δ13C and δ18O). However, past data is sparse. Over 200 ambient air samples from Cape Meares, Oregon (45.5°N, 124.0°W), a coastal site in Western United States, were obtained by researchers at Oregon Institute of Science and Technology (OGI, now Oregon Health & Science University), between the years of 1977 and 1998 as part of a global monitoring program of six different sites in the polar, middle, and tropical latitudes of the Northern and Southern Hemispheres. Air liquefaction was used to compress approximately 1000L of air (STP) to 30bar, into 33L electropolished (SUMMA) stainless steel canisters. Select archived air samples from the original network are maintained at Portland State University (PSU) Department of Physics. These archived samples are a valuable look at changing atmospheric concentrations of CO2 and δ13C, which can contribute to a better understanding of changes in sources during this time. CO2 concentrations and δ13C of CO2 were measured at PSU, with a Picarro Cavity Ringdown Spectrometer, model G1101-i analytical system. This study presents the analytical methods used, calibration techniques, precision, and reproducibility. Measurements of select samples from the archive show rising CO2 concentrations and falling δ13C over the 1977 to 1998 period, compatible with previous observations and rising anthropogenic sources of CO2. The resulting data set was statistically analyzed in MATLAB. Results of preliminary seasonal and secular trends from the archive samples are presented.

  11. The screwworm eradication data system archives

    NASA Technical Reports Server (NTRS)

    Barnes, C. M.; Spaulding, R. R.; Giddings, L. E.

    1977-01-01

    The archives accumulated during 1 year of operation of the Satellite Temperature-Monitoring System during development of the Screwworm Eradication Data System are reported. Brief descriptions of all the kinds of tapes, as well as their potential uses, are presented. Reference is made to other documents that explain the generation of these data.

  12. Metadata Design in the New PDS4 Standards - Something for Everybody

    NASA Astrophysics Data System (ADS)

    Raugh, Anne C.; Hughes, John S.

    2015-11-01

    The Planetary Data System (PDS) archives, supports, and distributes data of diverse targets, from diverse sources, to diverse users. One of the core problems addressed by the PDS4 data standard redesign was that of metadata - how to accommodate the increasingly sophisticated demands of search interfaces, analytical software, and observational documentation into label standards without imposing limits and constraints that would impinge on the quality or quantity of metadata that any particular observer or team could supply. And yet, as an archive, PDS must have detailed documentation for the metadata in the labels it supports, or the institutional knowledge encoded into those attributes will be lost - putting the data at risk.The PDS4 metadata solution is based on a three-step approach. First, it is built on two key ISO standards: ISO 11179 "Information Technology - Metadata Registries", which provides a common framework and vocabulary for defining metadata attributes; and ISO 14721 "Space Data and Information Transfer Systems - Open Archival Information System (OAIS) Reference Model", which provides the framework for the information architecture that enforces the object-oriented paradigm for metadata modeling. Second, PDS has defined a hierarchical system that allows it to divide its metadata universe into namespaces ("data dictionaries", conceptually), and more importantly to delegate stewardship for a single namespace to a local authority. This means that a mission can develop its own data model with a high degree of autonomy and effectively extend the PDS model to accommodate its own metadata needs within the common ISO 11179 framework. Finally, within a single namespace - even the core PDS namespace - existing metadata structures can be extended and new structures added to the model as new needs are identifiedThis poster illustrates the PDS4 approach to metadata management and highlights the expected return on the development investment for PDS, users and data preparers.

  13. Water level ingest, archive and processing system - an integral part of NOAA's tsunami database

    NASA Astrophysics Data System (ADS)

    McLean, S. J.; Mungov, G.; Dunbar, P. K.; Price, D. J.; Mccullough, H.

    2013-12-01

    The National Oceanic and Atmospheric Administration (NOAA), National Geophysical Data Center (NGDC) and collocated World Data Service for Geophysics (WDS) provides long-term archive, data management, and access to national and global tsunami data. Archive responsibilities include the NOAA Global Historical Tsunami event and runup database, damage photos, as well as other related hazards data. Beginning in 2008, NGDC was given the responsibility of archiving, processing and distributing all tsunami and hazards-related water level data collected from NOAA observational networks in a coordinated and consistent manner. These data include the Deep-ocean Assessment and Reporting of Tsunami (DART) data provided by the National Data Buoy Center (NDBC), coastal-tide-gauge data from the National Ocean Service (NOS) network and tide-gauge data from the two National Weather Service (NWS) Tsunami Warning Centers (TWCs) regional networks. Taken together, this integrated archive supports tsunami forecast, warning, research, mitigation and education efforts of NOAA and the Nation. Due to the variety of the water level data, the automatic ingest system was redesigned, along with upgrading the inventory, archive and delivery capabilities based on modern digital data archiving practices. The data processing system was also upgraded and redesigned focusing on data quality assessment in an operational manner. This poster focuses on data availability highlighting the automation of all steps of data ingest, archive, processing and distribution. Examples are given from recent events such as the October 2012 hurricane Sandy, the Feb 06, 2013 Solomon Islands tsunami, and the June 13, 2013 meteotsunami along the U.S. East Coast.

  14. Kellogg Library and Archive Retrieval System (KLARS) Document Capture Manual. Draft Version.

    ERIC Educational Resources Information Center

    Hugo, Jane

    This manual is designed to supply background information for Kellogg Library and Archive Retrieval System (KLARS) processors and others who might work with the system, outline detailed policies and procedures for processors who prepare and enter data into the adult education database on KLARS, and inform general readers about the system. KLARS is…

  15. Streaming the Archives: Repurposing Systems to Advance a Small Media Digitization and Dissemination Program

    ERIC Educational Resources Information Center

    Anderson, Talea

    2015-01-01

    In 2013-2014, Brooks Library at Central Washington University (CWU) launched library content in three systems: a digital asset-management system, an institutional repository (IR), and a web-based discovery layer. In early 2014, the archives at the library began to use these systems to disseminate media recently digitized from legacy formats. As…

  16. Design and implementation of a prototype data system for earth radiation budget, cloud, aerosol, and chemistry data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baum, B.A.; Barkstrom, B.R.

    1993-04-01

    The Earth Observing System (EOS) will collect data from a large number of satellite-borne instruments, beginning later in this decade, to make data accessible to the scientific community, NASA will build an EOS Data and Information System (EOSDIS). As an initial effort to accelerate the development of EOSDIS and to gain experience with such an information system, NASA and other agencies are working on a prototype system called Version O (VO). This effort will provide improved access to pre-EOS earth science data throughout the early EOSDIS period. Based on recommendations from the EOSDIS Science Advisory Panel, EOSDIS will have severalmore » distributed active archive centers (DAACs). Each DAAC will specialize in particular data sets. This paper describes work at the NASA Langley Research Center's (LaRC) DAAC. The Version 0 Langley DAAC began archiving and distributing existing data sets pertaining to the earth's radiation budget, clouds, aerosols, and tropospheric chemistry in late 1992. The primary goals of the LaRC VO effort are the following: (1) Enhance scientific use of existing data; (2) Develop institutional expertise in maintaining and distributing data; (3) Use institutional capability for processing data from previous missions such as the Earth Radiation Budget Experiment and the Stratospheric Aerosol and Gas Experiment to prepare for processing future EOS satellite data; (4) Encourage cooperative interagency and international involvement with data sets and research; and (5) Incorporate technological hardware and software advances quickly.« less

  17. Assessment of Self-Archiving in Institutional Repositories: Across Disciplines

    ERIC Educational Resources Information Center

    Xia, Jingfeng

    2007-01-01

    This research examined self-archiving practices by four disciplines in seven institutional repositories. By checking each individual item for its metadata and deposition status, the research found that a disciplinary culture is not obviously presented. Rather, self-archiving is regulated by a liaison system and a mandate policy.

  18. The Archival Photograph and Its Meaning: Formalisms for Modeling Images

    ERIC Educational Resources Information Center

    Benson, Allen C.

    2009-01-01

    This article explores ontological principles and their potential applications in the formal description of archival photographs. Current archival descriptive practices are reviewed and the larger question is addressed: do archivists who are engaged in describing photographs need a more formalized system of representation, or do existing encoding…

  19. Painless File Extraction: The A(rc)--Z(oo) of Internet Archive Formats.

    ERIC Educational Resources Information Center

    Simmonds, Curtis

    1993-01-01

    Discusses extraction programs needed to postprocess software downloaded from the Internet that has been archived and compressed for the purposes of storage and file transfer. Archiving formats for DOS, Macintosh, and UNIX operating systems are described; and cross-platform compression utilities are explained. (LRW)

  20. Archiving Derived Data with the PDS Atmospheres Node: The Educational Labeling System for Atmospheres (ELSA)

    NASA Astrophysics Data System (ADS)

    Neakrase, L. D. V.; Hornung, D.; Chanover, N.; Huber, L.; Beebe, R.; Johnson, J.; Sweebe, K.; Stevenson, Z.

    2017-06-01

    The PDS Atmospheres Node is developing an online tool, the Educational Labeling System for Atmospheres (ELSA), to aid in planning and creation of PDS4 bundles and associated labels for archiving derived data.

  1. Archived data management systems : a cross-cutting study : linking operations and planning data

    DOT National Transportation Integrated Search

    2005-12-01

    This report examines five transportation agencies that have established and are operating successful ADMSs (Archived Data Management Systems), and one that is on the verge of becoming fully operational. This study discusses the design choices, operat...

  2. Supporting users through integrated retrieval, processing, and distribution systems at the land processes distributed active archive center

    USGS Publications Warehouse

    Kalvelage, T.; Willems, Jennifer

    2003-01-01

    The design of the EOS Data and Information Systems (EOSDIS) to acquire, archive, manage and distribute Earth observation data to the broadest possible user community was discussed. A number of several integrated retrieval, processing and distribution capabilities have been explained. The value of these functions to the users were described and potential future improvements were laid out for the users. The users were interested in acquiring the retrieval, processing and archiving systems integrated so that they can get the data they want in the format and delivery mechanism of their choice.

  3. Digital imaging and electronic patient records in pathology using an integrated department information system with PACS.

    PubMed

    Kalinski, Thomas; Hofmann, Harald; Franke, Dagmar-Sybilla; Roessner, Albert

    2002-01-01

    Picture archiving and communication systems have been widely used in radiology thus far. Owing to the progress made in digital photo technology, their use in medicine opens up further opportunities. In the field of pathology, digital imaging offers new possiblities for the documentation of macroscopic and microscopic findings. Digital imaging has the advantage that the data is permanently and readily available, independent of conventional archives. In the past, PACS was a separate entity. Meanwhile, however, PACS has been integrated in DIS, the department information system, which was also run separately in former times. The combination of these two systems makes the administration of patient data, findings and images easier. Moreover, thanks to the introduction of special communication standards, a data exchange between different department information systems and hospital information systems (HIS) is possible. This provides the basis for a communication platform in medicine, constituting an electronic patient record (EPR) that permits an interdisciplinary treatment of patients by providing data of findings and images from clinics treating the same patient. As the pathologic diagnosis represents a central and often therapy-determining component, it is of utmost importance to add pathologic diagnoses to the EPR. Furthermore, the pathologist's work is considerably facilitated when he is able to retrieve additional data from the patient file. In this article, we describe our experience gained with the combined PACS and DIS systems recently installed at the Department of Pathology, University of Magdeburg. Moreover, we evaluate the current situation and future prospects for PACS in pathology.

  4. Using Best Practices to Extract, Organize, and Reuse Embedded Decision Support Content Knowledge Rules from Mature Clinical Systems.

    PubMed

    DesAutels, Spencer J; Fox, Zachary E; Giuse, Dario A; Williams, Annette M; Kou, Qing-Hua; Weitkamp, Asli; Neal R, Patel; Bettinsoli Giuse, Nunzia

    2016-01-01

    Clinical decision support (CDS) knowledge, embedded over time in mature medical systems, presents an interesting and complex opportunity for information organization, maintenance, and reuse. To have a holistic view of all decision support requires an in-depth understanding of each clinical system as well as expert knowledge of the latest evidence. This approach to clinical decision support presents an opportunity to unify and externalize the knowledge within rules-based decision support. Driven by an institutional need to prioritize decision support content for migration to new clinical systems, the Center for Knowledge Management and Health Information Technology teams applied their unique expertise to extract content from individual systems, organize it through a single extensible schema, and present it for discovery and reuse through a newly created Clinical Support Knowledge Acquisition and Archival Tool (CS-KAAT). CS-KAAT can build and maintain the underlying knowledge infrastructure needed by clinical systems.

  5. The Telecommunications and Data Acquisition Report

    NASA Technical Reports Server (NTRS)

    Yuen, Joseph H. (Editor)

    1994-01-01

    This quarterly publication provides archival reports on developments in programs managed by JPL's Telecommunications and Mission Operations Directorate (TMOD), which now includes the former Telecommunications and Data Acquisition (TDA) Office. In space communications, radio navigation, radio science, and ground-based radio and radar astronomy, it reports on activities of the Deep Space Network (DS) in planning, supporting research and technology, implementation, and operations. Also included are standards activity at JPL for space data and information systems and reimbursable DSN work performed for other space agencies through NASA. The preceding work is all performed for NASA's Office of Space Communications (OSC).

  6. Mergers, Acquisitions, and Access: STM Publishing Today

    NASA Astrophysics Data System (ADS)

    Robertson, Kathleen

    Electronic publishing is changing the fundamentals of the entire printing/delivery/archive system that has served as the distribution mechanism for scientific research over the last century and a half. The merger-mania of the last 20 years, preprint pools, and publishers' licensing and journals-bundling plans are among the phenomena impacting the scientific information field. Science-Technology-Medical (STM) publishing is experiencing a period of intense consolidation and reorganization. This paper gives an overview of the economic factors fueling these trends, the major STM publishers, and the government regulatory bodies that referee this industry in Europe, Canada, and the USA.

  7. The Telecommunications and Data Acquisition Report

    NASA Technical Reports Server (NTRS)

    Yuen, Joseph H. (Editor)

    1995-01-01

    This quarterly publication provides archival reports on developments in programs managed by JPL's Telecommunications and Mission Operations Directorate (TMOD), which now includes the former Telecommunications and Data Acquisition (TDA) Office. In space communications, radio navigation, radio science, and ground-based radio and radar astronomy, it reports on activities of the Deep Space Network (DSN) in planning, supporting research and technology, implementation, and operations. Also included are standards activity at JPL for space data and information systems and reimbursable DSN work performed for other space agencies through NASA. The preceding work is all performed for NASA's Office of Space Communications (OSC).

  8. Uncoupling File System Components for Bridging Legacy and Modern Storage Architectures

    NASA Astrophysics Data System (ADS)

    Golpayegani, N.; Halem, M.; Tilmes, C.; Prathapan, S.; Earp, D. N.; Ashkar, J. S.

    2016-12-01

    Long running Earth Science projects can span decades of architectural changes in both processing and storage environments. As storage architecture designs change over decades such projects need to adjust their tools, systems, and expertise to properly integrate such new technologies with their legacy systems. Traditional file systems lack the necessary support to accommodate such hybrid storage infrastructure resulting in more complex tool development to encompass all possible storage architectures used for the project. The MODIS Adaptive Processing System (MODAPS) and the Level 1 and Atmospheres Archive and Distribution System (LAADS) is an example of a project spanning several decades which has evolved into a hybrid storage architecture. MODAPS/LAADS has developed the Lightweight Virtual File System (LVFS) which ensures a seamless integration of all the different storage architectures, including standard block based POSIX compliant storage disks, to object based architectures such as the S3 compliant HGST Active Archive System, and the Seagate Kinetic disks utilizing the Kinetic Protocol. With LVFS, all analysis and processing tools used for the project continue to function unmodified regardless of the underlying storage architecture enabling MODAPS/LAADS to easily integrate any new storage architecture without the costly need to modify existing tools to utilize such new systems. Most file systems are designed as a single application responsible for using metadata to organizing the data into a tree, determine the location for data storage, and a method of data retrieval. We will show how LVFS' unique approach of treating these components in a loosely coupled fashion enables it to merge different storage architectures into a single uniform storage system which bridges the underlying hybrid architecture.

  9. Fingerprint verification on medical image reporting system.

    PubMed

    Chen, Yen-Cheng; Chen, Liang-Kuang; Tsai, Ming-Dar; Chiu, Hou-Chang; Chiu, Jainn-Shiun; Chong, Chee-Fah

    2008-03-01

    The healthcare industry is recently going through extensive changes, through adoption of robust, interoperable healthcare information technology by means of electronic medical records (EMR). However, a major concern of EMR is adequate confidentiality of the individual records being managed electronically. Multiple access points over an open network like the Internet increases possible patient data interception. The obligation is on healthcare providers to procure information security solutions that do not hamper patient care while still providing the confidentiality of patient information. Medical images are also part of the EMR which need to be protected from unauthorized users. This study integrates the techniques of fingerprint verification, DICOM object, digital signature and digital envelope in order to ensure that access to the hospital Picture Archiving and Communication System (PACS) or radiology information system (RIS) is only by certified parties.

  10. Health care information infrastructure: what will it be and how will we get there?

    NASA Astrophysics Data System (ADS)

    Kun, Luis G.

    1996-02-01

    During the first Health Care Technology Policy [HCTPI conference last year, during Health Care Reform, four major issues were brought up in regards to the underway efforts to develop a Computer Based Patient Record (CBPR)I the National Information Infrastructure (NIl) as part of the High Performance Computers & Communications (HPCC), and the so-called "Patient Card" . More specifically it was explained how a national information system will greatly affect the way health care delivery is provided to the United States public and reduce its costs. These four issues were: Constructing a National Information Infrastructure (NIl); Building a Computer Based Patient Record System; Bringing the collective resources of our National Laboratories to bear in developing and implementing the NIl and CBPR, as well as a security system with which to safeguard the privacy rights of patients and the physician-patient privilege; Utilizing Government (e.g. DOD, DOE) capabilities (technology and human resources) to maximize resource utilization, create new jobs and accelerate technology transfer to address health care issues. During the second HCTP conference, in mid 1 995, a section of this meeting entitled: "Health Care Technology Assets of the Federal Government" addressed benefits of the technology transfer which should occur for maximizing already developed resources. Also a section entitled:"Transfer and Utilization of Government Technology Assets to the Private Sector", looked at both Health Care and non-Health Care related technologies since many areas such as Information Technologies (i.e. imaging, communications, archival I retrieval, systems integration, information display, multimedia, heterogeneous data bases, etc.) already exist and are part of our National Labs and/or other federal agencies, i.e. ARPA. These technologies although they are not labeled under "Health Care" programs they could provide enormous value to address technical needs. An additional issue deals with both the technical (hardware, software) and human expertise that resides within these labs and their possible role in creating cost effective solutions.

  11. Radio data archiving system

    NASA Astrophysics Data System (ADS)

    Knapic, C.; Zanichelli, A.; Dovgan, E.; Nanni, M.; Stagni, M.; Righini, S.; Sponza, M.; Bedosti, F.; Orlati, A.; Smareglia, R.

    2016-07-01

    Radio Astronomical Data models are becoming very complex since the huge possible range of instrumental configurations available with the modern Radio Telescopes. What in the past was the last frontiers of data formats in terms of efficiency and flexibility is now evolving with new strategies and methodologies enabling the persistence of a very complex, hierarchical and multi-purpose information. Such an evolution of data models and data formats require new data archiving techniques in order to guarantee data preservation following the directives of Open Archival Information System and the International Virtual Observatory Alliance for data sharing and publication. Currently, various formats (FITS, MBFITS, VLBI's XML description files and ancillary files) of data acquired with the Medicina and Noto Radio Telescopes can be stored and handled by a common Radio Archive, that is planned to be released to the (inter)national community by the end of 2016. This state-of-the-art archiving system for radio astronomical data aims at delegating as much as possible to the software setting how and where the descriptors (metadata) are saved, while the users perform user-friendly queries translated by the web interface into complex interrogations on the database to retrieve data. In such a way, the Archive is ready to be Virtual Observatory compliant and as much as possible user-friendly.

  12. Implementation of a filmless mini picture archiving and communication system in ultrasonography: experience after one year of use.

    PubMed

    Henri, C J; Cox, R D; Bret, P M

    1997-08-01

    This article details our experience in developing and operating an ultrasound mini-picture archiving and communication system (PACS). Using software developed in-house, low-end Macintosh computers (Apple Computer Co. Cupertino, CA) equipped with framegrabbers coordinate the entry of patient demographic information, image acquisition, and viewing on each ultrasound scanner. After each exam, the data are transmitted to a central archive server where they can be accessed from anywhere on the network. The archive server also provides web-based access to the data and manages pre-fetch and other requests for data that may no longer be on-line. Archival is fully automatic and is performed on recordable compact disk (CD) without compression. The system has been filmless now for over 18 months. In the meantime, one film processor has been eliminated and the position of one film clerk has been reallocated. Previously, nine ultrasound machines produced approximately 150 sheets of laser film per day (at 14 images per sheet). The same quantity of data are now archived without compression onto a single CD. Start-up costs were recovered within six months, and the project has been extended to include computed tomography (CT) and magnetic resonance imaging (MRI).

  13. The ERESE Project: Interfacing with the ERDA Digital Archive and ERR Reference Database in EarthRef.org

    NASA Astrophysics Data System (ADS)

    Koppers, A. A.; Staudigel, H.; Mills, H.; Keller, M.; Wallace, A.; Bachman, N.; Helly, J.; Helly, M.; Miller, S. P.; Massell Symons, C.

    2004-12-01

    To bridge the gap between Earth science teachers, librarians, scientists and data archive managers, we have started the ERESE project that will create, archive and make available "Enduring Resources in Earth Science Education" through information technology (IT) portals. In the first phase of this National Science Digital Library (NSDL) project, we are focusing on the development of these ERESE resources for middle and high school teachers to be used in lesson plans with "plate tectonics" and "magnetics" as their main theme. In this presentation, we will show how these new ERESE resources are being generated, how they can be uploaded via online web wizards, how they are archived, how we make them available via the EarthRef.org Digital Archive (ERDA) and Reference Database (ERR), and how they relate to the SIOExplorer database containing data objects for all seagoing cruises carried out by the Scripps Institution of Oceanography. The EarthRef.org web resource uses the vision of a "general description" of the Earth as a geological system to provide an IT infrastructure for the Earth sciences. This emphasizes the marriage of the "scientific process" (and its results) with an educational cyber-infrastructure for teaching Earth sciences, on any level, from middle school to college and graduate levels. Eight different databases reside under EarthRef.org from which ERDA holds any digital object that has been uploaded by other scientists, teachers and students for free, while the ERR holds more than 80,000 publications. For more than 1,500 of these publications, this latter database makes available for downloading JPG/PDF images of the abstracts, data tables, methods and appendices, together with their digitized contents in Microsoft Word and Excel format. Both holdings are being used to store the ERESE objects that are being generated by a group of undergraduate students majoring in Environmental Systems (ESYS) program at the UCSD with an emphasis on the Earth Sciences. These students perform library and internet research in order to design and generate these "Enduring Resources in Earth Science Education" that they test by closely interacting with the research faculty at the Scripps Institution of Oceanography. Typical ERESE resources can be diagrams, model cartoons, maps, data sets for analyses, and glossary items and essays to explain certain Earth Science concepts and are ready to be used in the classroom.

  14. Chief Technology Officer Act of 2009

    THOMAS, 111th Congress

    Rep. Connolly, Gerald E. [D-VA-11

    2009-04-02

    House - 05/04/2009 Referred to the Subcommittee on Information Policy, Census, and National Archives. (All Actions) Tracker: This bill has the status IntroducedHere are the steps for Status of Legislation:

  15. Sentinel-1 Interferometry from the Cloud to the Scientist

    NASA Astrophysics Data System (ADS)

    Garron, J.; Stoner, C.; Johnston, A.; Arko, S. A.

    2017-12-01

    Big data problems and solutions are growing in the technological and scientific sectors daily. Cloud computing is a vertically and horizontally scalable solution available now for archiving and processing large volumes of data quickly, without significant on-site computing hardware costs. Be that as it may, the conversion of scientific data processors to these powerful platforms requires not only the proof of concept, but the demonstration of credibility in an operational setting. The Alaska Satellite Facility (ASF) Distributed Active Archive Center (DAAC), in partnership with NASA's Jet Propulsion Laboratory, is exploring the functional architecture of Amazon Web Services cloud computing environment for the processing, distribution and archival of Synthetic Aperture Radar data in preparation for the NASA-ISRO Synthetic Aperture Radar (NISAR) Mission. Leveraging built-in AWS services for logging, monitoring and dashboarding, the GRFN (Getting Ready for NISAR) team has built a scalable processing, distribution and archival system of Sentinel-1 L2 interferograms produced using the ISCE algorithm. This cloud-based functional prototype provides interferograms over selected global land deformation features (volcanoes, land subsidence, seismic zones) and are accessible to scientists via NASA's EarthData Search client and the ASF DAACs primary SAR interface, Vertex, for direct download. The interferograms are produced using nearest-neighbor logic for identifying pairs of granules for interferometric processing, creating deep stacks of BETA products from almost every satellite orbit for scientists to explore. This presentation highlights the functional lessons learned to date from this exercise, including the cost analysis of various data lifecycle policies as implemented through AWS. While demonstrating the architecture choices in support of efficient big science data management, we invite feedback and questions about the process and products from the InSAR community.

  16. A WWW-Based Archive and Retrieval System for Multimedia

    NASA Technical Reports Server (NTRS)

    Hyon, J.; Sorensen, S.; Martin, M.; Kawasaki, K.; Takacs, M.

    1996-01-01

    This paper describes the Data Distribution Laboratory (DDL) and discusses issues involved in building multimedia CD-ROMs. It describes the modeling philosophy for cataloging multimedia products and the worldwide-web (WWW)-based multimedia archive and retrieval system (Webcat) built on that model.

  17. Managing an Archive of Images

    NASA Technical Reports Server (NTRS)

    Andres, Vince; Walter, David; Hallal, Charles; Jones, Helene; Callac, Chris

    2004-01-01

    The SSC Multimedia Archive is an automated electronic system to manage images, acquired both by film and digital cameras, for the Public Affairs Office (PAO) at Stennis Space Center (SSC). Previously, the image archive was based on film photography and utilized a manual system that, by today s standards, had become inefficient and expensive. Now, the SSC Multimedia Archive, based on a server at SSC, contains both catalogs and images for pictures taken both digitally and with a traditional, film-based camera, along with metadata about each image. After a "shoot," a photographer downloads the images into the database. Members of the PAO can use a Web-based application to search, view and retrieve images, approve images for publication, and view and edit metadata associated with the images. Approved images are archived and cross-referenced with appropriate descriptions and information. Security is provided by allowing administrators to explicitly grant access privileges to personnel to only access components of the system that they need to (i.e., allow only photographers to upload images, only PAO designated employees may approve images).

  18. The focus of Langenbeck's Archives of Surgery in the 21st century.

    PubMed

    Schneider, Martin; Weitz, Jürgen; Büchler, Markus W

    2010-04-01

    The Langenbeck's Archives of Surgery has been serving as a publication platform for clinical and scientific progress in the field of surgery for 150 years. In order to maintain this long-standing tradition throughout the coming decades, it will be mandatory to face the challenges posed by increasing specialization of surgical subdisciplines, modern technologies, and interdisciplinary treatment options. Continued efforts need to be directed at minimizing surgical trauma, not at least with respect to current demographic development. Adoption of progressive technologies from the fields of biophysics, mechatronics, and biomedical imaging solutions will likely gain a major impact on the further development of surgical operation techniques. Expanding insight from genomic and molecular medicine will facilitate personalized, interdisciplinary treatment concepts for malignant disease, in which surgical resection techniques will need to be integrated. The introduction of novel diagnostic and treatment concepts will mandate solid evaluation of their clinical effectiveness and safety, which can only be achieved by randomized, controlled trials in the field of surgery. Extracting study ideas from the contributions by clinicians and basic scientists, and promoting the conduction of clinical trials will therefore range among the most important tasks for the Langenbeck's Archives of Surgery in the 21st century.

  19. Mega-precovery and data mining of near-Earth asteroids and other Solar System objects

    NASA Astrophysics Data System (ADS)

    Popescu, M.; Vaduvescu, O.; Char, F.; Curelaru, L.; Euronear Team

    2014-07-01

    The vast collection of CCD images and photographic plate archives available from the world-wide archives and telescopes is still insufficiently exploited. Within the EURONEAR project we designed two data mining software with the purpose to search very large collections of archives for images which serendipitously include known asteroids or comets in their field, with the main aims to extend the arc and improve the orbits. In this sense, ''Precovery'' (published in 2008, aiming to search all known NEAs in few archives via IMCCE's SkyBoT server) and ''Mega-Precovery'' (published in 2010, querying the IMCCE's Miriade server) were made available to the community via the EURONEAR website (euronear.imcce.fr). Briefly, Mega-Precovery aims to search one or a few known asteroids or comets in a mega-collection including millions of images from some of the largest observatory archives: ESO (15 instruments served by ESO Archive including VLT), NVO (8 instruments served by U.S. NVO Archive), CADC (11 instruments, including HST and Gemini), plus other important instrument archives: SDSS, CFHTLS, INT-WFC, Subaru-SuprimeCam and AAT-WFI, adding together 39 instruments and 4.3 million images (Mar 2014), and our Mega-Archive is growing. Here we present some of the most important results obtained with our data-mining software and some new planned search options of Mega-Precovery. Particularly, the following capabilities will be added soon: the ING archive (all imaging cameras) will be included and new search options will be made available (such as query by orbital elements and by observations) to be able to target new Solar System objects such as Virtual Impactors, bolides, planetary satellites, TNOs (besides the comets added recently). In order to better characterize the archives, we introduce the ''AOmegaA'' factor (archival etendue) proportional to the AOmega (etendue) and the number of images in an archive. With the aim to enlarge the Mega-Archive database, we invite the observatories (particularly those storing their images online and also those that own plate archives which could be scanned on request) to contact us in order to add their instrument archives (consisting of an ASCII file with telescope pointings in a simple format) to our Mega-Precovery open project. We intend for the future to synchronise our service with the Virtual Observatory.

  20. Structuring Broadcast Audio for Information Access

    NASA Astrophysics Data System (ADS)

    Gauvain, Jean-Luc; Lamel, Lori

    2003-12-01

    One rapidly expanding application area for state-of-the-art speech recognition technology is the automatic processing of broadcast audiovisual data for information access. Since much of the linguistic information is found in the audio channel, speech recognition is a key enabling technology which, when combined with information retrieval techniques, can be used for searching large audiovisual document collections. Audio indexing must take into account the specificities of audio data such as needing to deal with the continuous data stream and an imperfect word transcription. Other important considerations are dealing with language specificities and facilitating language portability. At Laboratoire d'Informatique pour la Mécanique et les Sciences de l'Ingénieur (LIMSI), broadcast news transcription systems have been developed for seven languages: English, French, German, Mandarin, Portuguese, Spanish, and Arabic. The transcription systems have been integrated into prototype demonstrators for several application areas such as audio data mining, structuring audiovisual archives, selective dissemination of information, and topic tracking for media monitoring. As examples, this paper addresses the spoken document retrieval and topic tracking tasks.

  1. Detailed description of the Mayo/IBM PACS

    NASA Astrophysics Data System (ADS)

    Gehring, Dale G.; Persons, Kenneth R.; Rothman, Melvyn L.; Salutz, James R.; Morin, Richard L.

    1991-07-01

    The Mayo Clinic and IBM/Rochester have jointly developed a picture archiving system (PACS) for use with Mayo's MRI and Neuro-CT imaging modalities. The system was developed to replace the imaging system's vendor-supplied magnetic tape archiving capability. The system consists of seven MR imagers and nine CT scanners, each interfaced to the PACS via IBM Personal System/2(tm) (PS/2) computers, which act as gateways from the imaging modality to the PACS network. The PAC system operates on the token-ring component of Mayo's city-wide local area network. Also on the PACS network are four optical storage subsystems used for image archival, three optical subsystems used for image retrieval, an IBM Application System/400(tm) (AS/400) computer used for database management and multiple PS/2-based image display systems and their image servers.

  2. The Phases Differential Astrometry Data Archive. 2. Updated Binary Star Orbits and a Long Period Eclipsing Binary

    DTIC Science & Technology

    2010-12-01

    Mathematics and Astronomy , 105-24 California Institute of Technology, Pasadena, CA 91125, USA 6 Nicolaus Copernicus Astronomical Center, Polish Academy of...Propulsion Laboratory, California Institute of Technology, 4800 Oak Grove Dr., Pasadena, CA 91109, USA 10 Department of Astronomy , University of California...PHASES is funded in part by the California Institute of Technology Astronomy Department and by the National Aeronautics and Space Administration under

  3. Meta-manager: a requirements analysis.

    PubMed

    Cook, J F; Rozenblit, J W; Chacko, A K; Martinez, R; Timboe, H L

    1999-05-01

    The digital imaging network-picture-archiving and communications system (DIN-PACS) will be implemented in ten sites within the Great Plains Regional Medical Command (GPRMC). This network of PACS and teleradiology technology over a shared T1 network has opened the door for round the clock radiology coverage of all sites. However, the concept of a virtual radiology environment poses new issues for military medicine. A new workflow management system must be developed. This workflow management system will allow us to efficiently resolve these issues including quality of care, availability, severe capitation, and quality of the workforce. The design process of this management system must employ existing technology, operate over various telecommunication networks and protocols, be independent of platform operating systems, be flexible and scaleable, and involve the end user at the outset in the design process for which it is developed. Using the unified modeling language (UML), the specifications for this new business management system were created in concert between the University of Arizona and the GPRMC. These specifications detail a management system operating through a common object request brokered architecture (CORBA) environment. In this presentation, we characterize the Meta-Manager management system including aspects of intelligence, interfacility routing, fail-safe operations, and expected improvements in patient care and efficiency.

  4. An Electronic Finding Aid Using Extensible Markup Language (XML) and Encoded Archival Description (EAD).

    ERIC Educational Resources Information Center

    Chang, May

    2000-01-01

    Describes the development of electronic finding aids for archives at the University of Illinois, Urbana-Champaign that used XML (extensible markup language) and EAD (encoded archival description) to enable more flexible information management and retrieval than using MARC or a relational database management system. EAD template is appended.…

  5. Acquisition, use, and archiving of real-time data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Leach, M.J.; Bernstein, H.J.; Tichler, J.L.

    Meteorological information is needed by scientific personnel at Brookhaven National Laboratory (BNL) for various purposes. An automated system, used to acquire, archive, and provide users with weather data, is described. Hardware, software, and some of the examples of the uses of the system are detailed.

  6. Don't leave data unattended at any time!

    NASA Astrophysics Data System (ADS)

    Fleischer, D.; Czerniak, A.; Schirnick, C.

    2013-12-01

    The architecture of Kiel Data Management Infrastructure (KDMI) is setup to serve from the data creation process all the way to the data publication procedure. Accordingly the KDMI is managing data at the right beginning of the data life cycle and does not leave data unattended at this very crucial time. Starting from the chosen working procedure to handwritten protocols or lab notes the provenance of the resulting research data is captured within the KDMI. The provenance definition system is the fundamental (see figure 1) capturing tool for working procedures. The provenance definition is used to enable data input by file import, web client or hand writing recognition. The captured data in the provenance system for data is taking care of unpublished in house research data created directly on site. This system serves as a master for research data systems with more degrees of freedom in regard to technology, design or performance (e.g. GraphDB, etc). Such research systems can be regarded as compilations of unpublished data and public domain data e.g. from World Data Centers or archives. These compilations can be used to run statistical data mining and pattern finding algorithms on these specially designed platforms. The architecture of the KDMI ensures that a technical solution for data correction from the slave systems to the master system is possible and improves the quality of the stored data in the provenance system for data. After the research phase is over and the interpretation is finished the provenance system is used by a workflow based publication system called PubFlow. Within PubFlow it is possible to create repeatable workflows to publish data into various external long-term archives or World Data Center. The KDMI is based on the utilization of persistent identifiers for samples and person identities to support this automatized publication process. The publication process is the final step of the KDMI and the management responsibility of the long-term part of the data life cycle is handed over to the chosen archive. Nevertheless the provenance information remains at the KDMI and the definition maybe serves for future datasets again. Unattended data may get lost or be destroyed Usage cycle of the Kiel Data Management Infrastructure (KDMI)

  7. Image acquisition unit for the Mayo/IBM PACS project

    NASA Astrophysics Data System (ADS)

    Reardon, Frank J.; Salutz, James R.

    1991-07-01

    The Mayo Clinic and IBM Rochester, Minnesota, have jointly developed a picture archiving, distribution and viewing system for use with Mayo's CT and MRI imaging modalities. Images are retrieved from the modalities and sent over the Mayo city-wide token ring network to optical storage subsystems for archiving, and to server subsystems for viewing on image review stations. Images may also be retrieved from archive and transmitted back to the modalities. The subsystems that interface to the modalities and communicate to the other components of the system are termed Image Acquisition Units (LAUs). The IAUs are IBM Personal System/2 (PS/2) computers with specially developed software. They operate independently in a network of cooperative subsystems and communicate with the modalities, archive subsystems, image review server subsystems, and a central subsystem that maintains information about the content and location of images. This paper provides a detailed description of the function and design of the Image Acquisition Units.

  8. Performance of the Mayo-IBM PAC system

    NASA Astrophysics Data System (ADS)

    Persons, Kenneth R.; Reardon, Frank J.; Gehring, Dale G.; Hangiandreou, Nicholas J.

    1994-05-01

    The Mayo Clinic and IBM (at Rochester, Minnesota) have jointly developed a picture archived system for use with Mayo's MRI and CT imaging modalities. This PACS is made up of over 50 computers that work cooperatively to provide archival, retrieval and image distribution services for Mayo's Department of Radiology. This paper will examine the performance characteristics of the system.

  9. Throughfall Displacement Experiment (TDE) Ecosystem Model Intercomparison Project Data Archive

    DOE Data Explorer

    Hanson, Paul J. [Oak Ridge National Laboratory (ORNL), Oak Ridge, TN (USA); Amthor, Jeffrey S. [Oak Ridge National Laboratory (ORNL), Oak Ridge, TN (USA); Baldocchi, Dennis D. [University of California, Berkeley; Grant, Robert F. [University of Alberta, Canada; Hartley, Anne E. [Ohio State University; Hui, Dafeng [University of Oklahoma; Hunt, Jr., E. Raymond [Agricultural Research Service, U.S. Department of Agriculture; Johnson, Dale W. [University of Nevada, Reno; Kimball, John S. [University of Montana; King, Anthony W. [Oak Ridge National Laboratory (ORNL), Oak Ridge, TN (USA); Luo, Yiqi [University of Oklahoma; McNulty, Steven G. [Southern Global Change Program, U.S. Forest Service, U.S. Department of Agriculture; Sun, Ge [North Carolina State University, Raleigh, NC (USA); Thornton, Peter E. [University of Montana; Wang, Shusen [Geomatics Canada - Canada Centre for Remote Sensing Natural Resources, Canada; Williams, Matthew [University of Edinburgh, United Kingdom; Wilson, Kell B. [National Oceanic and Atmospheric Administration, U.S. Department of Commerce; Wullschleger, Stanley D. [Oak Ridge National Laboratory (ORNL), Oak Ridge, TN (USA)

    2002-08-01

    This archive provides and documents data from a project whose purpose is to compare the output of various ecosystem models when they are run with the data from the Throughfall Displacement Experiment (TDE) at Walker Branch Watershed, Oak Ridge, Tennessee. The project is not designed to determine which models are "best" for diagnosis (i.e., explaining the current functioning of the system) or prognosis (i.e., predicting the response of the system to future conditions), but, rather, to clarify similarities and differences among the models and their components, so that all models can be improved. Data archive: ftp://cdiac.ornl.gov/ftp/tdemodel/. TDE data archive web site: http://cdiac.ess-dive.lbl.gov/epubs/ndp/ndp078a/ndp078a.html.

  10. Data Curation for the Exploitation of Large Earth Observation Products Databases - The MEA system

    NASA Astrophysics Data System (ADS)

    Mantovani, Simone; Natali, Stefano; Barboni, Damiano; Cavicchi, Mario; Della Vecchia, Andrea

    2014-05-01

    National Space Agencies under the umbrella of the European Space Agency are performing a strong activity to handle and provide solutions to Big Data and related knowledge (metadata, software tools and services) management and exploitation. The continuously increasing amount of long-term and of historic data in EO facilities in the form of online datasets and archives, the incoming satellite observation platforms that will generate an impressive amount of new data and the new EU approach on the data distribution policy make necessary to address technologies for the long-term management of these data sets, including their consolidation, preservation, distribution, continuation and curation across multiple missions. The management of long EO data time series of continuing or historic missions - with more than 20 years of data available already today - requires technical solutions and technologies which differ considerably from the ones exploited by existing systems. Several tools, both open source and commercial, are already providing technologies to handle data and metadata preparation, access and visualization via OGC standard interfaces. This study aims at describing the Multi-sensor Evolution Analysis (MEA) system and the Data Curation concept as approached and implemented within the ASIM and EarthServer projects, funded by the European Space Agency and the European Commission, respectively.

  11. Concepts for image management and communication system for space vehicle health management

    NASA Astrophysics Data System (ADS)

    Alsafadi, Yasser; Martinez, Ralph

    On a space vehicle, the Crew Health Care System will handle minor accidents or illnesses immediately, thereby eliminating the necessity of early mission termination or emergency rescue. For practical reasons, only trained personnel with limited medical experience can be available on space vehicles to render preliminary health care. There is the need to communicate with medical experts at different locations on earth. Interplanetary Image Management and Communication System (IIMACS) will be a bridge between worlds and deliver medical images acquired in space to physicians at different medical centers on earth. This paper discusses the implementation of IIMACS by extending the Global Picture Archiving and Communication System (GPACS) being developed to interconnect medical centers on earth. Furthermore, this paper explores system requirements of IIMACS and different user scenarios. Our conclusion is that IIMACS is feasible using the maturing technology base of GPACS.

  12. Cryogenic hydrogen-induced air-liquefaction technologies

    NASA Technical Reports Server (NTRS)

    Escher, William J. D.

    1990-01-01

    Extensive use of a special advanced airbreathing propulsion archives data base, as well as direct contacts with individuals who were active in the field in previous years, a technical assessment of cryogenic hydrogen induced air liquefaction, as a prospective onboard aerospace vehicle process, was performed and documented in 1986. The resulting assessment report is summarized. Technical findings relating the status of air liquefaction technology are presented both as a singular technical area, and also as that of a cluster of collateral technical areas including: Compact lightweight cryogenic heat exchangers; Heat exchanger atmospheric constituents fouling alleviation; Para/ortho hydrogen shift conversion catalysts; Hydrogen turbine expanders, cryogenic air compressors and liquid air pumps; Hydrogen recycling using slush hydrogen as heat sinks; Liquid hydrogen/liquid air rocket type combustion devices; Air Collection and Enrichment System (ACES); and Technically related engine concepts.

  13. Knowledge Capture and Management for Space Flight Systems

    NASA Technical Reports Server (NTRS)

    Goodman, John L.

    2005-01-01

    The incorporation of knowledge capture and knowledge management strategies early in the development phase of an exploration program is necessary for safe and successful missions of human and robotic exploration vehicles over the life of a program. Following the transition from the development to the flight phase, loss of underlying theory and rationale governing design and requirements occur through a number of mechanisms. This degrades the quality of engineering work resulting in increased life cycle costs and risk to mission success and safety of flight. Due to budget constraints, concerned personnel in legacy programs often have to improvise methods for knowledge capture and management using existing, but often sub-optimal, information technology and archival resources. Application of advanced information technology to perform knowledge capture and management would be most effective if program wide requirements are defined at the beginning of a program.

  14. The Telecommunications and Data Acquisition Report

    NASA Technical Reports Server (NTRS)

    Posner, Edward C. (Editor)

    1993-01-01

    This quarterly publication provides archival reports on developments in programs managed by JPL's Office of Telecommunications and Data Acquisition (TDA) in the following areas: space communications, radio navigation, radio science, and ground-based radio and radar astronomy. This document also reports on the activities of the Deep Space Network (DSN) in planning, supporting research and technology, implementation, and operations. Also included are standards activity at JPL for space data and information systems and reimbursable DSN work performed for other space agencies through NASA. The preceding work is all performed for NASA's Office of Space Communications (OSC). The TDA Office also performs work funded by another NASA program office through and with the cooperation of OSC. This is the Orbital Debris Radar Program with the Office of Space Systems Development.

  15. GEND planning report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    The Three Mile Island (TMI) Unit 2 accident on March 28, 1979 was and is of great concern to the nuclear industry; electric power generating companies and their customers, regulatory and other government agencies, the entire nuclear community, and to the country as a whole. While the accident resulted in only limited external plant radiation exposure, the plant itself suffered extensive damage with high radiation contamination within the reactor and auxiliary system facilities. The GEND Planning Report for cleanup activities at TMI-2 covers the areas of: instrumentation and electrical equipment survivability; fission product transport; decontamination/radiation dose reduction technology; data bankmore » organization and sample archive facility; characterization of primary system pressure boundary and mechanical components; core damage assessment; and fuel handling, removal, examination and disposal.« less

  16. NSSDC: Preserving time with technological advances

    NASA Technical Reports Server (NTRS)

    Perry, Charleen

    1990-01-01

    The National Space Science Data Center (NSSDC) has always striven to use the best methods and media currently available from data accessibility and archive management. The various advantages and disadvantages of four different media forms that have been used by NSSDC over 12 years of archiving and distributing of International Ultraviolet Explorer (IUE) data are discussed. The four media are nine track magnetic tape, IBM 3850 mass storage, Memorex 3480 18 track tape catridge, and the Sony 6.5 Gbyte Century optical disk. The CD-ROM medium is also discussed.

  17. Software for Optical Archive and Retrieval (SOAR) user's guide, version 4.2

    NASA Technical Reports Server (NTRS)

    Davis, Charles

    1991-01-01

    The optical disk is an emerging technology. Because it is not a magnetic medium, it offers a number of distinct advantages over the established form of storage, advantages that make it extremely attractive. They are as follows: (1) the ability to store much more data within the same space; (2) the random access characteristics of the Write Once Read Many optical disk; (3) a much longer life than that of traditional storage media; and (4) much greater data access rate. Software for Optical Archive and Retrieval (SOAR) user's guide is presented.

  18. The Challenges Facing Science Data Archiving on Current Mass Storage Systems

    NASA Technical Reports Server (NTRS)

    Peavey, Bernard; Behnke, Jeanne (Editor)

    1996-01-01

    This paper discusses the desired characteristics of a tape-based petabyte science data archive and retrieval system required to store and distribute several terabytes (TB) of data per day over an extended period of time, probably more than 115 years, in support of programs such as the Earth Observing System Data and Information System (EOSDIS). These characteristics take into consideration not only cost effective and affordable storage capacity, but also rapid access to selected files, and reading rates that are needed to satisfy thousands of retrieval transactions per day. It seems that where rapid random access to files is not crucial, the tape medium, magnetic or optical, continues to offer cost effective data storage and retrieval solutions, and is likely to do so for many years to come. However, in environments like EOS these tape based archive solutions provide less than full user satisfaction. Therefore, the objective of this paper is to describe the performance and operational enhancements that need to be made to the current tape based archival systems in order to achieve greater acceptance by the EOS and similar user communities.

  19. Engineering Design of ITER Prototype Fast Plant System Controller

    NASA Astrophysics Data System (ADS)

    Goncalves, B.; Sousa, J.; Carvalho, B.; Rodrigues, A. P.; Correia, M.; Batista, A.; Vega, J.; Ruiz, M.; Lopez, J. M.; Rojo, R. Castro; Wallander, A.; Utzel, N.; Neto, A.; Alves, D.; Valcarcel, D.

    2011-08-01

    The ITER control, data access and communication (CODAC) design team identified the need for two types of plant systems. A slow control plant system is based on industrial automation technology with maximum sampling rates below 100 Hz, and a fast control plant system is based on embedded technology with higher sampling rates and more stringent real-time requirements than that required for slow controllers. The latter is applicable to diagnostics and plant systems in closed-control loops whose cycle times are below 1 ms. Fast controllers will be dedicated industrial controllers with the ability to supervise other fast and/or slow controllers, interface to actuators and sensors and, if necessary, high performance networks. Two prototypes of a fast plant system controller specialized for data acquisition and constrained by ITER technological choices are being built using two different form factors. This prototyping activity contributes to the Plant Control Design Handbook effort of standardization, specifically regarding fast controller characteristics. Envisaging a general purpose fast controller design, diagnostic use cases with specific requirements were analyzed and will be presented along with the interface with CODAC and sensors. The requirements and constraints that real-time plasma control imposes on the design were also taken into consideration. Functional specifications and technology neutral architecture, together with its implications on the engineering design, were considered. The detailed engineering design compliant with ITER standards was performed and will be discussed in detail. Emphasis will be given to the integration of the controller in the standard CODAC environment. Requirements for the EPICS IOC providing the interface to the outside world, the prototype decisions on form factor, real-time operating system, and high-performance networks will also be discussed, as well as the requirements for data streaming to CODAC for visualization and archiving.

  20. Application service provider (ASP) financial models for off-site PACS archiving

    NASA Astrophysics Data System (ADS)

    Ratib, Osman M.; Liu, Brent J.; McCoy, J. Michael; Enzmann, Dieter R.

    2003-05-01

    For the replacement of its legacy Picture Archiving and Communication Systems (approx. annual workload of 300,000 procedures), UCLA Medical Center has evaluated and adopted an off-site data-warehousing solution based on an ASP financial with a one-time single payment per study archived. Different financial models for long-term data archive services were compared to the traditional capital/operational costs of on-site digital archives. Total cost of ownership (TCO), including direct and indirect expenses and savings, were compared for each model. Financial parameters were considered: logistic/operational advantages and disadvantages of ASP models versus traditional archiving systems. Our initial analysis demonstrated that the traditional linear ASP business model for data storage was unsuitable for large institutions. The overall cost markedly exceeds the TCO of an in-house archive infrastructure (when support and maintenance costs are included.) We demonstrated, however, that non-linear ASP pricing models can be cost-effective alternatives for large-scale data storage, particularly if they are based on a scalable off-site data-warehousing service and the prices are adapted to the specific size of a given institution. The added value of ASP is that it does not require iterative data migrations from legacy media to new storage media at regular intervals.

Top