Sample records for archival collecting processing

  1. Mars Observer data production, transfer, and archival: The data production assembly line

    NASA Technical Reports Server (NTRS)

    Childs, David B.

    1993-01-01

    This paper describes the data production, transfer, and archival process designed for the Mars Observer Flight Project. It addresses the developmental and operational aspects of the archive collection production process. The developmental aspects cover the design and packaging of data products for archival and distribution to the planetary community. Also discussed is the design and development of a data transfer and volume production process capable of handling the large throughput and complexity of the Mars Observer data products. The operational aspects cover the main functions of the process: creating data and engineering products, collecting the data products and ancillary products in a central repository, producing archive volumes, validating volumes, archiving, and distributing the data to the planetary community.

  2. Ethics and Truth in Archival Research

    ERIC Educational Resources Information Center

    Tesar, Marek

    2015-01-01

    The complexities of the ethics and truth in archival research are often unrecognised or invisible in educational research. This paper complicates the process of collecting data in the archives, as it problematises notions of ethics and truth in the archives. The archival research took place in the former Czechoslovakia and its turbulent political…

  3. Facilities Requirements for Archives and Special Collections Department.

    ERIC Educational Resources Information Center

    Brown, Charlotte B.

    The program of the Archives and Special Collections Department at Franklin and Marshall College requires the following function areas to be located in the Shadek-Fackenthal Library: (1) Reading Room; (2) Conservation Laboratory; (3) Isolation Room; (4) storage for permanent collection; (5) storage for high security materials; (6) Processing Room;…

  4. Appraisal of the papers of biomedical scientists and physicians for a medical archives.

    PubMed Central

    Anderson, P G

    1985-01-01

    Numerous medical libraries house archival collections. This article discusses criteria for selecting personal papers of biomedical scientists and physicians for a medical archives and defines key terms, such as appraisal, manuscripts, papers, records, and series. Appraisal focuses on both collection and series levels. Collection-level criteria include the significance of a scientist's career and the uniqueness, coverage, and accessibility of the manuscripts. Series frequently found among medically related manuscripts are enumerated and discussed. Types of organizational records and the desirability of accessioning them along with manuscripts are considered. Advantages of direct communication with creators of manuscripts are described. The initial appraisal process is not the last word: reevaluation of materials must take place during processing and can be resumed long afterwards. PMID:4052673

  5. Getting from then to now: Sustaining the Lesbian Herstory Archives as a lesbian organization.

    PubMed

    Smith-Cruz, Shawn ta; Rando, Flavia; Corbman, Rachel; Edel, Deborah; Gwenwald, Morgan; Nestle, Joan; Thistlethwaite, Polly

    2016-01-01

    This article is a compilation of six narratives written by collective members of the volunteer-run Lesbian Herstory Archives, the oldest and largest collection of lesbian material in the world. Narratives draw on a yearlong series of conversations, which culminated in a panel discussion at the 40th Anniversary celebration. Authors' narratives detail the significance of the Lesbian Herstory Archives as a successful and sustainable lesbian organization. Topics covered span four decades and include: the organization's history and practice, founding and activism, the acquisition of the current space, community engagement, and processing of special collections.

  6. Creating a web-based digital photographic archive: one hospital library's experience.

    PubMed

    Marshall, Caroline; Hobbs, Janet

    2017-04-01

    Cedars-Sinai Medical Center is a nonprofit community hospital based in Los Angeles. Its history spans over 100 years, and its growth and development from the merging of 2 Jewish hospitals, Mount Sinai and Cedars of Lebanon, is also part of the history of Los Angeles. The medical library collects and maintains the hospital's photographic archive, to which retiring physicians, nurses, and an active Community Relations Department have donated photographs over the years. The collection was growing rapidly, it was impossible to display all the materials, and much of the collection was inaccessible to patrons. The authors decided to make the photographic collection more accessible to medical staff and researchers by purchasing a web-based digital archival package, Omeka. We decided what material should be digitized by analyzing archival reference requests and considering the institution's plan to create a Timeline Wall documenting and celebrating the history of Cedars-Sinai. Within 8 months, we digitized and indexed over 500 photographs. The digital archive now allows patrons and researchers to access the history of the hospital and enables the library to process archival references more efficiently.

  7. Archive of digital CHIRP seismic reflection data collected during USGS cruise 06FSH01 offshore of Siesta Key, Florida, May 2006

    USGS Publications Warehouse

    Harrison, Arnell S.; Dadisman, Shawn V.; Flocks, James G.; Wiese, Dana S.; Robbins, Lisa L.

    2007-01-01

    In May of 2006, the U.S. Geological Survey conducted geophysical surveys offshore of Siesta Key, Florida. This report serves as an archive of unprocessed digital chirp seismic reflection data, trackline maps, navigation files, GIS information, Field Activity Collection System (FACS) logs, observer's logbook, and formal FGDC metadata. Gained digital images of the seismic profiles are also provided. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG-Y format (Barry and others, 1975) and may be downloaded and processed with commercial or public domain software such as Seismic Unix (SU). Example SU processing scripts and USGS software for viewing the SEG-Y files (Zihlman, 1992) are also provided.

  8. Archive of digital CHIRP seismic reflection data collected during USGS cruise 06SCC01 offshore of Isles Dernieres, Louisiana, June 2006

    USGS Publications Warehouse

    Harrison, Arnell S.; Dadisman, Shawn V.; Ferina, Nick F.; Wiese, Dana S.; Flocks, James G.

    2007-01-01

    In June of 2006, the U.S. Geological Survey conducted a geophysical survey offshore of Isles Dernieres, Louisiana. This report serves as an archive of unprocessed digital CHIRP seismic reflection data, trackline maps, navigation files, GIS information, Field Activity Collection System (FACS) logs, observer's logbook, and formal FGDC metadata. Gained digital images of the seismic profiles are also provided. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG-Y format (Barry and others, 1975) and may be downloaded and processed with commercial or public domain software such as Seismic UNIX (SU). Example SU processing scripts and USGS software for viewing the SEG-Y files (Zihlman, 1992) are also provided.

  9. Land Processes Distributed Active Archive Center (LP DAAC) 25th Anniversary Recognition "A Model for Government Partnerships". LP DAAC "History and a Look Forward"

    NASA Technical Reports Server (NTRS)

    Behnke, Jeanne; Doescher, Chris

    2015-01-01

    This presentation discusses 25 years of interactions between NASA and the USGS to manage a Land Processes Distributed Active Archive Center (LPDAAC) for the purpose of providing users access to NASA's rich collection of Earth Science data. The presentation addresses challenges, efforts and metrics on the performance.

  10. Archive of digital boomer and CHIRP seismic reflection data collected during USGS cruise 06FSH03 offshore of Fort Lauderdale, Florida, September 2006

    USGS Publications Warehouse

    Harrison, Arnell S.; Dadisman, Shawn V.; Reich, Christopher D.; Wiese, Dana S.; Greenwood, Jason W.; Swarzenski, Peter W.

    2007-01-01

    In September of 2006, the U.S. Geological Survey conducted geophysical surveys offshore of Fort Lauderdale, FL. This report serves as an archive of unprocessed digital boomer and CHIRP seismic reflection data, trackline maps, navigation files, GIS information, Field Activity Collection System (FACS) logs, observer's logbook, and formal FGDC metadata. Filtered and gained digital images of the seismic profiles are also provided. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG-Y format (Barry and others, 1975) and may be downloaded and processed with commercial or public domain software such as Seismic Unix (SU). Example SU processing scripts and USGS software for viewing the SEG-Y files (Zihlman, 1992) are also provided.

  11. Educational Labeling System for Atmospheres (ELSA): Python Tool Development for Archiving Under the PDS4 Standard

    NASA Astrophysics Data System (ADS)

    Neakrase, Lynn; Hornung, Danae; Sweebe, Kathrine; Huber, Lyle; Chanover, Nancy J.; Stevenson, Zena; Berdis, Jodi; Johnson, Joni J.; Beebe, Reta F.

    2017-10-01

    The Research and Analysis programs within NASA’s Planetary Science Division now require archiving of resultant data with the Planetary Data System (PDS) or an equivalent archive. The PDS Atmospheres Node is developing an online environment for assisting data providers with this task. The Educational Labeling System for Atmospheres (ELSA) is being designed with Django/Python coding to provide an easier environment for facilitating not only communication with the PDS node, but also streamlining the process of learning, developing, submitting, and reviewing archive bundles under the new PDS4 archiving standard. Under the PDS4 standard, data are archived in bundles, collections, and basic products that form an organizational hierarchy of interconnected labels that describe the data and relationships between the data and its documentation. PDS4 labels are implemented using Extensible Markup Language (XML), which is an international standard for managing metadata. Potential data providers entering the ELSA environment can learn more about PDS4, plan and develop label templates, and build their archive bundles. ELSA provides an interface to tailor label templates aiding in the creation of required internal Logical Identifiers (URN - Uniform Resource Names) and Context References (missions, instruments, targets, facilities, etc.). The underlying structure of ELSA uses Django/Python code that make maintaining and updating the interface easy to do for our undergraduate/graduate students. The ELSA environment will soon provide an interface for using the tailored templates in a pipeline to produce entire collections of labeled products, essentially building the user’s archive bundle. Once the pieces of the archive bundle are assembled, ELSA provides options for queuing the completed bundle for peer review. The peer review process has also been streamlined for online access and tracking to help make the archiving process with PDS as transparent as possible. We discuss the current status of ELSA and provide examples of its implementation.

  12. MODIS land data at the EROS data center DAAC

    USGS Publications Warehouse

    Jenkerson, Calli B.; Reed, B.C.

    2001-01-01

    The US Geological Survey's (USGS) Earth Resources Observation Systems (EROS) Data Center (EDC) in Sioux Falls, SD, USA, is the primary national archive for land processes data and one of the National Aeronautics and Space Administration's (NASA) Distributed Active Archive Centers (DAAC) for the Earth Observing System (EOS). One of EDC's functions as a DAAC is the archival and distribution of Moderate Resolution Spectroradiometer (MODIS) Land Data collected from the Earth Observing System (EOS) satellite Terra. More than 500,000 publicly available MODIS land data granules totaling 25 Terabytes (Tb) are currently stored in the EDC archive. This collection is managed, archived, and distributed by EOS Data and Information System (EOSDIS) Core System (ECS) at EDC. EDC User Services support the use of MODIS Land data, which include land surface reflectance/albedo, temperature/emissivity, vegetation characteristics, and land cover, by responding to user inquiries, constructing user information sites on the EDC web page, and presenting MODIS materials worldwide.

  13. The imaging node for the Planetary Data System

    USGS Publications Warehouse

    Eliason, E.M.; LaVoie, S.K.; Soderblom, L.A.

    1996-01-01

    The Planetary Data System Imaging Node maintains and distributes the archives of planetary image data acquired from NASA's flight projects with the primary goal of enabling the science community to perform image processing and analysis on the data. The Node provides direct and easy access to the digital image archives through wide distribution of the data on CD-ROM media and on-line remote-access tools by way of Internet services. The Node provides digital image processing tools and the expertise and guidance necessary to understand the image collections. The data collections, now approaching one terabyte in volume, provide a foundation for remote sensing studies for virtually all the planetary systems in our solar system (except for Pluto). The Node is responsible for restoring data sets from past missions in danger of being lost. The Node works with active flight projects to assist in the creation of their archive products and to ensure that their products and data catalogs become an integral part of the Node's data collections.

  14. Archiving, processing, and disseminating ASTER products at the USGS EROS Data Center

    USGS Publications Warehouse

    Jones, B.; Tolk, B.; ,

    2002-01-01

    The U.S. Geological Survey EROS Data Center archives, processes, and disseminates Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) data products. The ASTER instrument is one of five sensors onboard the Earth Observing System's Terra satellite launched December 18, 1999. ASTER collects broad spectral coverage with high spatial resolution at near infrared, shortwave infrared, and thermal infrared wavelengths with ground resolutions of 15, 30, and 90 meters, respectively. The ASTER data are used in many ways to understand local and regional earth-surface processes. Applications include land-surface climatology, volcanology, hazards monitoring, geology, agronomy, land cover change, and hydrology. The ASTER data are available for purchase from the ASTER Ground Data System in Japan and from the Land Processes Distributed Active Archive Center in the United States, which receives level 1A and level 1B data from Japan on a routine basis. These products are archived and made available to the public within 48 hours of receipt. The level 1A and level 1B data are used to generate higher level products that include routine and on-demand decorrelation stretch, brightness temperature at the sensor, emissivity, surface reflectance, surface kinetic temperature, surface radiance, polar surface and cloud classification, and digital elevation models. This paper describes the processes and procedures used to archive, process, and disseminate standard and on-demand higher level ASTER products at the Land Processes Distributed Active Archive Center.

  15. PDS, DOIs, and the Literature

    NASA Astrophysics Data System (ADS)

    Raugh, Anne; Henneken, Edwin

    The Planetary Data System (PDS) is actively involved in designing both metadata and interfaces to make the assignment of Digital Object Identifiers (DOIs) to archival data a part of the archiving process for all data creators. These DOIs will be registered through DataCite, a non-profit organization whose members are all deeply concerned with archival research data, provenance tracking through the literature, and proper acknowledgement of the various types of efforts that contribute to the creation of an archival reference data set. Making the collection of citation metadata and its ingestion into the DataCite DOI database easy - and easy to do correctly - is in the best interests of all stakeholders: the data creators; the curators; the indexing organizations like the Astrophysics Data System (ADS); and the data users. But in order to realize the promise of DOIs, there are three key issues to address: 1) How do we incorporate the metadata collection process simply and naturally into the PDS archive creation process; 2) How do we encourage journal editors to require references to previously published data with the same rigor with which they require references to previously published research and analysis; and finally, 3) How can we change the culture of academic and research employers to recognize that the effort required to prepare a PDS archival data set is a career achievement on par with contributing to a refereed article in the professional literature. Data archives and scholarly publications are the long-term return on investment that funding agencies and the science community expect in exchange for research spending. The traceability and reproducibility ensured by the integration of DOIs and their related metadata into indexing and search services is an essential part of providing and optimizing that return.

  16. STARS 2.0: 2nd-generation open-source archiving and query software

    NASA Astrophysics Data System (ADS)

    Winegar, Tom

    2008-07-01

    The Subaru Telescope is in process of developing an open-source alternative to the 1st-generation software and databases (STARS 1) used for archiving and query. For STARS 2, we have chosen PHP and Python for scripting and MySQL as the database software. We have collected feedback from staff and observers, and used this feedback to significantly improve the design and functionality of our future archiving and query software. Archiving - We identified two weaknesses in 1st-generation STARS archiving software: a complex and inflexible table structure and uncoordinated system administration for our business model: taking pictures from the summit and archiving them in both Hawaii and Japan. We adopted a simplified and normalized table structure with passive keyword collection, and we are designing an archive-to-archive file transfer system that automatically reports real-time status and error conditions and permits error recovery. Query - We identified several weaknesses in 1st-generation STARS query software: inflexible query tools, poor sharing of calibration data, and no automatic file transfer mechanisms to observers. We are developing improved query tools and sharing of calibration data, and multi-protocol unassisted file transfer mechanisms for observers. In the process, we have redefined a 'query': from an invisible search result that can only transfer once in-house right now, with little status and error reporting and no error recovery - to a stored search result that can be monitored, transferred to different locations with multiple protocols, reporting status and error conditions and permitting recovery from errors.

  17. Restoration of Apollo Data by the Lunar Data Project/PDS Lunar Data Node: An Update

    NASA Technical Reports Server (NTRS)

    Williams, David R.; Hills, H. Kent; Taylor, Patrick T.; Grayzeck, Edwin J.; Guinness, Edward A.

    2016-01-01

    The Apollo 11, 12, and 14 through 17 missions orbited and landed on the Moon, carrying scientific instruments that returned data from all phases of the missions, included long-lived Apollo Lunar Surface Experiments Packages (ALSEPs) deployed by the astronauts on the lunar surface. Much of these data were never archived, and some of the archived data were on media and in formats that are outmoded, or were deposited with little or no useful documentation to aid outside users. This is particularly true of the ALSEP data returned autonomously for many years after the Apollo missions ended. The purpose of the Lunar Data Project and the Planetary Data System (PDS) Lunar Data Node is to take data collections already archived at the NASA Space Science Data Coordinated Archive (NSSDCA) and prepare them for archiving through PDS, and to locate lunar data that were never archived, bring them into NSSDCA, and then archive them through PDS. Preparing these data for archiving involves reading the data from the original media, be it magnetic tape, microfilm, microfiche, or hard-copy document, converting the outmoded, often binary, formats when necessary, putting them into a standard digital form accepted by PDS, collecting the necessary ancillary data and documentation (metadata) to ensure that the data are usable and well-described, summarizing the metadata in documentation to be included in the data set, adding other information such as references, mission and instrument descriptions, contact information, and related documentation, and packaging the results in a PDS-compliant data set. The data set is then validated and reviewed by a group of external scientists as part of the PDS final archive process. We present a status report on some of the data sets that we are processing.

  18. Rendering an archive in three dimensions

    NASA Astrophysics Data System (ADS)

    Leiman, David A.; Twose, Claire; Lee, Teresa Y. H.; Fletcher, Alex; Yoo, Terry S.

    2003-05-01

    We examine the requirements for a publicly accessible, online collection of three-dimensional biomedical image data, including those yielded by radiological processes such as MRI, ultrasound and others. Intended as a repository and distribution mechanism for such medical data, we created the National Online Volumetric Archive (NOVA) as a case study aimed at identifying the multiple issues involved in realizing a large-scale digital archive. In the paper we discuss such factors as the current legal and health information privacy policy affecting the collection of human medical images, retrieval and management of information and technical implementation. This project culminated in the launching of a website that includes downloadable datasets and a prototype data submission system.

  19. Extending the role of a healthcare digital library environment to support orthopaedic research.

    PubMed

    Miles-Board, Timothy; Carr, Leslie; Wills, Gary; Power, Guillermo; Bailey, Christopher; Hall, Wendy; Stenning, Matthew; Grange, Simon

    2006-06-01

    A digital archive, together with its users and its contents, does not exist in isolation; there is a cycle of activities which provides the context for the archive's existence. In arguing for the broadening of the traditional view of digital libraries as merely collections towards the processes of collecting and deploying, we have developed an extend ed digital library environment for orthopaedic surgeons which bridges the gap between the undertaking of experimental work and the dissemination of its results through electronic publication.

  20. 76 FR 46855 - Agency Information Collection Activities: Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-03

    ... NATIONAL ARCHIVES AND RECORDS ADMINISTRATION Agency Information Collection Activities: Proposed Collection; Comment Request AGENCY: National Archives and Records Administration (NARA). ACTION: Notice... original archival records in a National Archives and Records Administration facility. The public is invited...

  1. ASF archive issues: Current status, past history, and questions for the future

    NASA Technical Reports Server (NTRS)

    Goula, Crystal A.; Wales, Carl

    1994-01-01

    The Alaska SAR Facility (ASF) collects, processes, archives, and distributes data from synthetic aperture radar (SAR) satellites in support of scientific research. ASF has been in operation since 1991 and presently has an archive of over 100 terabytes of data. ASF is performing an analysis of its magnetic tape storage system to ensure long-term preservation of this archive. Future satellite missions have the possibility of doubling to tripling the amounts of data that ASF acquires. ASF is examining the current data systems and the high volume storage, and exploring future concerns and solutions.

  2. The UK Biobank sample handling and storage protocol for the collection, processing and archiving of human blood and urine.

    PubMed

    Elliott, Paul; Peakman, Tim C

    2008-04-01

    UK Biobank is a large prospective study in the UK to investigate the role of genetic factors, environmental exposures and lifestyle in the causes of major diseases of late and middle age. Extensive data and biological samples are being collected from 500,000 participants aged between 40 and 69 years. The biological samples that are collected and how they are processed and stored will have a major impact on the future scientific usefulness of the UK Biobank resource. The aim of the UK Biobank sample handling and storage protocol is to specify methods for the collection and storage of participant samples that give maximum scientific return within the available budget. Processing or storage methods that, as far as can be predicted, will preclude current or future assays have been avoided. The protocol was developed through a review of the literature on sample handling and processing, wide consultation within the academic community and peer review. Protocol development addressed which samples should be collected, how and when they should be processed and how the processed samples should be stored to ensure their long-term integrity. The recommended protocol was extensively tested in a series of validation studies. UK Biobank collects about 45 ml blood and 9 ml of urine with minimal local processing from each participant using the vacutainer system. A variety of preservatives, anti-coagulants and clot accelerators is used appropriate to the expected end use of the samples. Collection of other material (hair, nails, saliva and faeces) was also considered but rejected for the full cohort. Blood and urine samples from participants are transported overnight by commercial courier to a central laboratory where they are processed and aliquots of urine, plasma, serum, white cells and red cells stored in ultra-low temperature archives. Aliquots of whole blood are also stored for potential future production of immortalized cell lines. A standard panel of haematology assays is completed on whole blood from all participants, since such assays need to be conducted on fresh samples (whereas other assays can be done on stored samples). By the end of the recruitment phase, 15 million sample aliquots will be stored in two geographically separate archives: 9.5 million in a -80 degrees C automated archive and 5.5 million in a manual liquid nitrogen archive at -180 degrees C. Because of the size of the study and the numbers of samples obtained from participants, the protocol stipulates a highly automated approach for the processing and storage of samples. Implementation of the processes, technology, systems and facilities has followed best practices used in manufacturing industry to reduce project risk and to build in quality and robustness. The data produced from sample collection, processing and storage are highly complex and are managed by a commercially available LIMS system fully integrated with the entire process. The sample handling and storage protocol adopted by UK Biobank provides quality assured and validated methods that are feasible within the available funding and reflect the size and aims of the project. Experience from recruiting and processing the first 40,000 participants to the study demonstrates that the adopted methods and technologies are fit-for-purpose and robust.

  3. Digital Archival Image Collections: Who Are the Users?

    ERIC Educational Resources Information Center

    Herold, Irene M. H.

    2010-01-01

    Archival digital image collections are a relatively new phenomenon in college library archives. Digitizing archival image collections may make them accessible to users worldwide. There has been no study to explore whether collections on the Internet lead to users who are beyond the institution or a comparison of users to a national or…

  4. Archive of single beam and swath bathymetry data collected nearshore of the Gulf Islands National Seashore, Mississippi, from West Ship Island, Mississippi, to Dauphin Island, Alabama: Methods and data report for USGS Cruises 08CCT01 and 08CCT02, July 2008, and 09CCT03 and 09CCT04, June 2009

    USGS Publications Warehouse

    DeWitt, Nancy T.; Flocks, James G.; Pendleton, Elizabeth A.; Hansen, Mark E.; Reynolds, B.J.; Kelso, Kyle W.; Wiese, Dana S.; Worley, Charles R.

    2012-01-01

    See the digital FACS equipment log for details about the acquisition equipment used. Raw datasets are stored digitally at the USGS St. Petersburg Coastal and Marine Science Center and processed systematically using Novatel's GrafNav version 7.6, SANDS version 3.7, SEA SWATHplus version 3.06.04.03, CARIS HIPS AND SIPS version 3.6, and ESRI ArcGIS version 9.3.1. For more information on processing refer to the Equipment and Processing page. Chirp seismic data were also collected during these surveys and are archived separately.

  5. User Impact on Selection, Digitization, and the Development of Digital Special Collections

    ERIC Educational Resources Information Center

    Mills, Alexandra

    2015-01-01

    Libraries and archives digitize their special collections in an effort to increase access to rare and unique items. To ensure that resulting digital collections meet user needs, institutions have made assessment, consultation, and user participation integral to digitization initiatives and the selection process. Institutions must also build…

  6. Water level ingest, archive and processing system - an integral part of NOAA's tsunami database

    NASA Astrophysics Data System (ADS)

    McLean, S. J.; Mungov, G.; Dunbar, P. K.; Price, D. J.; Mccullough, H.

    2013-12-01

    The National Oceanic and Atmospheric Administration (NOAA), National Geophysical Data Center (NGDC) and collocated World Data Service for Geophysics (WDS) provides long-term archive, data management, and access to national and global tsunami data. Archive responsibilities include the NOAA Global Historical Tsunami event and runup database, damage photos, as well as other related hazards data. Beginning in 2008, NGDC was given the responsibility of archiving, processing and distributing all tsunami and hazards-related water level data collected from NOAA observational networks in a coordinated and consistent manner. These data include the Deep-ocean Assessment and Reporting of Tsunami (DART) data provided by the National Data Buoy Center (NDBC), coastal-tide-gauge data from the National Ocean Service (NOS) network and tide-gauge data from the two National Weather Service (NWS) Tsunami Warning Centers (TWCs) regional networks. Taken together, this integrated archive supports tsunami forecast, warning, research, mitigation and education efforts of NOAA and the Nation. Due to the variety of the water level data, the automatic ingest system was redesigned, along with upgrading the inventory, archive and delivery capabilities based on modern digital data archiving practices. The data processing system was also upgraded and redesigned focusing on data quality assessment in an operational manner. This poster focuses on data availability highlighting the automation of all steps of data ingest, archive, processing and distribution. Examples are given from recent events such as the October 2012 hurricane Sandy, the Feb 06, 2013 Solomon Islands tsunami, and the June 13, 2013 meteotsunami along the U.S. East Coast.

  7. Raw and processed ground-penetrating radar and postprocessed differential global positioning system data collected from Assateague Island, Maryland, October 2014

    USGS Publications Warehouse

    Zaremba, Nicholas J.; Bernier, Julie C.; Forde, Arnell S.; Smith, Christopher G.

    2016-06-08

    This report serves as an archive of GPR and DGPS data collected from Assateague Island in October 2014. Data products, including raw GPR and processed DGPS data, elevation corrected GPR profiles, and accompanying Federal Geographic Data Committee metadata can be downloaded from the Data Downloads page.

  8. Evaluation of TxDOT'S traffic data collection and load forecasting process

    DOT National Transportation Integrated Search

    2001-01-01

    This study had two primary objectives: (1) compare current Texas Department of Transportation (TxDOT) procedures and protocols with the state-of-the-practice and the needs of its data customers; and (2) develop enhanced traffic collection, archival, ...

  9. 77 FR 36297 - Agency Information Collection Activities: Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-18

    ... NATIONAL ARCHIVES AND RECORDS ADMINISTRATION Agency Information Collection Activities: Proposed Collection; Comment Request AGENCY: National Archives and Records Administration (NARA). ACTION: Notice... Reduction Act Comments (NHP), Room 4400, National Archives and Records Administration, 8601 Adelphi Rd...

  10. 78 FR 78401 - Agency Information Collection Activities: Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-26

    ... NATIONAL ARCHIVES AND RECORDS ADMINISTRATION [NARA-2014-012] Agency Information Collection Activities: Proposed Collection; Comment Request AGENCY: National Archives and Records Administration (NARA... biomedical statistical research in archival records containing highly personal information. The second is...

  11. 76 FR 72449 - Agency Information Collection Activities: Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-23

    ... NATIONAL ARCHIVES AND RECORDS ADMINISTRATION Agency Information Collection Activities: Proposed Collection; Comment Request AGENCY: National Archives and Records Administration (NARA). ACTION: Notice... Reduction Act Comments (ISP), Room 4400, National Archives and Records Administration, 8601 Adelphi Rd...

  12. Merging and Visualization of Archived Oceanographic Acoustic, Optical, and Sensor Data to Support Improved Access and Interpretation

    NASA Astrophysics Data System (ADS)

    Malik, M. A.; Cantwell, K. L.; Reser, B.; Gray, L. M.

    2016-02-01

    Marine researchers and managers routinely rely on interdisciplinary data sets collected using hull-mounted sonars, towed sensors, or submersible vehicles. These data sets can be broadly categorized into acoustic remote sensing, imagery-based observations, water property measurements, and physical samples. The resulting raw data sets are overwhelmingly large and complex, and often require specialized software and training to process. To address these challenges, NOAA's Office of Ocean Exploration and Research (OER) is developing tools to improve the discoverability of raw data sets and integration of quality-controlled processed data in order to facilitate re-use of archived oceanographic data. Majority of recently collected OER raw oceanographic data can be retrieved from national data archives (e.g. NCEI and NOAA central library). Merging of disperse data sets by scientists with diverse expertise, however remains problematic. Initial efforts at OER have focused on merging geospatial acoustic remote sensing data with imagery and water property measurements that typically lack direct geo-referencing. OER has developed `smart' ship and submersible tracks that can provide a synopsis of geospatial coverage of various data sets. Tools under development enable scientists to quickly assess the relevance of archived OER data to their respective research or management interests, and enable quick access to the desired raw and processed data sets. Pre-processing of the data and visualization to combine various data sets also offers benefits to streamline data quality assurance and quality control efforts.

  13. 75 FR 69474 - Agency Information Collection Activities: Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-12

    ... NATIONAL ARCHIVES AND RECORDS ADMINISTRATION Agency Information Collection Activities: Proposed Collection; Comment Request AGENCY: National Archives and Records Administration (NARA). ACTION: Notice... research in archival records containing highly personal information. The second is an application that is...

  14. 75 FR 66802 - Agency Information Collection Activities: Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-29

    ... NATIONAL ARCHIVES AND RECORDS ADMINISTRATION Agency Information Collection Activities: Proposed Collection; Comment Request AGENCY: National Archives and Records Administration (NARA). ACTION: Notice...: NATF 81, National Archives Order for Copies of Ship Passenger Arrival Records; NATF 82, National...

  15. The what, why, and how of born-open data.

    PubMed

    Rouder, Jeffrey N

    2016-09-01

    Although many researchers agree that scientific data should be open to scrutiny to ferret out poor analyses and outright fraud, most raw data sets are not available on demand. There are many reasons researchers do not open their data, and one is technical. It is often time consuming to prepare and archive data. In response, my laboratory has automated the process such that our data are archived the night they are created without any human approval or action. All data are versioned, logged, time stamped, and uploaded including aborted runs and data from pilot subjects. The archive is GitHub, github.com, the world's largest collection of open-source materials. Data archived in this manner are called born open. In this paper, I discuss the benefits of born-open data and provide a brief technical overview of the process. I also address some of the common concerns about opening data before publication.

  16. Archive of side scan sonar and swath bathymetry data collected during USGS cruise 10CCT03 offshore of the Gulf Islands National Seashore, Mississippi, from East Ship Island, Mississippi, to Dauphin Island, Alabama, April 2010

    USGS Publications Warehouse

    DeWitt, Nancy T.; Flocks, James G.; Pfeiffer, William R.; Gibson, James N.; Wiese, Dana S.

    2012-01-01

    Data were collected aboard the U.S. Army Corps of Engineers (USACE) SV Irvington, a 56-foot (ft) Kvichak Marine Industries, Inc., catamaran (fig. 2). Side scan sonar and multibeam bathymetry data were collected simultaneously along the tracklines. The side scan sonar towfish was towed off the starboard side just slightly behind the vessel, close to the seafloor. The multibeam transducer was attached to a retractable strut-arm lowered between the catamaran hulls. Navigation was acquired with an Applanix POS MV and differentially corrected using the broadcast signal from a local National Geodetic Survey (NGS) Continuously Operating Reference Station (CORS) beacon. See the digital FACS equipment log for details about the acquisition equipment used. Raw datasets were stored digitally and processed using HYPACK Inc., HYSWEEP software at the USACE Mobile, Ala., District office. For more information on processing refer to the Equipment and Processing page. Chirp seismic data were also collected during this survey and are archived separately.

  17. Shared Governance and Regional Accreditation: Institutional Processes and Perceptions

    ERIC Educational Resources Information Center

    McGrane, Wendy L.

    2013-01-01

    This qualitative single-case research study was conducted to gain deeper understanding of the institutional processes to address shared governance accreditation criteria and to determine whether institutional processes altered stakeholder perceptions of shared governance. The data collection strategies were archival records and personal…

  18. Archive of digital Boomer seismic reflection data collected during USGS Cruises 94CCT01 and 95CCT01, eastern Texas and western Louisiana, 1994 and 1995

    USGS Publications Warehouse

    Calderon, Karynna; Dadisman, Shawn V.; Kindinger, Jack G.; Flocks, James G.; Morton, Robert A.; Wiese, Dana S.

    2004-01-01

    In June of 1994 and August and September of 1995, the U.S. Geological Survey, in cooperation with the University of Texas Bureau of Economic Geology, conducted geophysical surveys of the Sabine and Calcasieu Lake areas and the Gulf of Mexico offshore eastern Texas and western Louisiana. This report serves as an archive of unprocessed digital boomer seismic reflection data, trackline maps, navigation files, observers' logbooks, GIS information, and formal FGDC metadata. In addition, a filtered and gained GIF image of each seismic profile is provided. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG-Y format (Barry and others, 1975) and may be downloaded and processed with commercial or public domain software such as Seismic Unix (SU). Examples of SU processing scripts and in-house (USGS) software for viewing SEG-Y files (Zihlman, 1992) are also provided. Processed profile images, trackline maps, navigation files, and formal metadata may be viewed with a web browser. Scanned handwritten logbooks and Field Activity Collection System (FACS) logs may be viewed with Adobe Reader.

  19. Archive of digital Boomer and Chirp seismic reflection data collected during USGS Cruises 01RCE05 and 02RCE01 in the Lower Atchafalaya River, Mississippi River Delta, and offshore southeastern Louisiana, October 23-30, 2001, and August 18-19, 2002

    USGS Publications Warehouse

    Calderon, Karynna; Dadisman, Shawn V.; Kindinger, Jack G.; Flocks, James G.; Ferina, Nicholas F.; Wiese, Dana S.

    2004-01-01

    In October of 2001 and August of 2002, the U.S. Geological Survey conducted geophysical surveys of the Lower Atchafalaya River, the Mississippi River Delta, Barataria Bay, and the Gulf of Mexico south of East Timbalier Island, Louisiana. This report serves as an archive of unprocessed digital marine seismic reflection data, trackline maps, navigation files, observers' logbooks, GIS information, and formal FGDC metadata. In addition, a filtered and gained GIF image of each seismic profile is provided. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG-Y format (Barry and othes, 1975) and may be downloaded and processed with commercial or public domain software such as Seismic Unix (SU). Examples of SU processing scripts and in-house (USGS) software for viewing SEG-Y files (Zihlman, 1992) are also provided. Processed profile images, trackline maps, navigation files, and formal metadata may be viewed with a web browser. Scanned handwritten logbooks and Field Activity Collection System (FACS) logs may be viewed with Adobe Reader.

  20. Content Platforms Meet Data Storage, Retrieval Needs

    NASA Technical Reports Server (NTRS)

    2012-01-01

    Earth is under a constant barrage of information from space. Whether from satellites orbiting our planet, spacecraft circling Mars, or probes streaking toward the far reaches of the Solar System, NASA collects massive amounts of data from its spacefaring missions each day. NASA s Earth Observing System (EOS) satellites, for example, provide daily imagery and measurements of Earth s atmosphere, oceans, vegetation, and more. The Earth Observing System Data and Information System (EOSDIS) collects all of that science data and processes, archives, and distributes it to researchers around the globe; EOSDIS recently reached a total archive volume of 4.5 petabytes. Try to store that amount of information in your standard, four-drawer file cabinet, and you would need 90 million to get the job done. To manage the flood of information, NASA has explored technologies to efficiently collect, archive, and provide access to EOS data for scientists today and for years to come. One such technology is now providing similar capabilities to businesses and organizations worldwide.

  1. 76 FR 4737 - Agency Information Collection Activities: Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-26

    ... NATIONAL ARCHIVES AND RECORDS ADMINISTRATION Agency Information Collection Activities: Submission for OMB Review; Comment Request AGENCY: National Archives and Records Administration (NARA). ACTION... concerning the following information collections: 1. Title: Statistical Research in Archival Records...

  2. 78 FR 50451 - Agency Information Collection Activities: Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-19

    ... NATIONAL ARCHIVES AND RECORDS ADMINISTRATION [NARA-2013-041] Agency Information Collection Activities: Submission for OMB Review; Comment Request AGENCY: National Archives and Records Administration... following information collection: Title: National Archives and Records Administration Training and Event...

  3. Web Archiving for the Rest of Us: How to Collect and Manage Websites Using Free and Easy Software

    ERIC Educational Resources Information Center

    Dunn, Katharine; Szydlowski, Nick

    2009-01-01

    Large-scale projects such as the Internet Archive (www.archive.org) send out crawlers to gather snapshots of much of the web. This massive collection of archived websites may include content of interest to one's patrons. But if librarians want to control exactly when and what is archived, relying on someone else to do the archiving is not ideal.…

  4. Lessons Learned while Exploring Cloud-Native Architectures for NASA EOSDIS Applications and Systems

    NASA Astrophysics Data System (ADS)

    Pilone, D.

    2016-12-01

    As new, high data rate missions begin collecting data, the NASA's Earth Observing System Data and Information System (EOSDIS) archive is projected to grow roughly 20x to over 300PBs by 2025. To prepare for the dramatic increase in data and enable broad scientific inquiry into larger time series and datasets, NASA has been exploring the impact of applying cloud technologies throughout EOSDIS. In this talk we will provide an overview of NASA's prototyping and lessons learned in applying cloud architectures to: Highly scalable and extensible ingest and archive of EOSDIS data Going "all-in" on cloud based application architectures including "serverless" data processing pipelines and evaluating approaches to vendor-lock in Rethinking data distribution and approaches to analysis in a cloud environment Incorporating and enforcing security controls while minimizing the barrier for research efforts to deploy to NASA compliant, operational environments. NASA's Earth Observing System (EOS) is a coordinated series of satellites for long term global observations. NASA's Earth Observing System Data and Information System (EOSDIS) is a multi-petabyte-scale archive of environmental data that supports global climate change research by providing end-to-end services from EOS instrument data collection to science data processing to full access to EOS and other earth science data. On a daily basis, the EOSDIS ingests, processes, archives and distributes over 3 terabytes of data from NASA's Earth Science missions representing over 6000 data products ranging from various types of science disciplines. EOSDIS has continually evolved to improve the discoverability, accessibility, and usability of high-impact NASA data spanning the multi-petabyte-scale archive of Earth science data products.

  5. Archiving Software Systems: Approaches to Preserve Computational Capabilities

    NASA Astrophysics Data System (ADS)

    King, T. A.

    2014-12-01

    A great deal of effort is made to preserve scientific data. Not only because data is knowledge, but it is often costly to acquire and is sometimes collected under unique circumstances. Another part of the science enterprise is the development of software to process and analyze the data. Developed software is also a large investment and worthy of preservation. However, the long term preservation of software presents some challenges. Software often requires a specific technology stack to operate. This can include software, operating systems and hardware dependencies. One past approach to preserve computational capabilities is to maintain ancient hardware long past its typical viability. On an archive horizon of 100 years, this is not feasible. Another approach to preserve computational capabilities is to archive source code. While this can preserve details of the implementation and algorithms, it may not be possible to reproduce the technology stack needed to compile and run the resulting applications. This future forward dilemma has a solution. Technology used to create clouds and process big data can also be used to archive and preserve computational capabilities. We explore how basic hardware, virtual machines, containers and appropriate metadata can be used to preserve computational capabilities and to archive functional software systems. In conjunction with data archives, this provides scientist with both the data and capability to reproduce the processing and analysis used to generate past scientific results.

  6. The Element of Surprise: Preparing for the Possibility of Hazardous Materials within Archival Collections

    ERIC Educational Resources Information Center

    Wiener, Judith A.

    2007-01-01

    Unprocessed archival collections can contain unknown and potentially hazardous materials that can be harmful to other collections and staff. Archival literature largely focuses on collection and personnel dangers posed by environmental hazards such as mold and insect infestation but not on pharmaceutical and chemical hazards. In this article, the…

  7. Archive of digital boomer seismic reflection data collected during USGS field activities 95LCA03 and 96LCA02 in the Peace River of West-Central Florida, 1995 and 1996

    USGS Publications Warehouse

    Calderon, Karynna; Dadisman, Shawn V.; Tihansky, Ann B.; Lewelling, Bill R.; Flocks, James G.; Wiese, Dana S.; Kindinger, Jack G.; Harrison, Arnell S.

    2006-01-01

    In October and November of 1995 and February of 1996, the U.S. Geological Survey, in cooperation with the Southwest Florida Water Management District, conducted geophysical surveys of the Peace River in west-central Florida from east of Bartow to west of Arcadia. This report serves as an archive of unprocessed digital boomer seismic reflection data, trackline maps, navigation files, GIS files, Field Activity Collection System (FACS) logs, observers' logbooks, and formal FGDC metadata. Filtered and gained digital images of the seismic profiles are also provided. Refer to the Acronyms page for expansion of acronyms and abbreviations used in this report. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG-Y format (Barry and others, 1975) and may be downloaded and processed with commercial or public domain software such as Seismic Unix (SU). Example SU processing scripts and USGS software for viewing the SEG-Y files (Zihlman, 1992) are also provided.

  8. Recovery of Missing Apollo Lunar ALSEP Data

    NASA Astrophysics Data System (ADS)

    Taylor, P. T.; Nagihara, S.; Nakamura, Y.; Williams, D. R.; Kiefer, W. S.

    2016-12-01

    Apollo astronauts on missions 12, 14, 15, 16, and 17 installed instruments on the lunar surface, the Apollo Lunar Surface Experiment Package (ALSEP). The last astronauts departed from the Moon in December 1972; however ALSEP instruments continued to send data until 1977. These long-term in-situ data, along with data from orbital satellites launched from the Command Module, are some of the best information on the Moon's environment, surface and interior. Much of these data were archived at the now NASA Space Science Data Coordinated Archive (NSSDCA) in the 70's and 80's, but some were never submitted. This is particularly true of the ALSEP data returned autonomously after the last Apollo astronauts departed. The data that were archived were generally on microfilm, microfiche, or magnetic tape in now obsolete formats, making them difficult to use. Some of the documentation and metadata are insufficient for current use. The Lunar Data Node at Goddard Space Flight Center, under the auspices of the Planetary Data System (PDS) Geosciences Node, is attempting to collect and restore the original data that were never archived, in addition to much of the archived data that were on media and in formats that are outmoded. 440 original data archival tapes for the ALSEP experiments were found at the Washington National Records Center. We have recently completed extraction of binary files from these tapes filling a number of gaps in the current ALSEP data collection at NSSDCA. Some of these experiments include: Solar Wind Spectrometer (Apollo12, 15); Cold Cathode Ion Gage (14, 15); Heat Flow (15, 17); Dust Detector (11, 12, 14, 15); Lunar Ejecta and Meteorites (17); Lunar Atmosphere composition Experiment (17); Suprathermal Ion Detector (12, 14, 15); Lunar Surface Magnetometer (12,15, 16). The purpose of the Lunar Data Project is to take data collections already archived at the NSSDCA and prepare them for archive through PDS, and to locate lunar data that were never archived into NSSDCA, and then archive them through PDS. In addition results of recent re-analyses of some of these data with advanced data processing algorithms revealed more detailed interpretation (e.g., seismicity data). We expect that more techniques will be developed in the future.

  9. Archival policies and collections database for the Woods Hole Science Center's marine sediment samples

    USGS Publications Warehouse

    Buczkowski, Brian J.; Kelsey, Sarah A.

    2007-01-01

    The Woods Hole Science Center of the U.S. Geological Survey (USGS) has been an active member of the Woods Hole research community, Woods Hole, Massachusetts, for over 40 years. In that time there have been many projects that involved the collection of sediment samples conducted by USGS scientists and technicians for the research and study of seabed environments and processes. These samples were collected at sea or near shore and then brought back to the Woods Hole Science Center (WHSC) for analysis. While at the center, samples are stored in ambient temperature, refrigerated and freezing conditions ranging from +2º Celsius to -18º Celsius, depending on the best mode of preparation for the study being conducted or the duration of storage planned for the samples. Recently, storage methods and available storage space have become a major concern at the WHSC. The core and sediment archive program described herein has been initiated to set standards for the management, methods, and duration of sample storage. A need has arisen to maintain organizational consistency and define storage protocol. This handbook serves as a reference and guide to all parties interested in using and accessing the WHSC's sample archive and also defines all the steps necessary to construct and maintain an organized collection of geological samples. It answers many questions as to the way in which the archive functions.

  10. 75 FR 63141 - Information Collection; Research Data Archive Use Tracking

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-14

    ..., filing of petitions and applications and agency #0;statements of organization and functions are examples... Information Collection; Research Data Archive Use Tracking AGENCY: Forest Service, USDA. ACTION: Notice... information collection, Research Data Archive Use Tracking. DATES: Comments must be received in writing on or...

  11. The Preservation of Paper Collections in Archives.

    ERIC Educational Resources Information Center

    Adams, Cynthia Ann

    The preservation methods used for paper collections in archives were studied through a survey of archives in the metropolitan Atlanta (Georgia) area. The preservation policy or program was studied, and the implications for conservators and preservation officers were noted. Twelve of 15 archives responded (response rate of 80 percent). Basic…

  12. Policies and Procedures for Accessing Archived NASA Lunar Data via the Web

    NASA Technical Reports Server (NTRS)

    James, Nathan L.; Williams, David R.

    2011-01-01

    The National Space Science Data Center (NSSDC) was established by NASA to provide for the preservation and dissemination of scientific data from NASA missions. This paper describes the policies specifically related to lunar science data. NSSDC presently archives 660 lunar data collections. Most of these data (423 units) are stored offline in analog format. The remainder of this collection consists of magnetic tapes and discs containing approximately 1.7 TB of digital lunar data. The active archive for NASA lunar data is the Planetary Data System (PDS). NSSDC has an agreement with the PDS Lunar Data Node to assist in the restoration and preparation of NSSDC-resident lunar data upon request for access and distribution via the PDS archival system. Though much of NSSDC's digital store also resides in PDS, NSSDC has many analog data collections and some digital lunar data sets that are not in PDS. NSSDC stands ready to make these archived lunar data accessible to both the research community and the general public upon request as resources allow. Newly requested offline lunar data are digitized and moved to near-line storage devices called digital linear tape jukeboxes. The data are then packaged and made network-accessible via FTP for the convenience of a growing segment of the user community. This publication will 1) discuss the NSSDC processes and policies that govern how NASA lunar data is preserved, restored, and made accessible via the web and 2) highlight examples of special lunar data requests.

  13. Archive of digital and digitized analog boomer seismic reflection data collected during USGS cruise 96CCT02 in Copano, Corpus Christi, and Nueces Bays and Corpus Christi Bayou, Texas, July 1996

    USGS Publications Warehouse

    Harrison, Arnell S.; Dadisman, Shawn V.; Kindinger, Jack G.; Morton, Robert A.; Blum, Mike D.; Wiese, Dana S.; Subiño, Janice A.

    2007-01-01

    In June of 1996, the U.S. Geological Survey conducted geophysical surveys from Nueces to Copano Bays, Texas. This report serves as an archive of unprocessed digital boomer seismic reflection data, trackline maps, navigation files, GIS information, cruise log, and formal FGDC metadata. Filtered and gained digital images of the seismic profiles and high resolution scanned TIFF images of the original paper printouts are also provided. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG-Y format (Barry and others, 1975) and may be downloaded and processed with commercial or public domain software such as Seismic Unix (SU). Example SU processing scripts and USGS software for viewing the SEG-Y files (Zihlman, 1992) are also provided.

  14. Archive of Digital Boomer Seismic Reflection Data Collected During USGS Field Activity 08LCA04 in Lakes Cherry, Helen, Hiawassee, Louisa, and Prevatt, Central Florida, September 2008

    USGS Publications Warehouse

    Harrison, Arnell S.; Dadisman, Shawn V.; Davis, Jeffrey B.; Flocks, James G.; Wiese, Dana S.

    2009-01-01

    From September 2 through 4, 2008, the U.S. Geological Survey and St. Johns River Water Management District (SJRWMD) conducted geophysical surveys in Lakes Cherry, Helen, Hiawassee, Louisa, and Prevatt, central Florida. This report serves as an archive of unprocessed digital boomer seismic reflection data, trackline maps, navigation files, GIS information, FACS logs, and formal FGDC metadata. Filtered and gained digital images of the seismic profiles are also provided. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG-Y format (Barry and others, 1975) and may be downloaded and processed with commercial or public domain software such as Seismic Unix (SU). Example SU processing scripts and USGS software for viewing the SEG-Y files (Zihlman, 1992) are also provided.

  15. Lessons Learned While Exploring Cloud-Native Architectures for NASA EOSDIS Applications and Systems

    NASA Technical Reports Server (NTRS)

    Pilone, Dan; Mclaughlin, Brett; Plofchan, Peter

    2017-01-01

    NASA's Earth Observing System (EOS) is a coordinated series of satellites for long term global observations. NASA's Earth Observing System Data and Information System (EOSDIS) is a multi-petabyte-scale archive of environmental data that supports global climate change research by providing end-to-end services from EOS instrument data collection to science data processing to full access to EOS and other earth science data. On a daily basis, the EOSDIS ingests, processes, archives and distributes over 3 terabytes of data from NASA's Earth Science missions representing over 6000 data products ranging from various types of science disciplines. EOSDIS has continually evolved to improve the discoverability, accessibility, and usability of high-impact NASA data spanning the multi-petabyte-scale archive of Earth science data products. Reviewed and approved by Chris Lynnes.

  16. ROSETTA: How to archive more than 10 years of mission

    NASA Astrophysics Data System (ADS)

    Barthelemy, Maud; Heather, D.; Grotheer, E.; Besse, S.; Andres, R.; Vallejo, F.; Barnes, T.; Kolokolova, L.; O'Rourke, L.; Fraga, D.; A'Hearn, M. F.; Martin, P.; Taylor, M. G. G. T.

    2018-01-01

    The Rosetta spacecraft was launched in 2004 and, after several planetary and two asteroid fly-bys, arrived at comet 67P/Churyumov-Gerasimenko in August 2014. After escorting the comet for two years and executing its scientific observations, the mission ended on 30 September 2016 through a touch down on the comet surface. This paper describes how the Planetary Science Archive (PSA) and the Planetary Data System - Small Bodies Node (PDS-SBN) worked with the Rosetta instrument teams to prepare the science data collected over the course of the Rosetta mission for inclusion in the science archive. As Rosetta is an international mission in collaboration between ESA and NASA, all science data from the mission are fully archived within both the PSA and the PDS. The Rosetta archiving process, supporting tools, archiving systems, and their evolution throughout the mission are described, along with a discussion of a number of the challenges faced during the Rosetta implementation. The paper then presents the current status of the archive for each of the science instruments, before looking to the improvements planned both for the archive itself and for the Rosetta data content. The lessons learned from the first 13 years of archiving on Rosetta are finally discussed with an aim to help future missions plan and implement their science archives.

  17. Operational environmental satellite archives in the 21st Century

    NASA Astrophysics Data System (ADS)

    Barkstrom, Bruce R.; Bates, John J.; Privette, Jeff; Vizbulis, Rick

    2007-09-01

    NASA, NOAA, and USGS collections of Earth science data are large, federated, and have active user communities and collections. Our experience raises five categories of issues for long-term archival: *Organization of the data in the collections is not well-described by text-based categorization principles *Metadata organization for these data is not well-described by Dublin Core and needs attention to data access and data use patterns *Long-term archival requires risk management approaches to dealing with the unique threats to knowledge preservation specific to digital information *Long-term archival requires careful attention to archival cost management *Professional data stewards for these collections may require special training. This paper suggests three mechanisms for improving the quality of long-term archival: *Using a maturity model to assess the readiness of data for accession, for preservation, and for future data usefulness *Developing a risk management strategy for systematically dealing with threats of data loss *Developing a life-cycle cost model for continuously evolving the collections and the data centers that house them.

  18. jade: An End-To-End Data Transfer and Catalog Tool

    NASA Astrophysics Data System (ADS)

    Meade, P.

    2017-10-01

    The IceCube Neutrino Observatory is a cubic kilometer neutrino telescope located at the Geographic South Pole. IceCube collects 1 TB of data every day. An online filtering farm processes this data in real time and selects 10% to be sent via satellite to the main data center at the University of Wisconsin-Madison. IceCube has two year-round on-site operators. New operators are hired every year, due to the hard conditions of wintering at the South Pole. These operators are tasked with the daily operations of running a complex detector in serious isolation conditions. One of the systems they operate is the data archiving and transfer system. Due to these challenging operational conditions, the data archive and transfer system must above all be simple and robust. It must also share the limited resource of satellite bandwidth, and collect and preserve useful metadata. The original data archive and transfer software for IceCube was written in 2005. After running in production for several years, the decision was taken to fully rewrite it, in order to address a number of structural drawbacks. The new data archive and transfer software (JADE2) has been in production for several months providing improved performance and resiliency. One of the main goals for JADE2 is to provide a unified system that handles the IceCube data end-to-end: from collection at the South Pole, all the way to long-term archive and preservation in dedicated repositories at the North. In this contribution, we describe our experiences and lessons learned from developing and operating the data archive and transfer software for a particle physics experiment in extreme operational conditions like IceCube.

  19. Stories, skulls, and colonial collections.

    PubMed

    Roque, Ricardo

    2011-01-01

    The essay explores the hypothesis of colonial collecting processes involving the active addition of the colonial context and historical past to museum objects through the production of short stories. It examines the emergent historicity of collections through a focus on the "histories" that museum workers and colonial agents have been attaching to scientific collections of human skulls. Drawing on the notions of collection trajectory and historiographical work, it offers an alternative perspective from which to approach the creation of singular histories and individual archives for objects in collections.

  20. Global positioning system survey data for active seismic and volcanic areas of eastern Sicily, 1994 to 2013

    PubMed Central

    Bonforte, Alessandro; Fagone, Sonia; Giardina, Carmelo; Genovese, Simone; Aiesi, Gianpiero; Calvagna, Francesco; Cantarero, Massimo; Consoli, Orazio; Consoli, Salvatore; Guglielmino, Francesco; Puglisi, Biagio; Puglisi, Giuseppe; Saraceno, Benedetto

    2016-01-01

    This work presents and describes a 20-year long database of GPS data collected by geodetic surveys over the seismically and volcanically active eastern Sicily, for a total of more than 6300 measurements. Raw data were initially collected from the various archives at the Istituto Nazionale di Geofisica e Vulcanologia, Sezione di Catania—Osservatorio Etneo and organized in a single repository. Here, quality and completeness checks were performed, while all necessary supplementary information were searched, collected, validated and organized together with the relevant data. Once all data and information collections were completed, raw binary data were converted into the universal ASCII RINEX format; all data are provided in this format with the necessary information for precise processing. In order to make the data archive readily consultable, we developed software allowing the user to easily search and obtain the needed data by simple alphanumeric and geographic queries. PMID:27479914

  1. Global positioning system survey data for active seismic and volcanic areas of eastern Sicily, 1994 to 2013.

    PubMed

    Bonforte, Alessandro; Fagone, Sonia; Giardina, Carmelo; Genovese, Simone; Aiesi, Gianpiero; Calvagna, Francesco; Cantarero, Massimo; Consoli, Orazio; Consoli, Salvatore; Guglielmino, Francesco; Puglisi, Biagio; Puglisi, Giuseppe; Saraceno, Benedetto

    2016-08-01

    This work presents and describes a 20-year long database of GPS data collected by geodetic surveys over the seismically and volcanically active eastern Sicily, for a total of more than 6300 measurements. Raw data were initially collected from the various archives at the Istituto Nazionale di Geofisica e Vulcanologia, Sezione di Catania-Osservatorio Etneo and organized in a single repository. Here, quality and completeness checks were performed, while all necessary supplementary information were searched, collected, validated and organized together with the relevant data. Once all data and information collections were completed, raw binary data were converted into the universal ASCII RINEX format; all data are provided in this format with the necessary information for precise processing. In order to make the data archive readily consultable, we developed software allowing the user to easily search and obtain the needed data by simple alphanumeric and geographic queries.

  2. Detection and Characterization of Circulating Tumour Cells from Frozen Peripheral Blood Mononuclear Cells

    PubMed Central

    Lu, David; Graf, Ryon P.; Harvey, Melissa; Madan, Ravi A.; Heery, Christopher; Marte, Jennifer; Beasley, Sharon; Tsang, Kwong Y.; Krupa, Rachel; Louw, Jessica; Wahl, Justin; Bales, Natalee; Landers, Mark; Marrinucci, Dena; Schlom, Jeffrey; Gulley, James L.; Dittamore, Ryan

    2015-01-01

    Retrospective analysis of patient tumour samples is a cornerstone of clinical research. CTC biomarker characterization offers a non-invasive method to analyse patient samples. However, current CTC technologies require prospective blood collection, thereby reducing the ability to utilize archived clinical cohorts with long-term outcome data. We sought to investigate CTC recovery from frozen, archived patient PBMC pellets. Matched samples from both mCRPC patients and mock samples, which were prepared by spiking healthy donor blood with cultured prostate cancer cell line cells, were processed “fresh” via Epic CTC Platform or from “frozen” PBMC pellets. Samples were analysed for CTC enumeration and biomarker characterization via immunofluorescent (IF) biomarkers, fluorescence in-situ hybridization (FISH) and CTC morphology. In the frozen patient PMBC samples, the median CTC recovery was 18%, compared to the freshly processed blood. However, abundance and localization of cytokeratin (CK) and androgen receptor (AR) protein, as measured by IF, were largely concordant between the fresh and frozen CTCs. Furthermore, a FISH analysis of PTEN loss showed high concordance in fresh vs. frozen. The observed data indicate that CTC biomarker characterization from frozen archival samples is feasible and representative of prospectively collected samples. PMID:28936240

  3. National Aeronautics and Space Administration Biological Specimen Repository

    NASA Technical Reports Server (NTRS)

    McMonigal, Kathleen A.; Pietrzyk, Robert a.; Johnson, Mary Anne

    2008-01-01

    The National Aeronautics and Space Administration Biological Specimen Repository (Repository) is a storage bank that is used to maintain biological specimens over extended periods of time and under well-controlled conditions. Samples from the International Space Station (ISS), including blood and urine, will be collected, processed and archived during the preflight, inflight and postflight phases of ISS missions. This investigation has been developed to archive biosamples for use as a resource for future space flight related research. The International Space Station (ISS) provides a platform to investigate the effects of microgravity on human physiology prior to lunar and exploration class missions. The storage of crewmember samples from many different ISS flights in a single repository will be a valuable resource with which researchers can study space flight related changes and investigate physiological markers. The development of the National Aeronautics and Space Administration Biological Specimen Repository will allow for the collection, processing, storage, maintenance, and ethical distribution of biosamples to meet goals of scientific and programmatic relevance to the space program. Archiving of the biosamples will provide future research opportunities including investigating patterns of physiological changes, analysis of components unknown at this time or analyses performed by new methodologies.

  4. Coastal bathymetry data collected in 2011 from the Chandeleur Islands, Louisiana

    USGS Publications Warehouse

    DeWitt, Nancy T.; Pfeiffer, William R.; Bernier, Julie C.; Buster, Noreen A.; Miselis, Jennifer L.; Flocks, James G.; Reynolds, Billy J.; Wiese, Dana S.; Kelso, Kyle W.

    2014-01-01

    This report serves as an archive of processed interferometric swath and single-beam bathymetry data. Geographic Iinformation System data products include a 50-meter cell-size interpolated bathymetry grid surface, trackline maps, and point data files. Additional files include error analysis maps, Field Activity Collection System logs, and formal Federal Geographic Data Committee metadata.

  5. 78 FR 35273 - Agency Information Collection Activities; Proposed Collection; Comment Request; General Licensing...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-12

    ..., to FDA for approval to market a product in interstate commerce. The container and package labeling... be in electronic format and in a form that FDA can process, review, and archive. This requirement is... 356h ``Application to Market a New Drug, Biologic, or an Antibiotic Drug for Human Use'' to harmonize...

  6. Preservation and Access to Manuscript Collections of the Czech National Library.

    ERIC Educational Resources Information Center

    Karen, Vladimir; Psohlavec, Stanislav

    In 1996, the Czech National Library started a large-scale digitization of its extensive and invaluable collection of historical manuscripts and printed books. Each page of the selected documents is scanned using a high-resolution, full-color digital camera, processed, and archived on a CD-ROM disk. HTML coded description is added to the entire…

  7. 76 FR 29012 - Agency Information Collection Activities: Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-19

    ... NATIONAL ARCHIVES AND RECORDS ADMINISTRATION Agency Information Collection Activities: Submission for OMB Review; Comment Request AGENCY: National Archives and Records Administration (NARA). ACTION... Personnel Records Center (NPRC) of the National Archives and Records Administration (NARA) administers...

  8. The development of a digitising service centre for natural history collections

    PubMed Central

    Tegelberg, Riitta; Haapala, Jaana; Mononen, Tero; Pajari, Mika; Saarenmaa, Hannu

    2012-01-01

    Abstract Digitarium is a joint initiative of the Finnish Museum of Natural History and the University of Eastern Finland. It was established in 2010 as a dedicated shop for the large-scale digitisation of natural history collections. Digitarium offers service packages based on the digitisation process, including tagging, imaging, data entry, georeferencing, filtering, and validation. During the process, all specimens are imaged, and distance workers take care of the data entry from the images. The customer receives the data in Darwin Core Archive format, as well as images of the specimens and their labels. Digitarium also offers the option of publishing images through Morphbank, sharing data through GBIF, and archiving data for long-term storage. Service packages can also be designed on demand to respond to the specific needs of the customer. The paper also discusses logistics, costs, and intellectual property rights (IPR) issues related to the work that Digitarium undertakes. PMID:22859879

  9. Kepler Data Release 4 Notes

    NASA Technical Reports Server (NTRS)

    Van Cleve, Jeffrey (Editor); Jenkins, Jon; Caldwell, Doug; Allen, Christopher L.; Batalha, Natalie; Bryson, Stephen T.; Chandrasekaran, Hema; Clarke, Bruce D.; Cote, Miles T.; Dotson, Jessie L.; hide

    2010-01-01

    The Data Analysis Working Group have released long and short cadence materials, including FFIs and Dropped Targets for the Public. The Kepler Science Office considers Data Release 4 to provide "browse quality" data. These notes have been prepared to give Kepler users of the Multimission Archive at STScl (MAST) a summary of how the data were collected and prepared, and how well the data processing pipeline is functioning on flight data. They will be updated for each release of data to the public archive and placed on MAST along with other Kepler documentation, at http://archive.stsci.edu/kepler/documents.html. Data release 3 is meant to give users the opportunity to examine the data for possibly interesting science and to involve the users in improving the pipeline for future data releases. To perform the latter service, users are encouraged to notice and document artifacts, either in the raw or processed data, and report them to the Science Office.

  10. 77 FR 56234 - Agency Information Collection Activities: Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-12

    ... NATIONAL ARCHIVES AND RECORDS ADMINISTRATION Agency Information Collection Activities: Submission for OMB Review; Comment Request AGENCY: National Archives and Records Administration (NARA). ACTION... not be viewed or advertised as an endorsement by the National Archives and Records Administration...

  11. AVIRIS and TIMS data processing and distribution at the land processes distributed active archive center

    NASA Technical Reports Server (NTRS)

    Mah, G. R.; Myers, J.

    1993-01-01

    The U.S. Government has initiated the Global Change Research program, a systematic study of the Earth as a complete system. NASA's contribution of the Global Change Research Program is the Earth Observing System (EOS), a series of orbital sensor platforms and an associated data processing and distribution system. The EOS Data and Information System (EOSDIS) is the archiving, production, and distribution system for data collected by the EOS space segment and uses a multilayer architecture for processing, archiving, and distributing EOS data. The first layer consists of the spacecraft ground stations and processing facilities that receive the raw data from the orbiting platforms and then separate the data by individual sensors. The second layer consists of Distributed Active Archive Centers (DAAC) that process, distribute, and archive the sensor data. The third layer consists of a user science processing network. The EOSDIS is being developed in a phased implementation. The initial phase, Version 0, is a prototype of the operational system. Version 0 activities are based upon existing systems and are designed to provide an EOSDIS-like capability for information management and distribution. An important science support task is the creation of simulated data sets for EOS instruments from precursor aircraft or satellite data. The Land Processes DAAC, at the EROS Data Center (EDC), is responsible for archiving and processing EOS precursor data from airborne instruments such as the Thermal Infrared Multispectral Scanner (TIMS), the Thematic Mapper Simulator (TMS), and Airborne Visible and Infrared Imaging Spectrometer (AVIRIS). AVIRIS, TIMS, and TMS are flown by the NASA-Ames Research Center ARC) on an ER-2. The ER-2 flies at 65000 feet and can carry up to three sensors simultaneously. Most jointly collected data sets are somewhat boresighted and roughly registered. The instrument data are being used to construct data sets that simulate the spectral and spatial characteristics of the Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) instrument scheduled to be flown on the first EOS-AM spacecraft. The ASTER is designed to acquire 14 channels of land science data in the visible and near-IR (VNIR), shortwave-IR (SWIR), and thermal-IR (TIR) regions from 0.52 micron to 11.65 micron at high spatial resolutions of 15 m to 90 m. Stereo data will also be acquired in the VNIR region in a single band. The AVIRIS and TMS cover the ASTER VNIR and SWIR bands, and the TIMS covers the TIR bands. Simulated ASTER data sets have been generated over Death Valley, California, Cuprite, Nevada, and the Drum Mountains, Utah using a combination of AVIRIS, TIMS, amd TMS data, and existing digital elevation models (DEM) for the topographic information.

  12. [Historical heritage and medical progress: the destiny of the scientific collections in the 'modern' Hospital of Santa Maria Nuova in Florence].

    PubMed

    Diana, Esther

    2008-01-01

    The scientific collections of Florentine Santa Maria Nuova Hospital stimulated new interest in the second half of eigthteenth century. Indeed, the modernization process of the Hospital lead to a steadily increasing alienation of its rich historical heritage, including the scientific collections. Archive documents witness the sale or the museum valorization of a number of collections including mathematical instruments and the anatomical, surgical and wax-obstetrical ones.

  13. Ensuring Credit to Data Creators: A Case Study for Geodesy

    NASA Astrophysics Data System (ADS)

    Boler, F. M.; Gorman, A.

    2011-12-01

    UNAVCO, the NSF and NASA-funded facility that supports and promotes Earth science by advancing high-precision techniques for the measurement of crustal deformation, has operated a Global Navigation Satellite System (GNSS) Data Archive since 1992. For the GNSS domain, the UNAVCO Archive has established best practices for data and metadata preservation, and provides tools for openly tracking data provenance. The GNSS data collection at the UNAVCO Archive represents the efforts of over 400 principal investigators and uncounted years of effort by these individuals and their students in globally distributed field installations, sometimes in situations of significant danger, whether from geologic hazards or political/civil unrest. Our investigators also expend considerable effort in following best practices for data and metadata management. UNAVCO, with the support of its consortium membership, has committed to an open data policy for data in the Archive. Once the data and metadata are archived by UNAVCO, they are distributed by anonymous access to thousands of users who cannot be accurately identified. Consequently, the UNAVCO commitment to open data access was reached with a degree of trepidation on the part of a segment of the principal investigators who contribute their data with no guarantee that their colleagues (or competitors) will follow a code of ethics in their research and publications with respect to the data they have downloaded from the UNAVCO Archive. The UNAVCO community has recognized the need to develop, adopt, and follow a data citation policy among themselves and to advocate for data citation more generally within the science publication arena. The role of the UNAVCO Archive in this process has been to provide data citation guidance and to develop and implement mechanisms to assign digital object identifiers (DOIs) to data sets within the UNAVCO Archive. The UNAVCO community is interested in digital object identifiers primarily as a means to facilitate citation for the purpose of ensuring credit to the data creators. UNAVCO's archiving and metadata management systems are generally well-suited to assigning and maintaining DOIs for two styles of logical collections of data: campaigns, which are spatially and temporally well-defined; and stations, which represent ongoing collection at a single spatial position at the Earth's surface. These two styles form the basis for implementing approximately 3,000 DOIs that can encompass the current holdings in the UNAVCO Archive. In addition, aggregations of DOIs into a superset DOI is advantageous for numerous cases where groupings of stations are naturally used in research studies. There are about 100 such natural collections of stations. However, research using GNSS data can also utilize several hundred or more stations in unique combinations, where tallying the individual DOIs within a reference list is cumbersome. We are grappling with the complexities that inevitably crop up when assigning DOIs, including subsetting, versioning, and aggregating. We also foresee the need for mechanisms for users to go beyond our predefined collections and/or aggregations to define their own ad-hoc collections. Our goal is to create a system for DOI assignment and utilization that succeeds in facilitating data citation within our community of geodesy scientists.

  14. 2015 Cataloging Hidden Special Collections and Archives Unconference and Symposium: Innovation, Collaboration, and Models. Proceedings of the CLIR Cataloging Hidden Special Collections and Archives Symposium (Philadelphia, Pennsylvania, March 12-13, 2015)

    ERIC Educational Resources Information Center

    Oestreicher, Cheryl, Ed.

    2015-01-01

    The 2015 CLIR Unconference & Symposium was the capstone event to seven years of grant funding through CLIR's Cataloging Hidden Special Collections and Archives program. These proceedings group presentations by theme. Collaborations provides examples of multi-institutional projects, including one international collaboration; Student and Faculty…

  15. The global Landsat archive: Status, consolidation, and direction

    USGS Publications Warehouse

    Wulder, Michael A.; White, Joanne C.; Loveland, Thomas; Woodcock, Curtis; Belward, Alan; Cohen, Warren B.; Fosnight, Eugene A.; Shaw, Jerad; Masek, Jeffery G.; Roy, David P.

    2016-01-01

    New and previously unimaginable Landsat applications have been fostered by a policy change in 2008 that made analysis-ready Landsat data free and open access. Since 1972, Landsat has been collecting images of the Earth, with the early years of the program constrained by onboard satellite and ground systems, as well as limitations across the range of required computing, networking, and storage capabilities. Rather than robust on-satellite storage for transmission via high bandwidth downlink to a centralized storage and distribution facility as with Landsat-8, a network of receiving stations, one operated by the U.S. government, the other operated by a community of International Cooperators (ICs), were utilized. ICs paid a fee for the right to receive and distribute Landsat data and over time, more Landsat data was held outside the archive of the United State Geological Survey (USGS) than was held inside, much of it unique. Recognizing the critical value of these data, the USGS began a Landsat Global Archive Consolidation (LGAC) initiative in 2010 to bring these data into a single, universally accessible, centralized global archive, housed at the Earth Resources Observation and Science (EROS) Center in Sioux Falls, South Dakota. The primary LGAC goals are to inventory the data held by ICs, acquire the data, and ingest and apply standard ground station processing to generate an L1T analysis-ready product. As of January 1, 2015 there were 5,532,454 images in the USGS archive. LGAC has contributed approximately 3.2 million of those images, more than doubling the original USGS archive holdings. Moreover, an additional 2.3 million images have been identified to date through the LGAC initiative and are in the process of being added to the archive. The impact of LGAC is significant and, in terms of images in the collection, analogous to that of having had twoadditional Landsat-5 missions. As a result of LGAC, there are regions of the globe that now have markedly improved Landsat data coverage, resulting in an enhanced capacity for mapping, monitoring change, and capturing historic conditions. Although future missions can be planned and implemented, the past cannot be revisited, underscoring the value and enhanced significance of historical Landsat data and the LGAC initiative. The aim of this paper is to report the current status of the global USGS Landsat archive, document the existing and anticipated contributions of LGAC to the archive, and characterize the current acquisitions of Landsat-7 and Landsat-8. Landsat-8 is adding data to the archive at an unprecedented rate as nearly all terrestrial images are now collected. We also offer key lessons learned so far from the LGAC initiative, plus insights regarding other critical elements of the Landsat program looking forward, such as acquisition, continuity, temporal revisit, and the importance of continuing to operationalize the Landsat program.

  16. 75 FR 52992 - Agency Information Collection Activities: Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-30

    ... NATIONAL ARCHIVES AND RECORDS ADMINISTRATION Agency Information Collection Activities: Proposed Collection; Comment Request AGENCY: National Archives and Records Administration (NARA). ACTION: Notice... records matter. The information will support adjustments in this offering that will improve the overall...

  17. 77 FR 50532 - Agency Information Collection Activities: Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-21

    ... NATIONAL ARCHIVES AND RECORDS ADMINISTRATION Agency Information Collection Activities: Proposed Collection; Comment Request AGENCY: National Archives and Records Administration (NARA). ACTION: Notice... Personnel Folders or Employee Medical Folders from the National Personnel Records Center (NPRC) of the...

  18. Status of the TESS Science Processing Operations Center

    NASA Technical Reports Server (NTRS)

    Jenkins, Jon M.; Twicken, Joseph D.; Campbell, Jennifer; Tenebaum, Peter; Sanderfer, Dwight; Davies, Misty D.; Smith, Jeffrey C.; Morris, Rob; Mansouri-Samani, Masoud; Girouardi, Forrest; hide

    2017-01-01

    The Transiting Exoplanet Survey Satellite (TESS) science pipeline is being developed by the Science Processing Operations Center (SPOC) at NASA Ames Research Center based on the highly successful Kepler Mission science pipeline. Like the Kepler pipeline, the TESS science pipeline will provide calibrated pixels, simple and systematic error-corrected aperture photometry, and centroid locations for all 200,000+ target stars, observed over the 2-year mission, along with associated uncertainties. The pixel and light curve products are modeled on the Kepler archive products and will be archived to the Mikulski Archive for Space Telescopes (MAST). In addition to the nominal science data, the 30-minute Full Frame Images (FFIs) simultaneously collected by TESS will also be calibrated by the SPOC and archived at MAST. The TESS pipeline will search through all light curves for evidence of transits that occur when a planet crosses the disk of its host star. The Data Validation pipeline will generate a suite of diagnostic metrics for each transit-like signature discovered, and extract planetary parameters by fitting a limb-darkened transit model to each potential planetary signature. The results of the transit search will be modeled on the Kepler transit search products (tabulated numerical results, time series products, and pdf reports) all of which will be archived to MAST.

  19. Another New Frontier: Archives and Manuscripts in the National Park Service.

    ERIC Educational Resources Information Center

    Bowling, Mary B.

    1985-01-01

    Archival collections of Edison, Olmsted, Morristown, and Longfellow National Historic Sites offer examples of how documentary collections have been handled in the past, and of ways in which National Park Service is beginning to address cultural resource management issues (arrangement, preservation, cataloging, research use) of archives and…

  20. A biological survey on the Ottoman Archive papers and determination of the D10 value

    NASA Astrophysics Data System (ADS)

    Kantoğlu, Ömer; Ergun, Ece; Ozmen, Dilan; Halkman, Hilal B. D.

    2018-03-01

    The Ottoman Archives have one of the richest archive collections in the world. However, not all the archived documents are well preserved and some undergo biodeterioration. Therefore, a rapid and promising treatment method is necessary to preserve the collection for following generations as heritage. Radiation presents as an alternative for the treatment of archival materials for this purpose. In this study, we conducted a survey to determine the contamination species and the D10 values of the samples obtained from the shelves of the Ottoman Archives. The samples also included several insect pests collected at using a pheromone trap placed in the archive storage room. With the exception of few localized problems, no active pest presence was observed. The D10 values of mold contamination and reference mold (A. niger) were found to be 1.0 and 0.68 kGy, respectively. Based on these results, it can be concluded that an absorbed dose of 6 kGy is required to remove the contamination from the materials stored in the Ottoman Archives.

  1. Building an archives in a medical library.

    PubMed Central

    Sammis, S K

    1984-01-01

    In 1979 the University of Medicine and Dentistry of New Jersey established an archives to collect, preserve, and retrieve important documentation related to its history. This paper examines various steps in building an archives and the development of a coherent collection policy, including potential sources for archival material. Problems and possible solutions concerning what to preserve from the vast quantities of material generated by an institution are considered. The relationship between the archives and the medical library and the requirements of the physical plant are discussed, including the storage and preservation of materials. PMID:6743876

  2. 78 FR 45569 - Agency Information Collection Activities: Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-29

    ... NATIONAL ARCHIVES AND RECORDS ADMINISTRATION [NARA-2013-039] Agency Information Collection Activities: Proposed Collection; Comment Request AGENCY: National Archives and Records Administration (NARA... make inquiries on their behalf and to release information and records related to their Freedom of...

  3. NASA's astrophysics archives at the National Space Science Data Center

    NASA Technical Reports Server (NTRS)

    Vansteenberg, M. E.

    1992-01-01

    NASA maintains an archive facility for Astronomical Science data collected from NASA's missions at the National Space Science Data Center (NSSDC) at Goddard Space Flight Center. This archive was created to insure the science data collected by NASA would be preserved and useable in the future by the science community. Through 25 years of operation there are many lessons learned, from data collection procedures, archive preservation methods, and distribution to the community. This document presents some of these more important lessons, for example: KISS (Keep It Simple, Stupid) in system development. Also addressed are some of the myths of archiving, such as 'scientists always know everything about everything', or 'it cannot possibly be that hard, after all simple data tech's do it'. There are indeed good reasons that a proper archive capability is needed by the astronomical community, the important question is how to use the existing expertise as well as the new innovative ideas to do the best job archiving this valuable science data.

  4. A Waveform Archiving System for the GE Solar 8000i Bedside Monitor.

    PubMed

    Fanelli, Andrea; Jaishankar, Rohan; Filippidis, Aristotelis; Holsapple, James; Heldt, Thomas

    2018-01-01

    Our objective was to develop, deploy, and test a data-acquisition system for the reliable and robust archiving of high-resolution physiological waveform data from a variety of bedside monitoring devices, including the GE Solar 8000i patient monitor, and for the logging of ancillary clinical and demographic information. The data-acquisition system consists of a computer-based archiving unit and a GE Tram Rac 4A that connects to the GE Solar 8000i monitor. Standard physiological front-end sensors connect directly to the Tram Rac, which serves as a port replicator for the GE monitor and provides access to these waveform signals through an analog data interface. Together with the GE monitoring data streams, we simultaneously collect the cerebral blood flow velocity envelope from a transcranial Doppler ultrasound system and a non-invasive arterial blood pressure waveform along a common time axis. All waveform signals are digitized and archived through a LabView-controlled interface that also allows for the logging of relevant meta-data such as clinical and patient demographic information. The acquisition system was certified for hospital use by the clinical engineering team at Boston Medical Center, Boston, MA, USA. Over a 12-month period, we collected 57 datasets from 11 neuro-ICU patients. The system provided reliable and failure-free waveform archiving. We measured an average temporal drift between waveforms from different monitoring devices of 1 ms every 66 min of recorded data. The waveform acquisition system allows for robust real-time data acquisition, processing, and archiving of waveforms. The temporal drift between waveforms archived from different devices is entirely negligible, even for long-term recording.

  5. From Metric Image Archives to Point Cloud Reconstruction: Case Study of the Great Mosque of Aleppo in Syria

    NASA Astrophysics Data System (ADS)

    Grussenmeyer, P.; Khalil, O. Al

    2017-08-01

    The paper presents photogrammetric archives from Aleppo (Syria), collected between 1999 and 2002 by the Committee for maintenance and restoration of the Great Mosque in partnership with the Engineering Unit of the University of Aleppo. During that period, terrestrial photogrammetric data and geodetic surveys of the Great Omayyad mosque were recorded for documentation purposes and geotechnical studies. During the recent war in Syria, the Mosque has unfortunately been seriously damaged and its minaret has been completely destroyed. The paper presents a summary of the documentation available from the past projects as well as solutions of 3D reconstruction based on the processing of the photogrammetric archives with the latest 3D image-based techniques.

  6. Ensemble LUT classification for degraded document enhancement

    NASA Astrophysics Data System (ADS)

    Obafemi-Ajayi, Tayo; Agam, Gady; Frieder, Ophir

    2008-01-01

    The fast evolution of scanning and computing technologies have led to the creation of large collections of scanned paper documents. Examples of such collections include historical collections, legal depositories, medical archives, and business archives. Moreover, in many situations such as legal litigation and security investigations scanned collections are being used to facilitate systematic exploration of the data. It is almost always the case that scanned documents suffer from some form of degradation. Large degradations make documents hard to read and substantially deteriorate the performance of automated document processing systems. Enhancement of degraded document images is normally performed assuming global degradation models. When the degradation is large, global degradation models do not perform well. In contrast, we propose to estimate local degradation models and use them in enhancing degraded document images. Using a semi-automated enhancement system we have labeled a subset of the Frieder diaries collection.1 This labeled subset was then used to train an ensemble classifier. The component classifiers are based on lookup tables (LUT) in conjunction with the approximated nearest neighbor algorithm. The resulting algorithm is highly effcient. Experimental evaluation results are provided using the Frieder diaries collection.1

  7. The Heinz Electronic Library Interactive Online System (HELIOS): Building a Digital Archive Using Imaging, OCR, and Natural Language Processing Technologies.

    ERIC Educational Resources Information Center

    Galloway, Edward A.; Michalek, Gabrielle V.

    1995-01-01

    Discusses the conversion project of the congressional papers of Senator John Heinz into digital format and the provision of electronic access to these papers by Carnegie Mellon University. Topics include collection background, project team structure, document processing, scanning, use of optical character recognition software, verification…

  8. Security Considerations for Archives: Rare Book, Manuscript, and Other Special Collections.

    ERIC Educational Resources Information Center

    Cupp, Christian M.

    The first of six sections in this guide to security for special collections in archives and libraries discusses the importance of security and the difficulty of preventing theft of archival materials. The second section, which focuses on planning, recommends an inservice training program for staff, a planned communications network between library…

  9. Defense Advanced Research Projects Agency (DARPA) Network Archive (DNA)

    DTIC Science & Technology

    2008-12-01

    therefore decided for an iterative development process even within such a small project. The first iteration consisted of conducting specific...Public reporting burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing instructions...regarding this burden estimate or any other aspect of this collection of information, including suggestions for reducing this burden to Washington

  10. Quality-assurance plan for groundwater activities, U.S. Geological Survey, Washington Water Science Center

    USGS Publications Warehouse

    Kozar, Mark D.; Kahle, Sue C.

    2013-01-01

    This report documents the standard procedures, policies, and field methods used by the U.S. Geological Survey’s (USGS) Washington Water Science Center staff for activities related to the collection, processing, analysis, storage, and publication of groundwater data. This groundwater quality-assurance plan changes through time to accommodate new methods and requirements developed by the Washington Water Science Center and the USGS Office of Groundwater. The plan is based largely on requirements and guidelines provided by the USGS Office of Groundwater, or the USGS Water Mission Area. Regular updates to this plan represent an integral part of the quality-assurance process. Because numerous policy memoranda have been issued by the Office of Groundwater since the previous groundwater quality assurance plan was written, this report is a substantial revision of the previous report, supplants it, and contains significant additional policies not covered in the previous report. This updated plan includes information related to the organization and responsibilities of USGS Washington Water Science Center staff, training, safety, project proposal development, project review procedures, data collection activities, data processing activities, report review procedures, and archiving of field data and interpretative information pertaining to groundwater flow models, borehole aquifer tests, and aquifer tests. Important updates from the previous groundwater quality assurance plan include: (1) procedures for documenting and archiving of groundwater flow models; (2) revisions to procedures and policies for the creation of sites in the Groundwater Site Inventory database; (3) adoption of new water-level forms to be used within the USGS Washington Water Science Center; (4) procedures for future creation of borehole geophysics, surface geophysics, and aquifer-test archives; and (5) use of the USGS Multi Optional Network Key Entry System software for entry of routine water-level data collected as part of long-term water-level monitoring networks.

  11. A Relevancy Algorithm for Curating Earth Science Data Around Phenomenon

    NASA Technical Reports Server (NTRS)

    Maskey, Manil; Ramachandran, Rahul; Li, Xiang; Weigel, Amanda; Bugbee, Kaylin; Gatlin, Patrick; Miller, J. J.

    2017-01-01

    Earth science data are being collected for various science needs and applications, processed using different algorithms at multiple resolutions and coverages, and then archived at different archiving centers for distribution and stewardship causing difficulty in data discovery. Curation, which typically occurs in museums, art galleries, and libraries, is traditionally defined as the process of collecting and organizing information around a common subject matter or a topic of interest. Curating data sets around topics or areas of interest addresses some of the data discovery needs in the field of Earth science, especially for unanticipated users of data. This paper describes a methodology to automate search and selection of data around specific phenomena. Different components of the methodology including the assumptions, the process, and the relevancy ranking algorithm are described. The paper makes two unique contributions to improving data search and discovery capabilities. First, the paper describes a novel methodology developed for automatically curating data around a topic using Earthscience metadata records. Second, the methodology has been implemented as a standalone web service that is utilized to augment search and usability of data in a variety of tools.

  12. A relevancy algorithm for curating earth science data around phenomenon

    NASA Astrophysics Data System (ADS)

    Maskey, Manil; Ramachandran, Rahul; Li, Xiang; Weigel, Amanda; Bugbee, Kaylin; Gatlin, Patrick; Miller, J. J.

    2017-09-01

    Earth science data are being collected for various science needs and applications, processed using different algorithms at multiple resolutions and coverages, and then archived at different archiving centers for distribution and stewardship causing difficulty in data discovery. Curation, which typically occurs in museums, art galleries, and libraries, is traditionally defined as the process of collecting and organizing information around a common subject matter or a topic of interest. Curating data sets around topics or areas of interest addresses some of the data discovery needs in the field of Earth science, especially for unanticipated users of data. This paper describes a methodology to automate search and selection of data around specific phenomena. Different components of the methodology including the assumptions, the process, and the relevancy ranking algorithm are described. The paper makes two unique contributions to improving data search and discovery capabilities. First, the paper describes a novel methodology developed for automatically curating data around a topic using Earth science metadata records. Second, the methodology has been implemented as a stand-alone web service that is utilized to augment search and usability of data in a variety of tools.

  13. Electron beam for preservation of biodeteriorated cultural heritage paper-based objects

    NASA Astrophysics Data System (ADS)

    Chmielewska-Śmietanko, Dagmara; Gryczka, Urszula; Migdał, Wojciech; Kopeć, Kamil

    2018-02-01

    Unsuitable storage conditions or accidents such as floods can present a serious threat for large quantities of book making them prone to attack by harmful microorganisms. The microbiological degradation of archives and book collections can be efficiently inhibited with irradiation processing. Application of EB irradiation to book and archive collections can also be a very effective alternative to the commonly used ethylene oxide treatment, which is toxic to the human and natural environment. In this study was evaluated the influence of EB irradiation used for microbiological decontamination process on paper-based objects. Three different kinds of paper (Whatman CHR 1, office paper and newsprint paper) were treated with 0.4, 1, 2, 5, 10 and 25 kGy electron beam irradiation. Optical and mechanical properties of different sorts of paper treated with e-beam, before and after the radiation process were studied. These results, which correlated with absorbed radiation doses effective for the elimination of Aspergillus niger (A. niger) allowed to determine that EB irradiation with absorbed radiation dose of 5 kGy ensures safe decontamination of different sorts of paper-based objects.

  14. The use of museum specimens with high-throughput DNA sequencers

    PubMed Central

    Burrell, Andrew S.; Disotell, Todd R.; Bergey, Christina M.

    2015-01-01

    Natural history collections have long been used by morphologists, anatomists, and taxonomists to probe the evolutionary process and describe biological diversity. These biological archives also offer great opportunities for genetic research in taxonomy, conservation, systematics, and population biology. They allow assays of past populations, including those of extinct species, giving context to present patterns of genetic variation and direct measures of evolutionary processes. Despite this potential, museum specimens are difficult to work with because natural postmortem processes and preservation methods fragment and damage DNA. These problems have restricted geneticists’ ability to use natural history collections primarily by limiting how much of the genome can be surveyed. Recent advances in DNA sequencing technology, however, have radically changed this, making truly genomic studies from museum specimens possible. We review the opportunities and drawbacks of the use of museum specimens, and suggest how to best execute projects when incorporating such samples. Several high-throughput (HT) sequencing methodologies, including whole genome shotgun sequencing, sequence capture, and restriction digests (demonstrated here), can be used with archived biomaterials. PMID:25532801

  15. Adaptability in the Development of Data Archiving Services at Johns Hopkins University

    NASA Astrophysics Data System (ADS)

    Petters, J.; DiLauro, T.; Fearon, D.; Pralle, B.

    2015-12-01

    Johns Hopkins University (JHU) Data Management Services provides archiving services for institutional researchers through the JHU Data Archive, thereby increasing the access to and use of their research data. From its inception our unit's archiving service has evolved considerably. While some of these changes have been internally driven so that our unit can archive quality data collections more efficiently, we have also developed archiving policies and procedures on the fly in response to researcher needs. Providing our archiving services for JHU research groups from a variety of research disciplines have surfaced different sets of expectations and needs. We have used each interaction to help us refine our services and quickly satisfy the researchers we serve (following the first agile principle). Here we discuss the development of our newest archiving service model, its implementation over the past several months, and the processes by which we have continued to refine and improve our archiving services since its implementation. Through this discussion we will illustrate the benefits of planning, structure and flexibility in development of archiving services that maximize the potential value of research data. We will describe interactions with research groups, including those from environmental engineering and international health, and how we were able to rapidly modify and develop our archiving services to meet their needs (e.g. in an 'agile' way). For example, our interactions with both of these research groups led first to discussion in regular standing meetings and eventually development of new archiving policies and procedures. These policies and procedures centered on limiting access to archived research data while associated manuscripts progress through peer-review and publication.

  16. Unfinished Business: The Uneven Past and Uncertain Future of One Historically Black University's Archives--A Personal Reflection

    ERIC Educational Resources Information Center

    Pevar, Susan Gunn

    2011-01-01

    This article presents a perspective on how the restructuring of a historically black university's library and resulting closure of its special collections and archives puts important records pertaining to African American history in jeopardy. This article traces the recent history of special collections and archives at the Lincoln University…

  17. An automated, web-enabled and searchable database system for archiving electrogram and related data from implantable cardioverter defibrillators.

    PubMed

    Zong, W; Wang, P; Leung, B; Moody, G B; Mark, R G

    2002-01-01

    The advent of implantable cardioverter defibrillators (ICDs) has resulted in significant reductions in mortality in patients at high risk for sudden cardiac death. Extensive related basic research and clinical investigation continue. ICDs typically record intracardiac electrograms and inter-beat intervals along with device settings during episodes of device delivery of therapy. Researchers wishing to study these data further have until now been limited to viewing paper plots. In support of multi-center clinical studies of patients with ICDs, we have developed a web based searchable ICD data archiving system, which allows users to use a web browser to upload ICD data from diskettes to a server where the data are automatically processed and archived. Users can view and download the archived ICD data directly via the web. The entire system is built from open source software. At present more than 500 patient ICD data sets have been uploaded to and archived in the system. This project will be of value not only to those who wish to conduct research using ICD data, but also to clinicians who need to archive and review ICD data collected from their patients.

  18. Empirical analysis and modeling of manual turnpike tollbooths in China

    NASA Astrophysics Data System (ADS)

    Zhang, Hao

    2017-03-01

    To deal with low-level of service satisfaction at tollbooths of many turnpikes in China, we conduct an empirical study and use a queueing model to investigate performance measures. In this paper, we collect archived data from six tollbooths of a turnpike in China. Empirical analysis on vehicle's time-dependent arrival process and collector's time-dependent service time is conducted. It shows that the vehicle arrival process follows a non-homogeneous Poisson process while the collector service time follows a log-normal distribution. Further, we model the process of collecting tolls at tollbooths with MAP / PH / 1 / FCFS queue for mathematical tractability and present some numerical examples.

  19. 77 FR 53921 - Agency Information Collection Activities: Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-04

    ... NATIONAL ARCHIVES AND RECORDS ADMINISTRATION Agency Information Collection Activities: Submission for OMB Review; Comment Request AGENCY: National Archives and Records Administration (NARA). ACTION... following information collections: 1. Title: Request to Microfilm Records. OMB number: 3095-0017. Agency...

  20. Archive of ground penetrating radar data collected during USGS field activity 13BIM01—Dauphin Island, Alabama, April 2013

    USGS Publications Warehouse

    Forde, Arnell S.; Smith, Christopher G.; Reynolds, Billy J.

    2016-03-18

    From April 13 to 20, 2013, scientists from the U.S. Geological Survey St. Petersburg Coastal and Marine Science Center (USGS-SPCMSC) conducted geophysical and sediment sampling surveys on Dauphin Island, Alabama, as part of Field Activity 13BIM01. The objectives of the study were to quantify inorganic and organic accretion rates in back-barrier and mainland marsh and estuarine environments. Various field and laboratory methods were used to achieve these objectives, including subsurface imaging using Ground Penetrating Radar (GPR), sediment sampling, lithologic and microfossil analyses, and geochronology techniques to produce barrier island stratigraphic cross sections to help interpret the recent (last 2000 years) geologic evolution of the island.This data series report is an archive of GPR and associated Global Positioning System (GPS) data collected in April 2013 from Dauphin Island and adjacent barrier-island environments. In addition to GPR data, marsh core and vibracore data were also collected collected but are not reported (or included) in the current report. Data products, including elevation-corrected subsurface profile images of the processed GPR data, unprocessed digital GPR trace data, post-processed GPS data, Geographic Information System (GIS) files and accompanying Federal Geographic Data Committee (FGDC) metadata, can be downloaded from the Data Downloads page.

  1. 39. Photocopy of engineering drawing (LBNL Archives and Records Collection). ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    39. Photocopy of engineering drawing (LBNL Archives and Records Collection). December 10, 1948. 2 BEVATRON EXTERIOR PRELIMINARY PERSPECTIVE - University of California Radiation Laboratory, Bevatron, 1 Cyclotron Road, Berkeley, Alameda County, CA

  2. Connecting to Collections in Florida: Current Conditions and Critical Needs in Libraries, Archives, and Museums

    ERIC Educational Resources Information Center

    Jorgensen, Corinne; Marty, Paul F.; Braun, Kathy

    2012-01-01

    This article presents results from an IMLS-funded project to evaluate the current state of collections in Florida's libraries, archives, and museums, current practices to preserve and conserve these collections, and perceived needs to maintain and improve these collections for future generations. The survey, modeled after the Heritage Health Index…

  3. Making Archival and Special Collections More Accessible

    ERIC Educational Resources Information Center

    Renspie, Melissa, Comp.; Shepard, Linda, Comp.; Childress, Eric, Comp.

    2015-01-01

    Revealing hidden assets stewarded by research institutions so they can be made available for research and learning locally and globally is a prime opportunity for libraries to create and deliver new value. "Making Archival and Special Collections More Accessible" collects important work OCLC Research has done to help achieve the…

  4. Archive of digital chirp subbottom profile data collected during USGS Cruise 13GFP01, Brownlee Dam and Hells Canyon Reservoir, Idaho and Oregon, 2013

    USGS Publications Warehouse

    Forde, Arnell S.; Dadisman, Shawn V.; Flocks, James G.; Fosness, Ryan L.; Welcker, Chris; Kelso, Kyle W.

    2014-01-01

    From March 16 - 31, 2013, the U.S. Geological Survey in cooperation with the Idaho Power Company conducted a geophysical survey to investigate sediment deposits and long-term sediment transport within the Snake River from Brownlee Dam to Hells Canyon Reservoir, along the Idaho and Oregon border; this effort will help the USGS to better understand geologic processes. This report serves as an archive of unprocessed digital chirp subbottom data, trackline maps, navigation files, Geographic Information System (GIS) files, Field Activity Collection System (FACS) logs, and formal Federal Geographic Data Committee (FGDC) metadata. Gained (showing a relative increase in signal amplitude) digital images of the seismic profiles are also provided. Refer to the Acronyms page for expansions of acronyms and abbreviations used in this report.

  5. Digitized Archival Primary Sources in STEM: A Selected Webliography

    ERIC Educational Resources Information Center

    Jankowski, Amy

    2017-01-01

    Accessibility and findability of digitized archival resources can be a challenge, particularly for students or researchers not familiar with archival formats and digital interfaces, which adhere to different descriptive standards than more widely familiar library resources. Numerous aggregate archival collection databases exist, which provide a…

  6. 38. Photocopy of engineering drawing (LBNL Archives and Records Collection). ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    38. Photocopy of engineering drawing (LBNL Archives and Records Collection). December 10, 1948. 1 BEVATRON EXTERIOR PRELIMINARY PERSPECTIVE-BIRD'S-EYE VIEW - University of California Radiation Laboratory, Bevatron, 1 Cyclotron Road, Berkeley, Alameda County, CA

  7. Documenting genomics: Applying archival theory to preserving the records of the Human Genome Project.

    PubMed

    Shaw, Jennifer

    2016-02-01

    The Human Genome Archive Project (HGAP) aimed to preserve the documentary heritage of the UK's contribution to the Human Genome Project (HGP) by using archival theory to develop a suitable methodology for capturing the results of modern, collaborative science. After assessing past projects and different archival theories, the HGAP used an approach based on the theory of documentation strategy to try to capture the records of a scientific project that had an influence beyond the purely scientific sphere. The HGAP was an archival survey that ran for two years. It led to ninety scientists being contacted and has, so far, led to six collections being deposited in the Wellcome Library, with additional collections being deposited in other UK repositories. In applying documentation strategy the HGAP was attempting to move away from traditional archival approaches to science, which have generally focused on retired Nobel Prize winners. It has been partially successful in this aim, having managed to secure collections from people who are not 'big names', but who made an important contribution to the HGP. However, the attempt to redress the gender imbalance in scientific collections and to improve record-keeping in scientific organisations has continued to be difficult to achieve. Copyright © 2015 The Author. Published by Elsevier Ltd.. All rights reserved.

  8. Documenting genomics: Applying archival theory to preserving the records of the Human Genome Project

    PubMed Central

    Shaw, Jennifer

    2016-01-01

    The Human Genome Archive Project (HGAP) aimed to preserve the documentary heritage of the UK's contribution to the Human Genome Project (HGP) by using archival theory to develop a suitable methodology for capturing the results of modern, collaborative science. After assessing past projects and different archival theories, the HGAP used an approach based on the theory of documentation strategy to try to capture the records of a scientific project that had an influence beyond the purely scientific sphere. The HGAP was an archival survey that ran for two years. It led to ninety scientists being contacted and has, so far, led to six collections being deposited in the Wellcome Library, with additional collections being deposited in other UK repositories. In applying documentation strategy the HGAP was attempting to move away from traditional archival approaches to science, which have generally focused on retired Nobel Prize winners. It has been partially successful in this aim, having managed to secure collections from people who are not ‘big names’, but who made an important contribution to the HGP. However, the attempt to redress the gender imbalance in scientific collections and to improve record-keeping in scientific organisations has continued to be difficult to achieve. PMID:26388555

  9. The Canadian Astronomy Data Centre

    NASA Astrophysics Data System (ADS)

    Ball, Nicholas M.; Schade, D.; Astronomy Data Centre, Canadian

    2011-01-01

    The Canadian Astronomy Data Centre (CADC) is the world's largest astronomical data center, holding over 0.5 Petabytes of information, and serving nearly 3000 astronomers worldwide. Its current data collections include BLAST, CFHT, CGPS, FUSE, Gemini, HST, JCMT, MACHO, MOST, and numerous other archives and services. It provides extensive data archiving, curation, and processing expertise, via projects such as MegaPipe, and enables substantial day-to-day collaboration between resident astronomers and computer specialists. It is a stable, powerful, persistent, and properly supported environment for the storage and processing of large volumes of data, a condition that is now absolutely vital for their science potential to be exploited by the community. Through initiatives such as the Common Archive Observation Model (CAOM), the Canadian Virtual Observatory (CVO), and the Canadian Advanced Network for Astronomical Research (CANFAR), the CADC is at the global forefront of advancing astronomical research through improved data services. The CAOM aims to provide homogeneous data access, and hence viable interoperability between a potentially unlimited number of different data collections, at many wavelengths. It is active in the definition of numerous emerging standards within the International Virtual Observatory, and several datasets are already available. The CANFAR project is an initiative to make cloud computing for storage and data-intensive processing available to the community. It does this via a Virtual Machine environment that is equivalent to managing a local desktop. Several groups are already processing science data. CADC is also at the forefront of advanced astronomical data analysis, driven by the science requirements of astronomers both locally and further afield. The emergence of 'Astroinformatics' promises to provide not only utility items like object classifications, but to directly enable new science by accessing previously undiscovered or intractable information. We are currently in the early stages of implementing Astroinformatics tools, such as machine learning, on CANFAR.

  10. Survey of Special Collections and Archives in the United Kingdom and Ireland

    ERIC Educational Resources Information Center

    Dooley, Jackie M.; Beckett, Rachel; Cullingford, Alison; Sambrook, Katie; Sheppard, Chris; Worrall, Sue

    2013-01-01

    It has become widely recognised across the academic and research libraries sector that special collections and archives play a key role in differentiating each institution from its peers. In recognition of this, Research Libraries UK (RLUK) established the workstrand "Unique and Distinctive Collections" (UDC) in support of its strategic…

  11. Academic or Community Resource? Stakeholder Interests and Collection Management at Charles Sturt University Regional Archives, 1973-2003

    ERIC Educational Resources Information Center

    Boadle, Don

    2003-01-01

    This analysis of the transformation of the Charles Sturt University Regional Archives from a library special collection to a multi-function regional repository highlights the importance of stakeholder interests in determining institutional configurations and collection development priorities. It also demonstrates the critical importance of…

  12. The Destruction of Jewish Libraries and Archives in Cracow during World War II.

    ERIC Educational Resources Information Center

    Sroka, Marek

    2003-01-01

    Examines the loss of various collections, especially school libraries and the Ezra Library, in Cracow (Poland) during World War II. Highlights include Nazi policies toward Cracow's Jews; the destruction of libraries, archives, and collections; Jewish book collections in the Staatsbibliotek Krakau (state library); and the removal of books by Jewish…

  13. Data archiving and serving system implementation in CLEP's GRAS Core System

    NASA Astrophysics Data System (ADS)

    Zuo, Wei; Zeng, Xingguo; Zhang, Zhoubin; Geng, Liang; Li, Chunlai

    2017-04-01

    The Ground Research & Applications System(GRAS) is one of the five systems of China's Lunar Exploration Project(CLEP), it is responsible for data acquisition, processing, management and application, and it is also the operation control center during satellite in-orbit and payload operation management. Chang'E-1, Chang'E-2 and Chang'E-3 have collected abundant lunar exploration data. The aim of this work is to present the implementation of data archiving and Serving in CLEP's GRAS Core System software. This first approach provides a client side API and server side software allowing the creation of a simplified version of CLEPDB data archiving software, and implements all required elements to complete data archiving flow from data acquisition until its persistent storage technology. The client side includes all necessary components that run on devices that acquire or produce data, distributing and streaming to configure remote archiving servers. The server side comprises an archiving service that stores into PDS files all received data. The archiving solution aims at storing data coming for the Data Acquisition Subsystem, the Operation Management Subsystem, the Data Preprocessing Subsystem and the Scientific Application & Research Subsystem. The serving solution aims at serving data for the various business systems, scientific researchers and public users. The data-driven and component clustering methods was adopted in this system, the former is used to solve real-time data archiving and data persistence services; the latter is used to keep the continuous supporting ability of archive and service to new data from Chang'E Mission. Meanwhile, it can save software development cost as well.

  14. Technologically Enhanced Archival Collections: Using the Buddy System

    ERIC Educational Resources Information Center

    Holz, Dayna

    2006-01-01

    Based in the context of challenges faced by archives when managing digital projects, this article explores options of looking outside the existing expertise of archives staff to find collaborative partners. In teaming up with other departments and organizations, the potential scope of traditional archival digitization projects is expanded beyond…

  15. A Generic Archive Protocol and an Implementation

    NASA Astrophysics Data System (ADS)

    Jordan, J. M.; Jennings, D. G.; McGlynn, T. A.; Ruggiero, N. G.; Serlemitsos, T. A.

    1993-01-01

    Archiving vast amounts of data has become a major part of every scientific space mission today. GRASP, the Generic Retrieval/Ar\\-chive Services Protocol, addresses the question of how to archive the data collected in an environment where the underlying hardware archives and computer hosts may be rapidly changing.

  16. Resources for Archives: Developing Collections, Constituents, Colleagues, and Capital

    ERIC Educational Resources Information Center

    Primer, Ben

    2009-01-01

    The essential element for archival success is to be found in the quality of management decisions made and public services provided. Archivists can develop first-class archives operations through understanding the organizational context; planning; hiring, retaining, and developing staff; meeting archival standards for storage and access; and…

  17. Records & Information Management Services | Alaska State Archives

    Science.gov Websites

    Search Search in: Archives State of Alaska Home About Records Management (RIMS) For Researchers Collections Imaging (IMS) ASHRAB Libraries, Archives, & Museums Archives Records Management (RIMS) Records records and information management for the State of Alaska. Frequently Asked Questions Submit Records

  18. The Diverse Origins of Neutron-capture Elements in the Metal-poor Star HD 94028: Possible Detection of Products of I-Process Nucleosynthesis

    NASA Astrophysics Data System (ADS)

    Roederer, Ian U.; Karakas, Amanda I.; Pignatari, Marco; Herwig, Falk

    2016-04-01

    We present a detailed analysis of the composition and nucleosynthetic origins of the heavy elements in the metal-poor ([Fe/H] = -1.62 ± 0.09) star HD 94028. Previous studies revealed that this star is mildly enhanced in elements produced by the slow neutron-capture process (s process; e.g., [Pb/Fe] = +0.79 ± 0.32) and rapid neutron-capture process (r process; e.g., [Eu/Fe] = +0.22 ± 0.12), including unusually large molybdenum ([Mo/Fe] = +0.97 ± 0.16) and ruthenium ([Ru/Fe] = +0.69 ± 0.17) enhancements. However, this star is not enhanced in carbon ([C/Fe] = -0.06 ± 0.19). We analyze an archival near-ultraviolet spectrum of HD 94028, collected using the Space Telescope Imaging Spectrograph on board the Hubble Space Telescope, and other archival optical spectra collected from ground-based telescopes. We report abundances or upper limits derived from 64 species of 56 elements. We compare these observations with s-process yields from low-metallicity AGB evolution and nucleosynthesis models. No combination of s- and r-process patterns can adequately reproduce the observed abundances, including the super-solar [As/Ge] ratio (+0.99 ± 0.23) and the enhanced [Mo/Fe] and [Ru/Fe] ratios. We can fit these features when including an additional contribution from the intermediate neutron-capture process (I process), which perhaps operated through the ingestion of H in He-burning convective regions in massive stars, super-AGB stars, or low-mass AGB stars. Currently, only the I process appears capable of consistently producing the super-solar [As/Ge] ratios and ratios among neighboring heavy elements found in HD 94028. Other metal-poor stars also show enhanced [As/Ge] ratios, hinting that operation of the I process may have been common in the early Galaxy. These data are associated with Program 072.B-0585(A), PI. Silva. Some data presented in this paper were obtained from the Barbara A. Mikulski Archive for Space Telescopes (MAST). The Space Telescope Science Institute is operated by the Association of Universities for Research in Astronomy, Inc., under NASA contract NAS5-26555. These data are associated with Programs GO-7402 and GO-8197. This work is based on data obtained from the European Southern Observatory (ESO) Science Archive Facility. These data are associated with Program 072.B-0585(A). This paper includes data taken at The McDonald Observatory of The University of Texas at Austin.

  19. Preservation Environments

    NASA Technical Reports Server (NTRS)

    Moore, Reagan W.

    2004-01-01

    The long-term preservation of digital entities requires mechanisms to manage the authenticity of massive data collections that are written to archival storage systems. Preservation environments impose authenticity constraints and manage the evolution of the storage system technology by building infrastructure independent solutions. This seeming paradox, the need for large archives, while avoiding dependence upon vendor specific solutions, is resolved through use of data grid technology. Data grids provide the storage repository abstractions that make it possible to migrate collections between vendor specific products, while ensuring the authenticity of the archived data. Data grids provide the software infrastructure that interfaces vendor-specific storage archives to preservation environments.

  20. Web Services and Handle Infrastructure - WDCC's Contributions to International Projects

    NASA Astrophysics Data System (ADS)

    Föll, G.; Weigelt, T.; Kindermann, S.; Lautenschlager, M.; Toussaint, F.

    2012-04-01

    Climate science demands on data management are growing rapidly as climate models grow in the precision with which they depict spatial structures and in the completeness with which they describe a vast range of physical processes. The ExArch project is exploring the challenges of developing a software management infrastructure which will scale to the multi-exabyte archives of climate data which are likely to be crucial to major policy decisions in by the end of the decade. The ExArch approach to future integration of exascale climate archives is based on one hand on a distributed web service architecture providing data analysis and quality control functionality across archvies. On the other hand a consistent persistent identifier infrastructure is deployed to support distributed data management and data replication. Distributed data analysis functionality is based on the CDO climate data operators' package. The CDO-Tool is used for processing of the archived data and metadata. CDO is a collection of command line Operators to manipulate and analyse Climate and forecast model Data. A range of formats is supported and over 500 operators are provided. CDO presently is designed to work in a scripting environment with local files. ExArch will extend the tool to support efficient usage in an exascale archive with distributed data and computational resources by providing flexible scheduling capabilities. Quality control will become increasingly important in an exascale computing context. Researchers will be dealing with millions of data files from multiple sources and will need to know whether the files satisfy a range of basic quality criterea. Hence ExArch will provide a flexible and extensible quality control system. The data will be held at more than 30 computing centres and data archives around the world, but for users it will appear as a single archive due to a standardized ExArch Web Processing Service. Data infrastructures such as the one built by ExArch can greatly benefit from assigning persistent identifiers (PIDs) to the main entities, such as data and metadata records. A PID should then not only consist of a globally unique identifier, but also support built-in facilities to relate PIDs to each other, to build multi-hierarchical virtual collections and to enable attaching basic metadata directly to PIDs. With such a toolset, PIDs can support crucial data management tasks. For example, data replication performed in ExArch can be supported through PIDs as they can help to establish durable links between identical copies. By linking derivative data objects together, their provenance can be traced with a level of detail and reliability currently unavailable in the Earth system modelling domain. Regarding data transfers, virtual collections of PIDs may be used to package data prior to transmission. If the PID of such a collection is used as the primary key in data transfers, safety of transfer and traceability of data objects across repositories increases. End-users can benefit from PIDs as well since they make data discovery independent from particular storage sites and enable user-friendly communication about primary research objects. A generic PID system can in fact be a fundamental building block for scientific e-infrastructures across projects and domains.

  1. Use of MCIDAS as an earth science information systems tool

    NASA Technical Reports Server (NTRS)

    Goodman, H. Michael; Karitani, Shogo; Parker, Karen G.; Stooksbury, Laura M.; Wilson, Gregory S.

    1988-01-01

    The application of the man computer interactive data access system (MCIDAS) to information processing is examined. The computer systems that interface with the MCIDAS are discussed. Consideration is given to the computer networking of MCIDAS, data base archival, and the collection and distribution of real-time special sensor microwave/imager data.

  2. The Miracle of Microfilm: The Foundation of the Largest Genealogical Record Collection in the World.

    ERIC Educational Resources Information Center

    Powell, Ted F.

    1985-01-01

    Traces origins of the Genealogical Department of The Church of Jesus Christ of Latter-Day Saints (Mormon), highlighting microfilm technology, equipment used, development of the 16mm camera, film processing, quality control, filming techniques, specialized microfilming, archival storage (The Granite Mountain Records Vault), the genealogical library…

  3. Crossing Boundaries, Creating Community, Reorganizing a College of Education.

    ERIC Educational Resources Information Center

    Ginn, Linda W.

    This paper chronicles a process of structural change in the College of Education at the University of Tennessee in Knoxville. Data for the study were derived from interviews with 40 of the participants, plus archival material collected from the college planning office. The paper summarizes some of the historical context surrounding the change and…

  4. The Archive of the Amateur Observation Network of the International Halley Watch. Volume 1; Comet Giacobini-Zinner

    NASA Technical Reports Server (NTRS)

    Edberg, Stephen J. (Editor)

    1996-01-01

    The International Halley Watch (IHW) was organized for the purpose of gathering and archiving the most complete record of the apparition of a comet, Comet Halley (1982i = 1986 III = 1P/Halley), ever compiled. The redirection of the International Cometary Explorer (ICE), toward Comet Giacobini-Zinner (1984e = 1985 XIII = 21P/Giacobini-Zinner) prompted the initiation of a formal watch on that comet. All the data collected on P/Giacobini-Zinner and P/Halley have been published on CD-ROM in the Comet Halley Archive. This document contains a printed version of the archive data, collected by amateur astronomers, on these two comets. Volume 1 contains the Comet Giacobini-Zinner data archive and Volume 2 contains the Comet Halley archive. Both volumes include information on how to read the data in both archives, as well as a history of both comet watches (including the organizing of the network of astronomers and lessons learned from that experience).

  5. Archived Data User Service self evaluation report : FAST

    DOT National Transportation Integrated Search

    2000-11-01

    The Archived Data User Service (ADUS) is a recent addition to the National Intelligent Transportation System (ITS) Architecture. This user service required ITS system to have the capability to receive, collect and archive ITS-generated operational...

  6. Finding "Science" in the Archives of the Spanish Monarchy.

    PubMed

    Portuondo, Maria M

    2016-03-01

    This essay explores the history of several archives that house the early modern records of Spanish imperial science. The modern "archival turn" urges us to think critically about archives and to recognize in the history of these collections an embedded, often implicit, history that--unless properly recognized, acknowledged, and understood--can distort the histories we are trying to tell. This essay uses a curious episode in the history of science to illustrate how Spanish archives relate to each other and shape the collections they house. During the late eighteenth century a young navy officer, Martín Fernández de Navarrete, was dispatched to all the principal archives of the Spanish monarchy with a peculiar mission: he was to search for evidence that the Spanish in fact had a scientific tradition. This essay uses his mission to explain how the original purpose of an archive--the archive's telos--may persist as a strong and potentially deterministic force in the work of historians of science. In the case of the archives discussed, this telos was shaped by issues as wide ranging as defending a nation's reputation against claims of colonial neglect and as idiosyncratic as an archivist's selection criteria.

  7. Archive of Digital Chirp Subbottom Profile Data Collected During USGS Cruise 14BIM05 Offshore of Breton Island, Louisiana, August 2014

    USGS Publications Warehouse

    Forde, Arnell S.; Flocks, James G.; Wiese, Dana S.; Fredericks, Jake J.

    2016-03-29

    The archived trace data are in standard SEG Y rev. 0 format (Barry and others, 1975); the first 3,200 bytes of the card image header are in American Standard Code for Information Interchange (ASCII) format instead of Extended Binary Coded Decimal Interchange Code (EBCDIC) format. The SEG Y files are available on the DVD version of this report or online, downloadable via the USGS Coastal and Marine Geoscience Data System (http://cmgds.marine.usgs.gov). The data are also available for viewing using GeoMapApp (http://www.geomapapp.org) and Virtual Ocean (http://www.virtualocean.org) multi-platform open source software. The Web version of this archive does not contain the SEG Y trace files. To obtain the complete DVD archive, contact USGS Information Services at 1-888-ASK-USGS or infoservices@usgs.gov. The SEG Y files may be downloaded and processed with commercial or public domain software such as Seismic Unix (SU) (Cohen and Stockwell, 2010). See the How To Download SEG Y Data page for download instructions. The printable profiles are provided as Graphics Interchange Format (GIF) images processed and gained using SU software and can be viewed from theProfiles page or by using the links located on the trackline maps; refer to the Software page for links to example SU processing scripts.

  8. SPINS: standardized protein NMR storage. A data dictionary and object-oriented relational database for archiving protein NMR spectra.

    PubMed

    Baran, Michael C; Moseley, Hunter N B; Sahota, Gurmukh; Montelione, Gaetano T

    2002-10-01

    Modern protein NMR spectroscopy laboratories have a rapidly growing need for an easily queried local archival system of raw experimental NMR datasets. SPINS (Standardized ProteIn Nmr Storage) is an object-oriented relational database that provides facilities for high-volume NMR data archival, organization of analyses, and dissemination of results to the public domain by automatic preparation of the header files required for submission of data to the BioMagResBank (BMRB). The current version of SPINS coordinates the process from data collection to BMRB deposition of raw NMR data by standardizing and integrating the storage and retrieval of these data in a local laboratory file system. Additional facilities include a data mining query tool, graphical database administration tools, and a NMRStar v2. 1.1 file generator. SPINS also includes a user-friendly internet-based graphical user interface, which is optionally integrated with Varian VNMR NMR data collection software. This paper provides an overview of the data model underlying the SPINS database system, a description of its implementation in Oracle, and an outline of future plans for the SPINS project.

  9. Treasures in Archived Histopathology Collections: Preserving the Past for Future Understanding.(SETAC)

    EPA Science Inventory

    Extensive collections of histopathology materials from studies of marine and freshwater mollusks, crustaceans, echinoderms, and other organisms are archived in the Registry of Tumors in Lower Animals (RTLA), the U.S. Environmental Protection Agency, NOAA’s National Marine Fisheri...

  10. Treasures in Archived Histopathology Collections: Preserving the Past for Future Understanding (NACSETAC)

    EPA Science Inventory

    Extensive collections of histopathology materials from studies of marine and freshwater mollusks, crustaceans, echinoderms, and other organisms are archived in the Registry of Tumors in Lower Animals (RTLA), the U.S. Environmental Protection Agency, NOAA’s National Marine Fishe...

  11. Treasures in Archived Histolopathology Collections: Preserving the Past for Future Understanding

    EPA Science Inventory

    Extensive collections of histopathology materials from studies of marine and freshwater mollusks, crustaceans, echinoderms, and other organisms are archived in the Registry of Tumors in Lower Animals (RTLA), the U.S. Environmental Protection Agency, NOAA’s National Marine Fishe...

  12. 75 FR 69473 - Agency Information Collection Activities: Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-12

    ... NATIONAL ARCHIVES AND RECORDS ADMINISTRATION Agency Information Collection Activities: Submission for OMB Review; Comment Request AGENCY: National Archives and Records Administration (NARA). ACTION... effectiveness of the Public Vaults and its several exhibits in enhancing visitors' understanding that records...

  13. WWLLN Data User Manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lay, Erin Hoffmann; Wiens, Kyle Cameron; Delapp, Dorothea Marcia

    2016-03-11

    The World Wide Lightning Location Network (WWLLN) provides continuous global lightning monitoring and detection. At LANL we collect and archive these data on a daily basis. This document describes the WWLLN data, how they are collected and archived, and how to use the data at LANL.

  14. Archive & Data Management Activities for ISRO Science Archives

    NASA Astrophysics Data System (ADS)

    Thakkar, Navita; Moorthi, Manthira; Gopala Krishna, Barla; Prashar, Ajay; Srinivasan, T. P.

    2012-07-01

    ISRO has kept a step ahead by extending remote sensing missions to planetary and astronomical exploration. It has started with Chandrayaan-1 and successfully completed the moon imaging during its life time in the orbit. Now, in future ISRO is planning to launch Chandrayaan-2 (next moon mission), Mars Mission and Astronomical mission ASTROSAT. All these missions are characterized by the need to receive process, archive and disseminate the acquired science data to the user community for analysis and scientific use. All these science missions will last for a few months to a few years but the data received are required to be archived, interoperable and requires a seamless access to the user community for the future. ISRO has laid out definite plans to archive these data sets in specified standards and develop relevant access tools to be able to serve the user community. To achieve this goal, a Data Center is set up at Bangalore called Indian Space Science Data Center (ISSDC). This is the custodian of all the data sets of the current and future science missions of ISRO . Chandrayaan-1 is the first among the planetary missions launched/to be launched by ISRO and we had taken the challenge and developed a system for data archival and dissemination of the payload data received. For Chandrayaan-1 the data collected from all the instruments are processed and is archived in the archive layer in the Planetary Data System (PDS 3.0) standards, through the automated pipeline. But the dataset once stored is of no use unless it is made public, which requires a Web-based dissemination system that can be accessible to all the planetary scientists/data users working in this field. Towards this, a Web- based Browse and Dissemination system has been developed, wherein users can register and search for their area of Interest and view the data archived for TMC & HYSI with relevant Browse chips and Metadata of the data. Users can also order the data and get it on their desktop in the PDS. For other AO payloads users can view the metadata and the data is available through FTP site. This same archival and dissemination strategy will be extended for the next moon mission Chandrayaan-2. ASTROSAT is going to be the first multi-wavelength astronomical mission for which the data is archived at ISSDC. It consists of five astronomical payloads that would allow simultaneous multi-wavelengths observations from X-ray to Ultra-Violet (UV) of astronomical objects. It is planned to archive the data sets in FITS. The archive of the ASTROSAT will be done in the Archive Layer at ISSDC. The Browse of the Archive will be available through the ISDA (Indian Science Data Archive) web site. The Browse will be IVOA compliant with a search mechanism using VOTable. The data will be available to the users only on request basis via a FTP site after the lock in period is over. It is planned that the Level2 pipeline software and various modules for processing the data sets will be also available on the web site. This paper, describes the archival procedure of Chandrayaan-1 and archive plan for the ASTROSAT, Chandrayaan-2 and other future mission of ISRO including the discussion on data management activities.

  15. Geodetic Seamless Archive Centers Modernization - Information Technology for Exploiting the Data Explosion

    NASA Astrophysics Data System (ADS)

    Boler, F. M.; Blewitt, G.; Kreemer, C. W.; Bock, Y.; Noll, C. E.; McWhirter, J.; Jamason, P.; Squibb, M. B.

    2010-12-01

    Space geodetic science and other disciplines using geodetic products have benefited immensely from open sharing of data and metadata from global and regional archives Ten years ago Scripps Orbit and Permanent Array Center (SOPAC), the NASA Crustal Dynamics Data Information System (CDDIS), UNAVCO and other archives collaborated to create the GPS Seamless Archive Centers (GSAC) in an effort to further enable research with the expanding collections of GPS data then becoming available. The GSAC partners share metadata to facilitate data discovery and mining across participating archives and distribution of data to users. This effort was pioneering, but was built on technology that has now been rendered obsolete. As the number of geodetic observing technologies has expanded, the variety of data and data products has grown dramatically, exposing limitations in data product sharing. Through a NASA ROSES project, the three archives (CDDIS, SOPAC and UNAVCO) have been funded to expand the original GSAC capability for multiple geodetic observation types and to simultaneously modernize the underlying technology by implementing web services. The University of Nevada, Reno (UNR) will test the web services implementation by incorporating them into their daily GNSS data processing scheme. The effort will include new methods for quality control of current and legacy data that will be a product of the analysis/testing phase performed by UNR. The quality analysis by UNR will include a report of the stability of the stations coordinates over time that will enable data users to select sites suitable for their application, for example identifying stations with large seasonal effects. This effort will contribute to enhanced ability for very large networks to obtain complete data sets for processing.

  16. Supporting the Use of GPM-GV Field Campaign Data Beyond Project Scientists

    NASA Astrophysics Data System (ADS)

    Weigel, A. M.; Smith, D. K.; Sinclair, L.; Bugbee, K.

    2017-12-01

    The Global Precipitation Measurement (GPM) Mission Ground Validation (GV) consisted of a collection of field campaigns at various locations focusing on particular aspects of precipitation. Data collected during the GPM-GV are necessary for better understanding the instruments and algorithms used to monitor water resources, study the global hydrologic cycle, understand climate variability, and improve weather prediction. The GPM-GV field campaign data have been archived at the NASA Global Hydrology Resource Center (GHRC) Distributed Achive Archive Center (DAAC). These data consist of a heterogeneous collection of observations that require careful handling, full descriptive user guides, and helpful instructions for data use. These actions are part of the data archival process. In addition, the GHRC focuses on expanding the use of GPM-GV data beyond the validation and instrument researchers that participated in the field campaigns. To accomplish this, GHRC ties together the similarities and differences between the various field campaigns with the goal of improving user documents to be more easily read by those outside the field of research. In this poster, the authors will describe the GPM-GV datasets, discuss data use among the broader community, outline the types of problems/issues with these datasets, demonstrate what tools support data visualization and use, and highlight the outreach materials developed to educate both younger and general audiences about the data.

  17. Archive of chirp seismic reflection data collected during USGS cruises 00SCC02 and 00SCC04, Barataria Basin, Louisiana, May 12-31 and June 17-July 2, 2000

    USGS Publications Warehouse

    Calderon, Karynna; Dadisman, S.V.; Kindinger, J.L.; Flocks, J.G.; Wiese, D.S.; Kulp, Mark; Penland, Shea; Britsch, L.D.; Brooks, G.R.

    2003-01-01

    This archive consists of two-dimensional marine seismic reflection profile data collected in the Barataria Basin of southern Louisiana. These data were acquired in May, June, and July of 2000 aboard the R/V G.K. Gilbert. Included here are data in a variety of formats including binary, American Standard Code for Information Interchange (ASCII), Hyper-Text Markup Language (HTML), shapefiles, and Graphics Interchange Format (GIF) and Joint Photographic Experts Group (JPEG) images. Binary data are in Society of Exploration Geophysicists (SEG) SEG-Y format and may be downloaded for further processing or display. Reference maps and GIF images of the profiles may be viewed with a web browser. The Geographic Information Systems (GIS) information provided here is compatible with Environmental Systems Research Institute (ESRI) GIS software.

  18. Strike Up the Score: Deriving Searchable and Playable Digital Formats from Sheet Music; Smart Objects and Open Archives; Building the Archives of the Future: Advanced in Preserving Electronic Records at the National Archives and Records Administration; From the Digitized to the Digital Library.

    ERIC Educational Resources Information Center

    Choudhury, G. Sayeed; DiLauro, Tim; Droettboom, Michael; Fujinaga, Ichiro; MacMillan, Karl; Nelson, Michael L.; Maly, Kurt; Thibodeau, Kenneth; Thaller, Manfred

    2001-01-01

    These articles describe the experiences of the Johns Hopkins University library in digitizing their collection of sheet music; motivation for buckets, Smart Object, Dumb Archive (SODA) and the Open Archives Initiative (OAI), and initial experiences using them in digital library (DL) testbeds; requirements for archival institutions, the National…

  19. Uncovering Information Hidden in Web Archives: Glimpse at Web Analysis Building on Data Warehouses; Towards Continuous Web Archiving: First Results and an Agenda for the Future; The Open Video Digital Library; After Migration to an Electronic Journal Collection: Impact on Faculty and Doctoral Students; Who Is Reading On-Line Education Journals? Why? And What Are They Reading?; Report on eLibrary@UBC4: Research, Collaboration and the Digital Library - Visions for 2010.

    ERIC Educational Resources Information Center

    Rauber, Andreas; Bruckner, Robert M.; Aschenbrenner, Andreas; Witvoet, Oliver; Kaiser, Max; Masanes, Julien; Marchionini, Gary; Geisler, Gary; King, Donald W.; Montgomery, Carol Hansen; Rudner, Lawrence M.; Gellmann, Jennifer S.; Miller-Whitehead, Marie; Iverson, Lee

    2002-01-01

    These six articles discuss Web archives and Web analysis building on data warehouses; international efforts at continuous Web archiving; the Open Video Digital Library; electronic journal collections in academic libraries; online education journals; and an electronic library symposium at the University of British Columbia. (LRW)

  20. A VBA Desktop Database for Proposal Processing at National Optical Astronomy Observatories

    NASA Astrophysics Data System (ADS)

    Brown, Christa L.

    National Optical Astronomy Observatories (NOAO) has developed a relational Microsoft Windows desktop database using Microsoft Access and the Microsoft Office programming language, Visual Basic for Applications (VBA). The database is used to track data relating to observing proposals from original receipt through the review process, scheduling, observing, and final statistical reporting. The database has automated proposal processing and distribution of information. It allows NOAO to collect and archive data so as to query and analyze information about our science programs in new ways.

  1. Availability of Previously Unprocessed ALSEP Raw Instrument Data, Derivative Data, and Metadata Products

    NASA Technical Reports Server (NTRS)

    Nagihara, S.; Nakamura, Y.; Williams, D. R.; Taylor, P. T.; Kiefer, W. S.; Hager, M. A.; Hills, H. K.

    2016-01-01

    In year 2010, 440 original data archival tapes for the Apollo Lunar Science Experiment Package (ALSEP) experiments were found at the Washington National Records Center. These tapes hold raw instrument data received from the Moon for all the ALSEP instruments for the period of April through June 1975. We have recently completed extraction of binary files from these tapes, and we have delivered them to the NASA Space Science Data Cordinated Archive (NSSDCA). We are currently processing the raw data into higher order data products in file formats more readily usable by contemporary researchers. These data products will fill a number of gaps in the current ALSEP data collection at NSSDCA. In addition, we have estabilished a digital, searcheable archive of ALSEP document and metadata as part of the web portal of the Lunar and Planetary Institute. It currently holds approx. 700 documents totaling approx. 40,000 pages

  2. Archive Management of NASA Earth Observation Data to Support Cloud Analysis

    NASA Technical Reports Server (NTRS)

    Lynnes, Christopher; Baynes, Kathleen; McInerney, Mark A.

    2017-01-01

    NASA collects, processes and distributes petabytes of Earth Observation (EO) data from satellites, aircraft, in situ instruments and model output, with an order of magnitude increase expected by 2024. Cloud-based web object storage (WOS) of these data can simplify the execution of such an increase. More importantly, it can also facilitate user analysis of those volumes by making the data available to the massively parallel computing power in the cloud. However, storing EO data in cloud WOS has a ripple effect throughout the NASA archive system with unexpected challenges and opportunities. One challenge is modifying data servicing software (such as Web Coverage Service servers) to access and subset data that are no longer on a directly accessible file system, but rather in cloud WOS. Opportunities include refactoring of the archive software to a cloud-native architecture; virtualizing data products by computing on demand; and reorganizing data to be more analysis-friendly.

  3. [A collection of scientific instruments at the dawn of the modern hospital: Vincenzo Viviani's physical-mathematical instruments and Santa Maria Hospital in Florence (1871-1895)].

    PubMed

    Diana, Esther

    2008-01-01

    Around the second half of the nineteenth century, the collection of physics-mathematical instruments that Vincenzo Viviani (1622-1703) had bequeathed to the Santa Maria Nuova Hospital of Florence stirred new interest. The process of modernising the hospital was indeed to lead to the progressive alienation of the institution's rich historical patrimony, including the scientific collections. In tracing back the negotiations that led to the sale of the Viviani collection, archive documents have also brought to light the collection inventory, which is now proposed a new to help recount the history of how scientific instruments became museum collectibles in Florence.

  4. Treasures in Archived Histopathology Collections: Preserving the Past for Future Understanding (Histologic)

    EPA Science Inventory

    Extensive collections of histopathology materials from studies of marine and freshwater fish, mollusks, crustaceans, echinoderms, and other organisms are archived at the Registry of Tumors in Lower Animals (RTLA), the U.S. Environmental Protection Agency (EPA), NOAA’s National Ma...

  5. Treasures in Archived Histopathology Collections: Preserving the Past for Future Understanding (ISAAH-6)

    EPA Science Inventory

    Extensive collections of histopathology materials from studies of marine and freshwater fish, mollusks, crustaceans, echinoderms, and other organisms are archived in the Registry of Tumors in Lower Animals (RTLA), the U.S. Environmental Protection Agency, NOAA’s National Marine F...

  6. Treasures in Archived Histopathology Collections: Preserving the Past for Future Understanding (IMCC09)

    EPA Science Inventory

    Extensive collections of histopathology materials from studies of marine and freshwater fish, mollusks, crustaceans, echinoderms, and other organisms are archived in the Registry of Tumors in Lower Animals (RTLA), the U.S. Environmental Protection Agency, NOAA’s National Marine...

  7. Fermilab History and Archives Project | Golden Books - The Early History of

    Science.gov Websites

    Fermilab History and Archives Project Home About the Archives History and Archives Online Request Contact ; - The Early History of URA and Fermilab Fermilab Golden Book Collection main page Click on Image for Larger View The Early History of URA and Fermilab Viewpoint of a URA President (1966-1981) Norman F

  8. Getting to MARS: Working with an Automated Retrieval System in the Special Collections Department at the University of Nevada, Reno

    ERIC Educational Resources Information Center

    Sundstrand, Jacquelyn K.

    2011-01-01

    The University of Nevada, Reno's Special Collections and University Archives Department moved into a new facility and had to utilize an automated storage and retrieval systems (ASRS) for storage of manuscript and archival collections. Using ASRS bins presented theoretical challenges in planning for the move. This article highlights how well the…

  9. Mega-precovery and data mining of near-Earth asteroids and other Solar System objects

    NASA Astrophysics Data System (ADS)

    Popescu, M.; Vaduvescu, O.; Char, F.; Curelaru, L.; Euronear Team

    2014-07-01

    The vast collection of CCD images and photographic plate archives available from the world-wide archives and telescopes is still insufficiently exploited. Within the EURONEAR project we designed two data mining software with the purpose to search very large collections of archives for images which serendipitously include known asteroids or comets in their field, with the main aims to extend the arc and improve the orbits. In this sense, ''Precovery'' (published in 2008, aiming to search all known NEAs in few archives via IMCCE's SkyBoT server) and ''Mega-Precovery'' (published in 2010, querying the IMCCE's Miriade server) were made available to the community via the EURONEAR website (euronear.imcce.fr). Briefly, Mega-Precovery aims to search one or a few known asteroids or comets in a mega-collection including millions of images from some of the largest observatory archives: ESO (15 instruments served by ESO Archive including VLT), NVO (8 instruments served by U.S. NVO Archive), CADC (11 instruments, including HST and Gemini), plus other important instrument archives: SDSS, CFHTLS, INT-WFC, Subaru-SuprimeCam and AAT-WFI, adding together 39 instruments and 4.3 million images (Mar 2014), and our Mega-Archive is growing. Here we present some of the most important results obtained with our data-mining software and some new planned search options of Mega-Precovery. Particularly, the following capabilities will be added soon: the ING archive (all imaging cameras) will be included and new search options will be made available (such as query by orbital elements and by observations) to be able to target new Solar System objects such as Virtual Impactors, bolides, planetary satellites, TNOs (besides the comets added recently). In order to better characterize the archives, we introduce the ''AOmegaA'' factor (archival etendue) proportional to the AOmega (etendue) and the number of images in an archive. With the aim to enlarge the Mega-Archive database, we invite the observatories (particularly those storing their images online and also those that own plate archives which could be scanned on request) to contact us in order to add their instrument archives (consisting of an ASCII file with telescope pointings in a simple format) to our Mega-Precovery open project. We intend for the future to synchronise our service with the Virtual Observatory.

  10. Astro-H/Hitomi data analysis, processing, and archive

    NASA Astrophysics Data System (ADS)

    Angelini, Lorella; Terada, Yukikatsu; Dutka, Michael; Eggen, Joseph; Harrus, Ilana; Hill, Robert S.; Krimm, Hans; Loewenstein, Michael; Miller, Eric D.; Nobukawa, Masayoshi; Rutkowski, Kristin; Sargent, Andrew; Sawada, Makoto; Takahashi, Hiromitsu; Yamaguchi, Hiroya; Yaqoob, Tahir; Witthoeft, Michael

    2018-01-01

    Astro-H is the x-ray/gamma-ray mission led by Japan with international participation, launched on February 17, 2016. Soon after launch, Astro-H was renamed Hitomi. The payload consists of four different instruments (SXS, SXI, HXI, and SGD) that operate simultaneously to cover the energy range from 0.3 keV up to 600 keV. On March 27, 2016, JAXA lost contact with the satellite and, on April 28, they announced the cessation of the efforts to restore mission operations. Hitomi collected about one month's worth of data with its instruments. This paper presents the analysis software and the data processing pipeline created to calibrate and analyze the Hitomi science data, along with the plan for the archive. These activities have been a collaborative effort shared between scientists and software engineers working in several institutes in Japan and United States.

  11. Margaret W. Mayall in the AAVSO Archives

    NASA Astrophysics Data System (ADS)

    Saladyga, M.

    2012-06-01

    (Abstract only) AAVSO Director Margaret W. Mayall's presence in the AAVSO Archives is unique in that it was only by her effort that the AAVSO's institutional memory survived the organization's years of struggle. The history of the AAVSO could not have been written thoroughly and accurately without its archival collections. Similarly, the story of Mayall and the AAVSO within that history is not only informed, but is also formed by the materials that she chose to collect and preserve over the years.

  12. Earth observation archive activities at DRA Farnborough

    NASA Technical Reports Server (NTRS)

    Palmer, M. D.; Williams, J. M.

    1993-01-01

    Space Sector, Defence Research Agency (DRA), Farnborough have been actively involved in the acquisition and processing of Earth Observation data for over 15 years. During that time an archive of over 20,000 items has been built up. This paper describes the major archive activities, including: operation and maintenance of the main DRA Archive, the development of a prototype Optical Disc Archive System (ODAS), the catalog systems in use at DRA, the UK Processing and Archive Facility for ERS-1 data, and future plans for archiving activities.

  13. 78 FR 64254 - Agency Information Collection Activities: Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-28

    ... NATIONAL ARCHIVES AND RECORDS ADMINISTRATION [NARA-2014-002] Agency Information Collection Activities: Submission for OMB Review; Comment Request AGENCY: National Archives and Records Administration (NARA). ACTION: Notice. SUMMARY: NARA is giving public notice that the agency has submitted to OMB for...

  14. 76 FR 7591 - Agency Information Collection Activities: Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-10

    ... NATIONAL ARCHIVES AND RECORDS ADMINISTRATION Agency Information Collection Activities: Submission for OMB Review; Comment Request AGENCY: National Archives and Records Administration (NARA). ACTION... in a timely fashion the volume of requests received for these records and the need to obtain specific...

  15. 75 FR 66166 - Agency Information Collection Activities: Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-27

    ... NATIONAL ARCHIVES AND RECORDS ADMINISTRATION Agency Information Collection Activities: Submission for OMB Review; Comment Request AGENCY: National Archives and Records Administration (NARA). ACTION... appropriate NARA research room or who request copies of records as a result of visiting a research room. NARA...

  16. Lost in a Giant Database: The Potentials and Pitfalls of Secondary Analysis for Deaf Education

    ERIC Educational Resources Information Center

    Kluwin, T. N.; Morris, C. S.

    2006-01-01

    Secondary research or archival research is the analysis of data collected by another person or agency. It offers several advantages, including reduced cost, a less time-consuming research process, and access to larger populations and thus greater generalizability. At the same time, it offers several limitations, including the fact that the…

  17. The French Astronomical Archives Alidade Project

    NASA Astrophysics Data System (ADS)

    Debarbat, S.; Bobis, L.

    2004-12-01

    The present state of Alidade, an archival project of Paris Observatory, including not only archival papers, but also instruments, documents, iconography, paintings etc., of various institutions, is described. Documents and collections, e.g. from donations or purchases, are still integrated into the archives, and selected material is displayed in temporary exhibits at the Observatory. Modern uses of old material are briefly mentioned

  18. The NASA Ames Life Sciences Data Archive: Biobanking for the Final Frontier

    NASA Technical Reports Server (NTRS)

    Rask, Jon; Chakravarty, Kaushik; French, Alison J.; Choi, Sungshin; Stewart, Helen J.

    2017-01-01

    The NASA Ames Institutional Scientific Collection involves the Ames Life Sciences Data Archive (ALSDA) and a biospecimen repository, which are responsible for archiving information and non-human biospecimens collected from spaceflight and matching ground control experiments. The ALSDA also manages a biospecimen sharing program, performs curation and long-term storage operations, and facilitates distribution of biospecimens for research purposes via a public website (https:lsda.jsc.nasa.gov). As part of our best practices, a tissue viability testing plan has been developed for the repository, which will assess the quality of samples subjected to long-term storage. We expect that the test results will confirm usability of the samples, enable broader science community interest, and verify operational efficiency of the archives. This work will also support NASA open science initiatives and guides development of NASA directives and policy for curation of biological collections.

  19. Kepler Data Release 3 Notes

    NASA Technical Reports Server (NTRS)

    Cleve, Jeffrey E.

    2010-01-01

    This describes the collection of data and the processing done on it so when researchers around the world get the Kepler data sets (which are a set of pixels from the telescope of a particular target (star, galaxy or whatever) over a 3 month period) they can adjust their algorithms fro things that were done (like subtracting all of one particular wavelength for example). This is used to calibrate their own algorithms so that they know what it is they are starting with. It is posted so that whoever is accessing the publicly available data (not all of it is made public) can understand it .. (most of the Kepler data is under restriction for 1 - 4 years and is not available, but the handbook is for everyone (public and restricted) The Data Analysis Working Group have released long and short cadence materials, including FFls and Dropped Targets for the Public. The Kepler Science Office considers Data Release 3 to provide "browse quality" data. These notes have been prepared to give Kepler users of the Multimission Archive at STScl (MAST) a summary of how the data were collected and prepared, and how well the data processing pipeline is functioning on flight data. They will be updated for each release of data to the public archive and placed on MAST along with other Kepler documentation, at http:// archive.stsci.edu/kepler/documents.html .Data release 3 is meant to give users the opportunity to examine the data for possibly interesting science and to involve the users in improving the pipeline for future data releases. To perform the latter service, users are encouraged to notice and document artifacts, either in the raw or processed data, and report them to the Science Office.

  20. The Archive of the Amateur Observation Network of the International Halley Watch. Volume 2; Comet Halley

    NASA Technical Reports Server (NTRS)

    Edberg, Stephen J. (Editor)

    1996-01-01

    The International Halley Watch (IHW) was organized for the purpose of gathering and archiving the most complete record of the apparition of a comet, Halley's Comet (1982i = 1986 III = 1P/Halley), ever compiled. The redirection of the International Sun-Earth Explorer 3 (ISEE-3) spacecraft, subsequently renamed the International Cometary Explorer (ICE), toward Comet Giacobini- Zinner (1984e = 1985 XIII = 21P/Giacobini-Zinner) prompted the initiation of a formal watch on that comet. All the data collected on P/Giacobini-Zinner and P/Halley have been published on CD-ROM in the Comet Halley Archive. This document contains a printed version of the archive data, collected by amateur astronomers, on these two comets. Volume 1 contains the Comet Giacobini-Zinner data archive and Volume 2 contains the Comet Halley archive. Both volumes include information on how to read the data in both archives, as well as a history of both comet watches (including the organizing of the network of astronomers and lessons learned from that experience).

  1. The Archive of the Amateur Observation Network of the International Halley Watch. Volume 1; Comet Giacobini-Zinner

    NASA Technical Reports Server (NTRS)

    Edberg, Stephen J. (Editor)

    1966-01-01

    The International Halley Watch (IHW) was organized for the purpose of gathering and archiving the most complete record of the apparition of a comet, Halley's Comet (1982i = 1986 III = 1P/Halley), ever compiled. The redirection of the International Sun-Earth Explorer 3 (ISEE-3) spacecraft, subsequently renamed the International Cometary Explorer (ICE), toward Comet Giacobini-Zinner (1984e = 1985 XIII = 21P/Giacobini-Zinner) prompted the initiation of a formal watch on that comet. All the data collected on P/Giacobini-Zinner and P/Halley have been published on CD-ROM in the Comet Halley Archive. This document contains a printed version of the archive data, collected by amateur astronomers, on these two comets. Volume 1 contains the Comet Giacobini-Zinner data archive and Volume 2 contains the Comet Halley archive. Both volumes include information on how to read the data in both archives, as well as a history of both comet watches (including the organizing of the network of astronomers and lessons learned from that experience).

  2. St. Petersburg Coastal and Marine Science Center's Core Archive Portal

    USGS Publications Warehouse

    Reich, Chris; Streubert, Matt; Dwyer, Brendan; Godbout, Meg; Muslic, Adis; Umberger, Dan

    2012-01-01

    This Web site contains information on rock cores archived at the U.S. Geological Survey (USGS) St. Petersburg Coastal and Marine Science Center (SPCMSC). Archived cores consist of 3- to 4-inch-diameter coral cores, 1- to 2-inch-diameter rock cores, and a few unlabeled loose coral and rock samples. This document - and specifically the archive Web site portal - is intended to be a 'living' document that will be updated continually as additional cores are collected and archived. This document may also contain future references and links to a catalog of sediment cores. Sediment cores will include vibracores, pushcores, and other loose sediment samples collected for research purposes. This document will: (1) serve as a database for locating core material currently archived at the USGS SPCMSC facility; (2) provide a protocol for entry of new core material into the archive system; and, (3) set the procedures necessary for checking out core material for scientific purposes. Core material may be loaned to other governmental agencies, academia, or non-governmental organizations at the discretion of the USGS SPCMSC curator.

  3. Creating Trading Networks of Digital Archives.

    ERIC Educational Resources Information Center

    Cooper, Brian; Garcia-Molina, Hector

    Digital materials are vulnerable to a number of different kinds of failures, including decay of the digital media, loss due to hackers and viruses, accidental deletions, natural disasters, and bankruptcy of the institution holding the collection. Digital archives can best survive failures if they have made several copies of their collections at…

  4. Managing Archival Collections in an Automated Environment: The Joys of Barcoding

    ERIC Educational Resources Information Center

    Hamburger, Susan; Charles, Jane Veronica

    2006-01-01

    In a desire for automated collection control, archival repositories are adopting barcoding from their library and records center colleagues. This article discusses the planning, design, and implementation phases of barcoding. The authors focus on reasons for barcoding, security benefits, in-room circulation tracking, potential for gathering…

  5. Remotely Sensed Imagery from USGS: Update on Products and Portals

    NASA Astrophysics Data System (ADS)

    Lamb, R.; Lemig, K.

    2016-12-01

    The USGS Earth Resources Observation and Science (EROS) Center has recently implemented a number of additions and changes to its existing suite of products and user access systems. Together, these changes will enhance the accessibility, breadth, and usability of the remotely sensed image products and delivery mechanisms available from USGS. As of late 2016, several new image products are now available for public download at no charge from USGS/EROS Center. These new products include: (1) global Level 1T (precision terrain-corrected) products from Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER), provided via NASA's Land Processes Distributed Active Archive Center (LP DAAC); and (2) Sentinel-2 Multispectral Instrument (MSI) products, available through a collaborative effort with the European Space Agency (ESA). Other new products are also planned to become available soon. In an effort to enable future scientific analysis of the full 40+ year Landsat archive, the USGS also introduced a new "Collection Management" strategy for all Landsat Level 1 products. This new archive and access schema involves quality-based tier designations that will support future time series analysis of the historic Landsat archive at the pixel level. Along with the quality tier designations, the USGS has also implemented a number of other Level 1 product improvements to support Landsat science applications, including: enhanced metadata, improved geometric processing, refined quality assessment information, and angle coefficient files. The full USGS Landsat archive is now being reprocessed in accordance with the new `Collection 1' specifications. Several USGS data access and visualization systems have also seen major upgrades. These user interfaces include a new version of the USGS LandsatLook Viewer which was released in Fall 2017 to provide enhanced functionality and Sentinel-2 visualization and access support. A beta release of the USGS Global Visualization Tool ("GloVis Next") was also released in Fall 2017, with many new features including data visualization at full resolution. The USGS also introduced a time-enabled web mapping service (WMS) to support time-based access to the existing LandsatLook "natural color" full-resolution browse image services.

  6. Improving Access to NASA Earth Science Data through Collaborative Metadata Curation

    NASA Astrophysics Data System (ADS)

    Sisco, A. W.; Bugbee, K.; Shum, D.; Baynes, K.; Dixon, V.; Ramachandran, R.

    2017-12-01

    The NASA-developed Common Metadata Repository (CMR) is a high-performance metadata system that currently catalogs over 375 million Earth science metadata records. It serves as the authoritative metadata management system of NASA's Earth Observing System Data and Information System (EOSDIS), enabling NASA Earth science data to be discovered and accessed by a worldwide user community. The size of the EOSDIS data archive is steadily increasing, and the ability to manage and query this archive depends on the input of high quality metadata to the CMR. Metadata that does not provide adequate descriptive information diminishes the CMR's ability to effectively find and serve data to users. To address this issue, an innovative and collaborative review process is underway to systematically improve the completeness, consistency, and accuracy of metadata for approximately 7,000 data sets archived by NASA's twelve EOSDIS data centers, or Distributed Active Archive Centers (DAACs). The process involves automated and manual metadata assessment of both collection and granule records by a team of Earth science data specialists at NASA Marshall Space Flight Center. The team communicates results to DAAC personnel, who then make revisions and reingest improved metadata into the CMR. Implementation of this process relies on a network of interdisciplinary collaborators leveraging a variety of communication platforms and long-range planning strategies. Curating metadata at this scale and resolving metadata issues through community consensus improves the CMR's ability to serve current and future users and also introduces best practices for stewarding the next generation of Earth Observing System data. This presentation will detail the metadata curation process, its outcomes thus far, and also share the status of ongoing curation activities.

  7. The ARM Data System and Archive

    DOE PAGES

    McCord, Raymond A.; Voyles, Jimmy W.

    2016-07-05

    Every observationally based research program needs a way to collect data from instruments, convert the data from its raw format into a more usable format, apply quality control, process it into higher-order data products, store the data, and make the data available to its scientific community. This data flow is illustrated pictorially in Fig. 11-1. These are the basic requirements of any scientific data system, and ARM’s data system would have to address these requirements and more. This research provides one view of the development of the ARM data system, which includes the ARM Data Archive, and some of themore » notable decisions that were made along the way.« less

  8. Digital management and regulatory submission of medical images from clinical trials: role and benefits of the core laboratory

    NASA Astrophysics Data System (ADS)

    Robbins, William L.; Conklin, James J.

    1995-10-01

    Medical images (angiography, CT, MRI, nuclear medicine, ultrasound, x ray) play an increasingly important role in the clinical development and regulatory review process for pharmaceuticals and medical devices. Since medical images are increasingly acquired and archived digitally, or are readily digitized from film, they can be visualized, processed and analyzed in a variety of ways using digital image processing and display technology. Moreover, with image-based data management and data visualization tools, medical images can be electronically organized and submitted to the U.S. Food and Drug Administration (FDA) for review. The collection, processing, analysis, archival, and submission of medical images in a digital format versus an analog (film-based) format presents both challenges and opportunities for the clinical and regulatory information management specialist. The medical imaging 'core laboratory' is an important resource for clinical trials and regulatory submissions involving medical imaging data. Use of digital imaging technology within a core laboratory can increase efficiency and decrease overall costs in the image data management and regulatory review process.

  9. TCIA: An information resource to enable open science.

    PubMed

    Prior, Fred W; Clark, Ken; Commean, Paul; Freymann, John; Jaffe, Carl; Kirby, Justin; Moore, Stephen; Smith, Kirk; Tarbox, Lawrence; Vendt, Bruce; Marquez, Guillermo

    2013-01-01

    Reusable, publicly available data is a pillar of open science. The Cancer Imaging Archive (TCIA) is an open image archive service supporting cancer research. TCIA collects, de-identifies, curates and manages rich collections of oncology image data. Image data sets have been contributed by 28 institutions and additional image collections are underway. Since June of 2011, more than 2,000 users have registered to search and access data from this freely available resource. TCIA encourages and supports cancer-related open science communities by hosting and managing the image archive, providing project wiki space and searchable metadata repositories. The success of TCIA is measured by the number of active research projects it enables (>40) and the number of scientific publications and presentations that are produced using data from TCIA collections (39).

  10. Dynamic Data Management Based on Archival Process Integration at the Centre for Environmental Data Archival

    NASA Astrophysics Data System (ADS)

    Conway, Esther; Waterfall, Alison; Pepler, Sam; Newey, Charles

    2015-04-01

    In this paper we decribe a business process modelling approach to the integration of exisiting archival activities. We provide a high level overview of existing practice and discuss how procedures can be extended and supported through the description of preservation state. The aim of which is to faciliate the dynamic controlled management of scientific data through its lifecycle. The main types of archival processes considered are: • Management processes that govern the operation of an archive. These management processes include archival governance (preservation state management, selection of archival candidates and strategic management) . • Operational processes that constitute the core activities of the archive which maintain the value of research assets. These operational processes are the acquisition, ingestion, deletion, generation of metadata and preservation actvities, • Supporting processes, which include planning, risk analysis and monitoring of the community/preservation environment. We then proceed by describing the feasability testing of extended risk management and planning procedures which integrate current practices. This was done through the CEDA Archival Format Audit which inspected British Atmospherics Data Centre and National Earth Observation Data Centre Archival holdings. These holdings are extensive, comprising of around 2PB of data and 137 million individual files which were analysed and characterised in terms of format based risk. We are then able to present an overview of the risk burden faced by a large scale archive attempting to maintain the usability of heterogeneous environmental data sets. We conclude by presenting a dynamic data management information model that is capable of describing the preservation state of archival holdings throughout the data lifecycle. We provide discussion of the following core model entities and their relationships: • Aspirational entities, which include Data Entity definitions and their associated Preservation Objectives. • Risk entities, which act as drivers for change within the data lifecycle. These include Acquisitional Risks, Technical Risks, Strategic Risks and External Risks • Plan entities, which detail the actions to bring about change within an archive. These include Acquisition Plans, Preservation Plans and Monitoring plans • The Result entities describe the successful outcomes of the executed plans. These include Acquisitions, Mitigations and Accepted Risks.

  11. Characterizing DebriSat Fragments: So Many Fragments, So Much Data, and So Little Time

    NASA Technical Reports Server (NTRS)

    Shiotani, B.; Rivero, M.; Carrasquilla, M.; Allen, S.; Fitz-Coy, N.; Liou, J.-C.; Huynh, T.; Sorge, M.; Cowardin, H.; Opiela, J.; hide

    2017-01-01

    To improve prediction accuracy, the DebriSat project was conceived by NASA and DoD to update existing standard break-up models. Updating standard break-up models require detailed fragment characteristics such as physical size, material properties, bulk density, and ballistic coefficient. For the DebriSat project, a representative modern LEO spacecraft was developed and subjected to a laboratory hypervelocity impact test and all generated fragments with at least one dimension greater than 2 mm are collected, characterized and archived. Since the beginning of the characterization phase of the DebriSat project, over 130,000 fragments have been collected and approximately 250,000 fragments are expected to be collected in total, a three-fold increase over the 85,000 fragments predicted by the current break-up model. The challenge throughout the project has been to ensure the integrity and accuracy of the characteristics of each fragment. To this end, the post hypervelocity-impact test activities, which include fragment collection, extraction, and characterization, have been designed to minimize handling of the fragments. The procedures for fragment collection, extraction, and characterization were painstakingly designed and implemented to maintain the post-impact state of the fragments, thus ensuring the integrity and accuracy of the characterization data. Each process is designed to expedite the accumulation of data, however, the need for speed is restrained by the need to protect the fragments. Methods to expedite the process such as parallel processing have been explored and implemented while continuing to maintain the highest integrity and value of the data. To minimize fragment handling, automated systems have been developed and implemented. Errors due to human inputs are also minimized by the use of these automated systems. This paper discusses the processes and challenges involved in the collection, extraction, and characterization of the fragments as well as the time required to complete the processes. The objective is to provide the orbital debris community an understanding of the scale of the effort required to generate and archive high quality data and metadata for each debris fragment 2 mm or larger generated by the DebriSat project.

  12. Library and Archives of Canada Collections as Resources for Classroom Learning

    ERIC Educational Resources Information Center

    Sly, Gordon

    2006-01-01

    This article promotes the online use of primary documents from Library and Archives of Canada (LAC) collections by high school students conducting historical inquiry into a major historic event in Canada's past. It outlines a unit of seven history lessons that the author wrote for the "Learning Centre" at…

  13. 76 FR 4634 - Proposed Information Collection; Comment Request; Implantation and Recovery of Archival Tags for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-26

    ... tags in, or affix archival tags to, selected Atlantic Highly Migratory Species (tunas, sharks... for scientists researching the movements and behavior of individual fish. It is often necessary to retrieve the tags in order to obtain the collected data; therefore, persons catching tagged fish are...

  14. Mexican American Archives at the Benson Collection: A Guide for Users.

    ERIC Educational Resources Information Center

    Flores, Maria G., Comp.; Gutierrez-Witt, Laura, Ed.

    This guide, which documents the first phase of a continuing program to strengthen and develop Mexican American materials and research sources at the University of Texas at Austin, lists and describes both textual and non-textual materials in the collections of literary manuscripts, organizational archives, and personal papers. The first section…

  15. SCOPE - Stellar Classification Online Public Exploration

    NASA Astrophysics Data System (ADS)

    Harenberg, Steven

    2010-01-01

    The Astronomical Photographic Data Archive (APDA) has been established to be the primary North American archive for the collections of astronomical photographic plates. Located at the Pisgah Astronomical Research Institute (PARI) in Rosman, NC, the archive contains hundreds of thousands stellar spectra, many of which have never before been classified. To help classify the vast number of stars, the public is invited to participate in a distributed computing online environment called Stellar Classification Online - Public Exploration (SCOPE). Through a website, the participants will have a tutorial on stellar spectra and practice classifying. After practice, the participants classify spectra on photographic plates uploaded online from APDA. These classifications will be recorded in a database where the results from many users will be statistically analyzed. Stars with known spectral types will be included to test the reliability of classifications. The process of building the database of stars from APDA, which the citizen scientist will be able to classify, includes: scanning the photographic plates, orienting the plate to correct for the change in right ascension/declination using Aladin, stellar HD catalog identification using Simbad, marking the boundaries for each spectrum, and setting up the image for use on the website. We will describe the details of this process.

  16. Architecture of distributed picture archiving and communication systems for storing and processing high resolution medical images

    NASA Astrophysics Data System (ADS)

    Tokareva, Victoria

    2018-04-01

    New generation medicine demands a better quality of analysis increasing the amount of data collected during checkups, and simultaneously decreasing the invasiveness of a procedure. Thus it becomes urgent not only to develop advanced modern hardware, but also to implement special software infrastructure for using it in everyday clinical practice, so-called Picture Archiving and Communication Systems (PACS). Developing distributed PACS is a challenging task for nowadays medical informatics. The paper discusses the architecture of distributed PACS server for processing large high-quality medical images, with respect to technical specifications of modern medical imaging hardware, as well as international standards in medical imaging software. The MapReduce paradigm is proposed for image reconstruction by server, and the details of utilizing the Hadoop framework for this task are being discussed in order to provide the design of distributed PACS as ergonomic and adapted to the needs of end users as possible.

  17. Drowning in Data: Going Beyond Traditional Data Archival to Educate Data Users

    NASA Astrophysics Data System (ADS)

    Weigel, A. M.; Smith, T.; Smith, D. K.; Bugbee, K.; Sinclair, L.

    2017-12-01

    Increasing quantities of Earth science data and information prove overwhelming to new and unfamiliar users. Data discovery and use challenges faced by these users are compounded with atmospheric science field campaign data collected by a variety of instruments and stored, visualized, processed and analyzed in different ways. To address data and user needs assessed through annual surveys and user questions, the NASA Global Hydrology Resource Center Distributed Active Archive Center (GHRC DAAC), in collaboration with a graphic designer, has developed a series of resources to help users learn about GHRC science focus areas, field campaigns, instruments, data, and data processing techniques. In this talk, GHRC data recipes, micro articles, interactive data visualization techniques, and artistic science outreach and education efforts, such as ESRI story maps and research as art, will be overviewed. The objective of this talk is to stress the importance artistic information visualization has in communicating with and educating Earth science data users.

  18. Rejected Manuscripts in Publishers' Archives: Legal Rights and Access

    ERIC Educational Resources Information Center

    Hamburger, Susan

    2011-01-01

    This article focuses on an analysis of how various archival repositories deal with rejected manuscripts in publishers' archives as part of existing collections and as potential donations, and includes suggestions for ways to provide access while maintaining the author's legal rights. Viewpoints from the journal editor, author, archivist, and…

  19. Processing the CONSOL Energy, Inc. Mine Maps and Records Collection at the University of Pittsburgh

    ERIC Educational Resources Information Center

    Rougeux, Debora A.

    2011-01-01

    This article describes the efforts of archivists and student assistants at the University of Pittsburgh's Archives Service Center to organize, describe, store, and provide timely and efficient access to over 8,000 maps of underground coal mines in southwestern Pennsylvania, as well the records that accompanied them, donated by CONSOL Energy, Inc.…

  20. Higher Education and Employability: A Case Study of Debt and Justice in the Process to Becoming A Work College

    ERIC Educational Resources Information Center

    Bolger, Andrew T.

    2017-01-01

    This study presents the findings that emerged in a qualitative policy-oriented case study of an institution's transition to a work college. Using Resource Dependence Theory as the theoretical framework, 32 individual interviews were collected, along with other observational data and institutional archives to understand the appeal of federal policy…

  1. 78 FR 65334 - Agency Information Collection Activities; Submission for Office of Management and Budget Review...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-31

    ... information, including certain labeling information, to FDA for approval to market a product in interstate... and in a form that FDA can process, review, and archive. This requirement is in addition to the....12(b) in table 1 of this document. In July 1997, FDA revised Form FDA 356h ``Application to Market a...

  2. Stability and reproducibility of proteomic profiles measured with an aptamer-based platform.

    PubMed

    Kim, Claire H; Tworoger, Shelley S; Stampfer, Meir J; Dillon, Simon T; Gu, Xuesong; Sawyer, Sherilyn J; Chan, Andrew T; Libermann, Towia A; Eliassen, A Heather

    2018-05-30

    The feasibility of SOMAscan, a multiplex, high sensitivity proteomics platform, for use in studies using archived plasma samples has not yet been assessed. We quantified 1,305 proteins from plasma samples donated by 16 Nurses' Health Study (NHS) participants, 40 NHSII participants, and 12 local volunteers. We assessed assay reproducibility using coefficients of variation (CV) from duplicate samples and intra-class correlation coefficients (ICC) and Spearman correlation coefficients (r) of samples processed (i.e., centrifuged and aliquoted into separate components) immediately, 24, and 48 hours after collection, as well as those of samples collected from the same individuals 1 year apart. CVs were <20% for 99% of proteins overall and <10% for 92% of proteins in heparin samples compared to 66% for EDTA samples. We observed ICC or Spearman r (comparing immediate vs. 24-hour delayed processing) ≥0.75 for 61% of proteins, with some variation by anticoagulant (56% for heparin and 70% for EDTA) and protein class (ranging from 49% among kinases to 83% among hormones). Within-person stability over 1 year was good (ICC or Spearman r ≥ 0.4) for 91% of proteins. These results demonstrate the feasibility of SOMAscan for analyses of archived plasma samples.

  3. Genetic Inventory Task Final Report. Volume 2

    NASA Technical Reports Server (NTRS)

    Venkateswaran, Kasthuri; LaDuc, Myron T.; Vaishampayan, Parag

    2012-01-01

    Contaminant terrestrial microbiota could profoundly impact the scientific integrity of extraterrestrial life-detection experiments. It is therefore important to know what organisms persist on spacecraft surfaces so that their presence can be eliminated or discriminated from authentic extraterrestrial biosignatures. Although there is a growing understanding of the biodiversity associated with spacecraft and cleanroom surfaces, it remains challenging to assess the risk of these microbes confounding life-detection or sample-return experiments. A key challenge is to provide a comprehensive inventory of microbes present on spacecraft surfaces. To assess the phylogenetic breadth of microorganisms on spacecraft and associated surfaces, the Genetic Inventory team used three technologies: conventional cloning techniques, PhyloChip DNA microarrays, and 454 tag-encoded pyrosequencing, together with a methodology to systematically collect, process, and archive nucleic acids. These three analysis methods yielded considerably different results: Traditional approaches provided the least comprehensive assessment of microbial diversity, while PhyloChip and pyrosequencing illuminated more diverse microbial populations. The overall results stress the importance of selecting sample collection and processing approaches based on the desired target and required level of detection. The DNA archive generated in this study can be made available to future researchers as genetic-inventory-oriented technologies further mature.

  4. Building a Digital Library for Multibeam Data, Images and Documents

    NASA Astrophysics Data System (ADS)

    Miller, S. P.; Staudigel, H.; Koppers, A.; Johnson, C.; Cande, S.; Sandwell, D.; Peckman, U.; Becker, J. J.; Helly, J.; Zaslavsky, I.; Schottlaender, B. E.; Starr, S.; Montoya, G.

    2001-12-01

    The Scripps Institution of Oceanography, the UCSD Libraries and the San Diego Supercomputing Center have joined forces to establish a digital library for accessing a wide range of multibeam and marine geophysical data, to a community that ranges from the MGG researcher to K-12 outreach clients. This digital library collection will include 233 multibeam cruises with grids, plots, photographs, station data, technical reports, planning documents and publications, drawn from the holdings of the Geological Data Center and the SIO Archives. Inquiries will be made through an Ocean Exploration Console, reminiscent of a cockpit display where a multitude of data may be displayed individually or in two or three-dimensional projections. These displays will provide access to cruise data as well as global databases such as Global Topography, crustal age, and sediment thickness, thus meeting the day-to-day needs of researchers as well as educators, students, and the public. The prototype contains a few selected expeditions, and a review of the initial approach will be solicited from the user community during the poster session. The search process can be focused by a variety of constraints: geospatial (lat-lon box), temporal (e.g., since 1996), keyword (e.g., cruise, place name, PI, etc.), or expert-level (e.g., K-6 or researcher). The Storage Resource Broker (SRB) software from the SDSC manages the evolving collection as a series of distributed but related archives in various media, from shipboard data through processing and final archiving. The latest version of MB-System provides for the systematic creation of standard metadata, and for the harvesting of metadata from multibeam files. Automated scripts will be used to load the metadata catalog to enable queries with an Oracle database management system. These new efforts to bridge the gap between libraries and data archives are supported by the NSF Information Technology and National Science Digital Library (NSDL) programs, augmented by UC funds, and closely coordinated with Digital Library for Earth System Education (DLESE) activities.

  5. Concept for Future Data Services at the Long-Term Archive of WDCC combining DOIs with common PIDs

    NASA Astrophysics Data System (ADS)

    Stockhause, Martina; Weigel, Tobias; Toussaint, Frank; Höck, Heinke; Thiemann, Hannes; Lautenschlager, Michael

    2013-04-01

    The World Data Center for Climate (WDCC) hosted at the German Climate Computing Center (DKRZ) maintains a long-term archive (LTA) of climate model data as well as observational data. WDCC distinguishes between two types of LTA data: Structured data: Data output of an instrument or of a climate model run consists of numerous, highly structured individual datasets in a uniform format. Part of these data is also published on an ESGF (Earth System Grid Federation) data node. Detailed metadata is available allowing for fine-grained user-defined data access. Unstructured data: LTA data of finished scientific projects are in general unstructured and consist of datasets of different formats, different sizes, and different contents. For these data compact metadata is available as content information. The structured data is suitable for WDCC's DataCite DOI process, the project data only in exceptional cases. The DOI process includes a thorough quality control process of technical as well as scientific aspects by the publication agent and the data creator. DOIs are assigned to data collections appropriate to be cited in scientific publications, like a simulation run. The data collection is defined in agreement with the data creator. At the moment there is no possibility to identify and cite individual datasets within this DOI data collection analogous to the citation of chapters in a book. Also missing is a compact citation regulation for a user-specified collection of data. WDCC therefore complements its existing LTA/DOI concept by Persistent Identifier (PID) assignment to datasets using Handles. In addition to data identification for internal and external use, the concept of PIDs allows to define relations among PIDs. Such structural information is stored as key-value pair directly in the handles. Thus, relations provide basic provenance or lineage information, even if part of the data like intermediate results are lost. WDCC intends to use additional PIDs on metadata entities with a relation to the data PID(s). These add background information on the data creation process (e.g. descriptions of experiment, model, model set-up, and platform for the model run etc.) to the data. These pieces of additional information increase the re-usability of the archived model data, significantly. Other valuable additional information for scientific collaboration could be added by the same mechanism, like quality information and annotations. Apart from relations among data and metadata entities, PIDs on collections are advantageous for model data: Collections allow for persistent references to single datasets or subsets of data assigned a DOI, Data objects and additional information objects can be consistently connected via relations (provenance, creation, quality information for data),

  6. Surviving the Downturn: Challenges Faced by Appalachian Regional Collections during a Time of Reduced Resources

    ERIC Educational Resources Information Center

    Brodsky, Marc; Hyde, Gene

    2012-01-01

    How have college and university-based archives and special collections fared over the past few years in the midst of an historically grim economic downturn? The authors conducted in-depth interviews with directors of 13 archival repositories at state universities and private colleges in the Appalachian region of Virginia, West Virginia, North…

  7. The Inauguration of the Child Development Film Archives. Final Report.

    ERIC Educational Resources Information Center

    Popplestone, John A.; McPherson, Marion White

    The cataloging of films acquired by the Child Development Film Archives disclosed approximately 3500 separate units, or 804,232 feet of footage. An inventory sheet has been completed for each film which provides a record of the content of each cinema, and collectively, constitute a table of contents to the total collection. Variables specified on…

  8. BOREAS Level-0 ER-2 Navigation Data

    NASA Technical Reports Server (NTRS)

    Strub, Richard; Dominguez, Roseanne; Newcomer, Jeffrey A.; Hall, Forrest G. (Editor)

    2000-01-01

    The BOREAS Staff Science effort covered those activities that were BOREAS community-level activities or required uniform data collection procedures across sites and time. These activities included the acquisition, processing, and archiving of aircraft navigation/attitude data to complement the digital image data. The level-0 ER-2 navigation data files contain aircraft attitude and position information acquired during the digital image and photographic data collection missions. Temporally, the data were acquired from April to September 1994. Data were recorded at intervals of 5 seconds. The data are stored in tabular ASCII files.

  9. Archive of Boomer seismic reflection data: collected during USGS Cruise 96CCT01, nearshore south central South Carolina coast, June 26 - July 1, 1996

    USGS Publications Warehouse

    Calderon, Karynna; Dadisman, Shawn V.; Kindinger, Jack G.; Flocks, James G.; Wiese, Dana S.

    2003-01-01

    This archive consists of marine seismic reflection profile data collected in four survey areas from southeast of Charleston Harbor to the mouth of the North Edisto River of South Carolina. These data were acquired June 26 - July 1, 1996, aboard the R/V G.K. Gilbert. Included here are data in a variety of formats including binary, American Standard Code for Information Interchange (ASCII), Hyper Text Markup Language (HTML), Portable Document Format (PDF), Rich Text Format (RTF), Graphics Interchange Format (GIF) and Joint Photographic Experts Group (JPEG) images, and shapefiles. Binary data are in Society of Exploration Geophysicists (SEG) SEG-Y format and may be downloaded for further processing or display. Reference maps and GIF images of the profiles may be viewed with a web browser. The Geographic Information Systems (GIS) map documents provided were created with Environmental Systems Research Institute (ESRI) GIS software ArcView 3.2 and 8.1.

  10. Cloud object store for archive storage of high performance computing data using decoupling middleware

    DOEpatents

    Bent, John M.; Faibish, Sorin; Grider, Gary

    2015-06-30

    Cloud object storage is enabled for archived data, such as checkpoints and results, of high performance computing applications using a middleware process. A plurality of archived files, such as checkpoint files and results, generated by a plurality of processes in a parallel computing system are stored by obtaining the plurality of archived files from the parallel computing system; converting the plurality of archived files to objects using a log structured file system middleware process; and providing the objects for storage in a cloud object storage system. The plurality of processes may run, for example, on a plurality of compute nodes. The log structured file system middleware process may be embodied, for example, as a Parallel Log-Structured File System (PLFS). The log structured file system middleware process optionally executes on a burst buffer node.

  11. The GTC Public Archive

    NASA Astrophysics Data System (ADS)

    Alacid, J. Manuel; Solano, Enrique

    2015-12-01

    The Gran Telescopio Canarias (GTC) archive is operational since November 2011. The archive, maintained by the Data Archive Unit at CAB in the framework of the Spanish Virtual Observatory project, provides access to both raw and science ready data and has been designed in compliance with the standards defined by the International Virtual Observatory Alliance (IVOA) to guarantee a high level of data accessibility and handling. In this presentation I will describe the main capabilities the GTC archive offers to the community, in terms of functionalities and data collections, to carry out an efficient scientific exploitation of GTC data.

  12. The NCAR Research Data Archive's Hybrid Approach for Data Discovery and Access

    NASA Astrophysics Data System (ADS)

    Schuster, D.; Worley, S. J.

    2013-12-01

    The NCAR Research Data Archive (RDA http://rda.ucar.edu) maintains a variety of data discovery and access capabilities for it's 600+ dataset collections to support the varying needs of a diverse user community. In-house developed and standards-based community tools offer services to more than 10,000 users annually. By number of users the largest group is external and access the RDA through web based protocols; the internal NCAR HPC users are fewer in number, but typically access more data volume. This paper will detail the data discovery and access services maintained by the RDA to support both user groups, and show metrics that illustrate how the community is using the services. The distributed search capability enabled by standards-based community tools, such as Geoportal and an OAI-PMH access point that serves multiple metadata standards, provide pathways for external users to initially discover RDA holdings. From here, in-house developed web interfaces leverage primary discovery level metadata databases that support keyword and faceted searches. Internal NCAR HPC users, or those familiar with the RDA, may go directly to the dataset collection of interest and refine their search based on rich file collection metadata. Multiple levels of metadata have proven to be invaluable for discovery within terabyte-sized archives composed of many atmospheric or oceanic levels, hundreds of parameters, and often numerous grid and time resolutions. Once users find the data they want, their access needs may vary as well. A THREDDS data server running on targeted dataset collections enables remote file access through OPENDAP and other web based protocols primarily for external users. In-house developed tools give all users the capability to submit data subset extraction and format conversion requests through scalable, HPC based delayed mode batch processing. Users can monitor their RDA-based data processing progress and receive instructions on how to access the data when it is ready. External users are provided with RDA server generated scripts to download the resulting request output. Similarly they can download native dataset collection files or partial files using Wget or cURL based scripts supplied by the RDA server. Internal users can access the resulting request output or native dataset collection files directly from centralized file systems.

  13. Sound Recordings in the Audiovisual Archives Division of the National Archives. Preliminary Draft.

    ERIC Educational Resources Information Center

    Bray, Mayfield S.; Waffen, Leslie C.

    Some 47,000 sound recordings dating from the turn of the century have been collected in the National Archives. These recordings, consisting of recordings of press conferences, panel discussions, interviews, speeches, court and conference proceedings, entertainment programs, and news broadcasts, are listed in this paper by government agency. The…

  14. User Engagement with Digital Archives for Research and Teaching: A Case Study of "Emblematica Online"

    ERIC Educational Resources Information Center

    Green, Harriett E.; Lamprin, Patricia

    2017-01-01

    Researchers increasingly engage with the digital archives built by libraries, archives, and museums, but many institutions still seek to learn more about researchers' needs and practices with these digital collections. This paper presents a user assessment study for "Emblematica Online," a research digital library that provides digitized…

  15. The next Landsat satellite; the Landsat Data Continuity Mission

    USGS Publications Warehouse

    Irons, James R.; Dwyer, John L.; Barsi, Julia A.

    2012-01-01

    The National Aeronautics and Space Administration (NASA) and the Department of Interior United States Geological Survey (USGS) are developing the successor mission to Landsat 7 that is currently known as the Landsat Data Continuity Mission (LDCM). NASA is responsible for building and launching the LDCM satellite observatory. USGS is building the ground system and will assume responsibility for satellite operations and for collecting, archiving, and distributing data following launch. The observatory will consist of a spacecraft in low-Earth orbit with a two-sensor payload. One sensor, the Operational Land Imager (OLI), will collect image data for nine shortwave spectral bands over a 185 km swath with a 30 m spatial resolution for all bands except a 15 m panchromatic band. The other instrument, the Thermal Infrared Sensor (TIRS), will collect image data for two thermal bands with a 100 m resolution over a 185 km swath. Both sensors offer technical advancements over earlier Landsat instruments. OLI and TIRS will coincidently collect data and the observatory will transmit the data to the ground system where it will be archived, processed to Level 1 data products containing well calibrated and co-registered OLI and TIRS data, and made available for free distribution to the general public. The LDCM development is on schedule for a December 2012 launch. The USGS intends to rename the satellite "Landsat 8" following launch. By either name a successful mission will fulfill a mandate for Landsat data continuity. The mission will extend the almost 40-year Landsat data archive with images sufficiently consistent with data from the earlier missions to allow long-term studies of regional and global land cover change.

  16. Analysis of the request patterns to the NSSDC on-line archive

    NASA Technical Reports Server (NTRS)

    Johnson, Theodore

    1994-01-01

    NASA missions, both for earth science and for space science, collect huge amounts of data, and the rate at which data is being gathered is increasing. For example, the EOSDIS project is expected to collect petabytes per year. In addition, these archives are being made available to remote users over the Internet. The ability to manage the growth of the size and request activity of scientific archives depends on an understanding of the access patterns of scientific users. The National Space Science Data Center (NSSDC) of NASA Goddard Space Flight Center has run their on-line mass storage archive of space data, the National Data Archive and Distribution Service (NDADS), since November 1991. A large world-wide space research community makes use of NSSDC, requesting more than 20,000 files per month. Since the initiation of their service, they have maintained log files which record all accesses the archive. In this report, we present an analysis of the NDADS log files. We analyze the log files, and discuss several issues, including caching, reference patterns, clustering, and system loading.

  17. The Cancer Imaging Archive (TCIA): maintaining and operating a public information repository.

    PubMed

    Clark, Kenneth; Vendt, Bruce; Smith, Kirk; Freymann, John; Kirby, Justin; Koppel, Paul; Moore, Stephen; Phillips, Stanley; Maffitt, David; Pringle, Michael; Tarbox, Lawrence; Prior, Fred

    2013-12-01

    The National Institutes of Health have placed significant emphasis on sharing of research data to support secondary research. Investigators have been encouraged to publish their clinical and imaging data as part of fulfilling their grant obligations. Realizing it was not sufficient to merely ask investigators to publish their collection of imaging and clinical data, the National Cancer Institute (NCI) created the open source National Biomedical Image Archive software package as a mechanism for centralized hosting of cancer related imaging. NCI has contracted with Washington University in Saint Louis to create The Cancer Imaging Archive (TCIA)-an open-source, open-access information resource to support research, development, and educational initiatives utilizing advanced medical imaging of cancer. In its first year of operation, TCIA accumulated 23 collections (3.3 million images). Operating and maintaining a high-availability image archive is a complex challenge involving varied archive-specific resources and driven by the needs of both image submitters and image consumers. Quality archives of any type (traditional library, PubMed, refereed journals) require management and customer service. This paper describes the management tasks and user support model for TCIA.

  18. Archive Management of NASA Earth Observation Data to Support Cloud Analysis

    NASA Technical Reports Server (NTRS)

    Lynnes, Christopher; Baynes, Kathleen; McInerney, Mark

    2017-01-01

    NASA collects, processes and distributes petabytes of Earth Observation (EO) data from satellites, aircraft, in situ instruments and model output, with an order of magnitude increase expected by 2024. Cloud-based web object storage (WOS) of these data can simplify the execution of such an increase. More importantly, it can also facilitate user analysis of those volumes by making the data available to the massively parallel computing power in the cloud. However, storing EO data in cloud WOS has a ripple effect throughout the NASA archive system with unexpected challenges and opportunities. One challenge is modifying data servicing software (such as Web Coverage Service servers) to access and subset data that are no longer on a directly accessible file system, but rather in cloud WOS. Opportunities include refactoring of the archive software to a cloud-native architecture; virtualizing data products by computing on demand; and reorganizing data to be more analysis-friendly. Reviewed by Mark McInerney ESDIS Deputy Project Manager.

  19. Radiometric characterization of Landsat Collection 1 products

    USGS Publications Warehouse

    Micijevic, Esad; Haque, Md. Obaidul; Mishra, Nischal

    2017-01-01

    Landsat data in the U.S. Geological Survey (USGS) archive are being reprocessed to generate a tiered collection of consistently geolocated and radiometrically calibrated products that are suitable for time series analyses. With the implementation of the collection management, no major updates will be made to calibration of the Landsat sensors within a collection. Only calibration parameters needed to maintain the established calibration trends without an effect on derived environmental records will be regularly updated, while all other changes will be deferred to a new collection. This first collection, Collection 1, incorporates various radiometric calibration updates to all Landsat sensors including absolute and relative gains for Landsat 8 Operational Land Imager (OLI), stray light correction for Landsat 8 Thermal Infrared Sensor (TIRS), absolute gains for Landsat 4 and 5 Thematic Mappers (TM), recalibration of Landsat 1-5 Multispectral Scanners (MSS) to ensure radiometric consistency among different formats of archived MSS data, and a transfer of Landsat 8 OLI reflectance based calibration to all previous Landsat sensors. While all OLI/TIRS, ETM+ and majority of TM data have already been reprocessed to Collection 1, a completion of MSS and remaining TM data reprocessing is expected by the end of this year. It is important to note that, although still available for download from the USGS web pages, the products generated using the Pre-Collection processing do not benefit from the latest radiometric calibration updates. In this paper, we are assessing radiometry of solar reflective bands in Landsat Collection 1 products through analysis of trends in on-board calibrator and pseudo invariant site (PICS) responses.

  20. Radiometric characterization of Landsat Collection 1 products

    NASA Astrophysics Data System (ADS)

    Micijevic, Esad; Haque, Md. Obaidul; Mishra, Nischal

    2017-09-01

    Landsat data in the U.S. Geological Survey (USGS) archive are being reprocessed to generate a tiered collection of consistently geolocated and radiometrically calibrated products that are suitable for time series analyses. With the implementation of the collection management, no major updates will be made to calibration of the Landsat sensors within a collection. Only calibration parameters needed to maintain the established calibration trends without an effect on derived environmental records will be regularly updated, while all other changes will be deferred to a new collection. This first collection, Collection 1, incorporates various radiometric calibration updates to all Landsat sensors including absolute and relative gains for Landsat 8 Operational Land Imager (OLI), stray light correction for Landsat 8 Thermal Infrared Sensor (TIRS), absolute gains for Landsat 4 and 5 Thematic Mappers (TM), recalibration of Landsat 1-5 Multispectral Scanners (MSS) to ensure radiometric consistency among different formats of archived MSS data, and a transfer of Landsat 8 OLI reflectance based calibration to all previous Landsat sensors. While all OLI/TIRS, ETM+ and majority of TM data have already been reprocessed to Collection 1, a completion of MSS and remaining TM data reprocessing is expected by the end of this year. It is important to note that, although still available for download from the USGS web pages, the products generated using the Pre-Collection processing do not benefit from the latest radiometric calibration updates. In this paper, we are assessing radiometry of solar reflective bands in Landsat Collection 1 products through analysis of trends in on-board calibrator and pseudo invariant site (PICS) responses.

  1. Deposition of extreme-tolerant bacterial strains isolated during different phases of Phoenix spacecraft assembly in a public culture collection.

    PubMed

    Venkateswaran, Kasthuri; Vaishampayan, Parag; Benardini, James N; Rooney, Alejandro P; Spry, J Andy

    2014-01-01

    Extreme-tolerant bacteria (82 strains; 67 species) isolated during various assembly phases of the Phoenix spacecraft were permanently archived within the U.S. Department of Agriculture's Agricultural Research Service Culture Collection in Peoria, Illinois. This represents the first microbial collection of spacecraft-associated surfaces within the United States to be deposited into a freely available, government-funded culture collection. Archiving extreme-tolerant microorganisms from NASA mission(s) will provide opportunities for scientists who are involved in exploring microbes that can tolerate extreme conditions.

  2. Status of the TESS Science Processing Operations Center

    NASA Astrophysics Data System (ADS)

    Jenkins, Jon Michael; Caldwell, Douglas A.; Davies, Misty; Li, Jie; Morris, Robert L.; Rose, Mark; Smith, Jeffrey C.; Tenenbaum, Peter; Ting, Eric; Twicken, Joseph D.; Wohler, Bill

    2018-06-01

    The Transiting Exoplanet Survey Satellite (TESS) was selected by NASA’s Explorer Program to conduct a search for Earth’s closest cousins starting in 2018. TESS will conduct an all-sky transit survey of F, G and K dwarf stars between 4 and 12 magnitudes and M dwarf stars within 200 light years. TESS is expected to discover 1,000 small planets less than twice the size of Earth, and to measure the masses of at least 50 of these small worlds. The TESS science pipeline is being developed by the Science Processing Operations Center (SPOC) at NASA Ames Research Center based on the highly successful Kepler science pipeline. Like the Kepler pipeline, the TESS pipeline provides calibrated pixels, simple and systematic error-corrected aperture photometry, and centroid locations for all 200,000+ target stars observed over the 2-year mission, along with associated uncertainties. The pixel and light curve products are modeled on the Kepler archive products and will be archived to the Mikulski Archive for Space Telescopes (MAST). In addition to the nominal science data, the 30-minute Full Frame Images (FFIs) simultaneously collected by TESS will also be calibrated by the SPOC and archived at MAST. The TESS pipeline searches through all light curves for evidence of transits that occur when a planet crosses the disk of its host star. The Data Validation pipeline generates a suite of diagnostic metrics for each transit-like signature, and then extracts planetary parameters by fitting a limb-darkened transit model to each potential planetary signature. The results of the transit search are modeled on the Kepler transit search products (tabulated numerical results, time series products, and pdf reports) all of which will be archived to MAST. Synthetic sample data products are available at https://archive.stsci.edu/tess/ete-6.html.Funding for the TESS Mission has been provided by the NASA Science Mission Directorate.

  3. [The Istituto di Storia della Medicina archive and video collection].

    PubMed

    Aruta, Alessandro; De Angelis, Elio

    2006-01-01

    The Istituto di Storia della Medicina at Rome University was to a certain extent a one-man achievement. Founded by Adalberto Pazzini in 1937, its collections comprehended books, objects, as well as photographs, movies, and other didactic video. The Istituto was also a center for publications, conferences and meetings. The archival sources that document its activity have been re-evaluated and restored in recent years, together with the collections housed in the Library and in the Museum.

  4. NASA's small spacecraft technology initiative _Clark_ spacecraft

    NASA Astrophysics Data System (ADS)

    Hayduk, Robert J.; Scott, Walter S.; Walberg, Gerald D.; Butts, James J.; Starr, Richard D.

    1996-11-01

    The Small Satellite Technology Initiative (SSTI) is a National Aeronautics and Space Administration (NASA) program to demonstrate smaller, high technology satellites constructed rapidly and less expensively. Under SSTI, NASA funded the development of "Clark," a high technology demonstration satellite to provide 3-m resolution panchromatic and 15-m resolution multispectral images, as well as collect atmospheric constituent and cosmic x-ray data. The 690-Ib. satellite, to be launched in early 1997, will be in a 476 km, circular, sun-synchronous polar orbit. This paper describes the program objectives, the technical characteristics of the sensors and satellite, image processing, archiving and distribution. Data archiving and distribution will be performed by NASA Stennis Space Center and by the EROS Data Center, Sioux Falls, South Dakota, USA.

  5. Monthly Surface Air Temperature Time Series Area-Averaged Over the 30-Degree Latitudinal Belts of the Globe

    DOE Data Explorer

    Lugina, K. M. [Department of Geography, St. Petersburg State University, St. Petersburg, Russia; Groisman, P. Ya. [National Climatic Data Center, Asheville, North Carolina USA); Vinnikov, K. Ya. [Department of Atmospheric Sciences, University of Maryland, College Park, Maryland (USA); Koknaeva, V. V. [State Hydrological Institute, St. Petersburg, Russia; Speranskaya, N. A. [State Hydrological Institute, St. Petersburg, Russia

    2006-01-01

    The mean monthly and annual values of surface air temperature compiled by Lugina et al. have been taken mainly from the World Weather Records, Monthly Climatic Data for the World, and Meteorological Data for Individual Years over the Northern Hemisphere Excluding the USSR. These published records were supplemented with information from different national publications. In the original archive, after removal of station records believed to be nonhomogeneous or biased, 301 and 265 stations were used to determine the mean temperature for the Northern and Southern hemispheres, respectively. The new version of the station temperature archive (used for evaluation of the zonally-averaged temperatures) was created in 1995. The change to the archive was required because data from some stations became unavailable for analyses in the 1990s. During this process, special care was taken to secure homogeneity of zonally averaged time series. When a station (or a group of stations) stopped reporting, a "new" station (or group of stations) was selected in the same region, and its data for the past 50 years were collected and added to the archive. The processing (area-averaging) was organized in such a way that each time series from a new station spans the reference period (1951-1975) and the years thereafter. It was determined that the addition of the new stations had essentially no effect on the zonally-averaged values for the pre-1990 period.

  6. Informatics in radiology: RADTF: a semantic search-enabled, natural language processor-generated radiology teaching file.

    PubMed

    Do, Bao H; Wu, Andrew; Biswal, Sandip; Kamaya, Aya; Rubin, Daniel L

    2010-11-01

    Storing and retrieving radiology cases is an important activity for education and clinical research, but this process can be time-consuming. In the process of structuring reports and images into organized teaching files, incidental pathologic conditions not pertinent to the primary teaching point can be omitted, as when a user saves images of an aortic dissection case but disregards the incidental osteoid osteoma. An alternate strategy for identifying teaching cases is text search of reports in radiology information systems (RIS), but retrieved reports are unstructured, teaching-related content is not highlighted, and patient identifying information is not removed. Furthermore, searching unstructured reports requires sophisticated retrieval methods to achieve useful results. An open-source, RadLex(®)-compatible teaching file solution called RADTF, which uses natural language processing (NLP) methods to process radiology reports, was developed to create a searchable teaching resource from the RIS and the picture archiving and communication system (PACS). The NLP system extracts and de-identifies teaching-relevant statements from full reports to generate a stand-alone database, thus converting existing RIS archives into an on-demand source of teaching material. Using RADTF, the authors generated a semantic search-enabled, Web-based radiology archive containing over 700,000 cases with millions of images. RADTF combines a compact representation of the teaching-relevant content in radiology reports and a versatile search engine with the scale of the entire RIS-PACS collection of case material. ©RSNA, 2010

  7. Investigation of Magnetic Field Measurements.

    DTIC Science & Technology

    1983-02-28

    Report) Ill. SUPPLEMENTARY NOTES IS. KEY WORDS (CoEntnue on revere side I necoseer mnd Identify by block mamber) AFGL Magnetometer Network Fluxgate ... Magnetometer Induction Coil Magnetometer Temperature Dependency of Fluxgate Automation of Testing 20. ABSTRACT (Coniniue an reverse aide If neeeeey and...data collection platforms. Support was also provided to AFGL to process the fluxgate magnetometer archive tapes in order to make the data available to

  8. BOREAS HYD-8 1996 Gravimetric Moss Moisture Data

    NASA Technical Reports Server (NTRS)

    Fernandes, Richard; Hall, Forrest G. (Editor); Knapp, David E. (Editor); Smith, David E. (Technical Monitor)

    2000-01-01

    The Boreal Ecosystem-Atmosphere Study (BOREAS) Hydrology (HYD)-8 team made measurements of surface hydrological processes that were collected at the southern study area-Old Black Spruce (SSA-OBS) Tower Flux site in 1996 to support its research into point hydrological processes and the spatial variation of these processes. Data collected may be useful in characterizing canopy interception, drip, throughfall, moss interception, drainage, evaporation, and capacity during the growing season at daily temporal resolution. This particular data set contains the gravimetric moss moisture measurements from July to August 1996. To collect these data, a nested spatial sampling plan was implemented to support research into spatial variations of the measured hydrological processes and ultimately the impact of these variations on modeled carbon and water budgets. These data are stored in ASCII text files. The HYD-08 1996 gravimetric moss moisture data are available from the Earth Observing System Data and Information System (EOSDIS) Oak Ridge National Laboratory (ORNL) Distributed Active Archive Center (DAAC). The data files are available on a CD-ROM (see document number 20010000884).

  9. Archiving Mars Mission Data Sets with the Planetary Data System

    NASA Technical Reports Server (NTRS)

    Guinness, Edward A.

    2006-01-01

    This viewgraph presentation reviews the use of the Planetary Data System (PDS) to archive the datasets that are received from the Mars Missions. It reviews the lessons learned in the actual archiving process, and presents an overview of the actual archiving process. It also reviews the lessons learned from the perspectives of the projects, the data producers and the data users.

  10. Minimal Processing: Its Context and Influence in the Archival Community

    ERIC Educational Resources Information Center

    Gorzalski, Matt

    2008-01-01

    Since its publication in 2005, Mark A. Greene and Dennis Meissner's "More Product, Less Process: Revamping Traditional Archival Processing" has led to much discussion and self-examination within the archival community about working through backlogs. This article discusses the impact of Greene and Meissner's work and considers the questions and…

  11. Heritage Quay: What Will You Discover? Transforming the Archives of the University of Huddersfield, Yorkshire, UK

    ERIC Educational Resources Information Center

    Wickham, M. Sarah

    2015-01-01

    The University of Huddersfield presents a key case study of the transformation of its Archives Service, using the newly-developed Staff/Space/Collections dependency model for archives and the lessons of the UK's Customer Service Excellence (CSE) scheme in order to examine and illustrate service development. Heritage Lottery Fund (HLF) and…

  12. Remembering the Backstory: Why Keeping and Promoting School Archives Matters

    ERIC Educational Resources Information Center

    St. Germain, Janine

    2016-01-01

    As a consulting archivist, the author always thinking about the "backstory" of a place, how an institution's legacy is cultivated and preserved, and how all of the stuff a school collects over time can reveal the culture of the place. School archives are by far a favorite "genre" of archive for the author, primarily because she…

  13. "I Spent 1-1/2 Hours Sifting through One Large Box...": Diaries as Information Behavior of the Archives User: Lessons Learned.

    ERIC Educational Resources Information Center

    Toms, Elaine G.; Duff, Wendy

    2002-01-01

    This article describes how diaries were implemented in a study of the use of archives and archival finding aids by history graduate students. The issues concerning diary use as a data collection technique are discussed as well as the different types of diaries. (Author)

  14. Concerning Descriptive Standards: A Partnership between Public Archives and Private Collections in Geneva, Switzerland

    ERIC Educational Resources Information Center

    Roth-Lochner, Barbara; Grange, Didier

    2005-01-01

    This paper presents the results of a partnership begun in 2002 in the field of archival description between the Geneva City Archives (AVG) and the Manuscripts Department of the Public and University Library of Geneva (BPU). This cooperation has allowed the creation of two computer applications, which share technical and conceptual foundations.…

  15. Restoration of Apollo Data by the NSSDC and the PDS Lunar Data Node

    NASA Technical Reports Server (NTRS)

    Williams, David R.; Hills, H. Kent; Lowman, Paul D.; Taylor, Patrick T.; Guinness, Edward A.

    2011-01-01

    The Lunar Data Node (LDN), under the auspices of the Geosciences Node of the Planetary Data System (PDS), is restoring Apollo data archived at the National Space Science Data Center. The Apollo data were arch ived on older media (7 -track tapes. microfilm, microfiche) and in ob solete digital formats, which limits use of the data. The LDN is maki ng these data accessible by restoring them to standard formats and archiving them through PDS. The restoration involves reading the older m edia, collecting supporting data (metadata), deciphering and understa nding the data, and organizing into a data set. The data undergo a pe er review before archive at PDS. We will give an update on last year' s work. We have scanned notebooks from Otto Berg, P.1. for the Lunar Ejecta and Meteorites Experiment. These notebooks contain information on the data and calibration coefficients which we hope to be able to use to restore the raw data into a usable archive. We have scanned Ap ollo 14 and 15 Dust Detector data from microfilm and are in the proce ss of archiving thc scans with PDS. We are also restoring raw dust de tector data from magnetic tape supplied by Yosio Nakamura (UT Austin) . Seiichi Nagihara (Texas Tech Univ.) and others in cooperation with NSSDC are recovering ARCSAV tapes (tapes containing raw data streams from all the ALSEP instruments). We will be preparing these data for archive with PDS. We are also in the process of recovering and archivi ng data not previously archived, from the Apollo 16 Gamma Ray Spectro meter and the Apollo 17 Infrared Spectrometer.

  16. Flexible Workflow Software enables the Management of an Increased Volume and Heterogeneity of Sensors, and evolves with the Expansion of Complex Ocean Observatory Infrastructures.

    NASA Astrophysics Data System (ADS)

    Tomlin, M. C.; Jenkyns, R.

    2015-12-01

    Ocean Networks Canada (ONC) collects data from observatories in the northeast Pacific, Salish Sea, Arctic Ocean, Atlantic Ocean, and land-based sites in British Columbia. Data are streamed, collected autonomously, or transmitted via satellite from a variety of instruments. The Software Engineering group at ONC develops and maintains Oceans 2.0, an in-house software system that acquires and archives data from sensors, and makes data available to scientists, the public, government and non-government agencies. The Oceans 2.0 workflow tool was developed by ONC to manage a large volume of tasks and processes required for instrument installation, recovery and maintenance activities. Since 2013, the workflow tool has supported 70 expeditions and grown to include 30 different workflow processes for the increasing complexity of infrastructures at ONC. The workflow tool strives to keep pace with an increasing heterogeneity of sensors, connections and environments by supporting versioning of existing workflows, and allowing the creation of new processes and tasks. Despite challenges in training and gaining mutual support from multidisciplinary teams, the workflow tool has become invaluable in project management in an innovative setting. It provides a collective place to contribute to ONC's diverse projects and expeditions and encourages more repeatable processes, while promoting interactions between the multidisciplinary teams who manage various aspects of instrument development and the data they produce. The workflow tool inspires documentation of terminologies and procedures, and effectively links to other tools at ONC such as JIRA, Alfresco and Wiki. Motivated by growing sensor schemes, modes of collecting data, archiving, and data distribution at ONC, the workflow tool ensures that infrastructure is managed completely from instrument purchase to data distribution. It integrates all areas of expertise and helps fulfill ONC's mandate to offer quality data to users.

  17. Archive of single-beam bathymetry data collected from select areas in Weeks Bay and Weeks Bayou, southwest Louisiana, January 2013

    USGS Publications Warehouse

    DeWitt, Nancy T.; Reich, Christopher D.; Smith, Christopher G.; Reynolds, Billy J.

    2014-01-01

    A team of scientists from the U.S. Geological Survey, St. Petersburg Coastal and Marine Science Center, collected 92 line-kilometers of dual-frequency single-beam bathymetry data in the tidal creeks, bayous, and coastal areas near Weeks Bay, southwest Louisiana. Limited bathymetry data exist for these tidally and meteorologically influenced shallow-water estuarine environments. In order to reduce the present knowledge gap, the objectives of this study were to (1) develop methods for regional inland bathymetry mapping and monitoring, (2) test inland bathymetry mapping system in pilot locations for integrating multiple elevation (aerial and terrestrial lidar) and bathymetry datasets, (3) implement inland bathymetry mapping and monitoring in highly focused sites, and (4) evaluate changes in bathymetry and channel-fill sediment storage using these methods. This report contains single-beam bathymetric data collected between January 14 and 18, 2013. Data were collected from the RV Mako (5-meter vessel) in water depths that ranged from This report serves as an archive of processed bathymetry data. Geographic information system data provided in this document include a 10-meter cell-size interpolated gridded bathymetry surface, and trackline maps. Additional files include error analysis maps, Field Activity Collection System logs, and formal Federal Geographic Data Committee metadata. Do not use these data for navigational purposes.

  18. Archive of boomer seismic reflection data collected during USGS field activities 01ASR01, 01ASR02, 02ASR01, 02ASR02, Miami, Florida, November 2001-January 2002

    USGS Publications Warehouse

    Calderon, Karynna; Dadisman, Shawn V.; Kindinger, Jack G.; Wiese, Dana S.; Flocks, James G.

    2002-01-01

    This appendix consists of two-dimensional marine seismic reflection profile data collected in canals in the Lake Belt Area of Miami, Florida. These data were acquired in November and December of 2001 and January and February of 2002 using a 4.9-m (16-ft) jonboat. The data are available in a variety of formats, including binary, ASCII, HTML, shapefiles, and GIF images. Binary data are in Society of Exploration Geophysicists (SEG) SEG-Y format and may be downloaded for further processing or display. The SEG-Y data files are too large to fit on one CD-ROM, so they have been distributed onto two CD-ROMs as explained below. Reference maps and GIF images of the profiles may be viewed with your web browser. The GIS information provided is compatible with ESRI's GIS software. A reconnaissance test line (02ASR02-02b02) was collected northwest of the survey area during Field Activity 02ASR02 for possible use in a future project. It is archived here for organizational purposes only.

  19. The vanishing Black Indian: Revisiting craniometry and historic collections.

    PubMed

    Geller, Pamela L; Stojanowksi, Christopher M

    2017-02-01

    This article uses craniometric allocation as a platform for discussing the legacy of Samuel G. Morton's collection of crania, the process of racialization, and the value of contextualized biohistoric research perspectives in biological anthropology. Standard craniometric measurements were recorded for seven Seminoles in the Samuel G. Morton Crania Collection and 10 European soldiers from the Fort St. Marks Military Cemetery; all individuals were men and died in Florida during the 19th century. Fordisc 3.1 was used to assess craniometric affinity with respect to three samples: the Forensic Data Bank, Howells data set, and an archival sample that best fits the target populations collected from 19th century Florida. Discriminant function analyses were used to evaluate how allocations change across the three comparative databases, which roughly reflect a temporal sequence. Most Seminoles allocated as Native American, while most soldiers allocated as Euro-American. Allocation of Seminole crania, however, was unstable across analysis runs with more individuals identifying as African Americans when compared to the Howells and Forensic Data Bank. To the contrary, most of the soldiers produced consistent allocations across analyses. Repeatability for the St. Marks sample was lower when using the archival sample database, contrary to expectations. For the Seminole crania, Cohen's κ indicates significantly lower repeatability. A possible Black Seminole individual was identified in the Morton Collection. Recent articles discussing the merits and weaknesses of comparative craniometry focus on methodological issues. In our biohistoric approach, we use the patterning of craniometric allocations across databases as a platform for discussing social race and its development during the 19th century, a process known as racialization. Here we propose that differences in repeatability for the Seminoles and Euro-American soldiers reflect this process and transformation of racialized identities during 19th century U.S. nation-building. In particular, notions of whiteness were and remain tightly controlled, while other racial categorizations were affected by legal, social, and political contexts that resulted in hybridity in lieu of boundedness. © 2016 Wiley Periodicals, Inc.

  20. Laying the groundwork for NEON's continental-scale ecological research

    NASA Astrophysics Data System (ADS)

    Dethloff, G.; Denslow, M.

    2013-12-01

    The National Ecological Observatory Network (NEON) is designed to examine a suite of ecological issues. Field-collected data from 96 terrestrial and aquatic sites across the U.S. will be combined with remotely sensed data and existing continental-scale data sets. Field collections will include a range of physical and biological types, including soil, sediment, surface water, groundwater, precipitation, plants, animals, insects, and microbes as well as biological sub-samples such as leaf material, blood and tissue samples, and DNA extracts. Initial data analyses and identifications of approximately 175,000 samples per year will occur at numerous external laboratories when all sites are fully staffed in 2017. Additionally, NEON will archive biotic and abiotic specimens at collections facilities where they will be curated and available for additional analyses by the scientific community. The number of archived specimens is currently estimated to exceed 130,000 per year by 2017. We will detail how NEON is addressing the complexities and challenges around this set of analyses and specimens and how the resulting high-quality data can impact ecological understanding. The raw data returned from external laboratories that is quality checked and served by NEON will be the foundation for many NEON data products. For example, sequence-quality nucleic acids extracted from surface waters, benthic biofilms, and soil samples will be building blocks for data products on microbial diversity. The raw sequence data will also be available for uses such as evolutionary investigations, and the extracts will be archived so others can acquire them for additional research. Currently, NEON is establishing contracts for the analysis and archiving of field-collected samples through 2017. During this period, NEON will gather information on the progress and success of this large-scale effort in order to determine the most effective course to pursue with external facilities. Two areas that NEON already knows to evaluate are the need for geographic expertise in taxonomic identifications and the capacity necessary to handle the volume of samples. NEON is also addressing challenges associated with external entities and the logistics of sample movement, data formatting, data ingestion, and reporting. For example, NEON is considering tools, such as web APIs, which could allow efficient transfer of data from external facilities. Having a standard format in place for that data will be critical to transfer success and quality assessment. NEON is also working on the implementation of quality control measures for diverse analytical and taxonomic processes across laboratories, and is developing an external audit process. Additionally, given NEON's open access approach, the Network is focused on selecting a sample identification protocol that aids in tracking samples with more involved analytical needs and also allows maximum utility for the scientific community. Given the complex nature and breadth of the project, NEON will be developing novel sample management systems as well as metadata schemas. These efforts insure integrity and quality from field to external facility to archive for each sample taken, providing high-quality data now and confidence in future research stemming from raw data generated by NEON and its collection specimens.

  1. The Protein Data Bank: unifying the archive

    PubMed Central

    Westbrook, John; Feng, Zukang; Jain, Shri; Bhat, T. N.; Thanki, Narmada; Ravichandran, Veerasamy; Gilliland, Gary L.; Bluhm, Wolfgang F.; Weissig, Helge; Greer, Douglas S.; Bourne, Philip E.; Berman, Helen M.

    2002-01-01

    The Protein Data Bank (PDB; http://www.pdb.org/) is the single worldwide archive of structural data of biological macromolecules. This paper describes the progress that has been made in validating all data in the PDB archive and in releasing a uniform archive for the community. We have now produced a collection of mmCIF data files for the PDB archive (ftp://beta.rcsb.org/pub/pdb/uniformity/data/mmCIF/). A utility application that converts the mmCIF data files to the PDB format (called CIFTr) has also been released to provide support for existing software. PMID:11752306

  2. The Emirates Space Data Center, a PDS4-Compliant Data Archive

    NASA Astrophysics Data System (ADS)

    DeWolfe, A. W.; Al Hammadi, O.; Amiri, S.

    2017-12-01

    As part of the UAE's Emirates Mars Mission (EMM), we are constructing a data archive to preserve and distribute science data from this and future missions. The archive will be publicly accessible and will provide access to Level 2 and 3 science data products from EMM, as well as ancillary data such as SPICE kernels and mission event timelines. As a member of the International Planetary Data Alliance (IPDA), the UAE has committed to making its archive PDS4-compatible, and maintaining the archive beyond the end of the mission. EMM is scheduled to begin collecting science data in spring 2021, and the archive is expected to begin releasing data in September 2021.

  3. In-depth investigation of archival and prospectively collected samples reveals no evidence for XMRV infection in prostate cancer.

    PubMed

    Lee, Deanna; Das Gupta, Jaydip; Gaughan, Christina; Steffen, Imke; Tang, Ning; Luk, Ka-Cheung; Qiu, Xiaoxing; Urisman, Anatoly; Fischer, Nicole; Molinaro, Ross; Broz, Miranda; Schochetman, Gerald; Klein, Eric A; Ganem, Don; Derisi, Joseph L; Simmons, Graham; Hackett, John; Silverman, Robert H; Chiu, Charles Y

    2012-01-01

    XMRV, or xenotropic murine leukemia virus (MLV)-related virus, is a novel gammaretrovirus originally identified in studies that analyzed tissue from prostate cancer patients in 2006 and blood from patients with chronic fatigue syndrome (CFS) in 2009. However, a large number of subsequent studies failed to confirm a link between XMRV infection and CFS or prostate cancer. On the contrary, recent evidence indicates that XMRV is a contaminant originating from the recombination of two mouse endogenous retroviruses during passaging of a prostate tumor xenograft (CWR22) in mice, generating laboratory-derived cell lines that are XMRV-infected. To confirm or refute an association between XMRV and prostate cancer, we analyzed prostate cancer tissues and plasma from a prospectively collected cohort of 39 patients as well as archival RNA and prostate tissue from the original 2006 study. Despite comprehensive microarray, PCR, FISH, and serological testing, XMRV was not detected in any of the newly collected samples or in archival tissue, although archival RNA remained XMRV-positive. Notably, archival VP62 prostate tissue, from which the prototype XMRV strain was derived, tested negative for XMRV on re-analysis. Analysis of viral genomic and human mitochondrial sequences revealed that all previously characterized XMRV strains are identical and that the archival RNA had been contaminated by an XMRV-infected laboratory cell line. These findings reveal no association between XMRV and prostate cancer, and underscore the conclusion that XMRV is not a naturally acquired human infection.

  4. 9. Photographic copy of photograph. (Source: National Archives Photo Collection, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    9. Photographic copy of photograph. (Source: National Archives Photo Collection, Denver, NN-366-114, Box 12, Photo 4464) Rebuilt Rock Creek Diversion Dam. Intake structure for canal is at left with suliceway and overflow section to right. April 24, 1950. - Bitter Root Irrigation Project, Rock Creek Diversion Dam, One mile east of Como Dam, west of U.S. Highway 93, Darby, Ravalli County, MT

  5. The Desert Laboratory Repeat Photography Collection - An Invaluable Archive Documenting Landscape Change

    USGS Publications Warehouse

    Webb, Robert H.; Boyer, Diane E.; Turner, Raymond M.; Bullock, Stephen H.

    2007-01-01

    The Desert Laboratory Repeat Photography Collection, the largest collection of its kind in the world, is housed at the U.S. Geological Survey (USGS) in Tucson, Arizona. The collection preserves thousands of photos taken precisely in the same places but at different times. This archive of 'repeat photographs' documents changes in the desert landscape and vegetation of the American Southwest, and also includes images from northwestern Mexico and Kenya. These images are an invaluable asset to help understand the effects of climate variation and land-use practices on arid and semiarid environments.

  6. Desired Precision in Multi-Objective Optimization: Epsilon Archiving or Rounding Objectives?

    NASA Astrophysics Data System (ADS)

    Asadzadeh, M.; Sahraei, S.

    2016-12-01

    Multi-objective optimization (MO) aids in supporting the decision making process in water resources engineering and design problems. One of the main goals of solving a MO problem is to archive a set of solutions that is well-distributed across a wide range of all the design objectives. Modern MO algorithms use the epsilon dominance concept to define a mesh with pre-defined grid-cell size (often called epsilon) in the objective space and archive at most one solution at each grid-cell. Epsilon can be set to the desired precision level of each objective function to make sure that the difference between each pair of archived solutions is meaningful. This epsilon archiving process is computationally expensive in problems that have quick-to-evaluate objective functions. This research explores the applicability of a similar but computationally more efficient approach to respect the desired precision level of all objectives in the solution archiving process. In this alternative approach each objective function is rounded to the desired precision level before comparing any new solution to the set of archived solutions that already have rounded objective function values. This alternative solution archiving approach is compared to the epsilon archiving approach in terms of efficiency and quality of archived solutions for solving mathematical test problems and hydrologic model calibration problems.

  7. [The contribution of the Archive of the Military Medical Museum in summarizing the experience of the Great Patriotic War].

    PubMed

    Budko, A A; Gribovskaya, G A

    2015-06-01

    During 70 years of existence of the Military Medical Museum one of its main goals was to create an archive of military medical documents, the formation of his ideology and collecting funds, saving and promotion of unique materials related to the history of Russia, feat of arms Defenders of the Fatherland in the wars of the XX century. Creating a military medical museum archive was conceived in the beginning of the Great Patriotic War by leading figures of the military medicine--E.I.Smirnov, S.M.Bagdasaryan, V.N.Shevkunenko, A.N.Maksimenkov, who offered to collect all materials about the work of physicians at the front and in the rear, to consider the archive as a base for scientific works on the history of the military health care. Today the archive was removed from the Army Medical Museum and now it is a branch of the Central Archive of the Ministry of Defence of the Russian Federation (military medical records). His staff keeps the traditions established by predecessors and use in their activities a wealth of experience and background research.

  8. Production of Previews and Advanced Data Products for the ESO Science Archive

    NASA Astrophysics Data System (ADS)

    Rité, C.; Slijkhuis, R.; Rosati, P.; Delmotte, N.; Rino, B.; Chéreau, F.; Malapert, J.-C.

    2008-08-01

    We present a project being carried out by the Virtual Observatory Systems Department/Advanced Data Products group in order to populate the ESO Science Archive Facility with image previews and advanced data products. The main goal is to provide users of the ESO Science Archive Facility with the possibility of viewing pre-processed images associated with instruments like WFI, ISAAC and SOFI before actually retrieving the data for full processing. The image processing is done by using the ESO/MVM image reduction software developed at ESO, to produce astrometrically calibrated FITS images, ranging from simple previews of single archive images, to fully stacked mosaics. These data products can be accessed via the ESO Science Archive Query Form and also be viewed with the browser VirGO {http://archive.eso.org/cms/virgo}.

  9. Landsat: A Global Land-Observing Program

    USGS Publications Warehouse

    ,

    2003-01-01

    Landsat represents the world's longest continuously acquired collection of space-based land remote sensing data. The Landsat Project is a joint initiative of the U.S. Geological Survey (USGS) and the National Aeronautics and Space Administration (NASA) designed to gather Earth resource data from space. NASA developed and launched the spacecrafts, while the USGS handles the operations, maintenance, and management of all ground data reception, processing, archiving, product generation, and distribution.

  10. The design of a petabyte archive and distribution system for the NASA ECS project

    NASA Technical Reports Server (NTRS)

    Caulk, Parris M.

    1994-01-01

    The NASA EOS Data and Information System (EOSDIS) Core System (ECS) will contain one of the largest data management systems ever built - the ECS Science and Data Processing System (SDPS). SDPS is designed to support long term Global Change Research by acquiring, producing, and storing earth science data, and by providing efficient means for accessing and manipulating that data. The first two releases of SDPS, Release A and Release B, will be operational in 1997 and 1998, respectively. Release B will be deployed at eight Distributed Active Archiving Centers (DAAC's). Individual DAAC's will archive different collections of earth science data, and will vary in archive capacity. The storage and management of these data collections is the responsibility of the SDPS Data Server subsystem. It is anticipated that by the year 2001, the Data Server subsystem at the Goddard DAAC must support a near-line data storage capacity of one petabyte. The development of SDPS is a system integration effort in which COTS products will be used in favor of custom components in very possible way. Some software and hardware capabilities required to meet ECS data volume and storage management requirements beyond 1999 are not yet supported by available COTS products. The ECS project will not undertake major custom development efforts to provide these capabilities. Instead, SDPS and its Data Server subsystem are designed to support initial implementations with current products, and provide an evolutionary framework that facilitates the introduction of advanced COTS products as they become available. This paper provides a high-level description of the Data Server subsystem design from a COTS integration standpoint, and discussed some of the major issues driving the design. The paper focuses on features of the design that will make the system scalable and adaptable to changing technologies.

  11. Effect of soil property uncertainties on permafrost thaw projections: a calibration-constrained analysis: Modeling Archive

    DOE Data Explorer

    J.C. Rowland; D.R. Harp; C.J. Wilson; A.L. Atchley; V.E. Romanovsky; E.T. Coon; S.L. Painter

    2016-02-02

    This Modeling Archive is in support of an NGEE Arctic publication available at doi:10.5194/tc-10-341-2016. This dataset contains an ensemble of thermal-hydro soil parameters including porosity, thermal conductivity, thermal conductivity shape parameters, and residual saturation of peat and mineral soil. The ensemble was generated using a Null-Space Monte Carlo analysis of parameter uncertainty based on a calibration to soil temperatures collected at the Barrow Environmental Observatory site by the NGEE team. The micro-topography of ice wedge polygons present at the site is included in the analysis using three 1D column models to represent polygon center, rim and trough features. The Arctic Terrestrial Simulator (ATS) was used in the calibration to model multiphase thermal and hydrological processes in the subsurface.

  12. Earth imaging and scientific observations by SSTI ``Clark'' a NASA technology demonstration spacecraft

    NASA Astrophysics Data System (ADS)

    Hayduk, Robert J.; Scott, Walter S.; Walberg, Gerald D.; Butts, James J.; Starr, Richard D.

    1997-01-01

    The Small Satellite Technology Initiative (SSTI) is a National Aeronautics and Space Administration (NASA) program to demonstrate smaller, high technology satellites constructed rapidly and less expensively. Under SSTI, NASA funded the development of ``Clark,'' a high technology demonstration satellite to provide 3-m resolution panchromatic and 15-m resolution multispectral images, as well as collect atmospheric constituent and cosmic x-ray data. The 690-lb. satellite, to be launched in early 1997, will be in a 476 km, circular, sun-synchronous polar orbit. This paper describes the program objectives, the technical characteristics of the sensors and satellite, image processing, archiving and distribution. Data archiving and distribution will be performed by NASA Stennis Space Center and by the EROS Data Center, Sioux Falls, South Dakota, USA.

  13. Perceptual approach for unsupervised digital color restoration of cinematographic archives

    NASA Astrophysics Data System (ADS)

    Chambah, Majed; Rizzi, Alessandro; Gatta, Carlo; Besserer, Bernard; Marini, Daniele

    2003-01-01

    The cinematographic archives represent an important part of our collective memory. We present in this paper some advances in automating the color fading restoration process, especially with regard to the automatic color correction technique. The proposed color correction method is based on the ACE model, an unsupervised color equalization algorithm based on a perceptual approach and inspired by some adaptation mechanisms of the human visual system, in particular lightness constancy and color constancy. There are some advantages in a perceptual approach: mainly its robustness and its local filtering properties, that lead to more effective results. The resulting technique, is not just an application of ACE on movie images, but an enhancement of ACE principles to meet the requirements in the digital film restoration field. The presented preliminary results are satisfying and promising.

  14. Planetary Data Archiving Activities of ISRO

    NASA Astrophysics Data System (ADS)

    Gopala Krishna, Barla; D, Rao J.; Thakkar, Navita; Prashar, Ajay; Manthira Moorthi, S.

    ISRO has launched its first planetary mission to moon viz., Chandrayaan-1 on October 22, 2008. This mission carried eleven instruments; a wealth of science data has been collected during its mission life (November 2008 to August 2009), which is archived at Indian Space Science Data Centre (ISSDC). The data centre ISSDC is responsible for the Ingest, storage, processing, Archive, and dissemination of the payload and related ancillary data in addition to real-time spacecraft operations support. ISSDC is designed to provide high computation power, large storage and hosting a variety of applications necessary to support all the planetary and space science missions of ISRO. State-of-the-art architecture of ISSDC provides the facility to ingest the raw payload data of all the science payloads of the science satellites in automatic manner, processes raw data and generates payload specific processed outputs, generate higher level products and disseminates the data sets to principal investigators, guest observers, payload operations centres (POC) and to general public. The data archive makes use of the well-proven archive standards of the Planetary Data System (PDS). The long term Archive for five payloads of Chandrayaan-1 data viz., TMC, HySI, SARA, M3 and MiniSAR is released from ISSDC on19th April 2013 (http://www.issdc.gov.in) to the users. Additionally DEMs generated from possible passes of Chandrayaan-1 TMC stereo data and sample map sheets of Lunar Atlas are also archived and released from ISSDC along with the LTA. Mars Orbiter Mission (MOM) is the recent planetary mission launched on October 22, 2013; currently enroute to MARS, carrying five instruments (http://www.isro.org) viz., Mars Color Camera (MCC) to map various morphological features on Mars with varying resolution and scales using the unique elliptical orbit, Methane Sensor for Mars (MSM) to measure total column of methane in the Martian atmosphere, Thermal Infrared Imaging Spectrometer (TIS) to map surface composition & mineralogy of mars, Mars Exospheric Neutral Composition Analyser (MENCA) to study the composition and density of the Martian neutral atmosphere and Lyman Alpha Photometer (LAP) to investigate the loss process of water in Martian atmosphere, towards fulfilling the mission objectives. Active archive created in PDS for some of the instrument data during the earth phase of the mission is being analysed by the PIs. The Mars science data from the onboard instruments is expected during September 2014. The next planetary mission planned to moon is Chandrayaan-2 which consists of an orbiter having five instruments (http://www.isro.org) viz, (i) Imaging IR Spectrometer (IIRS) for mineral mapping, (ii) TMC-2 for topographic mapping, (iii) MiniSAR to detect water ice in the permanently shadowed regions on the Lunar poles, up to a depth of a few meters, (iv) Large Area Soft X-ray spectrometer (CLASS) & Solar X-ray Monitor (XSM) for mapping the major elements present on the lunar surface and (v)Neutral Mass Spectrometer (ChACE2) to carry out a detailed study of the lunar exosphere towards moon exploration; a rover for some specific experiments and a Lander for technology experiment and demonstration. The data is planned to be archived in PDS standards.

  15. ODI - Portal, Pipeline, and Archive (ODI-PPA): a web-based astronomical compute archive, visualization, and analysis service

    NASA Astrophysics Data System (ADS)

    Gopu, Arvind; Hayashi, Soichi; Young, Michael D.; Harbeck, Daniel R.; Boroson, Todd; Liu, Wilson; Kotulla, Ralf; Shaw, Richard; Henschel, Robert; Rajagopal, Jayadev; Stobie, Elizabeth; Knezek, Patricia; Martin, R. Pierre; Archbold, Kevin

    2014-07-01

    The One Degree Imager-Portal, Pipeline, and Archive (ODI-PPA) is a web science gateway that provides astronomers a modern web interface that acts as a single point of access to their data, and rich computational and visualization capabilities. Its goal is to support scientists in handling complex data sets, and to enhance WIYN Observatory's scientific productivity beyond data acquisition on its 3.5m telescope. ODI-PPA is designed, with periodic user feedback, to be a compute archive that has built-in frameworks including: (1) Collections that allow an astronomer to create logical collations of data products intended for publication, further research, instructional purposes, or to execute data processing tasks (2) Image Explorer and Source Explorer, which together enable real-time interactive visual analysis of massive astronomical data products within an HTML5 capable web browser, and overlaid standard catalog and Source Extractor-generated source markers (3) Workflow framework which enables rapid integration of data processing pipelines on an associated compute cluster and users to request such pipelines to be executed on their data via custom user interfaces. ODI-PPA is made up of several light-weight services connected by a message bus; the web portal built using Twitter/Bootstrap, AngularJS and jQuery JavaScript libraries, and backend services written in PHP (using the Zend framework) and Python; it leverages supercomputing and storage resources at Indiana University. ODI-PPA is designed to be reconfigurable for use in other science domains with large and complex datasets, including an ongoing offshoot project for electron microscopy data.

  16. Archived data management system in Kentucky.

    DOT National Transportation Integrated Search

    2007-05-01

    Archived Data User Service (ADUS) was added to the national ITS architecture in 1999 to enable multiple uses for ITS-generated data. In Kentucky, ARTIMIS and TRIMARC are collecting volume, speed, occupancy, length-based classification, and incident d...

  17. GLAS Long-Term Archive: Preservation and Stewardship for a Vital Earth Observing Mission

    NASA Astrophysics Data System (ADS)

    Fowler, D. K.; Moses, J. F.; Zwally, J.; Schutz, B. E.; Hancock, D.; McAllister, M.; Webster, D.; Bond, C.

    2012-12-01

    Data Stewardship, preservation, and reproducibility are fast becoming principal parts of a data manager's work. In an era of distributed data and information systems, it is of vital importance that organizations make a commitment to both current and long-term goals of data management and the preservation of scientific data. Satellite missions and instruments go through a lifecycle that involves pre-launch calibration, on-orbit data acquisition and product generation, and final reprocessing. Data products and descriptions flow to the archives for distribution on a regular basis during the active part of the mission. However there is additional information from the product generation and science teams needed to ensure the observations will be useful for long term climate studies. Examples include ancillary input datasets, product generation software, and production history as developed by the team during the course of product generation. These data and information will need to be archived after product data processing is completed. NASA has developed a set of Earth science data and information content requirements for long term preservation that is being used for all the EOS missions as they come to completion. Since the ICESat/GLAS mission was one of the first to end, NASA and NSIDC, in collaboration with the science team, are collecting data, software, and documentation, preparing for long-term support of the ICESat mission. For a long-term archive, it is imperative to preserve sufficient information about how products were prepared in order to ensure future researchers that the scientific results are accurate, understandable, and useable. Our experience suggests data centers know what to preserve in most cases. That is, the processing algorithms along with the Level 0 or Level 1a input and ancillary products used to create the higher-level products will be archived and made available to users. In other cases, such as pre-launch, calibration/validation, and test data, the data centers must seek guidance from the science team. All these data are essential for product provenance, contributing to and helping establish the integrity of the scientific observations for long term climate studies. In this presentation we will describe application of information gathering with guidance from the ICESat/GLAS Science Team, and the flow of additional information from the ICESat Science team and Science Investigator-Led Processing System to the NSIDC Distributed Active Archive Center. This presentation will also cover how we envision user support through the years of the Long-Term Archive.

  18. Hydratools, a MATLAB® based data processing package for Sontek Hydra data

    USGS Publications Warehouse

    Martini, M.; Lightsom, F.L.; Sherwood, C.R.; Xu, Jie; Lacy, J.R.; Ramsey, A.; Horwitz, R.

    2005-01-01

    The U.S. Geological Survey (USGS) has developed a set of MATLAB tools to process and convert data collected by Sontek Hydra instruments to netCDF, which is a format used by the USGS to process and archive oceanographic time-series data. The USGS makes high-resolution current measurements within 1.5 meters of the bottom. These data are used in combination with other instrument data from sediment transport studies to develop sediment transport models. Instrument manufacturers provide software which outputs unique binary data formats. Multiple data formats are cumbersome. The USGS solution is to translate data streams into a common data format: netCDF. The Hydratools toolbox is written to create netCDF format files following EPIC conventions, complete with embedded metadata. Data are accepted from both the ADV and the PCADP. The toolbox will detect and remove bad data, substitute other sources of heading and tilt measurements if necessary, apply ambiguity corrections, calculate statistics, return information about data quality, and organize metadata. Standardized processing and archiving makes these data more easily and routinely accessible locally and over the Internet. In addition, documentation of the techniques used in the toolbox provides a baseline reference for others utilizing the data.

  19. A MySQL Based EPICS Archiver

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Christopher Slominski

    2009-10-01

    Archiving a large fraction of the EPICS signals within the Jefferson Lab (JLAB) Accelerator control system is vital for postmortem and real-time analysis of the accelerator performance. This analysis is performed on a daily basis by scientists, operators, engineers, technicians, and software developers. Archiving poses unique challenges due to the magnitude of the control system. A MySQL Archiving system (Mya) was developed to scale to the needs of the control system; currently archiving 58,000 EPICS variables, updating at a rate of 11,000 events per second. In addition to the large collection rate, retrieval of the archived data must also bemore » fast and robust. Archived data retrieval clients obtain data at a rate over 100,000 data points per second. Managing the data in a relational database provides a number of benefits. This paper describes an archiving solution that uses an open source database and standard off the shelf hardware to reach high performance archiving needs. Mya has been in production at Jefferson Lab since February of 2007.« less

  20. Developing an EarthCube Governance Structure for Big Data Preservation and Access

    NASA Astrophysics Data System (ADS)

    Leetaru, H. E.; Leetaru, K. H.

    2012-12-01

    The underlying vision of the NSF EarthCube initiative is of an enduring resource serving the needs of the earth sciences for today and the future. We must therefore view this effort through the lens of what the earth sciences will need tomorrow and on how the underlying processes of data compilation, preservation, and access interplay with the scientific processes within the communities EarthCube will serve. Key issues that must be incorporated into the EarthCube governance structure include authentication, retrieval, and unintended use cases, the emerging role of whole-corpus data mining, and how inventory, citation, and archive practices will impact the ability of scientists to use EarthCube's collections into the future. According to the National Academies, the US federal government spends over $140 billion dollars a year in support of the nation's research base. Yet, a critical issue confronting all of the major scientific disciplines in building upon this investment is the lack of processes that guide how data are preserved for the long-term, ensuring that studies can be replicated and that experimental data remains accessible as new analytic methods become available or theories evolve. As datasets are used years or even decades after their creation, far richer metadata is needed to describe the underlying simulation, smoothing algorithms or bounding parameters of the data collection process. This is even truer as data are increasingly used outside their intended disciplines, as geoscience researchers apply algorithms from one discipline to datasets from another, where their analytical techniques may make extensive assumptions about the data. As science becomes increasingly interdisciplinary and emerging computational approaches begin applying whole-corpus methodologies and blending multiple archives together, we are facing new data access modalities distinct from the needs of the past, drawing into focus the question of centralized versus distributed architectures. In the past geoscience data have been distributed, with each site maintaining its own collections and centralized inventory metadata supporting discovery. This was based on the historical search-browse-download modality where access was primarily to download a copy to a researcher's own machine and datasets were measured in gigabytes. New "big data" approaches to the geosciences are already demonstrating the need to analyze the entirety of multiple collections from multiple sites totaling hundreds of terabytes in size. Yet, datasets are outpacing the ability of networks to share them, forcing a new paradigm in high-performance computing where computation must migrate to centralized data stores. The next generation of geoscientists are going to need a system designed for exploring and understanding data from multiple scientific domains and vantages where data are preserved for decades. We are not alone in this endeavor and there are many lessons we can learn from similar initiatives such as more than 40 years of governance policies for data warehouses and 15 years of open web archives, all of which face the same challenges. The entire EarthCube project will fail if the new governance structure does not account for the needs of integrated cyberinfrastructure that allows big data to stored, archived, analyzed, and made accessible to large numbers of scientists.

  1. You Can See Film through Digital: A Report from Where the Archiving of Motion Picture Film Stands

    NASA Astrophysics Data System (ADS)

    Tochigi, Akira

    In recent years, digital technology has brought drastic change to the archiving of motion picture film. By collecting digital media as well as film, many conventional film archives have transformed themselves into moving image archives or audiovisual archives. As well, digital technology has expanded the possibility of the restoration of motion picture film in comparison with conventional photochemical (analog) restoration. This paper first redefines some fundamental terms regarding the archiving of motion picture film and discusses the conditions which need consideration for film archiving in Japan. With a few examples of the recent restoration projects conducted by National Film Center of the National Museum of Modern Art, Tokyo, this paper then clarifies new challenges inherent in digital restoration and urges the importance of better appreciation of motion picture film.

  2. An Assessment of a Science Discipline Archive Against ISO 16363

    NASA Astrophysics Data System (ADS)

    Hughes, J. S.; Downs, R. R.

    2016-12-01

    The Planetary Data System (PDS) is a federation of science discipline nodes formed in response to the findings of the Committee on Data Management and Computing (CODMAC 1986) that a "wealth of science data would ultimately cease to be useful and probably lost if a process was not developed to ensure that the science data were properly archived." Starting operations in 1990 the stated mission of the PDS is to "facilitate achievement of NASA's planetary science goals by efficiently collecting, archiving, and making accessible digital data and documentation produced by or relevant to NASA's planetary missions, research programs, and data analysis programs."In 2008 the PDS initiated a transition to a more modern system based on key principles found in the Archival Information System (OAIS) Reference Model (ISO 14721), a set of functional requirements provided by the designated community, and about twenty years of lessons-learned. With science digital data now being archived under the new PDS4, the PDS is a good use case to be assessed as a trusted repository against ISO 16363, a recommended practice for assessing the trustworthiness of digital repositories.This presentation will summarize the OAIS principles adopted for PDS4 and the findings of a desk assessment of the PDS against ISO 16363. Also presented will be specific items of evidence, for example the PDS mission statement above, and how they impact the level of certainty that the ISO 16363 metrics are being met.

  3. Data management and digital delivery of analog data

    USGS Publications Warehouse

    Miller, W.A.; Longhenry, Ryan; Smith, T.

    2008-01-01

    The U.S. Geological Survey's (USGS) data archive at the Earth Resources Observation and Science (EROS) Center is a comprehensive and impartial record of the Earth's changing land surface. USGS/EROS has been archiving and preserving land remote sensing data for over 35 years. This remote sensing archive continues to grow as aircraft and satellites acquire more imagery. As a world leader in preserving data, USGS/EROS has a reputation as a technological innovator in solving challenges and ensuring that access to these collections is available. Other agencies also call on the USGS to consider their collections for long-term archive support. To improve access to the USGS film archive, each frame on every roll of film is being digitized by automated high performance digital camera systems. The system robotically captures a digital image from each film frame for the creation of browse and medium resolution image files. Single frame metadata records are also created to improve access that otherwise involves interpreting flight indexes. USGS/EROS is responsible for over 8.6 million frames of aerial photographs and 27.7 million satellite images.

  4. Life Sciences Data Archive (LSDA)

    NASA Technical Reports Server (NTRS)

    Fitts, M.; Johnson-Throop, Kathy; Thomas, D.; Shackelford, K.

    2008-01-01

    In the early days of spaceflight, space life sciences data were been collected and stored in numerous databases, formats, media-types and geographical locations. While serving the needs of individual research teams, these data were largely unknown/unavailable to the scientific community at large. As a result, the Space Act of 1958 and the Science Data Management Policy mandated that research data collected by the National Aeronautics and Space Administration be made available to the science community at large. The Biomedical Informatics and Health Care Systems Branch of the Space Life Sciences Directorate at JSC and the Data Archive Project at ARC, with funding from the Human Research Program through the Exploration Medical Capability Element, are fulfilling these requirements through the systematic population of the Life Sciences Data Archive. This program constitutes a formal system for the acquisition, archival and distribution of data for Life Sciences-sponsored experiments and investigations. The general goal of the archive is to acquire, preserve, and distribute these data using a variety of media which are accessible and responsive to inquiries from the science communities.

  5. The archiving and dissemination of biological structure data.

    PubMed

    Berman, Helen M; Burley, Stephen K; Kleywegt, Gerard J; Markley, John L; Nakamura, Haruki; Velankar, Sameer

    2016-10-01

    The global Protein Data Bank (PDB) was the first open-access digital archive in biology. The history and evolution of the PDB are described, together with the ways in which molecular structural biology data and information are collected, curated, validated, archived, and disseminated by the members of the Worldwide Protein Data Bank organization (wwPDB; http://wwpdb.org). Particular emphasis is placed on the role of community in establishing the standards and policies by which the PDB archive is managed day-to-day. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  6. The Astronomical Photographic Data Archive

    NASA Astrophysics Data System (ADS)

    Cline, J. Donald; Barker, T.; Castelaz, M.

    2010-01-01

    Pisgah Astronomical Research Institute is the home of the Astronomical Photographic Data Archive (APDA), a national effort to preserve, archive, and digitize astronomical photographic plate collections. APDA was formed in 2007 and presently holds more than 100,000 plates and films from more than a dozen observatory collections. While the photographic data pre-dates modern observational data taken with electronic instruments, it is nevertheless of extremely high quality. When one considers 100,000 plates and films in the APDA collection, some with 100's or 1000's of objects per plate, and plates taken over 100 years the value of the data in APDA becomes apparent. In addition to the astronomical photographic data collections, APDA also possesses two high precision glass plate measuring machines, GAMMA I and GAMMA II that were built for NASA and the Space Telescope Science Institute. The measuring machines were used by a team of scientists under the leadership of the late Dr. Barry Lasker to develop the Guide Star Catalog and Digitized Sky Survey that guide and direct the Hubble Space Telescope. We will describe the current set of collections, plans for the measuring machines, and the efforts that have been made to assure preservation of plate collections.

  7. 6. Photographic copy of photograph (Source: Salt River Project Archives, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    6. Photographic copy of photograph (Source: Salt River Project Archives, Tempe, Lubken collection, #R-295) Transformer house under construction. View looking north. October 5, 1908. - Theodore Roosevelt Dam, Transformer House, Salt River, Tortilla Flat, Maricopa County, AZ

  8. 5. Photographic copy of photograph (Source: Salt River Project Archives, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    5. Photographic copy of photograph (Source: Salt River Project Archives, Tempe, Lubken collection, #R-273) Transformer house under construction. View looking north. July 1, 1908. - Theodore Roosevelt Dam, Transformer House, Salt River, Tortilla Flat, Maricopa County, AZ

  9. MODIS Collection 6 Data at the National Snow and Ice Data Center (NSIDC)

    NASA Astrophysics Data System (ADS)

    Fowler, D. K.; Steiker, A. E.; Johnston, T.; Haran, T. M.; Fowler, C.; Wyatt, P.

    2015-12-01

    For over 15 years, the NASA National Snow and Ice Data Center Distributed Active Archive Center (NSIDC DAAC) has archived and distributed snow and sea ice products derived from the Moderate Resolution Imaging Spectroradiometer (MODIS) instruments on the NASA Earth Observing System (EOS) Aqua and Terra satellites. Collection 6 represents the next revision to NSIDC's MODIS archive, mainly affecting the snow-cover products. Collection 6 specifically addresses the needs of the MODIS science community by targeting the scenarios that have historically confounded snow detection and introduced errors into the snow-cover and fractional snow-cover maps even though MODIS snow-cover maps are typically 90 percent accurate or better under good observing conditions, Collection 6 uses revised algorithms to discriminate between snow and clouds, resolve uncertainties along the edges of snow-covered regions, and detect summer snow cover in mountains. Furthermore, Collection 6 applies modified and additional snow detection screens and new Quality Assessment protocols that enhance the overall accuracy of the snow maps compared with Collection 5. Collection 6 also introduces several new MODIS snow products, including a daily Climate Modelling Grid (CMG) cloud gap-filled (CGF) snow-cover map which generates cloud-free maps by using the most recent clear observations.. The MODIS Collection 6 sea ice extent and ice surface temperature algorithms and products are much the same as Collection 5; however, Collection 6 updates to algorithm inputs—in particular, the L1B calibrated radiances, land and water mask, and cloud mask products—have improved the sea ice outputs. The MODIS sea ice products are currently available at NSIDC, and the snow cover products are soon to follow in 2016 NSIDC offers a variety of methods for obtaining these data. Users can download data directly from an online archive or use the NASA Reverb Search & Order Tool to perform spatial, temporal, and parameter subsetting, reformatting, and re-projection of the data.

  10. Mission to Planet Earth

    NASA Technical Reports Server (NTRS)

    Wilson, Gregory S.; Huntress, Wesley T.

    1990-01-01

    The rationale behind Mission to Planet Earth is presented, and the program plan is described in detail. NASA and its interagency and international partners will place satellites carrying advanced sensors in strategic earth orbits to collect muultidisciplinary data. A sophisticated data system will process and archive an unprecedented large amount of information about the earth and how it functions as a system. Attention is given to the space observatories, the data and information systems, and the interdisciplinary research.

  11. The Role of an Archivist in Shaping Collective Memory on Kibbutz. Through Her Work on the Photographic Archive

    ERIC Educational Resources Information Center

    Barromi Perlman, Edna

    2011-01-01

    An archivist from a kibbutz in the north of Israel has been managing the kibbutz archive for close to a decade. I have chosen to present her enterprise and the role she is playing by means of her archival work, which is changing the historiography of her kibbutz. The archivist at the kibbutz in question reevaluates her kibbutz's history while…

  12. Improved discovery of NEON data and samples though vocabularies, workflows, and web tools

    NASA Astrophysics Data System (ADS)

    Laney, C. M.; Elmendorf, S.; Flagg, C.; Harris, T.; Lunch, C. K.; Gulbransen, T.

    2017-12-01

    The National Ecological Observatory Network (NEON) is a continental-scale ecological observation facility sponsored by the National Science Foundation and operated by Battelle. NEON supports research on the impacts of invasive species, land use change, and environmental change on natural resources and ecosystems by gathering and disseminating a full suite of observational, instrumented, and airborne datasets from field sites across the U.S. NEON also collects thousands of samples from soil, water, and organisms every year, and partners with numerous institutions to analyze and archive samples. We have developed numerous new technologies to support processing and discovery of this highly diverse collection of data. These technologies include applications for data collection and sample management, processing pipelines specific to each collection system (field observations, installed sensors, and airborne instruments), and publication pipelines. NEON data and metadata are discoverable and downloadable via both a public API and data portal. We solicit continued engagement and advice from the informatics and environmental research communities, particularly in the areas of data versioning, usability, and visualization.

  13. Life Sciences Data Archive Scientific Development

    NASA Technical Reports Server (NTRS)

    Buckey, Jay C., Jr.

    1995-01-01

    The Life Sciences Data Archive will provide scientists, managers and the general public with access to biomedical data collected before, during and after spaceflight. These data are often irreplaceable and represent a major resource from the space program. For these data to be useful, however, they must be presented with enough supporting information, description and detail so that an interested scientist can understand how, when and why the data were collected. The goal of this contract was to provide a scientific consultant to the archival effort at the NASA-Johnson Space Center. This consultant (Jay C. Buckey, Jr., M.D.) is a scientist, who was a co-investigator on both the Spacelab Life Sciences-1 and Spacelab Life Sciences-2 flights. In addition he was an alternate payload specialist for the Spacelab Life Sciences-2 flight. In this role he trained on all the experiments on the flight and so was familiar with the protocols, hardware and goals of all the experiments on the flight. Many of these experiments were flown on both SLS-1 and SLS-2. This background was useful for the archive, since the first mission to be archived was Spacelab Life Sciences-1. Dr. Buckey worked directly with the archive effort to ensure that the parameters, scientific descriptions, protocols and data sets were accurate and useful.

  14. The Archives of the Department of Terrestrial Magnetism: Documenting 100 Years of Carnegie Science

    NASA Astrophysics Data System (ADS)

    Hardy, S. J.

    2005-12-01

    The archives of the Department of Terrestrial Magnetism (DTM) of the Carnegie Institution of Washington document more than a century of geophysical and astronomical investigations. Primary source materials available for historical research include field and laboratory notebooks, equipment designs, plans for observatories and research vessels, scientists' correspondence, and thousands of expedition and instrument photographs. Yet despite its history, DTM long lacked a systematic approach to managing its documentary heritage. A preliminary records survey conducted in 2001 identified more than 1,000 linear feet of historically-valuable records languishing in dusty, poorly-accessible storerooms. Intellectual control at that time was minimal. With support from the National Historical Publications and Records Commission, the "Carnegie Legacy Project" was initiated in 2003 to preserve, organize, and facilitate access to DTM's archival records, as well as those of the Carnegie Institution's administrative headquarters and Geophysical Laboratory. Professional archivists were hired to process the 100-year backlog of records. Policies and procedures were established to ensure that all work conformed to national archival standards. Records were appraised, organized, and rehoused in acid-free containers, and finding aids were created for the project web site. Standardized descriptions of each collection were contributed to the WorldCat bibliographic database and the AIP International Catalog of Sources for History of Physics. Historic photographs and documents were digitized for online exhibitions to raise awareness of the archives among researchers and the general public. The success of the Legacy Project depended on collaboration between archivists, librarians, historians, data specialists, and scientists. This presentation will discuss key aspects (funding, staffing, preservation, access, outreach) of the Legacy Project and is aimed at personnel in observatories, research institutes, and other organizations interested in establishing their own archival programs.

  15. The spectral database Specchio: Data management, data sharing and initial processing of field spectrometer data within the Dimensions of Biodiversity project

    NASA Astrophysics Data System (ADS)

    Hueni, A.; Schweiger, A. K.

    2015-12-01

    Field spectrometry has substantially gained importance in vegetation ecology due to the increasing knowledge about causal ties between vegetation spectra and biochemical and structural plant traits. Additionally, worldwide databases enable the exchange of spectral and plant trait data and promote global research cooperation. This can be expected to further enhance the use of field spectrometers in ecological studies. However, the large amount of data collected during spectral field campaigns poses major challenges regarding data management, archiving and processing. The spectral database Specchio is designed to organize, manage, process and share spectral data and metadata. We provide an example for using Specchio based on leaf level spectra of prairie plant species collected during the 2015 field campaign of the Dimensions of Biodiversity research project, conducted at the Cedar Creek Long-Term Ecological Research site, in central Minnesota. We show how spectral data collections can be efficiently administered, organized and shared between distinct research groups and explore the capabilities of Specchio for data quality checks and initial processing steps.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roederer, Ian U.; Karakas, Amanda I.; Pignatari, Marco

    We present a detailed analysis of the composition and nucleosynthetic origins of the heavy elements in the metal-poor ([Fe/H] = −1.62 ± 0.09) star HD 94028. Previous studies revealed that this star is mildly enhanced in elements produced by the slow neutron-capture process (s process; e.g., [Pb/Fe] = +0.79 ± 0.32) and rapid neutron-capture process (r process; e.g., [Eu/Fe] = +0.22 ± 0.12), including unusually large molybdenum ([Mo/Fe] = +0.97 ± 0.16) and ruthenium ([Ru/Fe] = +0.69 ± 0.17) enhancements. However, this star is not enhanced in carbon ([C/Fe] = −0.06 ± 0.19). We analyze an archival near-ultraviolet spectrum of HD 94028, collected using the Space Telescope Imaging Spectrograph on board the Hubble Space Telescope, and other archival optical spectra collected frommore » ground-based telescopes. We report abundances or upper limits derived from 64 species of 56 elements. We compare these observations with s-process yields from low-metallicity AGB evolution and nucleosynthesis models. No combination of s- and r-process patterns can adequately reproduce the observed abundances, including the super-solar [As/Ge] ratio (+0.99 ± 0.23) and the enhanced [Mo/Fe] and [Ru/Fe] ratios. We can fit these features when including an additional contribution from the intermediate neutron-capture process (i process), which perhaps operated through the ingestion of H in He-burning convective regions in massive stars, super-AGB stars, or low-mass AGB stars. Currently, only the i process appears capable of consistently producing the super-solar [As/Ge] ratios and ratios among neighboring heavy elements found in HD 94028. Other metal-poor stars also show enhanced [As/Ge] ratios, hinting that operation of the i process may have been common in the early Galaxy.« less

  17. Implementing DOIs for Oceanographic Satellite Data at PO.DAAC

    NASA Astrophysics Data System (ADS)

    Hausman, J.; Tauer, E.; Chung, N.; Chen, C.; Moroni, D. F.

    2013-12-01

    The Physical Oceanographic Distributed Active Archive Center (PO.DAAC) is NASA's archive for physical oceanographic satellite data. It distributes over 500 datasets from gravity, ocean wind, sea surface topography, sea ice, ocean currents, salinity, and sea surface temperature satellite missions. A dataset is a collection of granules/files that share the same mission/project, versioning, processing level, spatial, and temporal characteristics. The large number of datasets is partially due to the number of satellite missions, but mostly because a single satellite mission typically has multiple versions or even temporal and spatial resolutions of data. As a result, a user might mistake one dataset for a different dataset from the same satellite mission. Due to the PO.DAAC'S vast variety and volume of data and growing requirements to report dataset usage, it has begun implementing DOIs for the datasets it archives and distributes. However, this was not as simple as registering a name for a DOI and providing a URL. Before implementing DOIs multiple questions needed to be answered. What are the sponsor and end-user expectations regarding DOIs? At what level does a DOI get assigned (dataset, file/granule)? Do all data get a DOI, or only selected data? How do we create a DOI? How do we create landing pages and manage them? What changes need to be made to the data archive, life cycle policy and web portal to accommodate DOIs? What if the data also exists at another archive and a DOI already exists? How is a DOI included if the data were obtained via a subsetting tool? How does a researcher or author provide a unique, definitive reference (standard citation) for a given dataset? This presentation will discuss how these questions were answered through changes in policy, process, and system design. Implementing DOIs is not a trivial undertaking, but as DOIs are rapidly becoming the de facto approach, it is worth the effort. Researchers have historically referenced the source satellite and data center (or archive), but scientific writings do not typically provide enough detail to point to a singular, uniquely identifiable dataset. DOIs provide the means to help researchers be precise in their data citations and provide needed clarity, standardization and permanence.

  18. STScI Archive Manual, Version 7.0

    NASA Astrophysics Data System (ADS)

    Padovani, Paolo

    1999-06-01

    The STScI Archive Manual provides information a user needs to know to access the HST archive via its two user interfaces: StarView and a World Wide Web (WWW) interface. It provides descriptions of the StarView screens used to access information in the database and the format of that information, and introduces the use to the WWW interface. Using the two interfaces, users can search for observations, preview public data, and retrieve data from the archive. Using StarView one can also find calibration reference files and perform detailed association searches. With the WWW interface archive users can access, and obtain information on, all Multimission Archive at Space Telescope (MAST) data, a collection of mainly optical and ultraviolet datasets which include, amongst others, the International Ultraviolet Explorer (IUE) Final Archive. Both interfaces feature a name resolver which simplifies searches based on target name.

  19. FAWKES Information Management for Space Situational Awareness

    NASA Astrophysics Data System (ADS)

    Spetka, S.; Ramseyer, G.; Tucker, S.

    2010-09-01

    Current space situational awareness assets can be fully utilized by managing their inputs and outputs in real time. Ideally, sensors are tasked to perform specific functions to maximize their effectiveness. Many sensors are capable of collecting more data than is needed for a particular purpose, leading to the potential to enhance a sensor’s utilization by allowing it to be re-tasked in real time when it is determined that sufficient data has been acquired to meet the first task’s requirements. In addition, understanding a situation involving fast-traveling objects in space may require inputs from more than one sensor, leading to a need for information sharing in real time. Observations that are not processed in real time may be archived to support forensic analysis for accidents and for long-term studies. Space Situational Awareness (SSA) requires an extremely robust distributed software platform to appropriately manage the collection and distribution for both real-time decision-making as well as for analysis. FAWKES is being developed as a Joint Space Operations Center (JSPOC) Mission System (JMS) compliant implementation of the AFRL Phoenix information management architecture. It implements a pub/sub/archive/query (PSAQ) approach to communications designed for high performance applications. FAWKES provides an easy to use, reliable interface for structuring parallel processing, and is particularly well suited to the requirements of SSA. In addition to supporting point-to-point communications, it offers an elegant and robust implementation of collective communications, to scatter, gather and reduce values. A query capability is also supported that enhances reliability. Archived messages can be queried to re-create a computation or to selectively retrieve previous publications. PSAQ processes express their role in a computation by subscribing to their inputs and by publishing their results. Sensors on the edge can subscribe to inputs by appropriately authorized users, allowing dynamic tasking capabilities. Previously, the publication of sensor data collected by mobile systems was demonstrated. Thumbnails of infrared imagery that were imaged in real time by an aircraft [1] were published over a grid. This airborne system subscribed to requests for and then published the requested detailed images. In another experiment a system employing video subscriptions [2] drove the analysis of live video streams, resulting in a published stream of processed video output. We are currently implementing an SSA system that uses FAWKES to deliver imagery from telescopes through a pipeline of processing steps that are performed on high performance computers. PSAQ facilitates the decomposition of a problem into components that can be distributed across processing assets from the smallest sensors in space to the largest high performance computing (HPC) centers, as well as the integration and distribution of the results, all in real time. FAWKES supports the real-time latency requirements demanded by all of these applications. It also enhances reliability by easily supporting redundant computation. This study shows how FAWKES/PSAQ is utilized in SSA applications, and presents performance results for latency and throughput that meet these needs.

  20. Exploring cloud and big data components for SAR archiving and analysis

    NASA Astrophysics Data System (ADS)

    Baker, S.; Crosby, C. J.; Meertens, C.; Phillips, D.

    2017-12-01

    Under the Geodesy Advancing Geoscience and EarthScope (GAGE) NSF Cooperative Agreement, UNAVCO has seen the volume of the SAR Data Archive grow at a substantial rate, from 2 TB in Y1 and 5 TB in Y2 to 41 TB in Y3 primarily due to WInSAR PI proposal management of ALOS-­2/JAXA (Japan Aerospace Exploration Agency) data and to a lesser extent Supersites and other data collections. JAXA provides a fixed number of scenes per year for each PI, and some data files are 50­-60GB each, which accounts for the large volume of data. In total, over 100TB of SAR data are in the WInSAR/UNAVCO archive and a large portion of these are available unrestricted for WInSAR members. In addition to the existing data, newer data streams from the Sentinel-1 and NISAR missions will require efficient processing pipelines and easily scalable infrastructure to handle processed results. With these growing data sizes and space concerns, the SAR archive operations migrated to the Texas Advanced Computing Center (TACC) via an NSF XSEDE proposal in spring 2017. Data are stored on an HPC system while data operations are running on Jetstream virtual machines within the same datacenter. In addition to the production data operations, testing was done in early 2017 with container based InSAR processing analysis using JupyterHub and Docker images deployed on a VM cluster on Jetstream. The JupyterHub environment is well suited for short courses and other training opportunities for the community such as labs for university courses on InSAR. UNAVCO is also exploring new processing methodologies using DC/OS (the datacenter operating system) for batch and stream processing workflows and time series analysis with Big Data open source components like the Spark, Mesos, Akka, Cassandra, Kafka (SMACK) stack. The comparison of the different methodologies will provide insight into the pros and cons for each and help the SAR community with decisions about infrastructure and software requirements to meet their research goals.

  1. The public cancer radiology imaging collections of The Cancer Imaging Archive.

    PubMed

    Prior, Fred; Smith, Kirk; Sharma, Ashish; Kirby, Justin; Tarbox, Lawrence; Clark, Ken; Bennett, William; Nolan, Tracy; Freymann, John

    2017-09-19

    The Cancer Imaging Archive (TCIA) is the U.S. National Cancer Institute's repository for cancer imaging and related information. TCIA contains 30.9 million radiology images representing data collected from approximately 37,568 subjects. This data is organized into collections by tumor-type with many collections also including analytic results or clinical data. TCIA staff carefully de-identify and curate all incoming collections prior to making the information available via web browser or programmatic interfaces. Each published collection within TCIA is assigned a Digital Object Identifier that references the collection. Additionally, researchers who use TCIA data may publish the subset of information used in their analysis by requesting a TCIA generated Digital Object Identifier. This data descriptor is a review of a selected subset of existing publicly available TCIA collections. It outlines the curation and publication methods employed by TCIA and makes available 15 collections of cancer imaging data.

  2. Land processes distributed active archive center product lifecycle plan

    USGS Publications Warehouse

    Daucsavage, John C.; Bennett, Stacie D.

    2014-01-01

    The U.S. Geological Survey (USGS) Earth Resources Observation and Science (EROS) Center and the National Aeronautics and Space Administration (NASA) Earth Science Data System Program worked together to establish, develop, and operate the Land Processes (LP) Distributed Active Archive Center (DAAC) to provide stewardship for NASA’s land processes science data. These data are critical science assets that serve the land processes science community with potential value beyond any immediate research use, and therefore need to be accounted for and properly managed throughout their lifecycle. A fundamental LP DAAC objective is to enable permanent preservation of these data and information products. The LP DAAC accomplishes this by bridging data producers and permanent archival resources while providing intermediate archive services for data and information products.

  3. Architectural Strategies for Enabling Data-Driven Science at Scale

    NASA Astrophysics Data System (ADS)

    Crichton, D. J.; Law, E. S.; Doyle, R. J.; Little, M. M.

    2017-12-01

    The analysis of large data collections from NASA or other agencies is often executed through traditional computational and data analysis approaches, which require users to bring data to their desktops and perform local data analysis. Alternatively, data are hauled to large computational environments that provide centralized data analysis via traditional High Performance Computing (HPC). Scientific data archives, however, are not only growing massive, but are also becoming highly distributed. Neither traditional approach provides a good solution for optimizing analysis into the future. Assumptions across the NASA mission and science data lifecycle, which historically assume that all data can be collected, transmitted, processed, and archived, will not scale as more capable instruments stress legacy-based systems. New paradigms are needed to increase the productivity and effectiveness of scientific data analysis. This paradigm must recognize that architectural and analytical choices are interrelated, and must be carefully coordinated in any system that aims to allow efficient, interactive scientific exploration and discovery to exploit massive data collections, from point of collection (e.g., onboard) to analysis and decision support. The most effective approach to analyzing a distributed set of massive data may involve some exploration and iteration, putting a premium on the flexibility afforded by the architectural framework. The framework should enable scientist users to assemble workflows efficiently, manage the uncertainties related to data analysis and inference, and optimize deep-dive analytics to enhance scalability. In many cases, this "data ecosystem" needs to be able to integrate multiple observing assets, ground environments, archives, and analytics, evolving from stewardship of measurements of data to using computational methodologies to better derive insight from the data that may be fused with other sets of data. This presentation will discuss architectural strategies, including a 2015-2016 NASA AIST Study on Big Data, for evolving scientific research towards massively distributed data-driven discovery. It will include example use cases across earth science, planetary science, and other disciplines.

  4. New records for millipedes from southern Chile (Polydesmida: Dalodesmidae; Polyzoniida: Siphonotidae).

    PubMed

    Mesibov, Robert Evan

    2017-01-01

    Millipedes from 1983 collections by the author in southern Chile have been identified and registered as specimen lots at the Queen Victoria Museum and Art Gallery (QVMAG) in Launceston, Tasmania. Collection and specimen data from the new QVMAG specimen lots have been archived in Darwin Core format together with a KML file of occurrences. The 31 occurrence records in the Darwin Core Archive list 13 millipede taxa from 16 sites in Llanquihue and Osorno provinces, Chile.

  5. Collaborative Metadata Curation in Support of NASA Earth Science Data Stewardship

    NASA Technical Reports Server (NTRS)

    Sisco, Adam W.; Bugbee, Kaylin; le Roux, Jeanne; Staton, Patrick; Freitag, Brian; Dixon, Valerie

    2018-01-01

    Growing collection of NASA Earth science data is archived and distributed by EOSDIS’s 12 Distributed Active Archive Centers (DAACs). Each collection and granule is described by a metadata record housed in the Common Metadata Repository (CMR). Multiple metadata standards are in use, and core elements of each are mapped to and from a common model – the Unified Metadata Model (UMM). Work done by the Analysis and Review of CMR (ARC) Team.

  6. Software for Computing, Archiving, and Querying Semisimple Braided Monoidal Category Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    This software package collects various open source and freely available codes and algorithms to compute and archive the categorical data for certain semisimple braided monoidal categories. In particular, it computes the data for of group theoretical categories for academic research.

  7. Spin-Off Successes of SETI Research at Berkeley

    NASA Astrophysics Data System (ADS)

    Douglas, K. A.; Anderson, D. P.; Bankay, R.; Chen, H.; Cobb, J.; Korpela, E. J.; Lebofsky, M.; Parsons, A.; von Korff, J.; Werthimer, D.

    2009-12-01

    Our group contributes to the Search for Extra-Terrestrial Intelligence (SETI) by developing and using world-class signal processing computers to analyze data collected on the Arecibo telescope. Although no patterned signal of extra-terrestrial origin has yet been detected, and the immediate prospects for making such a detection are highly uncertain, the SETI@home project has nonetheless proven the value of pursuing such research through its impact on the fields of distributed computing, real-time signal processing, and radio astronomy. The SETI@home project has spun off the Center for Astronomy Signal Processing and Electronics Research (CASPER) and the Berkeley Open Infrastructure for Networked Computing (BOINC), both of which are responsible for catalyzing a smorgasbord of new research in scientific disciplines in countries around the world. Futhermore, the data collected and archived for the SETI@home project is proving valuable in data-mining experiments for mapping neutral galatic hydrogen and for detecting black-hole evaporation.

  8. Libraries, Archives and Museums: What's in Them for Us? PIALA '98. Selected Papers from the Pacific Islands Association of Libraries and Archives Conference (8th, Tofol, Kosrae, Federated States of Micronesia, November 17-20, 1998).

    ERIC Educational Resources Information Center

    Cohen, Arlene, Ed.

    This proceedings contains papers from the 1998 annual conference of the Pacific Islands Association of Libraries and Archives (PIALA). After welcoming remarks from Henry Robert and Isabel Rungrad, the following papers are included: "Sharing Our Successes, Discussing Our Future: A Survey of Pacific Collections Activities--Report from the…

  9. BOREAS TE-5 Diurnal CO2 Canopy Profile Data

    NASA Technical Reports Server (NTRS)

    Hall, Forrest G. (Editor); Curd, Shelaine (Editor); Ehleriinger, Jim; Brooks, J. Renee; Flanagan, Larry

    2000-01-01

    The BOREAS TE-5 team collected several data sets to investigate the vegetation-atmosphere CO2 and H2O exchange processes. These data were collected to provide detailed information within the canopy during times when TE-05 sampled canopy CO2 for carbon and oxygen isotope analysis. These measurements were made in both the NSA and SSA at the OJP, OBS, UBS, and OA sites from 25-May1994 to 08-Sep1994. CO2 profile data were not collected at SSA-OA during the first IFC. The data are available in tabular ASCII files. The data files are available on a CD-ROM (see document number 20010000884), or from the Oak Ridge National Laboratory (ORNL) Distributed Activity Archive Center (DAAC).

  10. [Psychiatric Archives: Archive for which History? The Writings and Drawings of René L.

    PubMed

    Artières, Philippe

    The archives of the psychiatric institutions are often mobilized to investigate the history of the treatment of mental disorders and its modalities in our societies. From a patient's record, interned in the Hôpital du Bon Sauveur in the department of Manche in France in 1963, this article shows how these archives are part of writing a different story that of decolonization and particularly the independence of Algeria and the end of French colonization. In particular, studied the drawings by this patient and the way it reflects the collective history.

  11. A generic archive protocol and an implementation

    NASA Technical Reports Server (NTRS)

    Jordan, J. M.; Jennings, D. G.; Mcglynn, T. A.; Ruggiero, N. G.; Serlemitsos, T. A.

    1992-01-01

    Archiving vast amounts of data has become a major part of every scientific space mission today. The Generic Archive/Retrieval Services Protocol (GRASP) addresses the question of how to archive the data collected in an environment where the underlying hardware archives may be rapidly changing. GRASP is a device independent specification defining a set of functions for storing and retrieving data from an archive, as well as other support functions. GRASP is divided into two levels: the Transfer Interface and the Action Interface. The Transfer Interface is computer/archive independent code while the Action Interface contains code which is dedicated to each archive/computer addressed. Implementations of the GRASP specification are currently available for DECstations running Ultrix, Sparcstations running SunOS, and microVAX/VAXstation 3100's. The underlying archive is assumed to function as a standard Unix or VMS file system. The code, written in C, is a single suite of files. Preprocessing commands define the machine unique code sections in the device interface. The implementation was written, to the greatest extent possible, using only ANSI standard C functions.

  12. BOREAS AFM-04 Twin Otter Aircraft Flux Data

    NASA Technical Reports Server (NTRS)

    MacPherson, J. Ian; Hall, Forrest G. (Editor); Knapp, David E. (Editor); Desjardins, Raymond L.; Smith, David E. (Technical Monitor)

    2000-01-01

    The BOREAS AFM-5 team collected and processed data from the numerous radiosonde flights during the project. The goals of the AFM-05 team were to provide large-scale definition of the atmosphere by supplementing the existing AES aerological network, both temporally and spatially. This data set includes basic upper-air parameters collected from the network of upper-air stations during the 1993, 1994, and 1996 field campaigns over the entire study region. The data are contained in tabular ASCII files. The data files are available on a CD-ROM (see document number 20010000884) or from the Oak Ridge National Laboratory (ORNL) Distributed Active Archive Center (DAAC).

  13. BOREAS TE-5 Tree Ring and Carbon Isotope Ratio Data

    NASA Technical Reports Server (NTRS)

    Hall, Forrest G. (Editor); Curd, Shelaine (Editor); Ehleriinger, Jim; Brooks, J. Renee; Flanagan, Larry

    2000-01-01

    The BOREAS TE-5 team collected several data sets to investigate the vegetation-atmosphere CO2 and H2O exchange processes. These data include tree ring widths and cellulose carbon isotope data from coniferous trees collected at the BOREAS NSA and SSA in 1993 and 1994 by the BOREAS TE-5 team. Ring width data are provided for both Picea mariana and Pinus banksiana. The carbon isotope data are provided only for Pinus banksiana. The data are provided in tabular ASCII files. The data files are available on a CD-ROM (see document number 20010000884), or from the Oak Ridge National Laboratory (ORNL) Distributed Active Archive Center (DAAC).

  14. NASA-SETI microwave observing project: Targeted Search Element (TSE)

    NASA Technical Reports Server (NTRS)

    Webster, L. D.

    1991-01-01

    The Targeted Search Element (TSE) performs one of two complimentary search strategies of the NASA-SETI Microwave Observing Project (MOP): the targeted search. The principle objective of the targeted search strategy is to scan the microwave window between the frequencies of one and three gigahertz for narrowband microwave emissions eminating from the direction of 773 specifically targeted stars. The scanning process is accomplished at a minimum resolution of one or two Hertz at very high sensitivity. Detectable signals will be of a continuous wave or pulsed form and may also drift in frequency. The TSE will possess extensive radio frequency interference (RFI) mitigation and verification capability as the majority of signals detected by the TSE will be of local origin. Any signal passing through RFI classification and classifiable as an extraterrestrial intelligence (ETI) candidate will be further validated at non-MOP observatories using established protocol. The targeted search will be conducted using the capability provided by the TSE. The TSE provides six Targeted Search Systems (TSS) which independently or cooperatively perform automated collection, analysis, storage, and archive of signal data. Data is collected in 10 megahertz chunks and signal processing is performed at a rate of 160 megabits per second. Signal data is obtained utilizing the largest radio telescopes available for the Targeted Search such as those at Arecibo and Nancay or at the dedicated NASA-SETI facility. This latter facility will allow continuous collection of data. The TSE also provides for TSS utilization planning, logistics, remote operation, and for off-line data analysis and permanent archive of both the Targeted Search and Sky Survey data.

  15. VLBA Archive &Distribution Architecture

    NASA Astrophysics Data System (ADS)

    Wells, D. C.

    1994-01-01

    Signals from the 10 antennas of NRAO's VLBA [Very Long Baseline Array] are processed by a Correlator. The complex fringe visibilities produced by the Correlator are archived on magnetic cartridges using a low-cost architecture which is capable of scaling and evolving. Archive files are copied to magnetic media to be distributed to users in FITS format, using the BINTABLE extension. Archive files are labelled using SQL INSERT statements, in order to bind the DBMS-based archive catalog to the archive media.

  16. Researching the Vietnam Conflict through U.S. Archival Sources.

    ERIC Educational Resources Information Center

    Marlatt, Greta E.

    1995-01-01

    Presents a pathfinder for the researcher interested in locating materials pertaining to the U.S. involvement in the Vietnam conflict. Highlight include historical background; information sources, including the National Archives, oral histories, manuscripts, federal reports, National Technical Information Service, and special collections. Seven…

  17. The Videodisc as a Pilot Project of the Public Archives of Canada.

    ERIC Educational Resources Information Center

    Mole, Dennis

    1981-01-01

    Discusses a project in which a large variety of materials from the collection of the Canadian Public Archives were recorded and played back using laser optical videodisc technology. The videodisc's capabilities for preserving, storing, and retrieving information are discussed. (Author/JJD)

  18. Preserving Astronomy's Photographic Legacy: Current State and the Future of North American Astronomical Plates

    NASA Astrophysics Data System (ADS)

    Osborn, W.; Robbins, L.

    2009-08-01

    This book contains articles on preserving astronomy's valuable heritage of photographic observations, most of which are on glass plates. It is intended to serve as a reference for institutions charged with preserving and managing plate archives and astronomers interested in using archival photographic plates in their research. The first portion of the book focuses on previous activities and recommendations related to plate archiving. These include actions taken by the International Astronomical Union, activities in Europe and a detailed account of a workshop on preserving astronomical photographic data held in 2007 at the Pisgah Astronomical Research Institute, North Carolina. The workshop discussions covered a wide range of issues that must be considered in any effort to archive plates and culminated in a set of recommendations on preserving, cataloging and making publicly available these irreplaceable data. The second part of the book reports on some recent efforts to implement the recommendations. These include essays on the recently established Astronomical Photographic Data Archive, projects to make photographic collections available electronically, evaluations of commercial scanners for digitization of astronomical plates and the case for the continuing value of these data along with a report on the census of astronomical plate collections in North America carried out in 2008. The census cataloged the locations, numbers, and types of astronomical plates in the US and Canada. Comprehensive appendices identify all the significant collections in North America and detail the current contents, state and status of their holdings.

  19. The Ethics of Archival Research

    ERIC Educational Resources Information Center

    McKee, Heidi A.; Porter, James E.

    2012-01-01

    What are the key ethical issues involved in conducting archival research? Based on examination of cases and interviews with leading archival researchers in composition, this article discusses several ethical questions and offers a heuristic to guide ethical decision making. Key to this process is recognizing the person-ness of archival materials.…

  20. Meeting Students Where They Are: Advancing a Theory and Practice of Archives in the Classroom

    ERIC Educational Resources Information Center

    Saidy, Christina; Hannah, Mark; Sura, Tom

    2011-01-01

    This article uses theories of technical communication and archives to advance a pedagogy that includes archival production in the technical communication classroom. By developing and maintaining local classroom archives, students directly engage in valuable processes of appraisal, selection, collaboration, and retention. The anticipated outcomes…

  1. Using project life-cycles as guide for timing the archival of scientific data and supporting documentation

    NASA Astrophysics Data System (ADS)

    Martinez, E.; Glassy, J. M.; Fowler, D. K.; Khayat, M.; Olding, S. W.

    2014-12-01

    The NASA Earth Science Data Systems Working Groups (ESDSWG) focuses on improving technologies and processes related to science discovery and preservation. One particular group, the Data Preservation Practices, is defining a set of guidelines to aid data providers in planning both what to submit for archival, and when to submit artifacts, so that the archival process can begin early in the project's life cycle. This has the benefit of leveraging knowledge within the project before staff roll off to other work. In this poster we describe various project archival use cases and identify possible archival life cycles that map closely to the pace and flow of work. To understand "archival life cycles", i.e., distinct project phases that produce archival artifacts such as instrument capabilities, calibration reports, and science data products, the workig group initially mapped the archival requirements defined in the Preservation Content Specification to the typical NASA project life cycle. As described in the poster, this work resulted in a well-defined archival life cycle, but only for some types of projects; it did not fit well for condensed project life cycles experienced within airborne and balloon campaigns. To understand the archival process for projects with compressed cycles, the working group gathered use cases from various communities. This poster will describe selected uses cases that provided insight into the unique flow of these projects, as well as proposing archival life cycles that map artifacts to projects with compressed timelines. Finally, the poster will conclude with some early recommendations for data providers, which will be captured in a formal Guidelines document - to be published in 2015.

  2. Archived 1976-1985 JPL Aircraft SAR Data

    NASA Technical Reports Server (NTRS)

    Thompson, Thomas W.; Blom, Ronald G.

    2016-01-01

    This report describes archived data from the Jet Propulsion Laboratory (JPL) aircraft radar expeditions in the mid-1970s through the mid-1980s collected by Ron Blom, JPL Radar Geologist. The dataset was collected during Ron's career at JPL from the 1970s through 2015. Synthetic Aperture Radar (SAR) data in the 1970s and 1980s were recorded optically on long strips of film. SAR imagery was produced via an optical, holographic technique that resulted in long strips of film imagery.

  3. Development of a Reference Image Collection Library for Histopathology Image Processing, Analysis and Decision Support Systems Research.

    PubMed

    Kostopoulos, Spiros; Ravazoula, Panagiota; Asvestas, Pantelis; Kalatzis, Ioannis; Xenogiannopoulos, George; Cavouras, Dionisis; Glotsos, Dimitris

    2017-06-01

    Histopathology image processing, analysis and computer-aided diagnosis have been shown as effective assisting tools towards reliable and intra-/inter-observer invariant decisions in traditional pathology. Especially for cancer patients, decisions need to be as accurate as possible in order to increase the probability of optimal treatment planning. In this study, we propose a new image collection library (HICL-Histology Image Collection Library) comprising 3831 histological images of three different diseases, for fostering research in histopathology image processing, analysis and computer-aided diagnosis. Raw data comprised 93, 116 and 55 cases of brain, breast and laryngeal cancer respectively collected from the archives of the University Hospital of Patras, Greece. The 3831 images were generated from the most representative regions of the pathology, specified by an experienced histopathologist. The HICL Image Collection is free for access under an academic license at http://medisp.bme.teiath.gr/hicl/ . Potential exploitations of the proposed library may span over a board spectrum, such as in image processing to improve visualization, in segmentation for nuclei detection, in decision support systems for second opinion consultations, in statistical analysis for investigation of potential correlations between clinical annotations and imaging findings and, generally, in fostering research on histopathology image processing and analysis. To the best of our knowledge, the HICL constitutes the first attempt towards creation of a reference image collection library in the field of traditional histopathology, publicly and freely available to the scientific community.

  4. Undergraduate Research and Academic Archives: Instruction, Learning and Assessment

    ERIC Educational Resources Information Center

    Krause, Magia G.

    2010-01-01

    Colleges and universities are increasingly investing resources to promote undergraduate research. Undergraduate research can be broadly defined to incorporate scientific inquiry, creative expression, and scholarship with the result of producing original work. Academic archives and special collections can play a vital role in the undergraduate…

  5. Flexible server-side processing of climate archives

    NASA Astrophysics Data System (ADS)

    Juckes, Martin; Stephens, Ag; Damasio da Costa, Eduardo

    2014-05-01

    The flexibility and interoperability of OGC Web Processing Services are combined with an extensive range of data processing operations supported by the Climate Data Operators (CDO) library to facilitate processing of the CMIP5 climate data archive. The challenges posed by this peta-scale archive allow us to test and develop systems which will help us to deal with approaching exa-scale challenges. The CEDA WPS package allows users to manipulate data in the archive and export the results without first downloading the data -- in some cases this can drastically reduce the data volumes which need to be transferred and greatly reduce the time needed for the scientists to get their results. Reductions in data transfer are achieved at the expense of an additional computational load imposed on the archive (or near-archive) infrastructure. This is managed with a load balancing system. Short jobs may be run in near real-time, longer jobs will be queued. When jobs are queued the user is provided with a web dashboard displaying job status. A clean split between the data manipulation software and the request management software is achieved by exploiting the extensive CDO library. This library has a long history of development to support the needs of the climate science community. Use of the library ensures that operations run on data by the system can be reproduced by users using the same operators installed on their own computers. Examples using the system deployed for the CMIP5 archive will be shown and issues which need to be addressed as archive volumes expand into the exa-scale will be discussed.

  6. Flexible server-side processing of climate archives

    NASA Astrophysics Data System (ADS)

    Juckes, M. N.; Stephens, A.; da Costa, E. D.

    2013-12-01

    The flexibility and interoperability of OGC Web Processing Services are combined with an extensive range of data processing operations supported by the Climate Data Operators (CDO) library to facilitate processing of the CMIP5 climate data archive. The challenges posed by this peta-scale archive allow us to test and develop systems which will help us to deal with approaching exa-scale challenges. The CEDA WPS package allows users to manipulate data in the archive and export the results without first downloading the data -- in some cases this can drastically reduce the data volumes which need to be transferred and greatly reduce the time needed for the scientists to get their results. Reductions in data transfer are achieved at the expense of an additional computational load imposed on the archive (or near-archive) infrastructure. This is managed with a load balancing system. Short jobs may be run in near real-time, longer jobs will be queued. When jobs are queued the user is provided with a web dashboard displaying job status. A clean split between the data manipulation software and the request management software is achieved by exploiting the extensive CDO library. This library has a long history of development to support the needs of the climate science community. Use of the library ensures that operations run on data by the system can be reproduced by users using the same operators installed on their own computers. Examples using the system deployed for the CMIP5 archive will be shown and issues which need to be addressed as archive volumes expand into the exa-scale will be discussed.

  7. BOREAS HYD-8 Throughfall Data

    NASA Technical Reports Server (NTRS)

    Wang, Xue-Wen; Hall, Forrest G. (Editor); Knapp, David E. (Editor); Fernandes, Richard; Smith, David E. (Technical Monitor)

    2000-01-01

    The Boreal Ecosystem-Atmosphere Study (BOREAS) Hydrology (HYD)-8 team made measurements of surface hydrological processes at the Southern Study Area (SSA) and Northern Study Area (NSA) Old Black Spruce (OBS) Tower Flux sites, supporting its research into point hydrological processes and the spatial variation of these processes. These data were collected during the 1994 and 1996 field campaigns. Data collected may be useful in characterizing canopy interception, drip, throughfall, moss interception, drainage, evaporation, and capacity during the growing season at daily temporal resolution. This particular data set contains the measurements of throughfall, which is the amount of precipitation that fell through the canopy. A nested spatial sampling plan was implemented to determine spatial variations of the measured hydrological processes and ultimately the impact of these variations on modeled carbon and water budgets. These data are stored in ASCII text files. The data files are available on a CD-ROM (see document number 20010000884) or from the Oak Ridge National Laboratory (ORNL) Distributed Active Archive Center (DAAC).

  8. The European Radiobiology Archives (ERA)--content, structure and use illustrated by an example.

    PubMed

    Gerber, G B; Wick, R R; Kellerer, A M; Hopewell, J W; Di Majo, V; Dudoignon, N; Gössner, W; Stather, J

    2006-01-01

    The European Radiobiology Archives (ERA), supported by the European Commission and the European Late Effect Project Group (EULEP), together with the US National Radiobiology Archives (NRA) and the Japanese Radiobiology Archives (JRA) have collected all information still available on long-term animal experiments, including some selected human studies. The archives consist of a database in Microsoft Access, a website, databases of references and information on the use of the database. At present, the archives contain a description of the exposure conditions, animal strains, etc. from approximately 350,000 individuals; data on survival and pathology are available from approximately 200,000 individuals. Care has been taken to render pathological diagnoses compatible among different studies and to allow the lumping of pathological diagnoses into more general classes. 'Forms' in Access with an underlying computer code facilitate the use of the database. This paper describes the structure and content of the archives and illustrates an example for a possible analysis of such data.

  9. Using and Distributing Spaceflight Data: The Johnson Space Center Life Sciences Data Archive

    NASA Technical Reports Server (NTRS)

    Cardenas, J. A.; Buckey, J. C.; Turner, J. N.; White, T. S.; Havelka,J. A.

    1995-01-01

    Life sciences data collected before, during and after spaceflight are valuable and often irreplaceable. The Johnson Space Center Life is hard to find, and much of the data (e.g. Sciences Data Archive has been designed to provide researchers, engineers, managers and educators interactive access to information about and data from human spaceflight experiments. The archive system consists of a Data Acquisition System, Database Management System, CD-ROM Mastering System and Catalog Information System (CIS). The catalog information system is the heart of the archive. The CIS provides detailed experiment descriptions (both written and as QuickTime movies), hardware descriptions, hardware images, documents, and data. An initial evaluation of the archive at a scientific meeting showed that 88% of those who evaluated the catalog want to use the system when completed. The majority of the evaluators found the archive flexible, satisfying and easy to use. We conclude that the data archive effectively provides key life sciences data to interested users.

  10. NASA and USGS ASTER Expedited Satellite Data Services for Disaster Situations

    NASA Astrophysics Data System (ADS)

    Duda, K. A.

    2012-12-01

    Significant international disasters related to storms, floods, volcanoes, wildfires and numerous other themes reoccur annually, often inflicting widespread human suffering and fatalities with substantial economic consequences. During and immediately after such events it can be difficult to access the affected areas and become aware of the overall impacts, but insight on the spatial extent and effects can be gleaned from above through satellite images. The Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) on the Terra spacecraft has offered such views for over a decade. On short notice, ASTER continues to deliver analysts multispectral imagery at 15 m spatial resolution in near real-time to assist participating responders, emergency managers, and government officials in planning for such situations and in developing appropriate responses after they occur. The joint U.S./Japan ASTER Science Team has developed policies and procedures to ensure such ongoing support is accessible when needed. Processing and distribution of data products occurs at the NASA Land Processes Distributed Active Archive Center (LP DAAC) located at the USGS Earth Resources Observation and Science Center in South Dakota. In addition to current imagery, the long-term ASTER mission has generated an extensive collection of nearly 2.5 million global 3,600 km2 scenes since the launch of Terra in late 1999. These are archived and distributed by LP DAAC and affiliates at Japan Space Systems in Tokyo. Advanced processing is performed to create higher level products of use to researchers. These include a global digital elevation model. Such pre-event imagery provides a comparative basis for use in detecting changes associated with disasters and to monitor land use trends to portray areas of increased risk. ASTER imagery acquired via the expedited collection and distribution process illustrates the utility and relevancy of such data in crisis situations.

  11. Mapping the Archives 1

    ERIC Educational Resources Information Center

    Jackson, Anthony

    2012-01-01

    With this issue, "RiDE" begins a new occasional series of short informational pieces on archives in the field of drama and theatre education and applied theatre and performance. Each instalment will include summaries of several collections of significant material in the field. Over time this will build into a readily accessible annotated directory…

  12. Motorcycle Safety Event

    Science.gov Websites

    You have reached a collection of archived material. The content available is no longer being . If you wish to see the latest content, please visit the current version of the site. For persons with disabilities experiencing difficulties accessing content on archive.defense.gov, please use the DoD Section 508

  13. Wounded Warrior Diaries

    Science.gov Websites

    You have reached a collection of archived material. The content available is no longer being . If you wish to see the latest content, please visit the current version of the site. For persons with disabilities experiencing difficulties accessing content on archive.defense.gov, please use the DoD Section 508

  14. Untitled Document

    Science.gov Websites

    You have reached a collection of archived material. The content available is no longer being . If you wish to see the latest content, please visit the current version of the site. For persons with disabilities experiencing difficulties accessing content on archive.defense.gov, please use the DoD Section 508

  15. FAPA: Faculty Appointment Policy Archive, 1998. [CD-ROM.

    ERIC Educational Resources Information Center

    Trower, C. Ann

    This CD-ROM presents 220 documents collected in Harvard University's Faculty Appointment Policy Archive (FAPA), the ZyFIND search and retrieval system, and instructions for their use. The FAPA system and ZyFIND allow browsing through documents, inserting bookmarks in documents, attaching notes to documents without modifying them, and selecting…

  16. Mapping the Archives: 2

    ERIC Educational Resources Information Center

    Jackson, Anthony

    2012-01-01

    With this issue, "RiDE" continues its new occasional series of short informational pieces on archives in the field of drama and theatre education and applied theatre and performance. Each instalment includes summaries of one or more collections of significant material in the field. Over time this will build into a readily accessible…

  17. Mapping the Archives: 3

    ERIC Educational Resources Information Center

    Jackson, Anthony

    2013-01-01

    With this issue, "Research in Drama Education" (RiDE) continues its occasional series of short informational pieces on archives in the field of drama and theatre education and applied theatre and performance. Each instalment includes summaries of one or more collections of significant material in the field. Over time, this will build in…

  18. STABLE ISOPTOPE RATIOS IN ARCHIVED STRIPED BASS SCALES SUGGEST CHANGES IN TROPHIC STRUCTURE

    EPA Science Inventory

    Stable carbon isotope ratios were measured in archived striped bass, Morone saxatilis (Walbaum), scales to identify changes in the feeding behaviour of this species over time. Striped bass tissue and scale samples were collected from Rhode Island coastal waters during 1996 and ar...

  19. Cracking the Egg: The South Carolina Digital Library's New Perspective

    ERIC Educational Resources Information Center

    Vinson, Christopher G.; Boyd, Kate Foster

    2008-01-01

    This article explores the historical foundations of the South Carolina Digital Library, a collaborative statewide program that ties together academic special collections and archives, public libraries, state government archives, and other cultural resource institutions in an effort to provide the state with a comprehensive database of online…

  20. Supporting users through integrated retrieval, processing, and distribution systems at the land processes distributed active archive center

    USGS Publications Warehouse

    Kalvelage, T.; Willems, Jennifer

    2003-01-01

    The design of the EOS Data and Information Systems (EOSDIS) to acquire, archive, manage and distribute Earth observation data to the broadest possible user community was discussed. A number of several integrated retrieval, processing and distribution capabilities have been explained. The value of these functions to the users were described and potential future improvements were laid out for the users. The users were interested in acquiring the retrieval, processing and archiving systems integrated so that they can get the data they want in the format and delivery mechanism of their choice.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Voyles, Jimmy

    Individual datastreams from instrumentation at the U.S. Department of Energy (DOE) Atmospheric Radiation Measurement (ARM) Climate Research Facility fixed and mobile research observatories (sites) are collected and routed to the ARM Data Center (ADC). The Data Management Facility (DMF), a component of the ADC, executes datastream processing in near-real time. Processed data are then delivered approximately daily to the ARM Data Archive, also a component of the ADC, where they are made freely available to the research community. For each instrument, ARM calculates the ratio of the actual number of processed data records received daily at the ARM Data Archivemore » to the expected number of data records. DOE requires national user facilities to report time-based operating data.« less

  2. Archiving California’s historical duck nesting data

    USGS Publications Warehouse

    Ackerman, Joshua T.; Herzog, Mark P.; Brady, Caroline; Eadie, John M.; Yarris, Greg S.

    2015-07-14

    With the conclusion of this project, most duck nest data have been entered, but all nest-captured hen data and other breeding waterfowl data that were outside the scope of this project have still not been entered and electronically archived. Maintaining an up-to-date archive will require additional resources to archive and enter the new duck nest data each year in an iterative process. Further, data proofing should be conducted whenever possible, and also should be considered an iterative process as there was sometimes missing data that could not be filled in without more direct knowledge of specific projects. Despite these disclaimers, this duck data archive represents a massive and useful dataset to inform future research and management questions.

  3. The Value of Metrics for Science Data Center Management

    NASA Astrophysics Data System (ADS)

    Moses, J.; Behnke, J.; Watts, T. H.; Lu, Y.

    2005-12-01

    The Earth Observing System Data and Information System (EOSDIS) has been collecting and analyzing records of science data archive, processing and product distribution for more than 10 years. The types of information collected and the analysis performed has matured and progressed to become an integral and necessary part of the system management and planning functions. Science data center managers are realizing the importance that metrics can play in influencing and validating their business model. New efforts focus on better understanding of users and their methods. Examples include tracking user web site interactions and conducting user surveys such as the government authorized American Customer Satisfaction Index survey. This paper discusses the metrics methodology, processes and applications that are growing in EOSDIS, the driving requirements and compelling events, and the future envisioned for metrics as an integral part of earth science data systems.

  4. Degraded document image enhancement

    NASA Astrophysics Data System (ADS)

    Agam, G.; Bal, G.; Frieder, G.; Frieder, O.

    2007-01-01

    Poor quality documents are obtained in various situations such as historical document collections, legal archives, security investigations, and documents found in clandestine locations. Such documents are often scanned for automated analysis, further processing, and archiving. Due to the nature of such documents, degraded document images are often hard to read, have low contrast, and are corrupted by various artifacts. We describe a novel approach for the enhancement of such documents based on probabilistic models which increases the contrast, and thus, readability of such documents under various degradations. The enhancement produced by the proposed approach can be viewed under different viewing conditions if desired. The proposed approach was evaluated qualitatively and compared to standard enhancement techniques on a subset of historical documents obtained from the Yad Vashem Holocaust museum. In addition, quantitative performance was evaluated based on synthetically generated data corrupted under various degradation models. Preliminary results demonstrate the effectiveness of the proposed approach.

  5. Test Report for Cesium and Solids Removal from an 11.5L Composite of Archived Hanford Double Shell Tank Supernate for Off-Site Disposal

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Doll, Stephanie R.; Cooke, Gary A.

    The 222-S Laboratory blended supernate waste from Hanford Tanks 241-AN-101, 241-AN- 106, 241-AP-105, 241-AP-106, 241-AP-107, and 241-AY-101 from the hot cell archive to create a bulk composite. The composite was blended with 600 mL 19.4 M NaOH, which brought the total volume to approximately 11.5 L (3 gal). The composite was filtered to remove solids and passed through spherical resorcinol-formaldehyde ion-exchange resin columns to remove cesium. The composite masses were tracked as a treatability study. Samples collected before, during, and after the ion-exchange process were characterized for a full suite of analytes (inorganic, organic, and radionuclides) to aid in themore » classification of the waste for shipping, receiving, treatment, and disposal determinations.« less

  6. Test Report for Cesium and Solids Removal from an 11.5L Composite of Archived Hanford Double Shell Tank Supernate for Off-Site Disposal.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Doll, S. R.; Cooke, G. A.

    The 222-S Laboratory blended supernate waste from Hanford Tanks 241-AN-101, 241-AN- 106, 241-AP-105, 241-AP-106, 241-AP-107, and 241-AY-101 from the hot cell archive to create a bulk composite. The composite was blended with 600 mL 19.4 M NaOH, which brought the total volume to approximately 11.5 L (3 gal). The composite was filtered to remove solids and passed through spherical resorcinol-formaldehyde ion-exchange resin columns to remove cesium. The composite masses were tracked as a treatability study. Samples collected before, during, and after the ion exchange process were characterized for a full suite of analytes (inorganic, organic, and radionuclides) to aid inmore » the classification of the waste for shipping, receiving, treatment, and disposal determinations.« less

  7. ClearedLeavesDB: an online database of cleared plant leaf images

    PubMed Central

    2014-01-01

    Background Leaf vein networks are critical to both the structure and function of leaves. A growing body of recent work has linked leaf vein network structure to the physiology, ecology and evolution of land plants. In the process, multiple institutions and individual researchers have assembled collections of cleared leaf specimens in which vascular bundles (veins) are rendered visible. In an effort to facilitate analysis and digitally preserve these specimens, high-resolution images are usually created, either of entire leaves or of magnified leaf subsections. In a few cases, collections of digital images of cleared leaves are available for use online. However, these collections do not share a common platform nor is there a means to digitally archive cleared leaf images held by individual researchers (in addition to those held by institutions). Hence, there is a growing need for a digital archive that enables online viewing, sharing and disseminating of cleared leaf image collections held by both institutions and individual researchers. Description The Cleared Leaf Image Database (ClearedLeavesDB), is an online web-based resource for a community of researchers to contribute, access and share cleared leaf images. ClearedLeavesDB leverages resources of large-scale, curated collections while enabling the aggregation of small-scale collections within the same online platform. ClearedLeavesDB is built on Drupal, an open source content management platform. It allows plant biologists to store leaf images online with corresponding meta-data, share image collections with a user community and discuss images and collections via a common forum. We provide tools to upload processed images and results to the database via a web services client application that can be downloaded from the database. Conclusions We developed ClearedLeavesDB, a database focusing on cleared leaf images that combines interactions between users and data via an intuitive web interface. The web interface allows storage of large collections and integrates with leaf image analysis applications via an open application programming interface (API). The open API allows uploading of processed images and other trait data to the database, further enabling distribution and documentation of analyzed data within the community. The initial database is seeded with nearly 19,000 cleared leaf images representing over 40 GB of image data. Extensible storage and growth of the database is ensured by using the data storage resources of the iPlant Discovery Environment. ClearedLeavesDB can be accessed at http://clearedleavesdb.org. PMID:24678985

  8. ClearedLeavesDB: an online database of cleared plant leaf images.

    PubMed

    Das, Abhiram; Bucksch, Alexander; Price, Charles A; Weitz, Joshua S

    2014-03-28

    Leaf vein networks are critical to both the structure and function of leaves. A growing body of recent work has linked leaf vein network structure to the physiology, ecology and evolution of land plants. In the process, multiple institutions and individual researchers have assembled collections of cleared leaf specimens in which vascular bundles (veins) are rendered visible. In an effort to facilitate analysis and digitally preserve these specimens, high-resolution images are usually created, either of entire leaves or of magnified leaf subsections. In a few cases, collections of digital images of cleared leaves are available for use online. However, these collections do not share a common platform nor is there a means to digitally archive cleared leaf images held by individual researchers (in addition to those held by institutions). Hence, there is a growing need for a digital archive that enables online viewing, sharing and disseminating of cleared leaf image collections held by both institutions and individual researchers. The Cleared Leaf Image Database (ClearedLeavesDB), is an online web-based resource for a community of researchers to contribute, access and share cleared leaf images. ClearedLeavesDB leverages resources of large-scale, curated collections while enabling the aggregation of small-scale collections within the same online platform. ClearedLeavesDB is built on Drupal, an open source content management platform. It allows plant biologists to store leaf images online with corresponding meta-data, share image collections with a user community and discuss images and collections via a common forum. We provide tools to upload processed images and results to the database via a web services client application that can be downloaded from the database. We developed ClearedLeavesDB, a database focusing on cleared leaf images that combines interactions between users and data via an intuitive web interface. The web interface allows storage of large collections and integrates with leaf image analysis applications via an open application programming interface (API). The open API allows uploading of processed images and other trait data to the database, further enabling distribution and documentation of analyzed data within the community. The initial database is seeded with nearly 19,000 cleared leaf images representing over 40 GB of image data. Extensible storage and growth of the database is ensured by using the data storage resources of the iPlant Discovery Environment. ClearedLeavesDB can be accessed at http://clearedleavesdb.org.

  9. Demonstration Report for Geonics EM-63 Cued-Interrogation Data Collection, Processing and Archiving at Camp Sibert, Alabama

    DTIC Science & Technology

    2008-08-01

    Requirements for UXO Discrimination: Paper prepared for UXO Location Workshop, Annapolis, May 2005. Billings, S. D., Pasion , L. R., Beran, L., Oldenburg, D...Discrimination Strategies for Application to Live Sites. Billings, S. D., Pasion , L. R. and Oldenburg, D., 2007b, Sky Research/University of British Columbia...moved RTS to reacquire SE1-12 which was lost by the DAS issue as well. o Talked with Len Pasion about the best approach for full coverage data given

  10. SAM 2 data user's guide

    NASA Technical Reports Server (NTRS)

    Chu, W. P.; Osborn, M. T.; Mcmaster, L. R.

    1988-01-01

    This document is intended to serve as a guide to the use of the data products from the Stratospheric Aerosol Measurement (SAM) 2 experiment for scientific investigations of polar stratospheric aerosols. Included is a detailed description of the Beta and Aerosol Number Density Archive Tape (BANAT), which is the SAM 2 data product containing the aerosol extinction data available for these investigations. Also included are brief descriptions of the instrument operation, data collection, processing and validation, and some of the scientific analyses conducted to date.

  11. 76 FR 14433 - Agency Information Collection Activities: Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-16

    ... concerning the following information collection: 1. Title: Presidential Library Facilities. OMB number: 3095... Presidential library facility. The report contains information that can be furnished only by the foundation or... NATIONAL ARCHIVES AND RECORDS ADMINISTRATION Agency Information Collection Activities: Proposed...

  12. Interactive access to LP DAAC satellite data archives through a combination of open-source and custom middleware web services

    USGS Publications Warehouse

    Davis, Brian N.; Werpy, Jason; Friesz, Aaron M.; Impecoven, Kevin; Quenzer, Robert; Maiersperger, Tom; Meyer, David J.

    2015-01-01

    Current methods of searching for and retrieving data from satellite land remote sensing archives do not allow for interactive information extraction. Instead, Earth science data users are required to download files over low-bandwidth networks to local workstations and process data before science questions can be addressed. New methods of extracting information from data archives need to become more interactive to meet user demands for deriving increasingly complex information from rapidly expanding archives. Moving the tools required for processing data to computer systems of data providers, and away from systems of the data consumer, can improve turnaround times for data processing workflows. The implementation of middleware services was used to provide interactive access to archive data. The goal of this middleware services development is to enable Earth science data users to access remote sensing archives for immediate answers to science questions instead of links to large volumes of data to download and process. Exposing data and metadata to web-based services enables machine-driven queries and data interaction. Also, product quality information can be integrated to enable additional filtering and sub-setting. Only the reduced content required to complete an analysis is then transferred to the user.

  13. The Self-Organized Archive: SPASE, PDS and Archive Cooperatives

    NASA Astrophysics Data System (ADS)

    King, T. A.; Hughes, J. S.; Roberts, D. A.; Walker, R. J.; Joy, S. P.

    2005-05-01

    Information systems with high quality metadata enable uses and services which often go beyond the original purpose. There are two types of metadata: annotations which are items that comment on or describe the content of a resource and identification attributes which describe the external properties of the resource itself. For example, annotations may indicate which columns are present in a table of data, whereas an identification attribute would indicate source of the table, such as the observatory, instrument, organization, and data type. When the identification attributes are collected and used as the basis of a search engine, a user can constrain on an attribute, the archive can then self-organize around the constraint, presenting the user with a particular view of the archive. In an archive cooperative where each participating data system or archive may have its own metadata standards, providing a multi-system search engine requires that individual archive metadata be mapped to a broad based standard. To explore how cooperative archives can form a larger self-organized archive we will show how the Space Physics Archive Search and Extract (SPASE) data model will allow different systems to create a cooperative and will use Planetary Data System (PDS) plus existing space physics activities as a demonstration.

  14. Reconstructing the history of holography

    NASA Astrophysics Data System (ADS)

    Johnston, Sean F.

    2003-05-01

    This paper discusses large-scale but gradual changes in the subject of holography that have only recently become readily observable. Presenting an analysis of publications in holography over the past half century, the paper illustrates and discusses the evolving shape of the subject. Over 40,000 international information sources have been recorded, including some 20,000 papers, 10,000 books, nearly as many of these and at least 500 exhibitions. This statistical and sociological approach is combined with the identification of specific factors - notably the role of individuals, conferences, proof-of-concept demonstrations and exhibitions - to suggest that the development of holography has been unusually contingent on a variety of intellectual and social influences. The paper situates these observations about holography and holographers in the context of a wider discussion about the styles, purposes and difficulties of historical writing on technological subjects. It further suggests that this ongoing process of both recording and reconstructing technological history can be aided by identification of sources sometimes overlooked or undervalued by practitioners: unpublished archival materials such as private file collections; business records; or undervalued by practitioners: unpublished archival material such as private file collections; business records; accounts of unsuccessful activities; and, by no means least, anecdotal accounts inter-linked between participants.

  15. Integrated Stewardship of NASA Satellite and Field Campaign Data

    NASA Astrophysics Data System (ADS)

    Hausman, J.; Tsontos, V. M.; Hardman, S. H.

    2016-02-01

    The Physical Oceanography Distributed Active Archive Center (PO.DAAC) is NASA's archive, steward and distributor for physical oceanographic satellite data. Those data are typically organized along the lines of single parameters, such as Sea Surface Temperature, Ocean Winds, Salinity, etc. However there is a need supplement satellite data with in situ and various other remote sensing data to provide higher spatial and temporal sampling and information on physical processes that the satellites are not capable of measuring. This presentation will discuss how PO.DAAC is creating a stewardship and distribution plan that will accommodate satellite, in situ and other remote sensing data that can be used to solve a more integrated approach to data access and utilization along thematic lines in support of science and applications, specifically those posed by Salinity Processes in the Upper Ocean Regional Study (SPURS) and Oceans Melting Greenland (OMG) projects. SPURS used shipboard data, moorings and in situ instruments to investigate changes in salinity and how that information can be used in explaining the water cycle. OMG is studying ice melt in Greenland and how it contributes to changes in sea level through shipboard measurements, airborne and a variety of in situ instruments. PO.DAAC plans on adapting to stewarding and distributing these varieties of data through applications of file format and metadata standards (so data are discoverable and interoperable), extend the internal data system (to allow for better archiving, collection generation and querying of in situ and airborne data) and integration into tools (visualization and data access). We are also working on Virtual Collections with ESDWG, which could provide access to relevant data across DAACs/Agencies along thematic lines. These improvements will improve long-term data management and make it easier for users of various background, regardless if remote sensing or in situ, to discover and use the data.

  16. Collection of LAI and FPAR Data Over The Terra Core Sites

    NASA Technical Reports Server (NTRS)

    Myneni, Ranga B.; Knjazihhin, J.; Tian, Y.; Wang, Y.

    2001-01-01

    The objective of our effort was to collect and archive data on LAI (leaf area index) and FPAR (Fraction of Photosynthetically active Radiation absorbed by vegetation) at the EOS Core validation sites as well as to validate and evaluate global fields of LAI and FPAR derived from atmospherically corrected MODIS (Moderate Resolution Imaging Spectrometer) surface reflectance data by comparing these fields with the EOS Core validation data set. The above has been accomplished by: (a) the participation in selected field campaigns within the EOS Validation Program; (b) the processing of the collected data so that suitable comparison between field measurements and the MODIS LAI/FPAR fields can be made; (c) the comparison of the MODAS LAI/FRAM fields with the EOS Terra Core validation data set.

  17. The Golosiiv on-line plate archive database, management and maintenance

    NASA Astrophysics Data System (ADS)

    Pakuliak, L.; Sergeeva, T.

    2007-08-01

    We intend to create online version of the database of the MAO NASU plate archive as VO-compatible structures in accordance with principles, developed by the International Virtual Observatory Alliance in order to make them available for world astronomical community. The online version of the log-book database is constructed by means of MySQL+PHP. Data management system provides a user with user interface, gives a capability of detailed traditional form-filling radial search of plates, obtaining some auxiliary sampling, the listing of each collection and permits to browse the detail descriptions of collections. The administrative tool allows database administrator the data correction, enhancement with new data sets and control of the integrity and consistence of the database as a whole. The VO-compatible database is currently constructing under the demands and in the accordance with principles of international data archives and has to be strongly generalized in order to provide a possibility of data mining by means of standard interfaces and to be the best fitted to the demands of WFPDB Group for databases of the plate catalogues. On-going enhancements of database toward the WFPDB bring the problem of the verification of data to the forefront, as it demands the high degree of data reliability. The process of data verification is practically endless and inseparable from data management owing to a diversity of data errors nature, that means to a variety of ploys of their identification and fixing. The current status of MAO NASU glass archive forces the activity in both directions simultaneously: the enhancement of log-book database with new sets of observational data as well as generalized database creation and the cross-identification between them. The VO-compatible version of the database is supplying with digitized data of plates obtained with MicroTek ScanMaker 9800 XL TMA. The scanning procedure is not total but is conducted selectively in the frames of special projects.

  18. Science data archives of Indian Space Research Organisation (ISRO): Chandrayaan-1

    NASA Astrophysics Data System (ADS)

    Gopala Krishna, Barla; Singh Nain, Jagjeet; Moorthi, Manthira

    The Indian Space Research Organisation (ISRO) has started a new initiative to launch dedicated scientific satellites earmarked for planetary exploration, astronomical observation and space sciences. The Chandrayaan-1 mission to Moon is one of the approved missions of this new initiative. The basic objective of the Chandrayaan-1 mission, scheduled for launch in mid 2008, is photoselenological and chemical mapping of the Moon with better spatial and spectral resolution. Consistent with this scientific objective, the following baseline payloads are included in this mission: (i) Terrain mapping stereo camera (TMC) with 20 km swath (400-900 nm band) for 3D imaging of lunar surface at a spatial resolution of 5m. (ii) Hyper Spectral Imager in the 400- 920 nm band with 64 channels and spatial resolution of 80m (20 km swath) for mineralogical mapping. (iii) High-energy X-ray (30-270 keV) spectrometer having a footprint of 40 km for study of volatile transport on Moon and (iv) Laser ranging instrument with vertical resolution of 5m. ISRO offered opportunity to the international scientific community to participate in Chandrayaan- 1 mission and six payloads that complement the basic objective of the Chandrayaan-1 mission have been selected and included in this mission viz., (i) a miniature imaging radar instrument (Mini-SAR) from APL, NASA to look for presence of ice in the polar region, (ii) a near infrared spectrometer (SIR-2) from Max Plank Institute, Germany, (iii) a Moon Mineralogy Mapper (M3) from JPL, NASA for mineralogical mapping in the infra-red regions (0.7 - 3.0 micron), (iv) a sub-keV atom reflecting analyzer (SARA) from Sweden, India, Switzerland and Japan for detection of low energy neutral atoms emanated from the lunar surface,(v) a radiation dose monitor (RADOM) from Bulgaria for monitoring energetic particle flux in the lunar environment and (vi) a collimated low energy (1-10keV) X-ray spectrometer (C1XS) with a field of view of 20 km for chemical mapping of the lunar surface from RAL, UK. Science data from the Chandrayaan-1 instruments is planned to be archived by combined efforts from all the instrument and Payload Operations Centre (POC) teams, the Indian Space Science Data Centre (ISSDC), the Chandrayaan-1 Spacecraft Control Centre (SCC). Chandrayaan-1 Science Data Archive (CSDA) is planned at ISSDC is the primary data center for the payload data archives of Indian Space Science Missions. This data center is responsible for the Ingest, Archive, and Dissemination of the payload and related ancillary data for Space Science missions like Chandrayaan-1. The archiving process includes the design, generation, validation and transfer of the data archive. The archive will include raw and reduced data, calibration data, auxiliary data, higher-level derived data products, documentation and software. The CSDA will make use of the well-proven archive standards of the Planetary Data System (PDS) and planned to follow IPDA guidelines. This is to comply with the global standards for long term preservation of the data, maintain their usability and facilitate scientific community with the high quality data for their analysis. The primary users of this facility will be the principal investigators of the science payloads initially till the lock-in period. After this, the data will be made accessible to scientists from other institutions and also to the general public. The raw payload data received through the data reception stations is further processed to generate Level-0 and Level-1 data products, which are stored in the CSDA for subsequent dissemination. According to the well documented Chandrayaan-1 archive plan agreed by the experiment teams, the data collection period is decided to be six months. The first data delivery to long term archive of CSDA after peer review is expected to be eighteen months after launch. At present, Experimenter to Archive ICDs of the instrument data are under the process of review.

  19. Recovery and archiving key Arctic Alaska vegetation map and plot data for the Arctic-Boreal Vulnerability Field Experiment (ABoVE)

    NASA Astrophysics Data System (ADS)

    Walker, D. A.; Breen, A. L.; Broderson, D.; Epstein, H. E.; Fisher, W.; Grunblatt, J.; Heinrichs, T.; Raynolds, M. K.; Walker, M. D.; Wirth, L.

    2013-12-01

    Abundant ground-based information will be needed to inform remote-sensing and modeling studies of NASA's Arctic-Boreal Vulnerability Experiment (ABoVE). A large body of plot and map data collected by the Alaska Geobotany Center (AGC) and collaborators from the Arctic regions of Alaska and the circumpolar Arctic over the past several decades is being archived and made accessible to scientists and the public via the Geographic Information Network of Alaska's (GINA's) 'Catalog' display and portal system. We are building two main types of data archives: Vegetation Plot Archive: For the plot information we use a Turboveg database to construct the Alaska portion of the international Arctic Vegetation Archive (AVA) http://www.geobotany.uaf.edu/ava/. High quality plot data and non-digital legacy datasets in danger of being lost have highest priority for entry into the archive. A key aspect of the database is the PanArctic Species List (PASL-1), developed specifically for the AVA to provide a standard of species nomenclature for the entire Arctic biome. A wide variety of reports, documents, and ancillary data are linked to each plot's geographic location. Geoecological Map Archive: This database includes maps and remote sensing products and links to other relevant data associated with the maps, mainly those produced by the Alaska Geobotany Center. Map data include GIS shape files of vegetation, land-cover, soils, landforms and other categorical variables and digital raster data of elevation, multispectral satellite-derived data, and data products and metadata associated with these. The map archive will contain all the information that is currently in the hierarchical Toolik-Arctic Geobotanical Atlas (T-AGA) in Alaska http://www.arcticatlas.org, plus several additions that are in the process of development and will be combined with GINA's already substantial holdings of spatial data from northern Alaska. The Geoecological Atlas Portal uses GINA's Catalog tool to develop a web interface to view and access the plot and map data. The mapping portal allows visualization of GIS data, sample-point locations and imagery and access to the map data. Catalog facilitates the discovery and dissemination of science-based information products in support of analysis and decision-making concerned with development and climate change and is currently used by GINA in several similar archive/distribution portals.

  20. Up the Beanstalk: An Evolutionary Organizational Structure for Libraries.

    ERIC Educational Resources Information Center

    Hoadley, Irene B.; Corbin, John

    1990-01-01

    Presents a functional organizational model for research libraries consisting of six major divisions and subunits: acquisition (buying, borrowing, leasing); organization (records creation, records maintenance); collections (collections management, selection, preservation, special collections and archives); interpretation (reference, instructional…

  1. Data acquisition and processing system for the HT-6M tokamak fusion experiment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shu, Y.T.; Liu, G.C.; Pang, J.Q.

    1987-08-01

    This paper describes a high-speed data acquisition and processing system which has been successfully operated on the HT-6M tokamak fusion experimental device. The system collects, archives and analyzes up to 512 kilobytes of data from each shot of the experiment. A shot lasts 50-150 milliseconds and occurs every 5-10 minutes. The system consists of two PDP-11/24 computer systems. One PDP-11/24 is used for real-time data taking and on-line data analysis. It is based upon five CAMAC crates organized into a parallel branch. Another PDP-11/24 is used for off-line data processing. Both data acquisition software RSX-DAS and data processing software RSX-DAPmore » have modular, multi-tasking and concurrent processing features.« less

  2. The Birmingham Burn Centre archive: A photographic history of post-war burn care in the United Kingdom.

    PubMed

    Hardwicke, Joseph; Kohlhardt, Angus; Moiemen, Naiem

    2015-06-01

    The Medical Research Council Burns and Industrial Injuries Unit at the Birmingham Accident Hospital pioneered civilian burn care and research in the United Kingdom during the post-war years. A photographic archive has been discovered that documents this period from 1945 to 1975. The aim of this project was to sort, digitize and archive the images in a secure format for future reference. The photographs detail the management of burns patients, from injury causation and surgical intervention, to nursing care, rehabilitation and long-term follow-up. A total of 2650 images files were collected from over 600 patients. Many novel surgical, nursing, dressing and rehabilitation strategies are documented and discussed. We have chosen to report part of the archive under the sections of (1) aseptic and antimicrobial burn care; (2) burn excision and wound closure; (3) rehabilitation, reconstruction and long-term outcomes; (4) accident prevention; and (5) response to a major burns incident. The Birmingham collection gives us a valuable insight into the approach to civilian burn care in the post-war years, and we present a case from the archive to the modern day, the longest clinical photographic follow-up to date. Copyright © 2015 Elsevier Ltd and ISBI. All rights reserved.

  3. Sampling and Chemical Analysis of Potable Water for ISS Expeditions 12 and 13

    NASA Technical Reports Server (NTRS)

    Straub, John E. II; Plumlee, Deborah K.; Schultz, John R.

    2007-01-01

    The crews of Expeditions 12 and 13 aboard the International Space Station (ISS) continued to rely on potable water from two different sources, regenerated humidity condensate and Russian ground-supplied water. The Space Shuttle launched twice during the 12- months spanning both expeditions and docked with the ISS for delivery of hardware and supplies. However, no Shuttle potable water was transferred to the station during either of these missions. The chemical quality of the ISS onboard potable water supplies was verified by performing ground analyses of archival water samples at the Johnson Space Center (JSC) Water and Food Analytical Laboratory (WAFAL). Since no Shuttle flights launched during Expedition 12 and there was restricted return volume on the Russian Soyuz vehicle, only one chemical archive potable water sample was collected with U.S. hardware and returned during Expedition 12. This sample was collected in March 2006 and returned on Soyuz 11. The number and sensitivity of the chemical analyses performed on this sample were limited due to low sample volume. Shuttle flights STS-121 (ULF1.1) and STS-115 (12A) docked with the ISS in July and September of 2006, respectively. These flights returned to Earth with eight chemical archive potable water samples that were collected with U.S. hardware during Expedition 13. The average collected volume increased for these samples, allowing full chemical characterization to be performed. This paper presents a discussion of the results from chemical analyses performed on Expeditions 12 and 13 archive potable water samples. In addition to the results from the U.S. samples analyzed, results from pre-flight samples of Russian potable water delivered to the ISS on Progress vehicles and in-flight samples collected with Russian hardware during Expeditions 12 and 13 and analyzed at JSC are also discussed.

  4. The ethical use of existing samples for genome research.

    PubMed

    Bathe, Oliver F; McGuire, Amy L

    2009-10-01

    Modern biobanking efforts consist of prospective collections of tissues linked to clinical data for patients who have given informed consent for the research use of their specimens and data, including their DNA. In such efforts, patient autonomy and privacy are well respected because of the prospective nature of the informed consent process. However, one of the richest sources of tissue for research continues to be the millions of archived samples collected by pathology departments during normal clinical care or for research purposes without specific consent for future research or genetic analysis. Because specific consent was not obtained a priori, issues related to individual privacy and autonomy are much more complicated. A framework for accessing these existing samples and related clinical data for research is presented. Archival tissues may be accessed only when there is a reasonable likelihood of generating beneficial and scientifically valid information. To minimize risks, databases containing information related to the tissue and to clinical data should be coded, no personally identifying phenotypic information should be included, and access should be restricted to bona fide researchers for legitimate research purposes. These precautions, if implemented appropriately, should ensure that the research use of archival tissue and data are no more than minimal risk. A waiver of the requirement for informed consent would then be justified if reconsent is shown to be impracticable. A waiver of consent should not be granted, however, if there is a significant risk to privacy, if the proposed research use is inconsistent with the original consent (where there is one), or if the potential harm from a privacy breach is considerable.

  5. Contrast in Terahertz Images of Archival Documents—Part I: Influence of the Optical Parameters from the Ink and Support

    NASA Astrophysics Data System (ADS)

    Bardon, Tiphaine; May, Robert K.; Jackson, J. Bianca; Beentjes, Gabriëlle; de Bruin, Gerrit; Taday, Philip F.; Strlič, Matija

    2017-04-01

    This study aims to objectively inform curators when terahertz time-domain (TD) imaging set in reflection mode is likely to give well-contrasted images of inscriptions in a complex archival document and is a useful non-invasive alternative to current digitisation processes. To this end, the dispersive refractive indices and absorption coefficients from various archival materials are assessed and their influence on contrast in terahertz images from historical documents is explored. Sepia ink and inks produced with bistre or verdigris mixed with a solution of Arabic gum or rabbit skin glue are unlikely to lead to well-contrasted images. However, dispersions of bone black, ivory black, iron gall ink, malachite, lapis lazuli, minium and vermilion are likely to lead to well-contrasted images. Inscriptions written with lamp black, carbon black and graphite give the best imaging results. The characteristic spectral signatures from iron gall ink, minium and vermilion pellets between 5 and 100 cm-1 relate to a ringing effect at late collection times in TD waveforms transmitted through these pellets. The same ringing effect can be probed in waveforms reflected from iron gall, minium and vermilion ink deposits at the surface of a document. Since TD waveforms collected for each scanning pixel can be Fourier-transformed into spectral information, terahertz TD imaging in reflection mode can serve as a hyperspectral imaging tool. However, chemical recognition and mapping of the ink is currently limited by the fact that the morphology of the document influences more the terahertz spectral response of the document than the resonant behaviour of the ink.

  6. Enhanced Endometriosis Archiving Software (ENEAS): An Application for Storing, Retrieving, Comparing, and Sharing Data of Patients Affected by Endometriosis Integrated in the Daily Practice.

    PubMed

    Centini, Gabriele; Zannoni, Letizia; Lazzeri, Lucia; Buiarelli, Paolo; Limatola, Gianluca; Petraglia, Felice; Seracchioli, Renato; Zupi, Errico

    Endometriosis is a chronic benign disease affecting women of fertile age, associated with pelvic pain and subfertility, often with negative impacts on quality of life. Meetings involving 5 gynecologists skilled in endometriosis management and 2 informatics technology consultants competent in data management and website administration were enlisted to create an endometriosis databank known as ENEAS (Enhanced Endometriosis Archiving Software). This processing system allows users to store, retrieve, compare, and correlate all data collected in conjunction with different Italian endometriosis centers, with the collective objective of obtaining homogeneous data for a large population sample. ENEAS is a web-oriented application that can be based on any open-source database that can be installed locally on a server, allowing collection of data on approximately 700 items, providing standardized and homogeneous data for comparison and analysis. ENEAS is capable of generating a sheet incorporating all data on the management of endometriosis that is both accurate and suitable for each individual patient. ENEAS is an effective and universal web application that encourages providers in the field of endometriosis to use a common language and share data to standardize medical and surgical treatment, with the main benefits being improved patient satisfaction and quality of life. Copyright © 2016 AAGL. Published by Elsevier Inc. All rights reserved.

  7. Archive of Digital Boomer Seismic Reflection Data Collected During USGS Field Activities 93LCA01 and 94LCA01 in Kingsley, Orange, and Lowry Lakes, Northeast Florida, 1993 and 1994

    USGS Publications Warehouse

    Calderon, Karynna; Dadisman, Shawn V.; Kindinger, Jack G.; Davis, Jeffrey B.; Flocks, James G.; Wiese, Dana S.

    2004-01-01

    In August and September of 1993 and January of 1994, the U.S. Geological Survey, under a cooperative agreement with the St. Johns River Water Management District (SJRWMD), conducted geophysical surveys of Kingsley Lake, Orange Lake, and Lowry Lake in northeast Florida. This report serves as an archive of unprocessed digital boomer seismic reflection data, trackline maps, navigation files, GIS information, observer's logbook, Field Activity Collection System (FACS) logs, and formal FGDC metadata. A filtered and gained GIF image of each seismic profile is also provided. Refer to the Acronyms page for expansion of acronyms and abbreviations used in this report. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG-Y format (Barry and others, 1975) and may be downloaded and processed with commercial or public domain software such as Seismic Unix (SU). Examples of SU processing scripts and in-house (USGS) software for viewing SEG-Y files (Zihlman, 1992) are also provided. The data archived here were collected under a cooperative agreement with the St. Johns River Water Management District as part of the USGS Lakes and Coastal Aquifers (LCA) Project. For further information about this study, refer to http://coastal.er.usgs.gov/stjohns, Kindinger and others (1994), and Kindinger and others (2000). The USGS Florida Integrated Science Center (FISC) - Coastal and Watershed Studies in St. Petersburg, Florida, assigns a unique identifier to each cruise or field activity. For example, 93LCA01 tells us the data were collected in 1993 for the Lakes and Coastal Aquifers (LCA) Project and the data were collected during the first field activity for that project in that calendar year. For a detailed description of the method used to assign the field activity ID, see http://walrus.wr.usgs.gov/infobank/programs/html/definition/activity.html. The boomer is an acoustic energy source that consists of capacitors charged to a high voltage and discharged through a transducer in the water. The transducer is towed on a sled at the sea surface and when discharged emits a short acoustic pulse, or shot, that propagates through the water and sediment column. The acoustic energy is reflected at density boundaries (such as the seafloor or sediment layers beneath the seafloor), detected by the receiver, and recorded by a PC-based seismic acquisition system. This process is repeated at timed intervals (e.g., 0.5 s) and recorded for specific intervals of time (e.g., 100 ms). In this way, a two-dimensional vertical image of the shallow geologic structure beneath the ship track is produced. Acquisition geometery for 94LCA01 is recorded in the operations logbook. No logbook exists for 93LCA01. Table 1 displays acquisition parameters for both field activities. For more information about the acquisition equipment used, refer to the FACS equipment logs. The unprocessed seismic data are stored in SEG-Y format (Barry and others, 1975). For a detailed description of the data format, refer to the SEG-Y Format page. See the How To Download SEG-Y Data page for more information about these files. Processed profiles can be viewed as GIF images from the Profiles page. Refer to the Software page for details about the processing and examples of the processing scripts. Detailed information about the navigation systems used for each field activity can be found in Table 1 and the FACS equipment logs. To view the trackline maps and navigation files, and for more information about these items, see the Navigation page. The original trace files were recorded in nonstandard ELICS format and later converted to standard SEG-Y format. The original trace files for 94LCA01 lines ORJ127_1, ORJ127_3, and ORJ131_1 were divided into two or more trace files (e.g., ORJ127_1 became ORJ127_1a and ORJ127_1b) because the original total number of traces exceeded the maximum allowed by the processing system. Digital data were not recoverable for 93LCA

  8. [The Museu da Saúde in Portugal: a physical space, a virtual space].

    PubMed

    Oliveira, Inês Cavadas de; Andrade, Helena Rebelo de; Miguel, José Pereira

    2015-12-01

    Museu da Saúde (Museum of Health) in Portugal, based on the dual concept of a multifaceted physical space and a virtual space, is preparing an inventory of its archive. So far, it has studied five of its collections in greater depth: tuberculosis, urology, psychology, medicine, and malaria. In this article, these collections are presented, and the specificities of developing museological activities within a national laboratory, Instituto Nacional de Saúde Doutor Ricardo Jorge, are also discussed, highlighting the issues of the store rooms and exhibition spaces, the inventory process, and the communication activities, with a view to overcoming the challenges inherent to operating in a non-museological space.

  9. Atmospheric Radiation Measurement Program Climate Research Facility Operations Quarterly Report July 1 – September 30, 2008

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sisterson, DL

    2008-09-30

    Individual raw data streams from instrumentation at the Atmospheric Radiation Measurement (ARM) Program Climate Research Facility (ACRF) fixed and mobile sites are collected and sent to the Data Management Facility (DMF) at Pacific Northwest National Laboratory (PNNL) for processing in near real-time. Raw and processed data are then sent daily to the ACRF Archive, where they are made available to users. For each instrument, we calculate the ratio of the actual number of data records received daily at the Archive to the expected number of data records. The results are tabulated by (1) individual data stream, site, and month formore » the current year and (2) site and fiscal year (FY) dating back to 1998. The U.S. Department of Energy (DOE) requires national user facilities to report time-based operating data. The requirements concern the actual hours of operation (ACTUAL); the estimated maximum operation or uptime goal (OPSMAX), which accounts for planned downtime; and the VARIANCE [1 – (ACTUAL/OPSMAX)], which accounts for unplanned downtime. The OPSMAX time for the fourth quarter of FY 2008 for the Southern Great Plains (SGP) site is 2,097.60 hours (0.95 x 2,208 hours this quarter). The OPSMAX for the North Slope Alaska (NSA) locale is 1,987.20 hours (0.90 x 2,208), and for the Tropical Western Pacific (TWP) locale is 1,876.80 hours (0.85 x 2,208). The OPSMAX time for the ARM Mobile Facility (AMF) is not reported this quarter because the data have not yet been released from China to the DMF for processing. The differences in OPSMAX performance reflect the complexity of local logistics and the frequency of extreme weather events. It is impractical to measure OPSMAX for each instrument or data stream. Data availability reported here refers to the average of the individual, continuous data streams that have been received by the Archive. Data not at the Archive are caused by downtime (scheduled or unplanned) of the individual instruments. Therefore, data availability is directly related to individual instrument uptime. Thus, the average percentage of data in the Archive represents the average percentage of the time (24 hours per day, 92 days for this quarter) the instruments were operating this quarter.« less

  10. The Land Processes Distributed Active Archive Center (LP DAAC)

    USGS Publications Warehouse

    Golon, Danielle K.

    2016-10-03

    The Land Processes Distributed Active Archive Center (LP DAAC) operates as a partnership with the U.S. Geological Survey and is 1 of 12 DAACs within the National Aeronautics and Space Administration (NASA) Earth Observing System Data and Information System (EOSDIS). The LP DAAC ingests, archives, processes, and distributes NASA Earth science remote sensing data. These data are provided to the public at no charge. Data distributed by the LP DAAC provide information about Earth’s surface from daily to yearly intervals and at 15 to 5,600 meter spatial resolution. Data provided by the LP DAAC can be used to study changes in agriculture, vegetation, ecosystems, elevation, and much more. The LP DAAC provides several ways to access, process, and interact with these data. In addition, the LP DAAC is actively archiving new datasets to provide users with a variety of data to study the Earth.

  11. NASA'S Earth Science Data Stewardship Activities

    NASA Technical Reports Server (NTRS)

    Lowe, Dawn R.; Murphy, Kevin J.; Ramapriyan, Hampapuram

    2015-01-01

    NASA has been collecting Earth observation data for over 50 years using instruments on board satellites, aircraft and ground-based systems. With the inception of the Earth Observing System (EOS) Program in 1990, NASA established the Earth Science Data and Information System (ESDIS) Project and initiated development of the Earth Observing System Data and Information System (EOSDIS). A set of Distributed Active Archive Centers (DAACs) was established at locations based on science discipline expertise. Today, EOSDIS consists of 12 DAACs and 12 Science Investigator-led Processing Systems (SIPS), processing data from the EOS missions, as well as the Suomi National Polar Orbiting Partnership mission, and other satellite and airborne missions. The DAACs archive and distribute the vast majority of data from NASA’s Earth science missions, with data holdings exceeding 12 petabytes The data held by EOSDIS are available to all users consistent with NASA’s free and open data policy, which has been in effect since 1990. The EOSDIS archives consist of raw instrument data counts (level 0 data), as well as higher level standard products (e.g., geophysical parameters, products mapped to standard spatio-temporal grids, results of Earth system models using multi-instrument observations, and long time series of Earth System Data Records resulting from multiple satellite observations of a given type of phenomenon). EOSDIS data stewardship responsibilities include ensuring that the data and information content are reliable, of high quality, easily accessible, and usable for as long as they are considered to be of value.

  12. The Social and Organizational Life Data Archive (SOLDA).

    ERIC Educational Resources Information Center

    Reed, Ken; Blunsdon, Betsy; Rimme, Malcolm

    2000-01-01

    Outlines the rationale and design of the Social and Organizational Life Data Archive (SOLDA), an on-line collection of survey and other statistical data relevant to research in the fields of management, organizational studies, industrial relations, marketing, and related social sciences. The database uses CD-ROM technology and the World Wide Web…

  13. DefenseLink Special: Faces of Afghanistan

    Science.gov Websites

    You have reached a collection of archived material. The content available is no longer being . If you wish to see the latest content, please visit the current version of the site. For persons with disabilities experiencing difficulties accessing content on archive.defense.gov, please use the DoD Section 508

  14. Life In Iraq: A Photo Essay

    Science.gov Websites

    You have reached a collection of archived material. The content available is no longer being . If you wish to see the latest content, please visit the current version of the site. For persons with disabilities experiencing difficulties accessing content on archive.defense.gov, please use the DoD Section 508

  15. Military Appreciation Day at RFK stadium

    Science.gov Websites

    You have reached a collection of archived material. The content available is no longer being . If you wish to see the latest content, please visit the current version of the site. For persons with disabilities experiencing difficulties accessing content on archive.defense.gov, please use the DoD Section 508

  16. Wounded Warriors Climb Strong

    Science.gov Websites

    You have reached a collection of archived material. The content available is no longer being . If you wish to see the latest content, please visit the current version of the site. For persons with disabilities experiencing difficulties accessing content on archive.defense.gov, please use the DoD Section 508

  17. Defense.gov - Special Report

    Science.gov Websites

    You have reached a collection of archived material. The content available is no longer being . If you wish to see the latest content, please visit the current version of the site. For persons with disabilities experiencing difficulties accessing content on archive.defense.gov, please use the DoD Section 508

  18. DefenseLink.mil - Staying Power

    Science.gov Websites

    You have reached a collection of archived material. The content available is no longer being . If you wish to see the latest content, please visit the current version of the site. For persons with disabilities experiencing difficulties accessing content on archive.defense.gov, please use the DoD Section 508

  19. George Washington Birthday Parade

    Science.gov Websites

    You have reached a collection of archived material. The content available is no longer being . If you wish to see the latest content, please visit the current version of the site. For persons with disabilities experiencing difficulties accessing content on archive.defense.gov, please use the DoD Section 508

  20. DefenseLink Feature: Travels with Gates

    Science.gov Websites

    You have reached a collection of archived material. The content available is no longer being . If you wish to see the latest content, please visit the current version of the site. For persons with disabilities experiencing difficulties accessing content on archive.defense.gov, please use the DoD Section 508

  1. Memorial Day Parade 2007

    Science.gov Websites

    You have reached a collection of archived material. The content available is no longer being . If you wish to see the latest content, please visit the current version of the site. For persons with disabilities experiencing difficulties accessing content on archive.defense.gov, please use the DoD Section 508

  2. DefenseLink Special: Faces of Iraq

    Science.gov Websites

    You have reached a collection of archived material. The content available is no longer being . If you wish to see the latest content, please visit the current version of the site. For persons with disabilities experiencing difficulties accessing content on archive.defense.gov, please use the DoD Section 508

  3. DefenseLink.mil - 2008 Year in Pictures

    Science.gov Websites

    You have reached a collection of archived material. The content available is no longer being . If you wish to see the latest content, please visit the current version of the site. For persons with disabilities experiencing difficulties accessing content on archive.defense.gov, please use the DoD Section 508

  4. Just Kids - Department of Defense

    Science.gov Websites

    You have reached a collection of archived material. The content available is no longer being . If you wish to see the latest content, please visit the current version of the site. For persons with disabilities experiencing difficulties accessing content on archive.defense.gov, please use the DoD Section 508

  5. Recon - Citizens in Service - March 19, 2007 - U.S. Department of Defense

    Science.gov Websites

    Official Website You have reached a collection of archived material. The content available is /or administration. If you wish to see the latest content, please visit the current version of the site. For persons with disabilities experiencing difficulties accessing content on archive.defense.gov

  6. DefenseLink Special: Center for the Intrepid

    Science.gov Websites

    You have reached a collection of archived material. The content available is no longer being . If you wish to see the latest content, please visit the current version of the site. For persons with disabilities experiencing difficulties accessing content on archive.defense.gov, please use the DoD Section 508

  7. Audiovisual Records in the National Archives Relating to Black History. Preliminary Draft.

    ERIC Educational Resources Information Center

    Waffen, Leslie; And Others

    A representative selection of the National Archives and Records Services' audiovisual collection relating to black history is presented. The intention is not to provide an exhaustive survey, but rather to indicate the breadth and scope of materials available for study and to suggest areas for concentrated research. The materials include sound…

  8. The USA PATRIOT Act: Archival Implications

    ERIC Educational Resources Information Center

    Trinkaus-Randall, Gregor

    2005-01-01

    In October 2001, Congress passed the USA PATRIOT Act to strengthen the ability of the U.S. government to combat terrorism. Unfortunately, some sections of the Act strike at core values and practices of libraries and archives, especially in the areas of record keeping, privacy, confidentiality, security, and access to the collections. This article…

  9. Home Economics Archive: Research, Tradition and History (HEARTH)

    Science.gov Websites

    , Tradition and History HEARTH is a core electronic collection of books and journals in Home Economics and Intimate History of American Girls. Additional information, images and readings on the history of Home Archive: Research, Tradition and History (HEARTH). Ithaca, NY: Albert R. Mann Library, Cornell University

  10. Study on Integrated Pest Management for Libraries and Archives.

    ERIC Educational Resources Information Center

    Parker, Thomas A.

    This study addresses the problems caused by the major insect and rodent pests and molds and mildews in libraries and archives; the damage they do to collections; and techniques for their prevention and control. Guidelines are also provided for the development and initiation of an Integrated Pest Management program for facilities housing library…

  11. Oral History and American Advertising: How the "Pepsi Generation" Came Alive.

    ERIC Educational Resources Information Center

    Dreyfus, Carol; Connors, Thomas

    1985-01-01

    Described is a project in which the Archives Center of the National Museum of American History and the George Meany Memorial Archives analyzed a collection of advertising materials of the Pepsi-Cola USA company and conducted interviews to gather historically valuable information concerning the company. Valuable social history information was…

  12. Environment and Passive Climate Control Chiefly in Tropical Climates.

    ERIC Educational Resources Information Center

    Dean, John F.

    This paper focuses on some of the effects of climate on library and archives collections in tropical climates, and discusses some prudent alternatives to the mechanical and chemical approaches commonly used to control climate and its immediate effects. One of the most important factors affecting the longevity of library and archival materials is…

  13. Research Capacity Building in Education: The Role of Digital Archives

    ERIC Educational Resources Information Center

    Carmichael, Patrick

    2011-01-01

    Accounts of how research capacity in education can be developed often make reference to electronic networks and online resources. This paper presents a theoretically driven analysis of the role of one such resource, an online archive of educational research studies that includes not only digitised collections of original documents but also videos…

  14. Exploiting Data Intensive Applications on High Performance Computers to Unlock Australia's Landsat Archive

    NASA Astrophysics Data System (ADS)

    Purss, Matthew; Lewis, Adam; Edberg, Roger; Ip, Alex; Sixsmith, Joshua; Frankish, Glenn; Chan, Tai; Evans, Ben; Hurst, Lachlan

    2013-04-01

    Australia's Earth Observation Program has downlinked and archived satellite data acquired under the NASA Landsat mission for the Australian Government since the establishment of the Australian Landsat Station in 1979. Geoscience Australia maintains this archive and produces image products to aid the delivery of government policy objectives. Due to the labor intensive nature of processing of this data there have been few national-scale datasets created to date. To compile any Earth Observation product the historical approach has been to select the required subset of data and process "scene by scene" on an as-needed basis. As data volumes have increased over time, and the demand for the processed data has also grown, it has become increasingly difficult to rapidly produce these products and achieve satisfactory policy outcomes using these historic processing methods. The result is that we have been "drowning in a sea of uncalibrated data" and scientists, policy makers and the public have not been able to realize the full potential of the Australian Landsat Archive and its value is therefore significantly diminished. To overcome this critical issue, the Australian Space Research Program has funded the "Unlocking the Landsat Archive" (ULA) Project from April 2011 to June 2013 to improve the access and utilization of Australia's archive of Landsat data. The ULA Project is a public-private consortium led by Lockheed Martin Australia (LMA) and involving Geoscience Australia (GA), the Victorian Partnership for Advanced Computing (VPAC), the National Computational Infrastructure (NCI) at the Australian National University (ANU) and the Cooperative Research Centre for Spatial Information (CRC-SI). The outputs from the ULA project will become a fundamental component of Australia's eResearch infrastructure, with the Australian Landsat Archive hosted on the NCI and made openly available under a creative commons license. NCI provides access to researchers through significant HPC supercomputers, cloud infrastructure and data resources along with a large catalogue of software tools that make it possible to fully explore the potential of this data. Under the ULA Project, Geoscience Australia has developed a data-intensive processing workflow on the NCI. This system has allowed us to successfully process 11 years of the Australian Landsat Archive (from 2000 to 2010 inclusive) to standardized well-calibrated and sensor independent data products at a rate that allows for both bulk data processing of the archive and near-realtime processing of newly acquired satellite data. These products are available as Optical Surface Reflectance 25m (OSR25) and other derived products, such as Fractional Cover.

  15. Update on CERN Search based on SharePoint 2013

    NASA Astrophysics Data System (ADS)

    Alvarez, E.; Fernandez, S.; Lossent, A.; Posada, I.; Silva, B.; Wagner, A.

    2017-10-01

    CERN’s enterprise Search solution “CERN Search” provides a central search solution for users and CERN service providers. A total of about 20 million public and protected documents from a wide range of document collections is indexed, including Indico, TWiki, Drupal, SharePoint, JACOW, E-group archives, EDMS, and CERN Web pages. In spring 2015, CERN Search was migrated to a new infrastructure based on SharePoint 2013. In the context of this upgrade, the document pre-processing and indexing process was redesigned and generalised. The new data feeding framework allows to profit from new functionality and it facilitates the long term maintenance of the system.

  16. Dawn: A Simulation Model for Evaluating Costs and Tradeoffs of Big Data Science Architectures

    NASA Astrophysics Data System (ADS)

    Cinquini, L.; Crichton, D. J.; Braverman, A. J.; Kyo, L.; Fuchs, T.; Turmon, M.

    2014-12-01

    In many scientific disciplines, scientists and data managers are bracing for an upcoming deluge of big data volumes, which will increase the size of current data archives by a factor of 10-100 times. For example, the next Climate Model Inter-comparison Project (CMIP6) will generate a global archive of model output of approximately 10-20 Peta-bytes, while the upcoming next generation of NASA decadal Earth Observing instruments are expected to collect tens of Giga-bytes/day. In radio-astronomy, the Square Kilometre Array (SKA) will collect data in the Exa-bytes/day range, of which (after reduction and processing) around 1.5 Exa-bytes/year will be stored. The effective and timely processing of these enormous data streams will require the design of new data reduction and processing algorithms, new system architectures, and new techniques for evaluating computation uncertainty. Yet at present no general software tool or framework exists that will allow system architects to model their expected data processing workflow, and determine the network, computational and storage resources needed to prepare their data for scientific analysis. In order to fill this gap, at NASA/JPL we have been developing a preliminary model named DAWN (Distributed Analytics, Workflows and Numerics) for simulating arbitrary complex workflows composed of any number of data processing and movement tasks. The model can be configured with a representation of the problem at hand (the data volumes, the processing algorithms, the available computing and network resources), and is able to evaluate tradeoffs between different possible workflows based on several estimators: overall elapsed time, separate computation and transfer times, resulting uncertainty, and others. So far, we have been applying DAWN to analyze architectural solutions for 4 different use cases from distinct science disciplines: climate science, astronomy, hydrology and a generic cloud computing use case. This talk will present preliminary results and discuss how DAWN can be evolved into a powerful tool for designing system architectures for data intensive science.

  17. OASIS: A Data Fusion System Optimized for Access to Distributed Archives

    NASA Astrophysics Data System (ADS)

    Berriman, G. B.; Kong, M.; Good, J. C.

    2002-05-01

    The On-Line Archive Science Information Services (OASIS) is accessible as a java applet through the NASA/IPAC Infrared Science Archive home page. It uses Geographical Information System (GIS) technology to provide data fusion and interaction services for astronomers. These services include the ability to process and display arbitrarily large image files, and user-controlled contouring, overlay regeneration and multi-table/image interactions. OASIS has been optimized for access to distributed archives and data sets. Its second release (June 2002) provides a mechanism that enables access to OASIS from "third-party" services and data providers. That is, any data provider who creates a query form to an archive containing a collection of data (images, catalogs, spectra) can direct the result files from the query into OASIS. Similarly, data providers who serve links to datasets or remote services on a web page can access all of these data with one instance of OASIS. In this was any data or service provider is given access to the full suite of capabilites of OASIS. We illustrate the "third-party" access feature with two examples: queries to the high-energy image datasets accessible from GSFC SkyView, and links to data that are returned from a target-based query to the NASA Extragalactic Database (NED). The second release of OASIS also includes a file-transfer manager that reports the status of multiple data downloads from remote sources to the client machine. It is a prototype for a request management system that will ultimately control and manage compute-intensive jobs submitted through OASIS to computing grids, such as request for large scale image mosaics and bulk statistical analysis.

  18. Determining the Completeness of the Nimbus Meteorological Data Archive

    NASA Technical Reports Server (NTRS)

    Johnson, James; Moses, John; Kempler, Steven; Zamkoff, Emily; Al-Jazrawi, Atheer; Gerasimov, Irina; Trivedi, Bhagirath

    2011-01-01

    NASA launched the Nimbus series of meteorological satellites in the 1960s and 70s. These satellites carried instruments for making observations of the Earth in the visible, infrared, ultraviolet, and microwave wavelengths. The original data archive consisted of a combination of digital data written to 7-track computer tapes and on various film media. Many of these data sets are now being migrated from the old media to the GES DISC modern online archive. The process involves recovering the digital data files from tape as well as scanning images of the data from film strips. Some of the challenges of archiving the Nimbus data include the lack of any metadata from these old data sets. Metadata standards and self-describing data files did not exist at that time, and files were written on now obsolete hardware systems and outdated file formats. This requires creating metadata by reading the contents of the old data files. Some digital data files were corrupted over time, or were possibly improperly copied at the time of creation. Thus there are data gaps in the collections. The film strips were stored in boxes and are now being scanned as JPEG-2000 images. The only information describing these images is what was written on them when they were originally created, and sometimes this information is incomplete or missing. We have the ability to cross-reference the scanned images against the digital data files to determine which of these best represents the data set from the various missions, or to see how complete the data sets are. In this presentation we compared data files and scanned images from the Nimbus-2 High-Resolution Infrared Radiometer (HRIR) for September 1966 to determine whether the data and images are properly archived with correct metadata.

  19. Archive of digital Chirp sub-bottom profile data collected during USGS Cruise 07SCC01 offshore of the Chandeleur Islands, Louisiana, June 2007

    USGS Publications Warehouse

    Forde, Arnell S.; Dadisman, Shawn V.; Flocks, James G.; Wiese, Dana S.

    2010-01-01

    In June of 2007, the U.S. Geological Survey (USGS) conducted a geophysical survey offshore of the Chandeleur Islands, Louisiana, in cooperation with the Louisiana Department of Natural Resources (LDNR) as part of the USGS Barrier Island Comprehensive Monitoring (BICM) project. This project is part of a broader study focused on Subsidence and Coastal Change (SCC). The purpose of the study was to investigate the shallow geologic framework and monitor the enviromental impacts of Hurricane Katrina (Louisiana landfall was on August 29, 2005) on the Gulf Coast's barrier island chains. This report serves as an archive of unprocessed digital 512i and 424 Chirp sub-bottom profile data, trackline maps, navigation files, Geographic Information System (GIS) files, Field Activity Collection System (FACS) logs, observer's logbook, and formal Federal Geographic Data Committee (FGDC) metadata. Gained (a relative increase in signal amplitude) digital images of the seismic profiles are also provided. Refer to the Acronyms page for expansion of acronyms and abbreviations used in this report. The USGS St. Petersburg Coastal and Marine Science Center (SPCMSC) assigns a unique identifier to each cruise or field activity. For example, 07SCC01 tells us the data were collected in 2007 for the Subsidence and Coastal Change (SCC) study and the data were collected during the first field activity for that study in that calendar year. Refer to http://walrus.wr.usgs.gov/infobank/programs/html/definition/activity.html for a detailed description of the method used to assign the field activity identification (ID). All Chirp systems use a signal of continuously varying frequency; the Chirp systems used during this survey produce high resolution, shallow penetration profile images beneath the seafloor. The towfish is a sound source and receiver, which is typically towed 1 - 2 m below the sea surface. The acoustic energy is reflected at density boundaries (such as the seafloor or sediment layers beneath the seafloor), detected by a receiver, and recorded by a PC-based seismic acquisition system. This process is repeated at timed intervals (for example, 0.125 s) and recorded for specific intervals of time (for example, 50 ms). In this way, a two-dimensional vertical image of the shallow geologic structure beneath the ship track is produced. Figure 1 displays the acquisition geometry. Refer to table 1 for a summary of acquisition parameters. See the digital FACS equipment log (11-KB PDF) for details about the acquisition equipment used. Table 2 lists trackline statistics. Scanned images of the handwritten FACS logs and handwritten science logbook (449-KB PDF) are also provided. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG-Y rev 1 format (Norris and Faichney, 2002); ASCII character encoding is used for the first 3,200 bytes of the card image header instead of the SEG-Y rev 0 (Barry and others, 1975) EBCDIC format. The SEG-Y files may be downloaded and processed with commercial or public domain software such as Seismic Unix (SU) (Cohen and Stockwell, 2010). See the How To Download SEG-Y Data page for download instructions. The web version of this archive does not contain the SEG-Y trace files. These files are very large and would require extremely long download times. To obtain the complete DVD archive, contact USGS Information at 1-888-ASK-USGS or infoservices@usgs.gov. The printable profiles provided here are GIF images that were processed and gained using SU software; refer to the Software page for links to example SU processing scripts and USGS software for viewing the SEG-Y files (Zihlman, 1992). The processed SEG-Y data were also exported to Chesapeake Technology, Inc. (CTI) SonarWeb software to produce an interactive version of the profile that allows the user to obtain a geographic location and depth from the profile for a given cursor position. This information is displayed in the status bar of the browser.

  20. The GIK-Archive of sediment core radiographs with documentation

    NASA Astrophysics Data System (ADS)

    Grobe, Hannes; Winn, Kyaw; Werner, Friedrich; Driemel, Amelie; Schumacher, Stefanie; Sieger, Rainer

    2017-12-01

    The GIK-Archive of radiographs is a collection of X-ray negative and photographic images of sediment cores based on exposures taken since the early 1960s. During four decades of marine geological work at the University of Kiel, Germany, several thousand hours of sampling, careful preparation and X-raying were spent on producing a unique archive of sediment radiographs from several parts of the World Ocean. The archive consists of more than 18 500 exposures on chemical film that were digitized, geo-referenced, supplemented with metadata and archived in the data library PANGAEA®. With this publication, the images have become available open-access for use by the scientific community at https://doi.org/10.1594/PANGAEA.854841.

  1. Gaia DR2 documentation

    NASA Astrophysics Data System (ADS)

    van Leeuwen, F.; de Bruijne, J. H. J.; Arenou, F.; Bakker, J.; Blomme, R.; Busso, G.; Cacciari, C.; Castañeda, J.; Cellino, A.; Clotet, M.; Comoretto, G.; Eyer, L.; González-Núñez, J.; Guy, L.; Hambly, N.; Hobbs, D.; van Leeuwen, M.; Luri, X.; Manteiga, M.; Pourbaix, D.; Roegiers, T.; Salgado, J.; Sartoretti, P.; Tanga, P.; Ulla, A.; Utrilla Molina, E.; Abreu, A.; Altmann, M.; Andrae, R.; Antoja, T.; Audard, M.; Babusiaux, C.; Bailer-Jones, C. A. L.; Barache, C.; Bastian, U.; Beck, M.; Berthier, J.; Bianchi, L.; Biermann, M.; Bombrun, A.; Bossini, D.; Breddels, M.; Brown, A. G. A.; Busonero, D.; Butkevich, A.; Cantat-Gaudin, T.; Carrasco, J. M.; Cheek, N.; Clementini, G.; Creevey, O.; Crowley, C.; David, M.; Davidson, M.; De Angeli, F.; De Ridder, J.; Delbò, M.; Dell'Oro, A.; Diakité, S.; Distefano, E.; Drimmel, R.; Durán, J.; Evans, D. W.; Fabricius, C.; Fabrizio, M.; Fernández-Hernández, J.; Findeisen, K.; Fleitas, J.; Fouesneau, M.; Galluccio, L.; Gracia-Abril, G.; Guerra, R.; Gutiérrez-Sánchez, R.; Helmi, A.; Hernandez, J.; Holl, B.; Hutton, A.; Jean-Antoine-Piccolo, A.; Jevardat de Fombelle, G.; Joliet, E.; Jordi, C.; Juhász, Á.; Klioner, S.; Löffler, W.; Lammers, U.; Lanzafame, A.; Lebzelter, T.; Leclerc, N.; Lecoeur-Taïbi, I.; Lindegren, L.; Marinoni, S.; Marrese, P. M.; Mary, N.; Massari, D.; Messineo, R.; Michalik, D.; Mignard, F.; Molinaro, R.; Molnár, L.; Montegriffo, P.; Mora, A.; Mowlavi, N.; Muinonen, K.; Muraveva, T.; Nienartowicz, K.; Ordenovic, C.; Pancino, E.; Panem, C.; Pauwels, T.; Petit, J.; Plachy, E.; Portell, J.; Racero, E.; Regibo, S.; Reylé, C.; Rimoldini, L.; Ripepi, V.; Riva, A.; Robichon, N.; Robin, A.; Roelens, M.; Romero-Gómez, M.; Sarro, L.; Seabroke, G.; Segovia, J. C.; Siddiqui, H.; Smart, R.; Smith, K.; Sordo, R.; Soria, S.; Spoto, F.; Stephenson, C.; Turon, C.; Vallenari, A.; Veljanoski, J.; Voutsinas, S.

    2018-04-01

    The second Gaia data release, Gaia DR2, encompasses astrometry, photometry, radial velocities, astrophysical parameters (stellar effective temperature, extinction, reddening, radius, and luminosity), and variability information plus astrometry and photometry for a sample of pre-selected bodies in the solar system. The data collected during the first 22 months of the nominal, five-year mission have been processed by the Gaia Data Processing and Analysis Consortium (DPAC), resulting into this second data release. A summary of the release properties is provided in Gaia Collaboration et al. (2018b). The overall scientific validation of the data is described in Arenou et al. (2018). Background information on the mission and the spacecraft can be found in Gaia Collaboration et al. (2016), with a more detailed presentation of the Radial Velocity Spectrometer (RVS) in Cropper et al. (2018). In addition, Gaia DR2 is accompanied by various, dedicated papers that describe the processing and validation of the various data products. Four more Gaia Collaboration papers present a glimpse of the scientific richness of the data. In addition to this set of refereed publications, this documentation provides a detailed, complete overview of the processing and validation of the Gaia DR2 data. Gaia data, from both Gaia DR1 and Gaia DR2, can be retrieved from the Gaia archive, which is accessible from https://archives.esac.esa.int/gaia. The archive also provides various tutorials on data access and data queries plus an integrated data model (i.e., description of the various fields in the data tables). In addition, Luri et al. (2018) provide concrete advice on how to deal with Gaia astrometry, with recommendations on how best to estimate distances from parallaxes. The Gaia archive features an enhanced visualisation service which can be used for quick initial explorations of the entire Gaia DR2 data set. Pre-computed cross matches between Gaia DR2 and a selected set of large surveys are provided. Gaia DR2 represents a major advance with respect to Gaia DR1 in terms of survey completeness, precision and accuracy, and the richness of the published data. Nevertheless, Gaia DR2 is still an early release based on a limited amount of input data, simplifications in the data processing, and imperfect calibrations. Many limitations hence exist which the user of Gaia DR2 should be aware of; they are described in Gaia Collaboration et al. (2018b).

  2. Sample Curation in Support of the OSIRIS-REx Asteroid Sample Return Mission

    NASA Technical Reports Server (NTRS)

    Righter, Kevin; Nakamura-Messenger, Keiko

    2017-01-01

    The OSIRIS-REx asteroid sample return mission launched to asteroid Bennu Sept. 8, 2016. The spacecraft will arrive at Bennu in late 2019, orbit and map the asteroid, and perform a touch and go (TAG) sampling maneuver in July 2020. After sample is stowed and confirmed the spacecraft will return to Earth, and the sample return capsule (SRC) will land in Utah in September 2023. Samples will be recovered from Utah [2] and then transported and stored in a new sample cleanroom at NASA Johnson Space Center in Houston [3]. The materials curated for the mission are described here. a) Materials Archive and Witness Plate Collection: The SRC and TAGSAM were built between March 2014 and Summer of 2015, and instruments (OTES,OVIRS, OLA, OCAMS, REXIS) were integrated from Summer 2015 until May 2016. A total of 395 items were received for the materials archive at NASA-JSC, with archiving finishing 30 days after launch (with the final archived items being related to launch operations)[4]. The materials fall into several general categories including metals (stainless steel, aluminum, titanium alloys, brass and BeCu alloy), epoxies, paints, polymers, lubricants, non-volatile-residue samples (NVR), sapphire, and various miscellaneous materials. All through the ATLO process (from March 2015 until late August 2016) contamination knowledge witness plates (Si wafer and Al foil) were deployed in the various cleanrooms in Denver and KSC to provide an additional record of particle counts and volatiles that is archived for current and future scientific studies. These plates were deployed in roughly monthly increments with each unit containing 4 Si wafers and 4 Al foils. We archived 128 individual witness plates (64 Si wafers and 64 Al foils); one of each witness plate (Si and Al) was analyzed immediately by the science team after archiving, while the remaining 3 of each are archived indefinitely. Information about each material archived is stored in an extensive database at NASA-JSC, and key summary information for each will be presented in an online catalog. b) Bulk Asteroid sample: The Touch and Go Sampling Mechanism (TAGSAM) head will contain up to 1.5 kg of asteroid material. Upon return to Earth, the TAGSAM head with the sample canister will be subjected to a nitrogen purge and then opened in a nitrogen cabinet in Houston. Once the TAGSAM head is removed from the canister, it will be dis-assembled slowly and carefully under nitrogen until the sample can be removed for processing in a dedicated nitrogen glovebox. Bennu surface samples are expected to be sub-cm sized, based on thermal infrared and radar polarization ratio measurements [1]. The upper limit on material collected by the TAGSAM head is 2 cm. Therefore, we will be prepared to handle, subdivide, and characterize materials of a wide grain size (from 10 ?m to 2 cm), and for both organic (UV fluorescence) and inorganic (SEM, FTIR, optical) properties. Representative portions of the bulk sample will be prepared for JAXA (0.5 %; see also [5]) and Canadian Space Agency (4%), with the remaining divided between the science team (<25%) and archived for future studies (NASA) (>75%). c) Contact Pad samples: The base of the TAGSAM head contains 24 contact pads that are designed to trap the upper surface layer of material and thus offer an opportunity to study asteroid samples that have resided at the very top surface of the regolith. Asteroid material is trapped on the pads in spring steel Velcro hooks, and material will have to be removed from these pads by curation specialists in the lab. d) Hardware: Some canister and SRC hardware items will contain information that will be important to understanding the collected samples, including the canister gas filter, temperature strips, flight witness plates, and the TAGSAM and canister parts that might have adhering dust grains. Some challenges remaining for both bulk sample and contact pad samples include: i) working with intermediate size range (200 to 500 micron) samples - a size range NASA has not previously worked in such detail; ii) techniques for removal of contact pad material from the spring steel hooks, iii) static electrical effects of dust sized particles during sample handling and curation is likely to be significant, and iv) the TAGSAM head and associated canister hardware will undoubtedly be coated with fine adhering dust grains from Bennu. In the case of collection of a large bulk sample mass, the adhering dust grains may be of lower priority. If a small sample mass is returned, the adhering dust may attain a higher priority, so recovery of adhering dust grains is an additional challenge to consider. In the year leading up to sample return we plan a variety of sample handling rehearsals that will enables the curation team to be prepared for many new aspects posed by this sample suite.

  3. Collections Care: A Basic Reference Shelflist.

    ERIC Educational Resources Information Center

    de Torres, Amparo R., Ed.

    This is an extensive bibliography of reference sources--i.e., books and articles--that relate to the care and conservation of library, archival, and museum collections. Bibliographies are presented under the following headings: (1) General Information; (2) Basic Collections Care; (3) Architectural Conservation; (4) Collections Management: Law,…

  4. First Light for ASTROVIRTEL Project

    NASA Astrophysics Data System (ADS)

    2000-04-01

    Astronomical data archives increasingly resemble virtual gold mines of information. A new project, known as ASTROVIRTEL aims to exploit these astronomical treasure troves by allowing scientists to use the archives as virtual telescopes. The competition for observing time on large space- and ground-based observatories such as the ESA/NASA Hubble Space Telescope and the ESO Very Large Telescope (VLT) is intense. On average, less than a quarter of applications for observing time are successful. The fortunate scientist who obtains observing time usually has one year of so-called proprietary time to work with the data before they are made publicly accessible and can be used by other astronomers. Precious data from these large research facilities retain their value far beyond their first birthday and may still be useful decades after they were first collected. The enormous quantity of valuable astronomical data now stored in the archives of the European Southern Observatory (ESO) and the Space Telescope-European Coordinating Facility (ST-ECF) is increasingly attracting the attention of astronomers. Scientists are aware that one set of observations can serve many different scientific purposes, including some that were not considered at all when the observations were first made. Data archives as "gold mines" for research [ASTROVIRTEL Logo; JPEG - 184 k] Astronomical data archives increasingly resemble virtual gold mines of information. A new project, known as ASTROVIRTEL or "Accessing Astronomical Archives as Virtual Telescopes" aims to exploit these astronomical treasure troves. It is supported by the European Commission (EC) within the "Access to Research Infrastructures" action under the "Improving Human Potential & the Socio-economic Knowledge Base" of the EC (under EU Fifth Framework Programme). ASTROVIRTEL has been established on behalf of the European Space Agency (ESA) and the European Southern Observatory (ESO) in response to rapid developments currently taking place in the fields of telescope and detector construction, computer hardware, data processing, archiving, and telescope operation. Nowadays astronomical telescopes can image increasingly large areas of the sky. They use more and more different instruments and are equipped with ever-larger detectors. The quantity of astronomical data collected is rising dramatically, generating a corresponding increase in potentially interesting research projects. These large collections of valuable data have led to the useful concept of "data mining", whereby large astronomical databases are exploited to support original research. However, it has become obvious that scientists need additional support to cope efficiently with the massive amounts of data available and so to exploit the true potential of the databases. The strengths of ASTROVIRTEL ASTROVIRTEL is the first virtual astronomical telescope dedicated to data mining. It is currently being established at the joint ESO/Space Telescope-European Coordinating Facility Archive in Garching (Germany). Scientists from EC member countries and associated states will be able to apply for support for a scientific project based on access to and analysis of data from the Hubble Space Telescope (HST), Very Large Telescope (VLT), New Technology Telescope (NTT), and Wide Field Imager (WFI) archives, as well as a number of other related archives, including the Infrared Space Observatory (ISO) archive. Scientists will be able to visit the archive site and collaborate with the archive specialists there. Special software tools that incorporate advanced methods for exploring the enormous quantities of information available will be developed. Statements The project co-ordinator, Piero Benvenuti , Head of ST-ECF, elaborates on the advantages of ASTROVIRTEL: "The observations by the ESA/NASA Hubble Space Telescope and, more recently, by the ESO Very Large Telescope, have already been made available on-line to the astronomical community, once the proprietary period of one year has elapsed. ASTROVIRTEL is different, in that astronomers are now invited to regard the archive as an "observatory" in its own right: a facility that, when properly used, may provide an answer to their specific scientific questions. The architecture of the archives as well as their suite of software tools may have to evolve to respond to the new demand. ASTROVIRTEL will try to drive this evolution on the basis of the scientific needs of its users." Peter Quinn , the Head of ESO's Data Management and Operations Division, is of the same opinion: "The ESO/HST Archive Facility at ESO Headquarters in Garching is currently the most rapidly growing astronomical archive resource in the world. This archive is projected to contain more than 100 Terabytes (100,000,000,000,000 bytes) of data within the next four years. The software and hardware technologies for the archive will be jointly developed and operated by ESA and ESO staff and will be common to both HST and ESO data archives. The ASTROVIRTEL project will provide us with real examples of scientific research programs that will push the capabilities of the archive and allow us to identify and develop new software tools for data mining. The growing archive facility will provide the European astronomical community with new digital windows on the Universe." Note [1] This is a joint Press Release by the European Southern Observatory (ESO) and the Space Telescope European Coordinating Facility (ST-ECF). Additional information More information about ASTROVIRTEL can be found at the dedicated website at: http://www.stecf.org/astrovirtel The European Southern Observatory (ESO) is an intergovernmental organisation, supported by eight European countries: Belgium, Denmark, France, Germany, Italy, The Netherlands, Sweden and Switzerland. The European Space Agency is an intergovernmental organisation supported by 15 European countries: Austria, Belgium, Denmark, Finland, France, Germany, Ireland, Italy, Netherlands, Norway, Portugal, Spain, Sweden, Switzerland and the United Kingdom. The Space Telescope European Coordinating Facility (ST-ECF) is a co-operation between the European Space Agency and the European Southern Observatory. The Hubble Space Telescope (HST) is a project of international co-operation between NASA and ESA.

  5. Use of history science methods in exposure assessment for occupational health studies

    PubMed Central

    Johansen, K; Tinnerberg, H; Lynge, E

    2005-01-01

    Aims: To show the power of history science methods for exposure assessment in occupational health studies, using the dry cleaning industry in Denmark around 1970 as the example. Methods: Exposure data and other information on exposure status were searched for in unconventional data sources such as the Danish National Archives, the Danish Royal Library, archives of Statistics Denmark, the National Institute of Occupational Health, Denmark, and the Danish Labor Inspection Agency. Individual census forms were retrieved from the Danish National Archives. Results: It was estimated that in total 3267 persons worked in the dry cleaning industry in Denmark in 1970. They typically worked in small shops with an average size of 3.5 persons. Of these, 2645 persons were considered exposed to solvents as they were dry cleaners or worked very close to the dry cleaning process, while 622 persons were office workers, drivers, etc in shops with 10 or more persons. It was estimated that tetrachloroethylene constituted 85% of the dry cleaning solvent used, and that a shop would normally have two machines using 4.6 tons of tetrachloroethylene annually. Conclusion: The history science methods, including retrieval of material from the Danish National Archives and a thorough search in the Royal Library for publications on dry cleaning, turned out to be a very fruitful approach for collection of exposure data on dry cleaning work in Denmark. The history science methods proved to be a useful supplement to the exposure assessment methods normally applied in epidemiological studies. PMID:15961618

  6. Use of history science methods in exposure assessment for occupational health studies.

    PubMed

    Johansen, K; Tinnerberg, H; Lynge, E

    2005-07-01

    To show the power of history science methods for exposure assessment in occupational health studies, using the dry cleaning industry in Denmark around 1970 as the example. Exposure data and other information on exposure status were searched for in unconventional data sources such as the Danish National Archives, the Danish Royal Library, archives of Statistics Denmark, the National Institute of Occupational Health, Denmark, and the Danish Labor Inspection Agency. Individual census forms were retrieved from the Danish National Archives. It was estimated that in total 3267 persons worked in the dry cleaning industry in Denmark in 1970. They typically worked in small shops with an average size of 3.5 persons. Of these, 2645 persons were considered exposed to solvents as they were dry cleaners or worked very close to the dry cleaning process, while 622 persons were office workers, drivers, etc in shops with 10 or more persons. It was estimated that tetrachloroethylene constituted 85% of the dry cleaning solvent used, and that a shop would normally have two machines using 4.6 tons of tetrachloroethylene annually. The history science methods, including retrieval of material from the Danish National Archives and a thorough search in the Royal Library for publications on dry cleaning, turned out to be a very fruitful approach for collection of exposure data on dry cleaning work in Denmark. The history science methods proved to be a useful supplement to the exposure assessment methods normally applied in epidemiological studies.

  7. NASA Langley Atmospheric Science Data Center (ASDC) Experience with Aircraft Data

    NASA Astrophysics Data System (ADS)

    Perez, J.; Sorlie, S.; Parker, L.; Mason, K. L.; Rinsland, P.; Kusterer, J.

    2011-12-01

    Over the past decade the NASA Langley ASDC has archived and distributed a variety of aircraft mission data sets. These datasets posed unique challenges for archiving from the rigidity of the archiving system and formats to the lack of metadata. The ASDC developed a state-of-the-art data archive and distribution system to serve the atmospheric sciences data provider and researcher communities. The system, called Archive - Next Generation (ANGe), is designed with a distributed, multi-tier, serviced-based, message oriented architecture enabling new methods for searching, accessing, and customizing data. The ANGe system provides the ease and flexibility to ingest and archive aircraft data through an ad hoc workflow or to develop a new workflow to suit the providers needs. The ASDC will describe the challenges encountered in preparing aircraft data for archiving and distribution. The ASDC is currently providing guidance to the DISCOVER-AQ (Deriving Information on Surface Conditions from Column and Vertically Resolved Observations Relevant to Air Quality) Earth Venture-1 project on developing collection, granule, and browse metadata as well as supporting the ADAM (Airborne Data For Assessing Models) site.

  8. Archive of digital boomer seismic reflection data collected during USGS cruises 94GFP01, 95GFP01, 96GFP01, 97GFP01, and 98GFP02 in Lakes Pontchartrain, Borgne, and Maurepas, Louisiana, 1994-1998

    USGS Publications Warehouse

    Calderon, Karynna; Dadisman, Shawn V.; Kindinger, Jack G.; Williams, S. Jeffress; Flocks, James G.; Penland, Shea; Wiese, Dana S.

    2003-01-01

    The U.S. Geological Survey, in cooperation with the University of New Orleans, the Lake Pontchartrain Basin Foundation, the National Oceanic and Atmospheric Administration, the Coalition to Restore Coastal Louisiana, the U.S. Army Corps of Engineers, the Environmental Protection Agency, and the University of Georgia, conducted five geophysical surveys of Lakes Pontchartrain, Borgne, and Maurepas in Louisiana from 1994 to 1998. This report serves as an archive of unprocessed digital boomer seismic reflection data, trackline maps, navigation files, observers' logbooks, GIS information, and formal FGDC metadata. In addition, a filtered and gained digital GIF image of each seismic profile is provided. Refer to the Acronyms page for expansion of acronyms and abbreviations used in this report. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG-Y format (Barry and others, 1975) and may be downloaded and processed with commercial or public domain software such as Seismic Unix (SU). Examples of SU processing scripts and in-house (USGS) software for viewing SEG-Y headers (Zihlman, 1992) are also provided. Processed profile images, trackline maps, navigation files, and formal metadata may be viewed with a web browser, and scanned handwritten logbooks may be viewed with Adobe Reader. To access the information contained on these discs, open the file 'index.htm' located at the top level of the discs using a web browser. This report also contains hyperlinks to USGS collaborators and other agencies. These links are only accessible if access to the Internet is available while viewing these documents.

  9. Audit of a Scientific Data Center for Certification as a Trustworthy Digital Repository: A Case Study

    NASA Astrophysics Data System (ADS)

    Downs, R. R.; Chen, R. S.

    2011-12-01

    Services that preserve and enable future access to scientific data are necessary to ensure that the data that are being collected today will be available for use by future generations of scientists. Many data centers, archives, and other digital repositories are working to improve their ability to serve as long-term stewards of scientific data. Trust in sustainable data management and preservation capabilities of digital repositories can influence decisions to use these services to deposit or obtain scientific data. Building on the Open Archival Information System (OAIS) Reference Model developed by the Consultative Committee for Space Data Systems (CCSDS) and adopted by the International Organization for Standardization as ISO 14721:2003, new standards are being developed to improve long-term data management processes and documentation. The Draft Information Standard ISO/DIS 16363, "Space data and information transfer systems - Audit and certification of trustworthy digital repositories" offers the potential to evaluate digital repositories objectively in terms of their trustworthiness as long-term stewards of digital resources. In conjunction with this, the CCSDS and ISO are developing another draft standard for the auditing and certification process, ISO/DIS 16919, "Space data and information transfer systems - Requirements for bodies providing audit and certification of candidate trustworthy digital repositories". Six test audits were conducted of scientific data centers and archives in Europe and the United States to test the use of these draft standards and identify potential improvements for the standards and for the participating digital repositories. We present a case study of the test audit conducted on the NASA Socioeconomic Data and Applications Center (SEDAC) and describe the preparation, the audit process, recommendations received, and next steps to obtain certification as a trustworthy digital repository, after approval of the ISO/DIS standards.

  10. Collection-Level Surveys for Special Collections: Coalescing Descriptors across Standards

    ERIC Educational Resources Information Center

    Ascher, James P.; Ferris, Anna M.

    2012-01-01

    Developing collection-level surveys to expose hidden collections in special collections and archives departments within ARL libraries has received a great deal of scholarly attention in the recent years. Numerous standards have been explored, and each has its strengths and weaknesses. This paper summarizes some of the major initiatives in…

  11. Supporting users through integrated retrieval, processing, and distribution systems at the Land Processes Distributed Active Archive Center

    USGS Publications Warehouse

    Kalvelage, Thomas A.; Willems, Jennifer

    2005-01-01

    The US Geological Survey's EROS Data Center (EDC) hosts the Land Processes Distributed Active Archive Center (LP DAAC). The LP DAAC supports NASA's Earth Observing System (EOS), which is a series of polar-orbiting and low inclination satellites for long-term global observations of the land surface, biosphere, solid Earth, atmosphere, and oceans. The EOS Data and Information Systems (EOSDIS) was designed to acquire, archive, manage and distribute Earth observation data to the broadest possible user community.The LP DAAC is one of four DAACs that utilize the EOSDIS Core System (ECS) to manage and archive their data. Since the ECS was originally designed, significant changes have taken place in technology, user expectations, and user requirements. Therefore the LP DAAC has implemented additional systems to meet the evolving needs of scientific users, tailored to an integrated working environment. These systems provide a wide variety of services to improve data access and to enhance data usability through subsampling, reformatting, and reprojection. These systems also support the wide breadth of products that are handled by the LP DAAC.The LP DAAC is the primary archive for the Landsat 7 Enhanced Thematic Mapper Plus (ETM+) data; it is the only facility in the United States that archives, processes, and distributes data from the Advanced Spaceborne Thermal Emission/Reflection Radiometer (ASTER) on NASA's Terra spacecraft; and it is responsible for the archive and distribution of “land products” generated from data acquired by the Moderate Resolution Imaging Spectroradiometer (MODIS) on NASA's Terra and Aqua satellites.

  12. AstroCloud, a Cyber-Infrastructure for Astronomy Research: Data Archiving and Quality Control

    NASA Astrophysics Data System (ADS)

    He, B.; Cui, C.; Fan, D.; Li, C.; Xiao, J.; Yu, C.; Wang, C.; Cao, Z.; Chen, J.; Yi, W.; Li, S.; Mi, L.; Yang, S.

    2015-09-01

    AstroCloud is a cyber-Infrastructure for Astronomy Research initiated by Chinese Virtual Observatory (China-VO) under funding support from NDRC (National Development and Reform commission) and CAS (Chinese Academy of Sciences)1(Cui et al. 2014). To archive the astronomical data in China, we present the implementation of the astronomical data archiving system (ADAS). Data archiving and quality control are the infrastructure for the AstroCloud. Throughout the data of the entire life cycle, data archiving system standardized data, transferring data, logging observational data, archiving ambient data, And storing these data and metadata in database. Quality control covers the whole process and all aspects of data archiving.

  13. Data Processing, Visualization and Distribution for Support of Science Programs in the Arctic Ocean

    NASA Astrophysics Data System (ADS)

    Johnson, P. D.; Edwards, M. H.; Wright, D.

    2006-12-01

    For the past two years the Hawaii Mapping Research Group (HMRG) and Oregon State University researchers have been building an on-line archive of geophysical data for the Arctic Basin. This archive is known as AAGRUUK - the Arctic Archive for Geophysical Research: Unlocking Undersea Knowledge (http://www.soest.hawaii.edu/hmrg/Aagruuk). It contains a wide variety of data including bathymetry, sidescan and subbottom data collected by: 1) U.S. Navy nuclear-powered submarines during the Science Ice Exercises (SCICEX), 2) icebreakers such as the USCGC Healy, R/V Nathaniel B. Palmer, and CCGS Amundsen, and 3) historical depth soundings from the T3 ice camp and pre-1990 nuclear submarine missions. Instead of simply soliciting data, reformatting it, and serving it to the community, we have focused our efforts on producing and serving an integrated dataset. We pursued this path after experimenting with dataset integration and discovering a multitude of problems including navigational inconsistencies and systemic offsets produced by acquiring data in an ice-covered ocean. Our goal in addressing these problems, integrating the processed datasets and producing a data compilation was to prevent the myriad researchers interested in these datasets, many of whom have less experience processing geophysical data than HMRG personnel, from having to repeat the same data processing efforts. For investigators interested in pursuing their own data processing approaches, AAGRUUK also serves most of the raw data that was included in the data compilation, as well as processed versions of individual datasets. The archive also provides downloadable static chart sets for users who desire derived products for inclusion in reports, planning documents, etc. We are currently testing a prototype mapserver that allows maps of the cleaned datasets to be accessed interactively as well as providing access to the edited files that make up the datasets. Previously we have documented the types of the problems that were encountered in a general way. Over the past year we have integrated two terabytes of data, which allows us to comment on system performance from a much broader context. In this presentation we will show the types of error for each data acquisition system and also for operating conditions (e.g. ice cover, time of year, etc.). Our error analysis both illuminates our approach to data processing and serves as a guide for, when possible, choosing the type of instruments and the optimal time to conduct these types of surveys in ice-covered oceans.

  14. 21 CFR 177.2400 - Perfluorocarbon cured elastomers.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... as are provided: Substances Limitations Carbon black (channel process of furnace combustion process... Park, MD 20740, or available for inspection at the National Archives and Records Administration (NARA....archives.gov/federal_register/code_of_federal_regulations/ibr_locations.html. (2) Thermogravimetry...

  15. Online Meta-data Collection and Monitoring Framework for the STAR Experiment at RHIC

    NASA Astrophysics Data System (ADS)

    Arkhipkin, D.; Lauret, J.; Betts, W.; Van Buren, G.

    2012-12-01

    The STAR Experiment further exploits scalable message-oriented model principles to achieve a high level of control over online data streams. In this paper we present an AMQP-powered Message Interface and Reliable Architecture framework (MIRA), which allows STAR to orchestrate the activities of Meta-data Collection, Monitoring, Online QA and several Run-Time and Data Acquisition system components in a very efficient manner. The very nature of the reliable message bus suggests parallel usage of multiple independent storage mechanisms for our meta-data. We describe our experience with a robust data-taking setup employing MySQL- and HyperTable-based archivers for meta-data processing. In addition, MIRA has an AJAX-enabled web GUI, which allows real-time visualisation of online process flow and detector subsystem states, and doubles as a sophisticated alarm system when combined with complex event processing engines like Esper, Borealis or Cayuga. The performance data and our planned path forward are based on our experience during the 2011-2012 running of STAR.

  16. Cores to the rescue: how old cores enable new science

    NASA Astrophysics Data System (ADS)

    Ito, E.; Noren, A. J.; Brady, K.

    2016-12-01

    The value of archiving scientific specimens and collections for the purpose of enabling further research using new analytical techniques, resolving conflicting results, or repurposing them for entirely new research, is often discussed in abstract terms. We all agree that samples with adequate metadata ought to be archived systematically for easy access, for a long time and stored under optimal conditions. And yet, as storage space fills, there is a temptation to cull the collection, or when a researcher retires, to discard the collection unless the researcher manages to make his or her own arrangement for the collection to be accessioned elsewhere. Nobody has done anything with these samples in over 20 years! Who would want them? It turns out that plenty of us do want them, if we know how to find them and if they have sufficient metadata to assess past work and suitability for new analyses. The LacCore collection holds over 33 km of core from >6700 sites in diverse geographic locations worldwide with samples collected as early as 1950s. From these materials, there are many examples to illustrate the scientific value of archiving geologic samples. One example that benefitted Ito personally were cores from Lakes Mirabad and Zeribar, Iran, acquired in 1963 by Herb Wright and his associates. Several doctoral and postdoctoral students generated and published paleoecological reconstructions based on cladocerans, diatoms, pollen or plant macrofossils, mostly between 1963 and 1967. The cores were resampled in 1990s by a student being jointly advised by Wright and Ito for oxygen isotope analysis of endogenic calcite. The results were profitably compared with pollen and the results published in 2001 and 2006. From 1979 until very recently, visiting Iran for fieldwork was not pallowed for US scientists. Other examples will be given to further illustrate the power of archived samples to advance science.

  17. A Tale of Two Archives: PDS3/PDS4 Archiving and Distribution of Juno Mission Data

    NASA Astrophysics Data System (ADS)

    Stevenson, Zena; Neakrase, Lynn; Huber, Lyle; Chanover, Nancy J.; Beebe, Reta F.; Sweebe, Kathrine; Johnson, Joni J.

    2017-10-01

    The Juno mission to Jupiter, which was launched on 5 August 2011 and arrived at the Jovian system in July 2016, represents the last mission to be officially archived under the PDS3 archive standards. Modernization and availability of the newer PDS4 archive standard has prompted the PDS Atmospheres Node (ATM) to provide on-the-fly migration of Juno data from PDS3 to PDS4. Data distribution under both standards presents challenges in terms of how to present data to the end user in both standards, without sacrificing accessibility to the data or impacting the active PDS3 mission pipelines tasked with delivering the data on predetermined schedules. The PDS Atmospheres Node has leveraged its experience with prior active PDS4 missions (e.g., LADEE and MAVEN) and ongoing PDS3-to-PDS4 data migration efforts providing a seamless distribution of Juno data in both PDS3 and PDS4. When ATM receives a data delivery from the Juno Science Operations Center, the PDS3 labels are validated and then fed through PDS4 migration software built at ATM. Specifically, a collection of Python methods and scripts has been developed to make the migration process as automatic as possible, even when working with the more complex labels used by several of the Juno instruments. This is used to create all of the PDS4 data labels at once and build PDS4 archive bundles with minimal human effort. Resultant bundles are then validated against the PDS4 standard and released alongside the certified PDS3 versions of the same data. The newer design of the distribution pages provides access to both versions of the data, utilizing some of the enhanced capabilities of PDS4 to improve search and retrieval of Juno data. Webpages are designed with the intent of offering easy access to all documentation for Juno data as well as the data themselves in both standards for users of all experience levels. We discuss the structure and organization of the Juno archive and associated webpages as examples of joint PDS3/PDS4 data access for end users.

  18. Initial progress in the recording of crime scene simulations using 3D laser structured light imagery techniques for law enforcement and forensic applications

    NASA Astrophysics Data System (ADS)

    Altschuler, Bruce R.; Monson, Keith L.

    1998-03-01

    Representation of crime scenes as virtual reality 3D computer displays promises to become a useful and important tool for law enforcement evaluation and analysis, forensic identification and pathological study and archival presentation during court proceedings. Use of these methods for assessment of evidentiary materials demands complete accuracy of reproduction of the original scene, both in data collection and in its eventual virtual reality representation. The recording of spatially accurate information as soon as possible after first arrival of law enforcement personnel is advantageous for unstable or hazardous crime scenes and reduces the possibility that either inadvertent measurement error or deliberate falsification may occur or be alleged concerning processing of a scene. Detailed measurements and multimedia archiving of critical surface topographical details in a calibrated, uniform, consistent and standardized quantitative 3D coordinate method are needed. These methods would afford professional personnel in initial contact with a crime scene the means for remote, non-contacting, immediate, thorough and unequivocal documentation of the contents of the scene. Measurements of the relative and absolute global positions of object sand victims, and their dispositions within the scene before their relocation and detailed examination, could be made. Resolution must be sufficient to map both small and large objects. Equipment must be able to map regions at varied resolution as collected from different perspectives. Progress is presented in devising methods for collecting and archiving 3D spatial numerical data from crime scenes, sufficient for law enforcement needs, by remote laser structured light and video imagery. Two types of simulation studies were done. One study evaluated the potential of 3D topographic mapping and 3D telepresence using a robotic platform for explosive ordnance disassembly. The second study involved using the laser mapping system on a fixed optical bench with simulated crime scene models of the people and furniture to assess feasibility, requirements and utility of such a system for crime scene documentation and analysis.

  19. Archive of digital Chirp subbottom profile data collected during USGS cruises 09CCT03 and 09CCT04, Mississippi and Alabama Gulf Islands, June and July 2009

    USGS Publications Warehouse

    Forde, Arnell S.; Dadisman, Shawn V.; Flocks, James G.; Wiese, Dana S.

    2011-01-01

    In June and July of 2009, the U.S. Geological Survey (USGS) conducted geophysical surveys to investigate the geologic controls on island framework from Cat Island, Mississippi, to Dauphin Island, Alabama, as part of a broader USGS study on Coastal Change and Transport (CCT). The surveys were funded through the Northern Gulf of Mexico Ecosystem Change and Hazard Susceptibility Project as part of the Holocene Evolution of the Mississippi-Alabama Region Subtask (http://ngom.er.usgs.gov/task2_2/index.php). This report serves as an archive of unprocessed digital Chirp seismic profile data, trackline maps, navigation files, Geographic Information System (GIS) files, Field Activity Collection System (FACS) logs, and formal Federal Geographic Data Committee (FGDC) metadata. Single-beam and Swath bathymetry data were also collected during these cruises and will be published as a separate archive. Gained (a relative increase in signal amplitude) digital images of the seismic profiles are also provided. Refer to the Acronyms page for expansion of acronyms and abbreviations used in this report.

  20. Women's History Resources at the State Historical Society of Wisconsin. Fourth Edition, Revised and Enlarged.

    ERIC Educational Resources Information Center

    Danky, James P.; And Others

    The library, archives, iconographic collections, museum, and film archives of the State Historical Society of Wisconsin (Madison) are described, with an emphasis on the history of women in the United States and Canada. The library features works on frontier/utopian communities, ethnic groups, women's organizations, women and labor unions, women's…

  1. University Counseling Center Clients' Expressed Preferences for Counselors: A Four Year Archival Exploration

    ERIC Educational Resources Information Center

    Speight, Suzette L.; Vera, Elizabeth M.

    2005-01-01

    This archival study explored patterns of client preferences from a randomized sample of 881 clients at a Midwestern university counseling center. Information from client intake forms was collected for a four year time frame. Results showed that 61% of the clients did not express preferences for particular types of counselors when asked on intake…

  2. U.S. Department of Defense Official Website - Battle for Iwo Jima

    Science.gov Websites

    You have reached a collection of archived material. The content available is no longer being . If you wish to see the latest content, please visit the current version of the site. For persons with disabilities experiencing difficulties accessing content on archive.defense.gov, please use the DoD Section 508

  3. Recon - Inventing for the Future - U.S. Department of Defense Official

    Science.gov Websites

    Website You have reached a collection of archived material. The content available is no longer administration. If you wish to see the latest content, please visit the current version of the site. For persons with disabilities experiencing difficulties accessing content on archive.defense.gov, please use the

  4. Afghanistan Today: A Photo Essay by U.S. Army National Guard Staff Sgt.

    Science.gov Websites

    Russell Lee Klika You have reached a collection of archived material. The content available is /or administration. If you wish to see the latest content, please visit the current version of the site. For persons with disabilities experiencing difficulties accessing content on archive.defense.gov

  5. DefenseLink Special: 30th Anniversary of the Fall of Saigon

    Science.gov Websites

    You have reached a collection of archived material. The content available is no longer being . If you wish to see the latest content, please visit the current version of the site. For persons with disabilities experiencing difficulties accessing content on archive.defense.gov, please use the DoD Section 508

  6. DefenseLink.mil - Special Report - Flag Day - June 14, 2008

    Science.gov Websites

    You have reached a collection of archived material. The content available is no longer being . If you wish to see the latest content, please visit the current version of the site. For persons with disabilities experiencing difficulties accessing content on archive.defense.gov, please use the DoD Section 508

  7. U.S. Department of Defense: Year in Pictures 2009

    Science.gov Websites

    You have reached a collection of archived material. The content available is no longer being . If you wish to see the latest content, please visit the current version of the site. For persons with disabilities experiencing difficulties accessing content on archive.defense.gov, please use the DoD Section 508

  8. Defense.gov - Special Report - Training for the Fight

    Science.gov Websites

    You have reached a collection of archived material. The content available is no longer being . If you wish to see the latest content, please visit the current version of the site. For persons with disabilities experiencing difficulties accessing content on archive.defense.gov, please use the DoD Section 508

  9. Library and Archival Security: Policies and Procedures To Protect Holdings from Theft and Damage.

    ERIC Educational Resources Information Center

    Trinkaus-Randall, Gregor

    1998-01-01

    Firm policies and procedures that address the environment, patron/staff behavior, general attitude, and care and handling of materials need to be at the core of the library/archival security program. Discussion includes evaluating a repository's security needs, collections security, security in non-public areas, security in the reading room,…

  10. Library and Archival Resources for Social Science Research in the Spanish, French, Dutch Caribbean.

    ERIC Educational Resources Information Center

    Mathews, Thomas G.

    The working paper describes how a social scientist might go about locating resources for any particular study. Researchers are directed to non-Caribbean based material in European Archives as well as collections in the United States. Caribbean resources are analyzed by county. The countries include Cuba, Dominican Republic, Puerto Rico,…

  11. The AstroBID: Searching through the Italian Astronomical Heritage

    NASA Astrophysics Data System (ADS)

    Cirella, E. O.; Gargano, M.; Gasperini, A.; Mandrino, A.; Randazzo, D.; Zanini, V.

    2015-04-01

    The scientific heritage held in the National Institute for Astrophysics (INAF), made up of rare and modern books, instruments, and archival documents spanning from the 15th to the early 20th century, marks the milestones in the history of astronomy in Italy. To promote this history of this historical collection, the Libraries and Historical Archives Service and the Museums Service of INAF have developed a project aimed at creating a single web portal: Polvere di stelle. I beni culturali dell'astronomia italiana (Stardust. The cultural heritage of the Italian astronomy). This portal searches for data coming from the libraries, the instruments collections and the historical archives, regarding the heritage of the Italian Observatories. The BID (Books, Instruments, Documents) of the project is the creation of a multimedia web facility, which allows the public to make simultaneous searches on the three different types of materials.

  12. NASA/IPAC Infrared Archive's General Image Cutouts Service

    NASA Astrophysics Data System (ADS)

    Alexov, A.; Good, J. C.

    2006-07-01

    The NASA/IPAC Infrared Archive (IRSA) ``Cutouts" Service (http://irsa.ipac.caltech.edu/applications/Cutouts) is a general tool for creating small ``cutout" FITS images and JPEGs from collections of data archived at IRSA. This service is a companion to IRSA's Atlas tool (http://irsa.ipac.caltech.edu/applications/Atlas/), which currently serves over 25 different data collections of various sizes and complexity and returns entire images for a user-defined region of the sky. The Cutouts Services sits on top of Atlas and extends the Atlas functionality by generating subimages at locations and sizes requested by the user from images already identified by Atlas. These results can be downloaded individually, in batch mode (using the program wget), or as a tar file. Cutouts re-uses IRSA's software architecture along with the publicly available Montage mosaicking tools. The advantages and disadvantages of this approach to generic cutout serving will be discussed.

  13. 36 CFR 1290.6 - Originals and copies.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... Section 1290.6 Parks, Forests, and Public Property NATIONAL ARCHIVES AND RECORDS ADMINISTRATION JFK... ASSASSINATION RECORDS COLLECTION ACT OF 1992 (JFK ACT) § 1290.6 Originals and copies. (a) For purposes of.... Kennedy Assassination Records Collection (JFK Assassination Records Collection) established under the JFK...

  14. 36 CFR 1290.6 - Originals and copies.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Section 1290.6 Parks, Forests, and Public Property NATIONAL ARCHIVES AND RECORDS ADMINISTRATION JFK... ASSASSINATION RECORDS COLLECTION ACT OF 1992 (JFK ACT) § 1290.6 Originals and copies. (a) For purposes of.... Kennedy Assassination Records Collection (JFK Assassination Records Collection) established under the JFK...

  15. 36 CFR 1290.6 - Originals and copies.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Section 1290.6 Parks, Forests, and Public Property NATIONAL ARCHIVES AND RECORDS ADMINISTRATION JFK... ASSASSINATION RECORDS COLLECTION ACT OF 1992 (JFK ACT) § 1290.6 Originals and copies. (a) For purposes of.... Kennedy Assassination Records Collection (JFK Assassination Records Collection) established under the JFK...

  16. Evolving the Living With a Star Data System Definition

    NASA Astrophysics Data System (ADS)

    Otranto, J.; Dijoseph, M.; Worrall, W.

    2003-04-01

    NASA’s Living With a Star (LWS) Program is a space weather-focused and applications-driven research program. The LWS Program is soliciting input from the solar, space physics, space weather, and climate science communities to develop a system that enables access to science data associated with these disciplines, and advances the development of discipline and interdisciplinary findings. The LWS Program will implement a data system that builds upon the existing and planned data capture, processing, and storage components put in place by individual spacecraft missions and also inter-project data management systems, such as active archives, deep archives, and multi-mission repositories. It is technically feasible for the LWS Program to integrate data from a broad set of resources, assuming they are either publicly accessible or access is permitted by the system’s administrators. The LWS Program data system will work in coordination with spacecraft mission data systems and science data repositories, integrating them into a common data representation. This common representation relies on a robust metadata definition that provides journalistic and technical data descriptions, plus linkages to supporting data products and tools. The LWS Program intends to become an enabling resource to PIs, interdisciplinary scientists, researchers, and students facilitating both access to a broad collection of science data, as well as the necessary supporting components to understand and make productive use of the data. For the LWS Program to represent science data that is physically distributed across various ground system elements, information about the data products stored on each system is collected through a series of LWS-created active agents. These active agents are customized to interface or interact with each one of these data systems, collect information, and forward updates to a single LWS-developed metadata broker. This broker, in turn, updates a centralized repository of LWS-specific metadata. A populated LWS metadata database is a single point-of-contact that can serve all users (the science community) with a “one-stop-shop” for data access. While data may not be physically stored in an LWS-specific repository, the LWS system enables data access from wherever the data are stored. Moreover, LWS provides the user access to information for understanding the data source, format, and calibration, enables access to ancillary and correlative data products, provides links to processing tools and models associated with the data, and any corresponding findings. The LWS may also support an active archive for solar, space physics, space weather, and climate data when these data would otherwise be discarded or archived off-line. This archive could potentially serve as a backup facility for LWS missions. This plan is developed based upon input already received from the science community; the architecture is based on system developed to date that have worked well on a smaller scale. The LWS Program continues to seek constructive input from the science community, examples of both successes and failures in dealing with science data systems, and insights regarding the obstacles between the current state-of-the-practice and this vision for the LWS Program data system.

  17. Charting the Course: Life Cycle Management of Mars Mission Digital Information

    NASA Technical Reports Server (NTRS)

    Reiz, Julie M.

    2003-01-01

    This viewgraph presentation reviews the life cycle management of MER Project information. This process was an essential key to the successful launch of the MER Project rovers. Incorporating digital information archive requirements early in the project life cycle resulted in: Design of an information system that included archive metadata, Reduced the risk of information loss through in-process appraisal, Easier transfer of project information to institutional online archive and Project appreciation for preserving information for reuse by future projects

  18. HRP Data Accessibility Current Status

    NASA Technical Reports Server (NTRS)

    Sams, Clarence

    2009-01-01

    Overview of talk: a) Content of Human Life Science data; b) Data archive structure; c) Applicable legal documents and policies; and d) Methods for data access. Life Science Data Archive (LSDA) contains research data from NASA-funded experiments, primarily data from flight experiments and ground analog data collected at NASA facilities. Longitudinal Study of Astronaut Health (LSAH) contains electronic health records (medical data) of all astronauts, including mission data. Data are collected for clinical purposes. Clinical data are analyzed by LSAH epidemiologists to identify trends in crew health and implement changes in pre-, in-, or post-flight medical care.

  19. The Master Archive Collection Inventory (MACI)

    NASA Astrophysics Data System (ADS)

    Lief, C. J.; Arnfield, J.; Sprain, M.

    2014-12-01

    The Master Archive Collection Inventory (MACI) project at the NOAA National Climatic Data Center (NCDC) is an effort to re-inventory all digital holdings to streamline data set and product titles and update documentation to discovery level ISO 199115-2. Subject Matter Experts (SME) are being identified for each of the holdings and will be responsible for creating and maintaining metadata records. New user-friendly tools are available for the SMEs to easily create and update this documentation. Updated metadata will be available for retrieval by other aggregators and discovery tools, increasing the usability of NCDC data and products.

  20. The digital archive of the International Halley Watch

    NASA Technical Reports Server (NTRS)

    Klinglesmith, D. A., III; Niedner, M. B.; Grayzeck, E.; Aronsson, M.; Newburn, R. L.; Warnock, A., III

    1992-01-01

    The International Halley Watch was established to coordinate, collect, archive, and distribute the scientific data from Comet P/Halley that would be obtained from both the ground and space. This paper describes one of the end products of that effort, namely the IHW Digital Archive. The IHW Digital Archive consists of 26 CD-ROM's containing over 32 gigabytes of data from the 9 IHW disciplines as well as data from the 5 spacecraft missions flown to comet P/Haley and P/Giacobini-Zinner. The total archive contains over 50,000 observations by 1,500 observers from at least 40 countries. The first 24 CD's, which are currently available, contain data from the 9 IHW disciplines. The two remaining CD's will have the spacecraft data and should be available within the next year. A test CD-ROM of these data has been created and is currently under review.

  1. Enhancement of real-time EPICS IOC PV management for the data archiving system

    NASA Astrophysics Data System (ADS)

    Kim, Jae-Ha

    2015-10-01

    The operation of a 100-MeV linear proton accelerator, the major driving values and experimental data need to be archived. According to the experimental conditions, different data are required. Functions that can add new data and delete data in real time need to be implemented. In an experimental physics and industrial control system (EPICS) input output controller (IOC), the value of process variables (PVs) are matched with the driving values and data. The PV values are archived in text file format by using the channel archiver. There is no need to create a database (DB) server, just a need for large hard disk. Through the web, the archived data can be loaded, and new PV values can be archived without stopping the archive engine. The details of the implementation of a data archiving system with channel archiver are presented, and some preliminary results are reported.

  2. A fully automatic processing chain to produce Burn Scar Mapping products, using the full Landsat archive over Greece

    NASA Astrophysics Data System (ADS)

    Kontoes, Charalampos; Papoutsis, Ioannis; Herekakis, Themistoklis; Michail, Dimitrios; Ieronymidi, Emmanuela

    2013-04-01

    Remote sensing tools for the accurate, robust and timely assessment of the damages inflicted by forest wildfires provide information that is of paramount importance to public environmental agencies and related stakeholders before, during and after the crisis. The Institute for Astronomy, Astrophysics, Space Applications and Remote Sensing of the National Observatory of Athens (IAASARS/NOA) has developed a fully automatic single and/or multi date processing chain that takes as input archived Landsat 4, 5 or 7 raw images and produces precise diachronic burnt area polygons and damage assessments over the Greek territory. The methodology consists of three fully automatic stages: 1) the pre-processing stage where the metadata of the raw images are extracted, followed by the application of the LEDAPS software platform for calibration and mask production and the Automated Precise Orthorectification Package, developed by NASA, for image geo-registration and orthorectification, 2) the core-BSM (Burn Scar Mapping) processing stage which incorporates a published classification algorithm based on a series of physical indexes, the application of two filters for noise removal using graph-based techniques and the grouping of pixels classified as burnt to form the appropriate pixels clusters before proceeding to conversion from raster to vector, and 3) the post-processing stage where the products are thematically refined and enriched using auxiliary GIS layers (underlying land cover/use, administrative boundaries, etc.) and human logic/evidence to suppress false alarms and omission errors. The established processing chain has been successfully applied to the entire archive of Landsat imagery over Greece spanning from 1984 to 2012, which has been collected and managed in IAASARS/NOA. The number of full Landsat frames that were subject of process in the framework of the study was 415. These burn scar mapping products are generated for the first time to such a temporal and spatial extent and are ideal to use in further environmental time series analyzes, production of statistical indexes (frequency, geographical distribution and number of fires per prefecture) and applications, including change detection and climate change models, urban planning, correlation with manmade activities, etc.

  3. Collecting, archiving and processing DNA from wildlife samples using FTA® databasing paper

    PubMed Central

    Smith, LM; Burgoyne, LA

    2004-01-01

    Background Methods involving the analysis of nucleic acids have become widespread in the fields of traditional biology and ecology, however the storage and transport of samples collected in the field to the laboratory in such a manner to allow purification of intact nucleic acids can prove problematical. Results FTA® databasing paper is widely used in human forensic analysis for the storage of biological samples and for purification of nucleic acids. The possible uses of FTA® databasing paper in the purification of DNA from samples of wildlife origin were examined, with particular reference to problems expected due to the nature of samples of wildlife origin. The processing of blood and tissue samples, the possibility of excess DNA in blood samples due to nucleated erythrocytes, and the analysis of degraded samples were all examined, as was the question of long term storage of blood samples on FTA® paper. Examples of the end use of the purified DNA are given for all protocols and the rationale behind the processing procedures is also explained to allow the end user to adjust the protocols as required. Conclusions FTA® paper is eminently suitable for collection of, and purification of nucleic acids from, biological samples from a wide range of wildlife species. This technology makes the collection and storage of such samples much simpler. PMID:15072582

  4. The development of participatory health research among incarcerated women in a Canadian prison

    PubMed Central

    Murphy, K.; Hanson, D.; Hemingway, C.; Ramsden, V.; Buxton, J.; Granger-Brown, A.; Condello, L-L.; Buchanan, M.; Espinoza-Magana, N.; Edworthy, G.; Hislop, T. G.

    2009-01-01

    This paper describes the development of a unique prison participatory research project, in which incarcerated women formed a research team, the research activities and the lessons learned. The participatory action research project was conducted in the main short sentence minimum/medium security women's prison located in a Western Canadian province. An ethnographic multi-method approach was used for data collection and analysis. Quantitative data was collected by surveys and analysed using descriptive statistics. Qualitative data was collected from orientation package entries, audio recordings, and written archives of research team discussions, forums and debriefings, and presentations. These data and ethnographic observations were transcribed and analysed using iterative and interpretative qualitative methods and NVivo 7 software. Up to 15 women worked each day as prison research team members; a total of 190 women participated at some time in the project between November 2005 and August 2007. Incarcerated women peer researchers developed the research processes including opportunities for them to develop leadership and technical skills. Through these processes, including data collection and analysis, nine health goals emerged. Lessons learned from the research processes were confirmed by the common themes that emerged from thematic analysis of the research activity data. Incarceration provides a unique opportunity for engagement of women as expert partners alongside academic researchers and primary care workers in participatory research processes to improve their health. PMID:25759141

  5. 76 FR 31641 - Agency Information Collection Activities: Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-01

    ... following information collections: 1. Title: Presidential Library Facilities. OMB Number: 3095-0036. Agency... library facility. The report contains information that can be furnished only by the foundation or other... NATIONAL ARCHIVES AND RECORDS ADMINISTRATION Agency Information Collection Activities: Submission...

  6. Challenges of the science data processing, analysis and archiving approach in BepiColombo

    NASA Astrophysics Data System (ADS)

    Martinez, Santa

    BepiColombo is a joint mission of the European Space Agency (ESA) and the Japan Aerospace Exploration Agency (JAXA) to the planet Mercury. It comprises two separate orbiters: the Mercury Planetary Orbiter (MPO) and the Mercury Magnetospheric Orbiter (MMO). After approximately 7.5 years of cruise, BepiColombo will arrive at Mercury in 2024 and will gather data during a 1-year nominal mission, with a possible 1-year extension. The approach selected for BepiColombo for the processing, analysis and archiving of the science data represents a significant change with respect to previous ESA planetary missions. Traditionally Instrument Teams are responsible for processing, analysing and preparing their science data for the long-term archive, however in BepiColombo, the Science Ground Segment (SGS), located in Madrid, Spain, will play a key role in these activities. Fundamental aspects of this approach include: the involvement of the SGS in the definition, development and operation of the instrument processing pipelines; the production of ready-to-archive science products compatible with NASA’s Planetary Data System (PDS) standards in all the processing steps; the joint development of a quick-look analysis system to monitor deviations between planned and executed observations to feed back the results into the different planning cycles when possible; and a mission archive providing access to the scientific products and to the operational data throughout the different phases of the mission (from the early development phase to the legacy phase). In order to achieve these goals, the SGS will need to overcome a number of challenges. The proposed approach requires a flexible infrastructure able to cope with a distributed data processing system, residing in different locations but designed as a single entity. For this, all aspects related to the integration of software developed by different Instrument Teams and the alignment of their development schedules will need to be considered. In addition, the SGS is taking full responsibility for the production of the first level of science data (un-calibrated), with the associated operational implications. An additional difficulty impacting the processing strategies relates to the various spacecraft data downlink mechanisms available for the MPO and their associated data latency. With regards to archiving, the main challenges include: the use of a new version of the PDS standards (so-called PDS4), being implemented for the first time in an ESA planetary mission; the use of external standards (CDF, FITS); and the implementation of interoperability protocols that aim to make all data (from both MPO and MMO) globally accessible through a distributed archive to the end-users. For the definition of the quick-look analysis system, it is very important to understand and harmonise the different views and expectations of the science team. Due to the long duration of the Cruise phase, and the fact that there are many years between the design of the system and the nominal mission, it might be difficult for some Instrument Teams to accurately define their needs so many years before operations. In particular, new scientific discoveries over the coming years by the MESSENGER spacecraft, currently orbiting Mercury, may influence how the Instrument Teams on BepiColombo define their operations and their reduction and analysis techniques. In addition, due to the long duration of the mission, it is not always possible or practical to document all accumulated knowledge on paper so if personnel leave some of their knowledge is lost as well. This is key, particularly for the Instrument Teams. By taking a pro-active role in the collection of requirements and expectations of the science team together with the definition of clear guidelines early in the mission and by developing close collaboration with the Instrument Teams, the SGS will be able to identify how to best exploit the expertise on both sides and to guarantee that the necessary support is provided when needed. This contribution will detail the main challenges and advantages associated with the data processing, analysis and archiving approach in BepiColombo, and will summarise the various efforts ongoing to guarantee that the scientific requirements of the mission and the expectations of the science team are fulfilled. Future ESA planetary missions (e.g. ExoMars, JUICE) will follow a similar approach, adapting the efforts to the profile of the mission.

  7. BOREAS HYD-8 1994 Gravimetric Moss Moisture Data

    NASA Technical Reports Server (NTRS)

    Wang, Xuewen; Hall, Forrest G. (Editor); Knapp, David E. (Editor); Smith, David E. (Technical Monitor)

    2000-01-01

    The Boreal Ecosystem-Atmosphere Study (BOREAS) Hydrology (HYD)-8 team made measurements of surface hydrological processes that were collected at the Northern Study Area-Old Black Spruce (NSA-OBS) Tower Flux site in 1994 and at Joey Lake, Manitoba, to support its research into point hydrological processes and the spatial variation of these processes. The data collected may be useful in characterizing canopy interception, drip, throughfall, moss interception, drainage, evaporation, and capacity during the growing season at daily temporal resolution. This particular data set contains the gravimetric moss moisture measurements from June to September 1994. A nested spatial sampling plan was implemented to support research into spatial variations of the measured hydrological processes and ultimately the impact of these variations on modeled carbon and water budgets. These data are stored in tabular ASCII files. The HYD-08 1994 gravimetric moss moisture data are available from the Earth Observing System Data and Information System (EOSDIS) Oak Ridge National Laboratory (ORNL) Distributed Active Archive Center (DAAC). The data files are available on a CD-ROM (see document number 20010000884).

  8. Political dreams, practical boundaries: the case of the Nursing Minimum Data Set, 1983-1990.

    PubMed

    Hobbs, Jennifer

    2011-01-01

    The initial development of the Nursing Minimum Data Set (NMDS) was analyzed based on archival material from Harriet Werley and Norma Lang, two nurses involved with the project, and American Nurses Association materials. The process of identifying information to be included in the NMDS was contentious. Individual nurses argued on behalf of particular data because of a strong belief in how nursing practice (through information collection) should be structured. Little attention was paid to existing practice conditions that would ultimately determine whether the NMDS would be used.

  9. Appraising U.S. Geological Survey science records

    USGS Publications Warehouse

    Faundeen, John L.

    2010-01-01

    The U.S. Geological Survey (USGS) Earth Resources Observation and Science (EROS) Center has legislative charters to preserve and make accessible land remote sensing records important to the United States. This essay explains the appraisal process developed by EROS to ensure the science records it holds and those offered to it align with those charters. The justifications behind the questions employed to weed and to complement the EROS archive are explained along with the literature reviewed supporting their inclusion. Appraisal results are listed by individual collection and include the recommendations accepted by EROS management.

  10. MODIS Snow and Ice Products from the NSIDC DAAC

    NASA Technical Reports Server (NTRS)

    Scharfen, Greg R.; Hall, Dorothy K.; Riggs, George A.

    1997-01-01

    The National Snow and Ice Data Center (NSIDC) Distributed Active Archive Center (DAAC) provides data and information on snow and ice processes, especially pertaining to interactions among snow, ice, atmosphere and ocean, in support of research on global change detection and model validation, and provides general data and information services to cryospheric and polar processes research community. The NSIDC DAAC is an integral part of the multi-agency-funded support for snow and ice data management services at NSIDC. The Moderate Resolution Imaging Spectroradiometer (MODIS) will be flown on the first Earth Observation System (EOS) platform (AM-1) in 1998. The MODIS Instrument Science Team is developing geophysical products from data collected by the MODIS instrument, including snow and ice products which will be archived and distributed by NSIDC DAAC. The MODIS snow and ice mapping algorithms will generate global snow, lake ice, and sea ice cover products on a daily basis. These products will augment the existing record of satellite-derived snow cover and sea ice products that began about 30 years ago. The characteristics of these products, their utility, and comparisons to other data set are discussed. Current developments and issues are summarized.

  11. 36 CFR 1201.41 - What are NARA's procedures for collecting debts by tax refund offset?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... for collecting debts by tax refund offset? 1201.41 Section 1201.41 Parks, Forests, and Public Property NATIONAL ARCHIVES AND RECORDS ADMINISTRATION GENERAL RULES COLLECTION OF CLAIMS Tax Refund Offset § 1201.41 What are NARA's procedures for collecting debts by tax refund offset? (a) NARA's Financial Services...

  12. On detecting variables using ROTSE-IIId archival data

    NASA Astrophysics Data System (ADS)

    Yesilyaprak, C.; Yerli, S. K.; Aksaker, N.; Gucsav, B. B.; Kiziloglu, U.; Dikicioglu, E.; Coker, D.; Aydin, E.; Ozeren, F. F.

    ROTSE (Robotic Optical Transient Search Experiment) telescopes can also be used for variable star detection. As explained in the system description tep{2003PASP..115..132A}, they have a good sky coverage and they allow a fast data acquisition. The optical magnitude range varies between 7^m to 19^m. Thirty percent of the telescope time of north-eastern leg of the network, namely ROTSE-IIId (located at TUBITAK National Observatory, Bakirlitepe, Turkey http://www.tug.tubitak.gov.tr/) is owned by Turkish researchers. Since its first light (May 2004) considerably a large amount of data has been collected (around 2 TB) from the Turkish time and roughly one million objects have been identified from the reduced data. A robust pipeline has been constructed to discover new variables, transients and planetary nebulae from this archival data. In the detection process, different statistical methods were applied to the archive. We have detected thousands of variable stars by applying roughly four different tests to light curve of each star. In this work a summary of the pipeline is presented. It uses a high performance computing (HPC) algorithm which performs inhomogeneous ensemble photometry of the data on a 36 core cluster. This study is supported by TUBITAK (Scientific and Technological Research Council of Turkey) with the grant number TBAG-108T475.

  13. Exploiting NASA's Cumulus Earth Science Cloud Archive with Services and Computation

    NASA Astrophysics Data System (ADS)

    Pilone, D.; Quinn, P.; Jazayeri, A.; Schuler, I.; Plofchan, P.; Baynes, K.; Ramachandran, R.

    2017-12-01

    NASA's Earth Observing System Data and Information System (EOSDIS) houses nearly 30PBs of critical Earth Science data and with upcoming missions is expected to balloon to between 200PBs-300PBs over the next seven years. In addition to the massive increase in data collected, researchers and application developers want more and faster access - enabling complex visualizations, long time-series analysis, and cross dataset research without needing to copy and manage massive amounts of data locally. NASA has started prototyping with commercial cloud providers to make this data available in elastic cloud compute environments, allowing application developers direct access to the massive EOSDIS holdings. In this talk we'll explain the principles behind the archive architecture and share our experience of dealing with large amounts of data with serverless architectures including AWS Lambda, the Elastic Container Service (ECS) for long running jobs, and why we dropped thousands of lines of code for AWS Step Functions. We'll discuss best practices and patterns for accessing and using data available in a shared object store (S3) and leveraging events and message passing for sophisticated and highly scalable processing and analysis workflows. Finally we'll share capabilities NASA and cloud services are making available on the archives to enable massively scalable analysis and computation in a variety of formats and tools.

  14. Visualization of GPM Standard Products at the Precipitation Processing System (PPS)

    NASA Astrophysics Data System (ADS)

    Kelley, O.

    2010-12-01

    Many of the standard data products for the Global Precipitation Measurement (GPM) constellation of satellites will be generated at and distributed by the Precipitation Processing System (PPS) at NASA Goddard. PPS will provide several means to visualize these data products. These visualization tools will be used internally by PPS analysts to investigate potential anomalies in the data files, and these tools will also be made available to researchers. Currently, a free data viewer called THOR, the Tool for High-resolution Observation Review, can be downloaded and installed on Linux, Windows, and Mac OS X systems. THOR can display swath and grid products, and to a limited degree, the low-level data packets that the satellite itself transmits to the ground system. Observations collected since the 1997 launch of the Tropical Rainfall Measuring Mission (TRMM) satellite can be downloaded from the PPS FTP archive, and in the future, many of the GPM standard products will also be available from this FTP site. To provide easy access to this 80 terabyte and growing archive, PPS currently operates an on-line ordering tool called STORM that provides geographic and time searches, browse-image display, and the ability to order user-specified subsets of standard data files. Prior to the anticipated 2013 launch of the GPM core satellite, PPS will expand its visualization tools by integrating an on-line version of THOR within STORM to provide on-the-fly image creation of any portion of an archived data file at a user-specified degree of magnification. PPS will also provide OpenDAP access to the data archive and OGC WMS image creation of both swath and gridded data products. During the GPM era, PPS will continue to provide realtime globally-gridded 3-hour rainfall estimates to the public in a compact binary format (3B42RT) and in a GIS format (2-byte TIFF images + ESRI WorldFiles).

  15. Digital curation and online resources: digital scanning of surgical tools at the royal college of physicians and surgeons of Glasgow for an open university learning resource.

    PubMed

    Earley, Kirsty; Livingstone, Daniel; Rea, Paul M

    2017-01-01

    Collection preservation is essential for the cultural status of any city. However, presenting a collection publicly risks damage. Recently this drawback has been overcome by digital curation. Described here is a method of digitisation using photogrammetry and virtual reality software. Items were selected from the Royal College of Physicians and Surgeons of Glasgow archives, and implemented into an online learning module for the Open University. Images were processed via Agisoft Photoscan, Autodesk Memento, and Garden Gnome Object 2VR. Although problems arose due to specularity, 2VR digital models were developed for online viewing. Future research must minimise the difficulty of digitising specular objects.

  16. Clean and Cold Sample Curation

    NASA Technical Reports Server (NTRS)

    Allen, C. C.; Agee, C. B.; Beer, R.; Cooper, B. L.

    2000-01-01

    Curation of Mars samples includes both samples that are returned to Earth, and samples that are collected, examined, and archived on Mars. Both kinds of curation operations will require careful planning to ensure that the samples are not contaminated by the instruments that are used to collect and contain them. In both cases, sample examination and subdivision must take place in an environment that is organically, inorganically, and biologically clean. Some samples will need to be prepared for analysis under ultra-clean or cryogenic conditions. Inorganic and biological cleanliness are achievable separately by cleanroom and biosafety lab techniques. Organic cleanliness to the <50 ng/sq cm level requires material control and sorbent removal - techniques being applied in our Class 10 cleanrooms and sample processing gloveboxes.

  17. BOREAS AFM-5 Level-2 Upper Air Network Standard Pressure Level Data

    NASA Technical Reports Server (NTRS)

    Barr, Alan; Hrynkiw, Charmaine; Hall, Forrest G. (Editor); Newcomer, Jeffrey A. (Editor); Smith, David E. (Technical Monitor)

    2000-01-01

    The BOREAS AFM-5 team collected and processed data from the numerous radiosonde flights during the project. The goals of the AFM-05 team were to provide large-scale definition of the atmosphere by supplementing the existing AES aerological network, both temporally and spatially. This data set includes basic upper-air parameters interpolated at 0.5 kiloPascal increments of atmospheric pressure from data collected from the network of upper-air stations during the 1993, 1994, and 1996 field campaigns over the entire study region. The data are contained in tabular ASCII files. The data files are available on a CD-ROM (see document number 20010000884) or from the Oak Ridge National Laboratory (ORNL) Distributed Active Archive Center (DAAC).

  18. Archive of side scan sonar and swath bathymetry data collected during USGS cruise 10CCT01 offshore of Cat Island, Gulf Islands National Seashore, Mississippi, March 2010

    USGS Publications Warehouse

    DeWitt, Nancy T.; Flocks, James G.; Pfeiffer, William R.; Wiese, Dana S.

    2010-01-01

    In March of 2010, the U.S. Geological Survey (USGS) conducted geophysical surveys east of Cat Island, Mississippi (fig. 1). The efforts were part of the USGS Gulf of Mexico Science Coordination partnership with the U.S. Army Corps of Engineers (USACE) to assist the Mississippi Coastal Improvements Program (MsCIP) and the Northern Gulf of Mexico (NGOM) Ecosystem Change and Hazards Susceptibility Project by mapping the shallow geological stratigraphic framework of the Mississippi Barrier Island Complex. These geophysical surveys will provide the data necessary for scientists to define, interpret, and provide baseline bathymetry and seafloor habitat for this area and to aid scientists in predicting future geomorpholocial changes of the islands with respect to climate change, storm impact, and sea-level rise. Furthermore, these data will provide information for barrier island restoration, particularly in Camille Cut, and provide protection for the historical Fort Massachusetts. For more information refer to http://ngom.usgs.gov/gomsc/mscip/index.html. This report serves as an archive of the processed swath bathymetry and side scan sonar data (SSS). Data products herein include gridded and interpolated surfaces, surface images, and x,y,z data products for both swath bathymetry and side scan sonar imagery. Additional files include trackline maps, navigation files, GIS files, Field Activity Collection System (FACS) logs, and formal FGDC metadata. Scanned images of the handwritten FACS logs and digital FACS logs are also provided as PDF files. Refer to the Acronyms page for expansion of acronyms and abbreviations used in this report or hold the cursor over an acronym for a pop-up explanation. The USGS St. Petersburg Coastal and Marine Science Center assigns a unique identifier to each cruise or field activity. For example, 10CCT01 tells us the data were collected in 2010 for the Coastal Change and Transport (CCT) study and the data were collected during the first field activity for that project in that calendar year. Refer to http://walrus.wr.usgs.gov/infobank/programs/html/definition/activity.html for a detailed description of the method used to assign the field activity ID. Data were collected using a 26-foot (ft) Glacier Bay Catamaran. Side scan sonar and interferometric swath bathymetry data were collected simultaneously along the tracklines. The side scan sonar towfish was towed off the port side just slightly behind the vessel, close to the seafloor. The interferometric swath transducer was sled-mounted on a rail attached between the catamaran hulls. During the survey the sled is secured into position. Navigation was acquired with a CodaOctopus Octopus F190 Precision Attitude and Positioning System and differentially corrected with OmniSTAR. See the digital FACS equipment log for details about the acquisition equipment used. Both raw datasets were stored digitally and processed using CARIS HIPS and SIPS software at the USGS St. Petersburg Coastal and Marine Science Center. For more information on processing refer to the Equipment and Processing page. Post-processing of the swath dataset revealed a motion artifact that is attributed to movement of the pole that the swath transducers are attached to in relation to the boat. The survey took place in the winter months, in which strong winds and rough waves contributed to a reduction in data quality. The rough seas contributed to both the movement of the pole and the very high noise base seen in the raw amplitude data of the side scan sonar. Chirp data were also collected during this survey and are archived separately.

  19. Defense.gov Special Report: Pearl Harbor - Anniversary of the Attack on

    Science.gov Websites

    Pearl Harbor You have reached a collection of archived material. The content available is no administration. If you wish to see the latest content, please visit the current version of the site. For persons with disabilities experiencing difficulties accessing content on archive.defense.gov, please use the

  20. HeinOnline: An Online Archive of Law Journals.

    ERIC Educational Resources Information Center

    Marisa, Richard J.

    Law is grounded in the past, in the decisions and reasoning of generations of lawyers, judges, juries, and professors. Ready access to this history is vital to solid legal research, and yet, until 2000, much of it was buried in vast collections of aging paper journals. HeinOnline is a new online archive of law journals. Development of HeinOnline…

  1. 3rd Annual PIALA Conference Saipan--Collecting, Preserving & Sharing Information in Micronesia. Conference Proceedings. October 13-15, 1993.

    ERIC Educational Resources Information Center

    Edmundson, Margaret, Ed.

    1993-01-01

    This PIALA 1993 Proceedings contains many of the papers presented at the 3rd annual conference of the Pacific Islands Association of Libraries and Archives. This publication is the first time papers from this Micronesian regional library and archives conference have ever been published. The conference addressed various topics of interest to…

  2. The Global Streamflow Indices and Metadata archive (G-SIM): A compilation of global streamflow time series indices and meta-data

    NASA Astrophysics Data System (ADS)

    Do, Hong; Gudmundsson, Lukas; Leonard, Michael; Westra, Seth; Senerivatne, Sonia

    2017-04-01

    In-situ observations of daily streamflow with global coverage are a crucial asset for understanding large-scale freshwater resources which are an essential component of the Earth system and a prerequisite for societal development. Here we present the Global Streamflow Indices and Metadata archive (G-SIM), a collection indices derived from more than 20,000 daily streamflow time series across the globe. These indices are designed to support global assessments of change in wet and dry extremes, and have been compiled from 12 free-to-access online databases (seven national databases and five international collections). The G-SIM archive also includes significant metadata to help support detailed understanding of streamflow dynamics, with the inclusion of drainage area shapefile and many essential catchment properties such as land cover type, soil and topographic characteristics. The automated procedure in data handling and quality control of the project makes G-SIM a reproducible, extendible archive and can be utilised for many purposes in large-scale hydrology. Some potential applications include the identification of observational trends in hydrological extremes, the assessment of climate change impacts on streamflow regimes, and the validation of global hydrological models.

  3. Semi-automated Data Set Submission Work Flow for Archival with the ORNL DAAC

    NASA Astrophysics Data System (ADS)

    Wright, D.; Beaty, T.; Cook, R. B.; Devarakonda, R.; Eby, P.; Heinz, S. L.; Hook, L. A.; McMurry, B. F.; Shanafield, H. A.; Sill, D.; Santhana Vannan, S.; Wei, Y.

    2013-12-01

    The ORNL DAAC archives and publishes, free of charge, data and information relevant to biogeochemical, ecological, and environmental processes. The ORNL DAAC primarily archives data produced by NASA's Terrestrial Ecology Program; however, any data that are pertinent to the biogeochemical and ecological community are of interest. The data set submission process to the ORNL DAAC has been recently updated and semi-automated to provide a consistent data provider experience and to create a uniform data product. The data archived at the ORNL DAAC must be well formatted, self-descriptive, and documented, as well as referenced in a peer-reviewed publication. If the ORNL DAAC is the appropriate archive for a data set, the data provider will be sent an email with several URL links to guide them through the submission process. The data provider will be asked to fill out a short online form to help the ORNL DAAC staff better understand the data set. These questions cover information about the data set, a description of the data set, temporal and spatial characteristics of the data set, and how the data were prepared and delivered. The questionnaire is generic and has been designed to gather input on the various diverse data sets the ORNL DAAC archives. A data upload module and metadata editor further guide the data provider through the submission process. For submission purposes, a complete data set includes data files, document(s) describing data, supplemental files, metadata record(s), and an online form. There are five major functions the ORNL DAAC performs during the process of archiving data: 1) Ingestion is the ORNL DAAC side of submission; data are checked, metadata records are compiled, and files are converted to archival formats. 2) Metadata records and data set documentation made searchable and the data set is given a permanent URL. 3) The data set is published, assigned a DOI, and advertised. 4) The data set is provided long-term post-project support. 5) Stewardship of data ensures the data are stored on state of the art computer systems with reliable backups.

  4. PACS archive upgrade and data migration: clinical experiences

    NASA Astrophysics Data System (ADS)

    Liu, Brent J.; Documet, Luis; Sarti, Dennis A.; Huang, H. K.; Donnelly, John

    2002-05-01

    Saint John's Health Center PACS data volumes have increased dramatically since the hospital became filmless in April of 1999. This is due in part of continuous image accumulation, and the integration of a new multi-slice detector CT scanner into PACS. The original PACS archive would not be able to handle the distribution and archiving load and capacity in the near future. Furthermore, there is no secondary copy backup of all the archived PACS image data for disaster recovery purposes. The purpose of this paper is to present a clinical and technical process template to upgrade and expand the PACS archive, migrate existing PACs image data to the new archive, and provide a back-up and disaster recovery function not currently available. Discussion of the technical and clinical pitfalls and challenges involved in this process will be presented as well. The server hardware configuration was upgraded and a secondary backup implemented for disaster recovery. The upgrade includes new software versions, database reconfiguration, and installation of a new tape jukebox to replace the current MOD jukebox. Upon completion, all PACS image data from the original MOD jukebox was migrated to the new tape jukebox and verified. The migration was performed during clinical operation continuously in the background. Once the data migration was completed the MOD jukebox was removed. All newly acquired PACS exams are now archived to the new tape jukebox. All PACs image data residing on the original MOD jukebox have been successfully migrated into the new archive. In addition, a secondary backup of all PACS image data has been implemented for disaster recovery and has been verified using disaster scenario testing. No PACS image data was lost during the entire process and there was very little clinical impact during the entire upgrade and data migration. Some of the pitfalls and challenges during this upgrade process included hardware reconfiguration for the original archive server, clinical downtime involved with the upgrade, and data migration planning to minimize impact on clinical workflow. The impact was minimized with a downtime contingency plan.

  5. European distributed seismological data archives infrastructure: EIDA

    NASA Astrophysics Data System (ADS)

    Clinton, John; Hanka, Winfried; Mazza, Salvatore; Pederson, Helle; Sleeman, Reinoud; Stammler, Klaus; Strollo, Angelo

    2014-05-01

    The European Integrated waveform Data Archive (EIDA) is a distributed Data Center system within ORFEUS that (a) securely archives seismic waveform data and related metadata gathered by European research infrastructures, and (b) provides transparent access to the archives for the geosciences research communities. EIDA was founded in 2013 by ORFEUS Data Center, GFZ, RESIF, ETH, INGV and BGR to ensure sustainability of a distributed archive system and the implementation of standards (e.g. FDSN StationXML, FDSN webservices) and coordinate new developments. Under the mandate of the ORFEUS Board of Directors and Executive Committee the founding group is responsible for steering and maintaining the technical developments and organization of the European distributed seismic waveform data archive and the integration within broader multidisciplanry frameworks like EPOS. EIDA currently offers uniform data access to unrestricted data from 8 European archives (www.orfeus-eu.org/eida), linked by the Arclink protocol, hosting data from 75 permanent networks (1800+ stations) and 33 temporary networks (1200+) stations). Moreover, each archive may also provide unique, restricted datasets. A webinterface, developed at GFZ, offers interactive access to different catalogues (EMSC, GFZ, USGS) and EIDA waveform data. Clients and toolboxes like arclink_fetch and ObsPy can connect directly to any EIDA node to collect data. Current developments are directed to the implementation of quality parameters and strong motion parameters.

  6. Multimedia content analysis and indexing: evaluation of a distributed and scalable architecture

    NASA Astrophysics Data System (ADS)

    Mandviwala, Hasnain; Blackwell, Scott; Weikart, Chris; Van Thong, Jean-Manuel

    2003-11-01

    Multimedia search engines facilitate the retrieval of documents from large media content archives now available via intranets and the Internet. Over the past several years, many research projects have focused on algorithms for analyzing and indexing media content efficiently. However, special system architectures are required to process large amounts of content from real-time feeds or existing archives. Possible solutions include dedicated distributed architectures for analyzing content rapidly and for making it searchable. The system architecture we propose implements such an approach: a highly distributed and reconfigurable batch media content analyzer that can process media streams and static media repositories. Our distributed media analysis application handles media acquisition, content processing, and document indexing. This collection of modules is orchestrated by a task flow management component, exploiting data and pipeline parallelism in the application. A scheduler manages load balancing and prioritizes the different tasks. Workers implement application-specific modules that can be deployed on an arbitrary number of nodes running different operating systems. Each application module is exposed as a web service, implemented with industry-standard interoperable middleware components such as Microsoft ASP.NET and Sun J2EE. Our system architecture is the next generation system for the multimedia indexing application demonstrated by www.speechbot.com. It can process large volumes of audio recordings with minimal support and maintenance, while running on low-cost commodity hardware. The system has been evaluated on a server farm running concurrent content analysis processes.

  7. A historian among scientists: reflections on archiving the history of science in postcolonial India.

    PubMed

    Chowdhury, Indira

    2013-06-01

    How might we overcome the lack of archival resources while doing the history of science in India? Offering reflections on the nature of archival resources that could be collected for scientific institutions and the need for new interpretative tools with which to understand these resources, this essay argues for the use of oral history in order to understand the practices of science in the postcolonial context. The oral history of science can become a tool with which to understand the hidden interactions between the world of scientific institutions and the larger world of the postcolonial nation.

  8. Metrically preserving the USGS aerial film archive

    USGS Publications Warehouse

    Moe, Donald; Longhenry, Ryan

    2013-01-01

    Since 1972, the U.S. Geological Survey (USGS) Earth Resources Observation and Science (EROS) Center in Sioux Falls, South Dakota, has provided fi lm-based products to the public. EROS is home to an archive of 12 million frames of analog photography ranging from 1937 to the present. The archive contains collections from both aerial and satellite platforms including programs such as the National High Altitude Program (NHAP), National Aerial Photography Program (NAPP), U.S. Antarctic Resource Center (USARC), Declass 1(CORONA, ARGON, and LANYARD), Declass 2 (KH-7 and KH-9), and Landsat (1972 – 1992, Landsat 1–5).

  9. Service-Based Extensions to an OAIS Archive for Science Data Management

    NASA Astrophysics Data System (ADS)

    Flathers, E.; Seamon, E.; Gessler, P. E.

    2014-12-01

    With new data management mandates from major funding sources such as the National Institutes for Health and the National Science Foundation, architecture of science data archive systems is becoming a critical concern for research institutions. The Consultative Committee for Space Data Systems (CCSDS), in 2002, released their first version of a Reference Model for an Open Archival Information System (OAIS). The CCSDS document (now an ISO standard) was updated in 2012 with additional focus on verifying the authenticity of data and developing concepts of access rights and a security model. The OAIS model is a good fit for research data archives, having been designed to support data collections of heterogeneous types, disciplines, storage formats, etc. for the space sciences. As fast, reliable, persistent Internet connectivity spreads, new network-available resources have been developed that can support the science data archive. A natural extension of an OAIS archive is the interconnection with network- or cloud-based services and resources. We use the Service Oriented Architecture (SOA) design paradigm to describe a set of extensions to an OAIS-type archive: purpose and justification for each extension, where and how each extension connects to the model, and an example of a specific service that meets the purpose.

  10. Quantifying grain shape with MorpheoLV: A case study using Holocene glacial marine sediments

    NASA Astrophysics Data System (ADS)

    Charpentier, Isabelle; Staszyc, Alicia B.; Wellner, Julia S.; Alejandro, Vanessa

    2017-06-01

    As demonstrated in earlier works, quantitative grain shape analysis has revealed to be a strong proxy for determining sediment transport history and depositional environments. MorpheoLV, devoted to the calculation of roughness coefficients from pictures of unique clastic sediment grains using Fourier analysis, drives computations for a collection of samples of grain images. This process may be applied to sedimentary deposits assuming core/interval/image archives for the storage of samples collected along depth. This study uses a 25.8 m jumbo piston core, NBP1203 JPC36, taken from a 100 m thick sedimentary drift deposit from Perseverance Drift on the northern Antarctic Peninsula continental shelf. Changes in ocean and ice conditions throughout the Holocene recorded in this sedimentary archive can be assessed by studying grain shape, grain texture, and other proxies. Ninety six intervals were sampled and a total of 2319 individual particle images were used. Microtextures of individual grains observed by SEM show a very high abundance of authigenically precipitated silica that obscures the original grain shape. Grain roughness, computed along depth with MorpheoLV, only shows small variation confirming the qualitative observation deduced from the SEM. Despite this, trends can be seen confirming the reliability of MorpheoLV as a tool for quantitative grain shape analysis.

  11. A Toolkit For CryoSat Investigations By The ESRIN EOP-SER Altimetry Team

    NASA Astrophysics Data System (ADS)

    Dinardo, Salvatore; Bruno, Lucas; Benveniste, Jerome

    2013-12-01

    The scope of this work is to feature the new tool for the exploitation of the CryoSat data, designed and developed entirely by the Altimetry Team at ESRIN EOP-SER (Earth Observation - Exploitation, Research and Development). The tool framework is composed of two separate components: the first one handles the data collection and management, the second one is the processing toolkit. The CryoSat FBR (Full Bit Rate) data is downlinked uncompressed from the satellite, containing un-averaged individual echoes. This data is made available in the Kiruna CalVal server in a 10 day rolling archive. Daily at ESRIN all the CryoSat FBR data, in SAR and SARin Mode, are downloaded (around 30 Gigabytes) catalogued and archived in local ESRIN EOP-SER workstations. As of March 2013, the total amount of FBR data is over 9 Terabytes, with CryoSat acquisition dates spanning January 2011 to February 2013 (with some gaps). This archive was built by merging partial datasets available at ESTEC and NOAA, that have been kindly made available for EOP-SER team. The on-demand access to this low level data is restricted to expert users with validated ESA P.I. credentials. Currently the main users of the archiving functionality are the team members of the Project CP4O (STSE- CryoSat Plus for Ocean), CNES and NOAA. The second component of the service is the processing toolkit. On the EOP-SER workstations there is internally and independently developed software that is able to process the FBR data in SAR/SARin mode to generate multi-looked echoes (Level 1B) and subsequently able to re-track them in SAR and SARin mode (Level 2) over open ocean, exploiting the SAMOSA model and other internally developed models. The processing segment is used for research & development scopes, supporting the development contracts awarded confronting the deliverables to ESA, on site demonstrations/training to selected users, cross- comparison against third part products (CLS/CNES CPP Products for instance), preparation to Sentinel-3 mission, publications, etc. Samples of these experimental SAR/SARin L1b/L2 Products can be provided to the scientific community for comparison with self-processed data, on-request. So far, the processing has been designed and optimized for open ocean studies and is fully functional only over this kind of surface but there are plans to augment this processing capacity over coastal zones, inland waters and over land in sight of maximizing the exploitation of the upcoming Sentinel-3 Topographic mission over all surfaces. There are also plans to make the toolkit fully accessible through software “gridification” to run in the ESRin GPod (Grid Processing on Demand) Service and to extend the tool's functionalities to support Sentinel-3 Mission (both Simulated and Real Data). Graphs and statistics about the spatial coverage and amount of FBR data actually archived on the EOP-SER workstations and some scientific results will be shown in this paper along with the tests that have been designed and performed to validate the products (tests against CryoSat Kiruna PDGS Products and against transponder data).

  12. AIRSAR Automated Web-based Data Processing and Distribution System

    NASA Technical Reports Server (NTRS)

    Chu, Anhua; vanZyl, Jakob; Kim, Yunjin; Lou, Yunling; Imel, David; Tung, Wayne; Chapman, Bruce; Durden, Stephen

    2005-01-01

    In this paper, we present an integrated, end-to-end synthetic aperture radar (SAR) processing system that accepts data processing requests, submits processing jobs, performs quality analysis, delivers and archives processed data. This fully automated SAR processing system utilizes database and internet/intranet web technologies to allow external users to browse and submit data processing requests and receive processed data. It is a cost-effective way to manage a robust SAR processing and archival system. The integration of these functions has reduced operator errors and increased processor throughput dramatically.

  13. Archive of digital boomer subbottom data collected during USGS cruise 05FGS01 offshore east-central Florida, July 17-29, 2005

    USGS Publications Warehouse

    Forde, Arnell S.; Dadisman, Shawn V.; Wiese, Dana S.; Phelps, Daniel C.

    2012-01-01

    In July of 2005, the U.S. Geological Survey (USGS), in cooperation with the Florida Geological Survey (FGS), conducted a geophysical survey of the Atlantic Ocean offshore of Florida's east coast from Flagler Beach to Daytona Beach. This report serves as an archive of unprocessed digital boomer subbottom data, trackline maps, navigation files, Geographic Information System (GIS) files, Field Activity Collection System (FACS) logs and formal Federal Geographic Data Committee (FGDC) metadata. Filtered and gained (showing a relative increase in signal amplitude) digital images of the seismic profiles are also provided. Refer to the Acronyms page for expansions of acronyms and abbreviations used in this report. The USGS Saint Petersburg Coastal and Marine Science Center (SPCMSC) assigns a unique identifier to each cruise or field activity. For example, 05FGS01 tells us the data were collected in 2005 for cooperative work with the FGS and the data were collected during the first field activity for that project in that calendar year. Refer to http://walrus.wr.usgs.gov/infobank/programs/html/definition/activity.html for a detailed description of the method used to assign the field activity ID. The boomer subbottom processing system consists of an acoustic energy source that is made up of capacitors charged to a high voltage and discharged through a transducer in the water. The transducer is towed on a sled floating on the water surface and when discharged emits a short acoustic pulse, or shot, which propagates through the water column and shallow stratrigraphy below. The acoustic energy is reflected at density boundaries (such as the seafloor or sediment layers beneath the seafloor), detected by the receiver (a hydrophone streamer), and recorded by a PC-based seismic acquisition system. This process is repeated at timed intervals (for example, 0.5 s) and recorded for specific intervals of time (for example, 100 ms). In this way, a two-dimensional (2-D) vertical image of the shallow geologic structure beneath the ship track is produced. Figure 1 displays the acquisition geometry. Refer to table 1 for a summary of acquisition parameters and table 2 for trackline statistics. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG Y format (Barry and others, 1975), except an ASCII format is used for the first 3,200 bytes of the card image header instead of the standard EBCDIC format. For a detailed description about the recorded trace headers, refer to the SEG Y Format page. The SEG Y files may be downloaded and processed with commercial or public domain software such as Seismic Unix (Cohen and Stockwell, 2005). See the How To Download SEG Y Data page for download instructions. The printable profiles provided here are GIF images that were processed and gained using SU software; refer to the Software page for links to example SU processing scripts. The processed SEG Y data were also exported to Chesapeake Technology, Inc. (CTI) SonarWeb software to produce a geospatially interactive version of the profile that allows the user to obtain a geographic location and depth from the profile for a given cursor position; this information is displayed in the status bar of the browser. Please note that clicking on the profile image switches it to "Expanded View" (a compressed image of the entire line) and cursor tracking is not available in this mode.

  14. Archive of digital chirp subbottom profile data collected during USGS cruise 12BIM03 offshore of the Chandeleur Islands, Louisiana, July 2012

    USGS Publications Warehouse

    Forde, Arnell S.; Miselis, Jennifer L.; Wiese, Dana S.

    2014-01-01

    From July 23 - 31, 2012, the U.S. Geological Survey conducted geophysical surveys to investigate the geologic controls on barrier island framework and long-term sediment transport along the oil spill mitigation sand berm constructed at the north end and just offshore of the Chandeleur Islands, La. (figure 1). This effort is part of a broader USGS study, which seeks to better understand barrier island evolution over medium time scales (months to years). This report serves as an archive of unprocessed digital chirp subbottom data, trackline maps, navigation files, Geographic Information System (GIS) files, Field Activity Collection System (FACS) logs, and formal Federal Geographic Data Committee (FGDC) metadata. Gained (showing a relative increase in signal amplitude) digital images of the seismic profiles are also provided. Refer to the Abbreviations page for expansions of acronyms and abbreviations used in this report. The USGS St. Petersburg Coastal and Marine Science Center (SPCMSC) assigns a unique identifier to each cruise or field activity. For example, 12BIM03 tells us the data were collected in 2012 during the third field activity for that project in that calendar year and BIM is a generic code, which represents efforts related to Barrier Island Mapping. Refer to http://walrus.wr.usgs.gov/infobank/programs/html/definition/activity.html for a detailed description of the method used to assign the field activity ID. All chirp systems use a signal of continuously varying frequency; the EdgeTech SB-424 system used during this survey produces high-resolution, shallow-penetration (typically less than 50 milliseconds (ms)) profile images of sub-seafloor stratigraphy. The towfish contains a transducer that transmits and receives acoustic energy and is typically towed 1 - 2 m below the sea surface. As transmitted acoustic energy intersects density boundaries, such as the seafloor or sub-surface sediment layers, energy is reflected back toward the transducer, received, and recorded by a PC-based seismic acquisition system. This process is repeated at regular time intervals (for example, 0.125 seconds (s)) and returned energy is recorded for a specific duration (for example, 50 ms). In this way, a two-dimensional (2-D) vertical image of the shallow geologic structure beneath the ship track is produced. Figure 2 displays the acquisition geometry. Refer to table 1 for a summary of acquisition parameters and table 2 for trackline statistics. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG Y rev. 0 format (Barry and others, 1975); the first 3,200 bytes of the card image header are in ASCII format instead of EBCDIC format. The SEG Y files may be downloaded and processed with commercial or public domain software such as Seismic Unix (SU) (Cohen and Stockwell, 2010). See the How To Download SEG Y Data page for download instructions. The web version of this archive does not contain the SEG Y trace files. These files are very large and would require extremely long download times. To obtain the complete DVD archive, contact USGS Information Services at 1-888-ASK-USGS or infoservices@usgs.gov. The printable profiles provided here are GIF images that were processed and gained using SU software and can be viewed from the Profiles page or from links located on the trackline maps; refer to the Software page for links to example SU processing scripts. The SEG Y files are available on the DVD version of this report or on the Web, downloadable via the USGS Coastal and Marine Geoscience Data System (http://cmgds.marine.usgs.gov). The data are also available for viewing using GeoMapApp (http://www.geomapapp.org) and Virtual Ocean (http://www.virtualocean.org) multi-platform open source software. Detailed information about the navigation system used can be found in table 1 and the Field Activity Collection System (FACS) logs. To view the trackline maps and navigation files, and for more information about these items, see the Navigation page.

  15. Historical files from Federal government mineral exploration-assistance programs, 1950 to 1974

    USGS Publications Warehouse

    Frank, David G.

    2010-01-01

    Congress enacted the Defense Production Act in 1950 to provide funding and support for the exploration and development of critical mineral resources. From 1950 to 1974, three Department of the Interior agencies carried out this mission. Contracts with mine owners provided financial assistance for mineral exploration on a joint-participation basis. These contracts are documented in more than 5,000 'dockets' now archived online by the U.S. Geological Survey. This archive provides access to unique and difficult to recreate information, such as drill logs, assay results, and underground geologic maps, that is invaluable to land and resource management organizations and the minerals industry. An effort to preserve the data began in 2009, and the entire collection of dockets was electronically scanned. The scanning process used optical character recognition (OCR) when possible, and files were converted into Portable Document Format (.pdf) files, which require Adobe Reader or similar software for viewing. In 2010, the scans were placed online (http://minerals.usgs.gov/dockets/) and are available to download free of charge.

  16. On a path towards long-term sampling following the Deepwater Horizon: Initial insights

    NASA Astrophysics Data System (ADS)

    Reddy, C. M.

    2012-12-01

    During the past two decades in the United States, few areas contaminated from an oil spill have been revisited on the time scales from months to years. The lack of sampling is a missed opportunity to shine light on long-term processes, evaluate recovery, identify compounds most likely to persist, and apply new chemical and biological techniques. To address this issue, my laboratory has begun a land-based effort to collect oiled samples from the Gulf of Mexico beaches from the Deepwater Horizon disaster. Each sample is archived, analyzed, and available for others via an online repository. Detailed analysis of many of these samples has already been fruitful on determining the fate of the spilled oil, which will be discussed. This meeting is an ideal time to discuss strategies for long-term sampling and archiving. With support from the Gulf Research Initiative for the next nine years, the opportunities to use these samples will be frequent.

  17. Body-object interaction ratings for 1,618 monosyllabic nouns.

    PubMed

    Tillotson, Sherri M; Siakaluk, Paul D; Pexman, Penny M

    2008-11-01

    Body-object interaction (BOI) assesses the ease with which a human body can physically interact with a word's referent. Recent research has shown that BOI influences visual word recognition processes in such a way that responses to high-BOI words (e.g., couch) are faster and less error prone than responses to low-BOI words (e.g., cliff). Importantly, the high-BOI words and the low-BOI words that were used in those studies were matched on imageability. In the present study, we collected BOI ratings for a large set of words. BOI ratings, on a 1-7 scale, were obtained for 1,618 monosyllabic nouns. These ratings allowed us to test the generalizability of BOI effects to a large set of items, and they should be useful to researchers who are interested in manipulating or controlling for the effects of BOI. The body-object interaction ratings for this study may be downloaded from the Psychonomic Society's Archive of Norms, Stimuli, and Data, www.psychonomic.org/archive.

  18. The ``One Archive'' for JWST

    NASA Astrophysics Data System (ADS)

    Greene, G.; Kyprianou, M.; Levay, K.; Sienkewicz, M.; Donaldson, T.; Dower, T.; Swam, M.; Bushouse, H.; Greenfield, P.; Kidwell, R.; Wolfe, D.; Gardner, L.; Nieto-Santisteban, M.; Swade, D.; McLean, B.; Abney, F.; Alexov, A.; Binegar, S.; Aloisi, A.; Slowinski, S.; Gousoulin, J.

    2015-09-01

    The next generation for the Space Telescope Science Institute data management system is gearing up to provide a suite of archive system services supporting the operation of the James Webb Space Telescope. We are now completing the initial stage of integration and testing for the preliminary ground system builds of the JWST Science Operations Center which includes multiple components of the Data Management Subsystem (DMS). The vision for astronomical science and research with the JWST archive introduces both solutions to formal mission requirements and innovation derived from our existing mission systems along with the collective shared experience of our global user community. We are building upon the success of the Hubble Space Telescope archive systems, standards developed by the International Virtual Observatory Alliance, and collaborations with our archive data center partners. In proceeding forward, the “one archive” architectural model presented here is designed to balance the objectives for this new and exciting mission. The STScI JWST archive will deliver high quality calibrated science data products, support multi-mission data discovery and analysis, and provide an infrastructure which supports bridges to highly valued community tools and services.

  19. The NASA Ames Research Center Institutional Scientific Collection: History, Best Practices and Scientific Opportunities

    NASA Technical Reports Server (NTRS)

    Rask, Jon C.; Chakravarty, Kaushik; French, Alison; Choi, Sungshin; Stewart, Helen

    2017-01-01

    The NASA Ames Life Sciences Institutional Scientific Collection (ISC), which is composed of the Ames Life Sciences Data Archive (ALSDA) and the Biospecimen Storage Facility (BSF), is managed by the Space Biosciences Division and has been operational since 1993. The ALSDA is responsible for archiving information and animal biospecimens collected from life science spaceflight experiments and matching ground control experiments. Both fixed and frozen spaceflight and ground tissues are stored in the BSF within the ISC. The ALSDA also manages a Biospecimen Sharing Program, performs curation and long-term storage operations, and makes biospecimens available to the scientific community for research purposes via the Life Science Data Archive public website (https:lsda.jsc.nasa.gov). As part of our best practices, a viability testing plan has been developed for the ISC, which will assess the quality of archived samples. We expect that results from the viability testing will catalyze sample use, enable broader science community interest, and improve operational efficiency of the ISC. The current viability test plan focuses on generating disposition recommendations and is based on using ribonucleic acid (RNA) integrity number (RIN) scores as a criteria for measurement of biospecimen viablity for downstream functional analysis. The plan includes (1) sorting and identification of candidate samples, (2) conducting a statiscally-based power analysis to generate representaive cohorts from the population of stored biospecimens, (3) completion of RIN analysis on select samples, and (4) development of disposition recommendations based on the RIN scores. Results of this work will also support NASA open science initiatives and guides development of the NASA Scientific Collections Directive (a policy on best practices for curation of biological collections). Our RIN-based methodology for characterizing the quality of tissues stored in the ISC since the 1980s also creates unique scientific opportunities for temporal assessment across historical missions. Support from the NASA Space Biology Program and the NASA Human Research Program is gratefully acknowledged.

  20. Understanding the Data Complexity continuum to reduce data management costs and increase data usability through partnerships with the National Centers for Environmental Information

    NASA Astrophysics Data System (ADS)

    Mesick, S.; Weathers, K. W.

    2017-12-01

    Data complexity can be seen as a continuum from complex to simple. The term data complexity refers to data collections that are disorganized, poorly documented, and generally do not follow best data management practices. Complex data collections are challenging and expensive to manage. Simplified collections readily support automated archival processes, enhanced discovery and data access, as well as production of services that make data easier to reuse. In this session, NOAA NCEI scientific data stewards will discuss the data complexity continuum. This talk will explore data simplification concepts, methods, and tools that data managers can employ which may offer more control over data management costs and processes, while achieving policy goals for open data access and ready reuse. Topics will include guidance for data managers on best allocation of limited data management resources; models for partnering with NCEI to accomplish shared data management goals; and will demonstrate through case studies the benefits of investing in documentation, accessibility, and services to increase data value and return on investment.

  1. Digitized Special Collections and Multiple User Groups

    ERIC Educational Resources Information Center

    Gueguen, Gretchen

    2010-01-01

    Many organizations have evolved since their early attempts to mount digital exhibits on the Web and are experimenting with ways to increase the scale of their digitized collections by utilizing archival finding aid description rather than resource-intensive collections and exhibits. This article examines usability research to predict how such…

  2. Milne "en Masse": A Case Study in Digitizing Large Image Collections

    ERIC Educational Resources Information Center

    Harkema, Craig; Avery, Cheryl

    2015-01-01

    In December 2012, the University of Saskatchewan Library's University Archives and Special Collections acquired the complete image collection of Courtney Milne, a professional photographer whose worked encompassed documentary, abstract and fine art photographs. From acquisition to digital curation, the authors identify, outline, and discuss the…

  3. Mind the Gap: Integrating Special Collections Teaching

    ERIC Educational Resources Information Center

    Samuelson, Todd; Coker, Cait

    2014-01-01

    Instruction is a vital part of the academic librarian's public services mission, but the teaching efforts of special collections librarians can be overlooked due to the culture and particularities of teaching in an archival setting. This article documents the challenges special collections librarians face in integrating their teaching program into…

  4. People's Collection Wales: Online Access to the Heritage of Wales from Museums, Archives and Libraries

    ERIC Educational Resources Information Center

    Tedd, Lucy A.

    2011-01-01

    Purpose: The People's Collection Wales aims to collect, interpret, distribute and discuss Wales' cultural heritage in an online environment. Individual users or local history societies are able to create their own digital collections, contribute relevant content, as well as access digital resources from heritage institutions. This paper aims to…

  5. The American Film Heritage; Impressions from the American Film Institute Archives.

    ERIC Educational Resources Information Center

    Shales, Tom; And Others

    The American Film Institute has an archive which presently contains more than 9,000 films, many of them rare. The articles in this volume are based on some of the films in the collection. Among the topics of these essays are: pioneers like D. W. Griffith and Thomas H. Ince, treatment of blacks and Indians in films, development of the techniques…

  6. Film Program Notes from the Current Holdings of the Anthology Film Archives; Outlines of 41 Films.

    ERIC Educational Resources Information Center

    Anthology Film Archives, New York, NY.

    This collection of film program notes includes mixed commentary on some of the films held in the Anthology Film Archives (a film and book library in New York City). Some of the films are described by synopsis of the episodes and others by translation into English of the foreign language subtitles. However, each film noted is identified by full…

  7. Archives and the Boundaries of Early Modern Science.

    PubMed

    Popper, Nicholas

    2016-03-01

    This contribution argues that the study of early modern archives suggests a new agenda for historians of early modern science. While in recent years historians of science have begun to direct increased attention toward the collections amassed by figures and institutions traditionally portrayed as proto-scientific, archives proliferated across early modern Europe, emerging as powerful tools for creating knowledge in politics, history, and law as well as natural philosophy, botany, and more. The essay investigates the methods of production, collection, organization, and manipulation used by English statesmen and Crown officers such as Keeper of the State Papers Thomas Wilson and Secretary of State Joseph Williamson to govern their disorderly collections. Their methods, it is shown, were shared with contemporaries seeking to generate and manage other troves of evidence and in fact reflect a complex ecosystem of imitation and exchange across fields of inquiry. These commonalities suggest that historians of science should look beyond the ancestors of modern scientific disciplines to examine how practices of producing knowledge emerged and migrated throughout cultures of learning in Europe and beyond. Creating such a map of knowledge production and exchange, the essay concludes, would provide a renewed and expansive ambition for the field.

  8. [Development of a medical equipment support information system based on PDF portable document].

    PubMed

    Cheng, Jiangbo; Wang, Weidong

    2010-07-01

    According to the organizational structure and management system of the hospital medical engineering support, integrate medical engineering support workflow to ensure the medical engineering data effectively, accurately and comprehensively collected and kept in electronic archives. Analyse workflow of the medical, equipment support work and record all work processes by the portable electronic document. Using XML middleware technology and SQL Server database, complete process management, data calculation, submission, storage and other functions. The practical application shows that the medical equipment support information system optimizes the existing work process, standardized and digital, automatic and efficient orderly and controllable. The medical equipment support information system based on portable electronic document can effectively optimize and improve hospital medical engineering support work, improve performance, reduce costs, and provide full and accurate digital data

  9. Ground-penetrating radar and differential global positioning system data collected from Long Beach Island, New Jersey, April 2015

    USGS Publications Warehouse

    Zaremba, Nicholas J.; Smith, Kathryn E.L.; Bishop, James M.; Smith, Christopher G.

    2016-08-04

    Scientists from the United States Geological Survey, St. Petersburg Coastal and Marine Science Center, U.S. Geological Survey Pacific Coastal and Marine Science Center, and students from the University of Hawaii at Manoa collected sediment cores, sediment surface grab samples, ground-penetrating radar (GPR) and Differential Global Positioning System (DGPS) data from within the Edwin B. Forsythe National Wildlife Refuge–Holgate Unit located on the southern end of Long Beach Island, New Jersey, in April 2015 (FAN 2015-611-FA). The study’s objective was to identify washover deposits in the stratigraphic record to aid in understanding barrier island evolution. This report is an archive of GPR and DGPS data collected from Long Beach Island in 2015. Data products, including raw GPR and processed DGPS data, elevation corrected GPR profiles, and accompanying Federal Geographic Data Committee metadata can be downloaded from the Data Downloads page.

  10. Summary of water-surface-elevation data for 116 U.S. Geological Survey lake and reservoir stations in Texas and comparison to data for water year 2006

    USGS Publications Warehouse

    Asquith, William H.; Vrabel, Joseph; Roussel, Meghan C.

    2007-01-01

    The U.S. Geological Survey (USGS), in cooperation with numerous Federal, State, municipal, and local agencies, currently (2007) collects data for more than 120 lakes and reservoirs in Texas through a realtime, data-collection network. The National Water Information System that processes and archives water-resources data for the Nation provides a central source for retrieval of real-time as well as historical data. This report provides a brief description of the real-time, data-collection network and graphically summarizes the period-of-record daily mean water-surface elevations for 116 active and discontinued USGS lake and reservoir stations in Texas. The report also graphically depicts selected statistics (minimum, maximum, and mean) of daily mean water-surface-elevation data. The data for water year 2006 are compared to the selected statistics.

  11. Virtual Collections: An Earth Science Data Curation Service

    NASA Astrophysics Data System (ADS)

    Bugbee, K.; Ramachandran, R.; Maskey, M.; Gatlin, P. N.

    2016-12-01

    The role of Earth science data centers has traditionally been to maintain central archives that serve openly available Earth observation data. However, in order to ensure data are as useful as possible to a diverse user community, Earth science data centers must move beyond simply serving as an archive to offering innovative data services to user communities. A virtual collection, the end product of a curation activity that searches, selects, and synthesizes diffuse data and information resources around a specific topic or event, is a data curation service that improves the discoverability, accessibility and usability of Earth science data and also supports the needs of unanticipated users. Virtual collections minimize the amount of time and effort needed to begin research by maximizing certainty of reward and by providing a trustworthy source of data for unanticipated users. This presentation will define a virtual collection in the context of an Earth science data center and will highlight a virtual collection case study created at the Global Hydrology Resource Center data center.

  12. Virtual Collections: An Earth Science Data Curation Service

    NASA Technical Reports Server (NTRS)

    Bugbee, Kaylin; Ramachandran, Rahul; Maskey, Manil; Gatlin, Patrick

    2016-01-01

    The role of Earth science data centers has traditionally been to maintain central archives that serve openly available Earth observation data. However, in order to ensure data are as useful as possible to a diverse user community, Earth science data centers must move beyond simply serving as an archive to offering innovative data services to user communities. A virtual collection, the end product of a curation activity that searches, selects, and synthesizes diffuse data and information resources around a specific topic or event, is a data curation service that improves the discoverability, accessibility, and usability of Earth science data and also supports the needs of unanticipated users. Virtual collections minimize the amount of the time and effort needed to begin research by maximizing certainty of reward and by providing a trustworthy source of data for unanticipated users. This presentation will define a virtual collection in the context of an Earth science data center and will highlight a virtual collection case study created at the Global Hydrology Resource Center data center.

  13. A collection of non-human primate computed tomography scans housed in MorphoSource, a repository for 3D data

    PubMed Central

    Copes, Lynn E.; Lucas, Lynn M.; Thostenson, James O.; Hoekstra, Hopi E.; Boyer, Doug M.

    2016-01-01

    A dataset of high-resolution microCT scans of primate skulls (crania and mandibles) and certain postcranial elements was collected to address questions about primate skull morphology. The sample consists of 489 scans taken from 431 specimens, representing 59 species of most Primate families. These data have transformative reuse potential as such datasets are necessary for conducting high power research into primate evolution, but require significant time and funding to collect. Similar datasets were previously only available to select research groups across the world. The physical specimens are vouchered at Harvard’s Museum of Comparative Zoology. The data collection took place at the Center for Nanoscale Systems at Harvard. The dataset is archived on MorphoSource.org. Though this is the largest high fidelity comparative dataset yet available, its provisioning on a web archive that allows unlimited researcher contributions promises a future with vastly increased digital collections available at researchers’ finger tips. PMID:26836025

  14. An overview on integrated data system for archiving and sharing marine geology and geophysical data in Korea Institute of Ocean Science & Technology (KIOST)

    NASA Astrophysics Data System (ADS)

    Choi, Sang-Hwa; Kim, Sung Dae; Park, Hyuk Min; Lee, SeungHa

    2016-04-01

    We established and have operated an integrated data system for managing, archiving and sharing marine geology and geophysical data around Korea produced from various research projects and programs in Korea Institute of Ocean Science & Technology (KIOST). First of all, to keep the consistency of data system with continuous data updates, we set up standard operating procedures (SOPs) for data archiving, data processing and converting, data quality controls, and data uploading, DB maintenance, etc. Database of this system comprises two databases, ARCHIVE DB and GIS DB for the purpose of this data system. ARCHIVE DB stores archived data as an original forms and formats from data providers for data archive and GIS DB manages all other compilation, processed and reproduction data and information for data services and GIS application services. Relational data management system, Oracle 11g, adopted for DBMS and open source GIS techniques applied for GIS services such as OpenLayers for user interface, GeoServer for application server, PostGIS and PostgreSQL for GIS database. For the sake of convenient use of geophysical data in a SEG Y format, a viewer program was developed and embedded in this system. Users can search data through GIS user interface and save the results as a report.

  15. Approaching data publication as part of the scholarly communication enterprise: some obstacles, some solutions (Invited)

    NASA Astrophysics Data System (ADS)

    Vision, T. J.

    2010-12-01

    Many datasets collected by academic research teams, despite being difficult or impossible to effectively reproduce, are never shared with the wider community even after the findings based upon them appear in print. This limits the extent to which published scientific findings can be verified and cuts short the opportunity for data to realize their full potential value through reuse and repurposing. While many scientists perceive data to be public goods that should be made available upon publication, they also perceive limited incentive for doing so themselves. This, combined with the lack of mandates for data archiving and the absence of a trusted public repository that can host any kind of data, means that the practice of data archiving is rare. When data are shared post-publication, it is often through ad hoc mechanisms and under terms that present obstacles to reuse. When data are archived, it is generally achieved through routes that do not ensure preservation or discoverability. To address this mix of technical and sociocultural obstacles to data reuse, a consortium of journals in ecology and evolutionary biology recently launched a digital data repository (Dryad) and developed a joint policy mandating data archiving at the time of publication. Dryad has a number of features specifically designed to make it possible for universal data archiving to be achieved with low-burden and low-cost at the time of publication. These include a streamlined submission process through the exchange of metadata through integration with the manuscript processing system, handshaking with more specialized data repositories, and metadata curation that is assisted by automated generation of cataloging terms. To directly benefit data depositors, data are treated as a citable scholarly product through the assignment of trackable data DOIs. The data are permanently linked from the original article and are made freely available with an explicit waiver of restrictions to reuse. The Dryad Consortium, which includes both society-owned and publisher-owned journals, is responsible for governing and sustaining the repository. For scientists, Dryad provides a rich source of data for confirmation of findings, tests of new methodology, and synthetic studies. It also provides the means for depositors to tangibly increase the scientific impact of their work. For journals, Dryad archives data in a more permanent, feature-rich, and cost-effective way than through use of supplementary online materials. Despite its biological origins, Dryad provides a discipline-neutral model for including data fully within the fold of scholarly communication.

  16. Recommendations for a service framework to access astronomical archives

    NASA Technical Reports Server (NTRS)

    Travisano, J. J.; Pollizzi, J.

    1992-01-01

    There are a large number of astronomical archives and catalogs on-line for network access, with many different user interfaces and features. Some systems are moving towards distributed access, supplying users with client software for their home sites which connects to servers at the archive site. Many of the issues involved in defining a standard framework of services that archive/catalog suppliers can use to achieve a basic level of interoperability are described. Such a framework would simplify the development of client and server programs to access the wide variety of astronomical archive systems. The primary services that are supplied by current systems include: catalog browsing, dataset retrieval, name resolution, and data analysis. The following issues (and probably more) need to be considered in establishing a standard set of client/server interfaces and protocols: Archive Access - dataset retrieval, delivery, file formats, data browsing, analysis, etc.; Catalog Access - database management systems, query languages, data formats, synchronous/asynchronous mode of operation, etc.; Interoperability - transaction/message protocols, distributed processing mechanisms (DCE, ONC/SunRPC, etc), networking protocols, etc.; Security - user registration, authorization/authentication mechanisms, etc.; Service Directory - service registration, lookup, port/task mapping, parameters, etc.; Software - public vs proprietary, client/server software, standard interfaces to client/server functions, software distribution, operating system portability, data portability, etc. Several archive/catalog groups, notably the Astrophysics Data System (ADS), are already working in many of these areas. In the process of developing StarView, which is the user interface to the Space Telescope Data Archive and Distribution Service (ST-DADS), these issues and the work of others were analyzed. A framework of standard interfaces for accessing services on any archive system which would benefit archive user and supplier alike is proposed.

  17. Atmospheric Radiation Measurement program climate research facility operations quarterly report October 1 - December 31, 2006.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sisterson, D. L.

    2007-03-14

    Individual raw data streams from instrumentation at the Atmospheric Radiation Measurement (ARM) Program Climate Research Facility (ACRF) fixed and mobile sites are collected and sent to the Data Management Facility (DMF) at Pacific Northwest National Laboratory (PNNL) for processing in near real time. Raw and processed data are then sent daily to the ACRF Archive, where they are made available to users. For each instrument, we calculate the ratio of the actual number of data records received daily at the Archive to the expected number of data records. The results are tabulated by (1) individual data stream, site, and monthmore » for the current year and (2) site and fiscal year dating back to 1998. Table 1 shows the accumulated maximum operation time (planned uptime), the actual hours of operation, and the variance (unplanned downtime) for the period October 1 through December 31, 2006, for the fixed and mobile sites. Although the AMF is currently up and running in Niamey, Niger, Africa, the AMF statistics are reported separately and not included in the aggregate average with the fixed sites. The first quarter comprises a total of 2,208 hours. For all fixed sites, the actual data availability (and therefore actual hours of operation) exceeded the individual (and well as aggregate average of the fixed sites) operational goal for the first quarter of fiscal year (FY) 2007. The Site Access Request System is a web-based database used to track visitors to the fixed sites, all of which have facilities that can be visited. The NSA locale has the Barrow and Atqasuk sites. The SGP site has a Central Facility, 23 extended facilities, 4 boundary facilities, and 3 intermediate facilities. The TWP locale has the Manus, Nauru, and Darwin sites. NIM represents the AMF statistics for the current deployment in Niamey, Niger, Africa. PYE represents the AMF statistics for the Point Reyes, California, past deployment in 2005. In addition, users who do not want to wait for data to be provided through the ACRF Archive can request an account on the local site data system. The eight research computers are located at the Barrow and Atqasuk sites; the SGP Central Facility; the TWP Manus, Nauru, and Darwin sites; the DMF at PNNL; and the AMF in Niger. This report provides the cumulative numbers of visitors and user accounts by site for the period January 1, 2006 - December 31, 2006. The U.S. Department of Energy requires national user facilities to report facility use by total visitor days-broken down by institution type, gender, race, citizenship, visitor role, visit purpose, and facility-for actual visitors and for active user research computer accounts. During this reporting period, the ACRF Archive did not collect data on user characteristics in this way. Work is under way to collect and report these data. Table 2 shows the summary of cumulative users for the period January 1, 2006 - December 31, 2006. For the first quarter of FY 2007, the overall number of users is up from the last reporting period. The historical data show that there is an apparent relationship between the total number of users and the 'size' of field campaigns, called Intensive Operation Periods (IOPs): larger IOPs draw more of the site facility resources, which are reflected by the number of site visits and site visit days, research accounts, and device accounts. These types of users typically collect and analyze data in near-real time for a site-specific IOP that is in progress. However, the Archive accounts represent persistent (year-to-year) ACRF data users that often mine from the entire collection of ACRF data, which mostly includes routine data from the fixed and mobile sites, as well as cumulative IOP data sets. Archive data users continue to show a steady growth, which is independent of the size of IOPs. For this quarter, the number of Archive data user accounts was 961, the highest since record-keeping began. For reporting purposes, the three ACRF sites and the AMF operate 24 hours per day, 7 days per week, and 52 weeks per year. Although the AMF is not officially collecting data this quarter, personnel are regularly involved with teardown, packing, hipping, unpacking, setup, and maintenance activities, so they are included in the safety statistics. Time is reported in days instead of hours. If any lost work time is incurred by any employee, it is counted as a workday loss. Table 3 reports the consecutive days since the last recordable or reportable injury or incident causing damage to property, equipment, or vehicle for the period October 1 - December 31, 2006. There were no recordable or lost workdays or incidents for the first quarter of FY 2007.« less

  18. From Ephemeral to Legitimate: An Inquiry into Television's Material Traces in Archival Spaces, 1950s-1970s

    ERIC Educational Resources Information Center

    Bratslavsky, Lauren Michelle

    2013-01-01

    The dissertation offers a historical inquiry about how television's material traces entered archival spaces. Material traces refer to both the moving image products and the assortment of documentation about the processes of television as industrial and creative endeavors. By identifying the development of television-specific archives and…

  19. Lifetime Surveillance of Astronaut Health (LSAH) / Life Sciences Data Archive (LSDA) Data Request Helpdesk

    NASA Technical Reports Server (NTRS)

    Young, Millennia; Van Baalen, Mary

    2016-01-01

    This session is intended to provide to HRP IWS attendees instant feedback on archived astronaut data, including such topics as content of archives, access, request processing, and data format. Members of the LSAH and LSDA teams will be available at a 'help desk' during the poster sessions to answer questions from researchers.

  20. Garbage in, Garbage Out: Data Collection, Quality Assessment and Reporting Standards for Social Media Data Use in Health Research, Infodemiology and Digital Disease Detection.

    PubMed

    Kim, Yoonsang; Huang, Jidong; Emery, Sherry

    2016-02-26

    Social media have transformed the communications landscape. People increasingly obtain news and health information online and via social media. Social media platforms also serve as novel sources of rich observational data for health research (including infodemiology, infoveillance, and digital disease detection detection). While the number of studies using social data is growing rapidly, very few of these studies transparently outline their methods for collecting, filtering, and reporting those data. Keywords and search filters applied to social data form the lens through which researchers may observe what and how people communicate about a given topic. Without a properly focused lens, research conclusions may be biased or misleading. Standards of reporting data sources and quality are needed so that data scientists and consumers of social media research can evaluate and compare methods and findings across studies. We aimed to develop and apply a framework of social media data collection and quality assessment and to propose a reporting standard, which researchers and reviewers may use to evaluate and compare the quality of social data across studies. We propose a conceptual framework consisting of three major steps in collecting social media data: develop, apply, and validate search filters. This framework is based on two criteria: retrieval precision (how much of retrieved data is relevant) and retrieval recall (how much of the relevant data is retrieved). We then discuss two conditions that estimation of retrieval precision and recall rely on--accurate human coding and full data collection--and how to calculate these statistics in cases that deviate from the two ideal conditions. We then apply the framework on a real-world example using approximately 4 million tobacco-related tweets collected from the Twitter firehose. We developed and applied a search filter to retrieve e-cigarette-related tweets from the archive based on three keyword categories: devices, brands, and behavior. The search filter retrieved 82,205 e-cigarette-related tweets from the archive and was validated. Retrieval precision was calculated above 95% in all cases. Retrieval recall was 86% assuming ideal conditions (no human coding errors and full data collection), 75% when unretrieved messages could not be archived, 86% assuming no false negative errors by coders, and 93% allowing both false negative and false positive errors by human coders. This paper sets forth a conceptual framework for the filtering and quality evaluation of social data that addresses several common challenges and moves toward establishing a standard of reporting social data. Researchers should clearly delineate data sources, how data were accessed and collected, and the search filter building process and how retrieval precision and recall were calculated. The proposed framework can be adapted to other public social media platforms.

  1. Seabird tissue archival and monitoring project: Egg collections and analytical results 1999-2002

    USGS Publications Warehouse

    Vander Pol, Stacy S.; Christopher, Steven J.; Roseneau, David G.; Becker, Paul R.; Day, Russel D.; Kucklick, John R.; Pugh, Rebecca S.; Simac, Kristin S.; Weston-York, Geoff

    2003-01-01

    In 1998, the U.S. Geological Survey Biological Resources Division (USGS-BRD), the U.S. Fish and Wildlife Service (USFWS) Alaska Maritime National Wildlife Refuge (AMNWR), and the National Institute of Standards and Technology (NIST) began the Seabird Tissue Archival and Monitoring Project (STAMP) to collect and cryogenically bank tissues from seabirds in Alaska for future retrospective analysis of anthropogenic contaminants. The approach of STAMP was similar to that of the Alaska Marine Mammal Tissue Archival Project (AMMTAP). AMMTAP was started in 1987 by NIST and the National Oceanic and Atmospheric Administration (NOAA) as part of the Outer Continental Shelf Environmental Assessment Program sponsored by the Minerals Management Service. Presently sponsored by the USGS-BRD, AMMTAP continues its work as part of a larger national program, the Marine Mammal Health and Stranding Response Program. AMMTAP developed carefully designed sampling and specimen banking protocols. Since 1987, AMMTAP has collected tissues from marine mammals taken in Alaska Native subsistence hunts and has cryogenically banked these tissues at the NIST National Biomonitoring Specimen Bank (NBSB). Through its own analytical work and working in partnership with other researchers both within and outside Alaska, AMMTAP has helped to develop a substantial database on contaminants in Alaska marine mammals. In contrast, data and information is limited on contaminants in Alaska seabirds, which are similar to marine mammals in that they feed near the top of the food chain and have the potential for accumulating anthropogenic contaminants. During its early planning stages, STAMP managers identified the seabird egg as the first tissue of choice for study by the project. There is a relatively long history of using bird eggs for environmental monitoring and for investigating the health status of bird populations. Since 1998, protocols for collecting and processing eggs, and cryogenically banking egg samples have been developed by STAMP (see York et al. 2001). Eggs are being collected on an annual basis for several species at nesting colonies throughout Alaska. Aliquots of these egg samples are being analyzed on a regular basis for persistent organic pollutants and mercury. Results of this work have been published in scientific journals (Christopher et al. 2002) and in conference proceedings (Kucklick et al. 2002; Vander Pol et al. 2002a, 2002b). The intent of this report is to provide an up-to-date description of STAMP. The report contains the most recent egg collection inventory, analytical data, preliminary interpretations based on these data, and a discussion of possible future directions of the project.

  2. Aquarius's Instrument Science Data System (ISDS) Automated to Acquire, Process, Trend Data and Produce Radiometric System Assessment Reports

    NASA Technical Reports Server (NTRS)

    2008-01-01

    The Aquarius Radiometer, a subsystem of the Aquarius Instrument required a data acquisition ground system to support calibration and radiometer performance assessment. To support calibration and compose performance assessments, we developed an automated system which uploaded raw data to a ftp server and saved raw and processed data to a database. This paper details the overall functionalities of the Aquarius Instrument Science Data System (ISDS) and the individual electrical ground support equipment (EGSE) which produced data files that were infused into the ISDS. Real time EGSEs include an ICDS Simulator, Calibration GSE, Labview controlled power supply, and a chamber data acquisition system. ICDS Simulator serves as a test conductor primary workstation, collecting radiometer housekeeping (HK) and science data and passing commands and HK telemetry collection request to the radiometer. Calibration GSE (Radiometer Active Test Source) provides source choice from multiple targets for the radiometer external calibration. Power Supply GSE, controlled by labview, provides real time voltage and current monitoring of the radiometer. And finally the chamber data acquisition system produces data reflecting chamber vacuum pressure, thermistor temperatures, AVG and watts. Each GSE system produce text based data files every two to six minutes and automatically copies the data files to the Central Archiver PC. The Archiver PC stores the data files, schedules automated uploads of these files to an external FTP server, and accepts request to copy all data files to the ISDS for offline data processing and analysis. Aquarius Radiometer ISDS contains PHP and MATLab programs to parse, process and save all data to a MySQL database. Analysis tools (MATLab programs) in the ISDS system are capable of displaying radiometer science, telemetry and auxiliary data in near real time as well as performing data analysis and producing automated performance assessment reports of the Aquarius Radiometer.

  3. A Counter-Proposal for Process: Toward the Development of Online Writing Archives

    ERIC Educational Resources Information Center

    Jensen, Kyle

    2009-01-01

    This dissertation advances an alternate vision for research and teaching in rhetoric and composition studies that centers on the development of online writing archives. To justify the need for this alternate vision, it assesses the limitations of the field's predominant research and teaching program: process theory. More specifically, it examines…

  4. The Convergence of Information Technology, Data, and Management in a Library Imaging Program

    ERIC Educational Resources Information Center

    France, Fenella G.; Emery, Doug; Toth, Michael B.

    2010-01-01

    Integrating advanced imaging and processing capabilities in libraries, archives, and museums requires effective systems and information management to ensure that the large amounts of digital data about cultural artifacts can be readily acquired, stored, archived, accessed, processed, and linked to other data. The Library of Congress is developing…

  5. Reconstructing Forty Years of Landsat Observations

    NASA Astrophysics Data System (ADS)

    Meyer, D. J.; Dwyer, J. L.; Steinwand, D.

    2013-12-01

    In July 1972, NASA launched the Earth Resource Technology Satellite (ERTS), the first of what was to be the series of Earth-observing satellites we now know as the Landsat system. This system, originally conceived in the 1960's within the US Department of the Interior and US Geological Survey (USGS), has continued with little interruption for over 40 years, creating the longest record of satellite-based global land observations. The current USGS archive of Landsat images exceeds 4 million scenes, and the recently launched Landsat 8 platform will extend that archive to nearly 50 years of observations. Clearly, these observations are critical to the study of Earth system processes, and the interaction between these processes and human activities. However, the seven successful Landsat missions represent more of an ad hoc program than a long-term record of consistent observations, due largely to changing Federal policies and challenges finding an operational home for the program. Technologically, these systems evolved from the original Multispectral Scanning System (MSS) through the Thematic Mapper and Enhanced Thematic Mapper Plus (ETM+) systems, to the current Observational Land Imager (OLI) and Thermal Infrared Sensor (TIRS) systems. Landsat data were collected globally by a network of international cooperators having diverse data management policies. Much of the oldest data were stored on archaic media that could not be retrieved using modern media readers. Collecting these data from various sensors and sources, and reconstructing them into coherent Earth observation records, posed numerous challenges. We present here a brief overview of work done to overcome these challenges and create a consistent, long-term Landsat observation record. Much of the current archive was 'repatriated' from international cooperators and often required the reconstruction of (sometimes absent) metadata for geo-location and radiometric calibration. The older MSS data, some of which had been successfully retrieved from outdated wide band video media, required similar metadata reconstruction. TM data from Landsats 4 and 5 relied on questionable on-board lamp data for calibration, thus the calibration history for these missions was reconstructed to account for sensor degradation over time. To improve continuity between platforms, Landsat 7 and 8 missions employed 'under-flight' maneuvers to reduce inter-calibration error. Data from the various sensors, platforms and sources were integrated into a common metadata standard, with quality assurance information, to ensure understandability of the data for long-term preservation. Because of these efforts, the current Landsat archive can now support the creation of the long-term climate data records and essential climate variables required to monitor changes on the Earth's surface quantitatively over decades of observations.

  6. An Archive of Spectra from the Mayall Fourier Transform Spectrometer at Kitt Peak

    NASA Astrophysics Data System (ADS)

    Pilachowski, C. A.; Hinkle, K. H.; Young, M. D.; Dennis, H. B.; Gopu, A.; Henschel, R.; Hayashi, S.

    2017-02-01

    We describe the SpArc science gateway for spectral data obtained using the Fourier Transform Spectrometer (FTS) in operation at the Mayall 4-m telescope at the Kitt Peak National Observatory during the period from 1975 through 1995. SpArc is hosted by Indiana University Bloomington and is available for public access. The archive includes nearly 10,000 individual spectra of more than 800 different astronomical sources including stars, nebulae, galaxies, and solar system objects. We briefly describe the FTS instrument itself and summarize the conversion of the original interferograms into spectral data and the process for recovering the data into FITS files. The architecture of the archive is discussed and the process for retrieving data from the archive is introduced. Sample use cases showing typical FTS spectra are presented.

  7. Heterogeneous Compression of Large Collections of Evolutionary Trees.

    PubMed

    Matthews, Suzanne J

    2015-01-01

    Compressing heterogeneous collections of trees is an open problem in computational phylogenetics. In a heterogeneous tree collection, each tree can contain a unique set of taxa. An ideal compression method would allow for the efficient archival of large tree collections and enable scientists to identify common evolutionary relationships over disparate analyses. In this paper, we extend TreeZip to compress heterogeneous collections of trees. TreeZip is the most efficient algorithm for compressing homogeneous tree collections. To the best of our knowledge, no other domain-based compression algorithm exists for large heterogeneous tree collections or enable their rapid analysis. Our experimental results indicate that TreeZip averages 89.03 percent (72.69 percent) space savings on unweighted (weighted) collections of trees when the level of heterogeneity in a collection is moderate. The organization of the TRZ file allows for efficient computations over heterogeneous data. For example, consensus trees can be computed in mere seconds. Lastly, combining the TreeZip compressed (TRZ) file with general-purpose compression yields average space savings of 97.34 percent (81.43 percent) on unweighted (weighted) collections of trees. Our results lead us to believe that TreeZip will prove invaluable in the efficient archival of tree collections, and enables scientists to develop novel methods for relating heterogeneous collections of trees.

  8. The cool-star spectral catalog: A uniform collection of IUE SWP-LOs

    NASA Technical Reports Server (NTRS)

    Ayres, T.; Lenz, D.; Burton, R.; Bennett, J.

    1992-01-01

    Over the past decade and a half of its operations, the International Ultraviolet Explorer has recorded low-dispersion spectrograms in the 1150-2000 A interval of more than 800 stars of late spectral type (F-M). The sub-2000 A region contains a number of emission lines that are key diagnostics of physical conditions in the high-excitation chromospheres and subcoronal 'transition zones' of such stars. Many of the sources have been observed a number of times, and the available collection of SWP-LO exposures in the IUE Archives exceeds 4,000. With support from the Astrophysics Data Program, we have assembled the archival material into a catalog of IUE far-UV fluxes of late-type stars. In order to ensure uniform processing of the spectra, we: (1) photometrically corrected the raw vidicon images with a custom version of the 1985 SWP ITF; (2) identified and eliminated, sharp cosmic-ray 'hits' by means of a spatial filter; (3) extracted the spectral traces with the 'optimal' (weighted-slit) strategy; and (4) calibrated them against a well-characterized reference source, the DA white dwarf G191-B2B. Our approach is similar to that adopted by the IUE Project for its 'Final Archive', but our implementation is specialized to the case of chromospheric emission-line sources. We measured the resulting SWP-LO spectra using a semi-autonomous algorithm that establishes a smooth continuum by numerical filtering, and then fits the significant emissions (or absorptions) by means of a constrained Bevington-type multiple-Gaussian procedure. The algorithm assigns errors to the fitted fluxes - or upper limits in the absence of a significant detection - according to a model based on careful measurements of the noise properties of the IUE's intensified SEC cameras. Here, we describe the 'visualization' strategies we adopted to ensure human-review of the semi-autonomous processing and measuring algorithms; the derivation of the noise model and the assignment of errors; and the structure of the final catalog as delivered to the Astrophysics Data System.

  9. Digital Archive Issues from the Perspective of an Earth Science Data Producer

    NASA Technical Reports Server (NTRS)

    Barkstrom, Bruce R.

    2004-01-01

    Contents include the following: Introduction. A Producer Perspective on Earth Science Data. Data Producers as Members of a Scientific Community. Some Unique Characteristics of Scientific Data. Spatial and Temporal Sampling for Earth (or Space) Science Data. The Influence of the Data Production System Architecture. The Spatial and Temporal Structures Underlying Earth Science Data. Earth Science Data File (or Relation) Schemas. Data Producer Configuration Management Complexities. The Topology of Earth Science Data Inventories. Some Thoughts on the User Perspective. Science Data User Communities. Spatial and Temporal Structure Needs of Different Users. User Spatial Objects. Data Search Services. Inventory Search. Parameter (Keyword) Search. Metadata Searches. Documentation Search. Secondary Index Search. Print Technology and Hypertext. Inter-Data Collection Configuration Management Issues. An Archive View. Producer Data Ingest and Production. User Data Searching and Distribution. Subsetting and Supersetting. Semantic Requirements for Data Interchange. Tentative Conclusions. An Object Oriented View of Archive Information Evolution. Scientific Data Archival Issues. A Perspective on the Future of Digital Archives for Scientific Data. References Index for this paper.

  10. Thermal Infrared Radiometric Calibration of the Entire Landsat 4, 5, and 7 Archive (1982-2010)

    NASA Technical Reports Server (NTRS)

    Schott, John R.; Hook, Simon J.; Barsi, Julia A.; Markham, Brian L.; Miller, Jonathan; Padula, Francis P.; Raqueno, Nina G.

    2012-01-01

    Landsat's continuing record of the thermal state of the earth's surface represents the only long term (1982 to the present) global record with spatial scales appropriate for human scale studies (i.e., tens of meters). Temperature drives many of the physical and biological processes that impact the global and local environment. As our knowledge of, and interest in, the role of temperature on these processes have grown, the value of Landsat data to monitor trends and process has also grown. The value of the Landsat thermal data archive will continue to grow as we develop more effective ways to study the long term processes and trends affecting the planet. However, in order to take proper advantage of the thermal data, we need to be able to convert the data to surface temperatures. A critical step in this process is to have the entire archive completely and consistently calibrated into absolute radiance so that it can be atmospherically compensated to surface leaving radiance and then to surface radiometric temperature. This paper addresses the methods and procedures that have been used to perform the radiometric calibration of the earliest sizable thermal data set in the archive (Landsat 4 data). The completion of this effort along with the updated calibration of the earlier (1985 1999) Landsat 5 data, also reported here, concludes a comprehensive calibration of the Landsat thermal archive of data from 1982 to the present

  11. Archive of Digitized Analog Boomer Seismic Reflection Data Collected from the Mississippi-Alabama-Florida Shelf During Cruises Onboard the R/V Kit Jones, June 1990 and July 1991

    USGS Publications Warehouse

    Sanford, Jordan M.; Harrison, Arnell S.; Wiese, Dana S.; Flocks, James G.

    2009-01-01

    In June of 1990 and July of 1991, the U.S. Geological Survey (USGS) conducted geophysical surveys to investigate the shallow geologic framework of the Mississippi-Alabama-Florida shelf in the northern Gulf of Mexico, from Mississippi Sound to the Florida Panhandle. Work was done onboard the Mississippi Mineral Resources Institute R/V Kit Jones as part of a project to study coastal erosion and offshore sand resources. This report is part of a series to digitally archive the legacy analog data collected from the Mississippi-Alabama SHelf (MASH). The MASH data rescue project is a cooperative effort by the USGS and the Minerals Management Service (MMS). This report serves as an archive of high-resolution scanned Tagged Image File Format (TIFF) and Graphics Interchange Format (GIF) images of the original boomer paper records, navigation files, trackline maps, Geographic Information System (GIS) files, cruise logs, and formal Federal Geographic Data Committee (FGDC) metadata.

  12. Three archives of the U. S. Geological Survey's Western Mineral Resources Team

    USGS Publications Warehouse

    Bolm, Karen Sue; Frank, David G.; Schneider, Jill L.

    2000-01-01

    The Western Mineral Resources Team of the U.S. Geological Survey (USGS) has three archives, which hold unpublished or difficult-to-obtain records and literature. The Technical Data Unit in Anchorage, Alaska, holds maps, field notes, and other records of the USGS work in Alaska. The USGS Field Office in Spokane, Washington, houses the more than 5,000 files from Federal government exploration programs that contracted to fund exploration for some commodities from 1950 until 1974. The Latin American Archive in Tucson, Arizona, holds material on Latin American mineral resources collected by the Center for Inter-American MineralResources Investigations.

  13. ModelArchiver—A program for facilitating the creation of groundwater model archives

    USGS Publications Warehouse

    Winston, Richard B.

    2018-03-01

    ModelArchiver is a program designed to facilitate the creation of groundwater model archives that meet the requirements of the U.S. Geological Survey (USGS) policy (Office of Groundwater Technical Memorandum 2016.02, https://water.usgs.gov/admin/memo/GW/gw2016.02.pdf, https://water.usgs.gov/ogw/policy/gw-model/). ModelArchiver version 1.0 leads the user step-by-step through the process of creating a USGS groundwater model archive. The user specifies the contents of each of the subdirectories within the archive and provides descriptions of the archive contents. Descriptions of some files can be specified automatically using file extensions. Descriptions also can be specified individually. Those descriptions are added to a readme.txt file provided by the user. ModelArchiver moves the content of the archive to the archive folder and compresses some folders into .zip files.As part of the archive, the modeler must create a metadata file describing the archive. The program has a built-in metadata editor and provides links to websites that can aid in creation of the metadata. The built-in metadata editor is also available as a stand-alone program named FgdcMetaEditor version 1.0, which also is described in this report. ModelArchiver updates the metadata file provided by the user with descriptions of the files in the archive. An optional archive list file generated automatically by ModelMuse can streamline the creation of archives by identifying input files, output files, model programs, and ancillary files for inclusion in the archive.

  14. On-the-fly Data Reprocessing and Analysis Capabilities from the XMM-Newton Archive

    NASA Astrophysics Data System (ADS)

    Ibarra, A.; Sarmiento, M.; Colomo, E.; Loiseau, N.; Salgado, J.; Gabriel, C.

    2017-10-01

    The XMM-Newton Science Archive (XSA) includes since last release the possibility to perform on-the-fly data processing with SAS through the Remote Interface for Science Analysis (RISA) server. It enables scientists to analyse data without any download and installation of data and software. The analysis options presently available include extraction of spectra and light curves of user-defined EPIC source regions and full reprocessing of data for which currently archived pipeline products were processed with older SAS versions or calibration files. The current pipeline is fully aligned with the most recent SAS and calibration, while the last full reprocessing of the archive was performed in 2013. The on-the-fly data processing functionality in this release is an experimental version and we invite the community to test and let us know their results. Known issues and workarounds are described in the 'Watchouts' section of the XSA web page. Feedback on how this functionality should evolve will be highly appreciated.

  15. Archive of digital boomer seismic reflection data collected offshore east-central Florida during USGS cruise 00FGS01, July 14-22, 2000

    USGS Publications Warehouse

    Subino, Janice A.; Dadisman, Shawn V.; Wiese, Dana S.; Calderon, Karynna; Phelps, Daniel C.

    2009-01-01

    In July of 2000, the U.S. Geological Survey (USGS), in cooperation with the Florida Geological Survey (FGS), conducted a geophysical survey of the Atlantic Ocean offshore Florida's east coast from Brevard County to northern Martin County. This report serves as an archive of unprocessed digital boomer seismic reflection data, trackline maps, navigation files, Geographic Information System (GIS) information, digital and handwritten Field Activity Collection System (FACS) logs, and Federal Geographic Data Committee (FGDC) metadata. A filtered and gained (a relative increase in signal amplitude) digital image of each seismic profile is also provided. Refer to the Acronyms page for expansions of all acronyms and abbreviations used in this report. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG-Y format (Barry and others, 1975) and may be downloaded and processed with commercial or public domain software such as Seismic Unix (SU) (Cohen and Stockwell, 2005). Example SU processing scripts and USGS Software for viewing the SEG-Y files (Zihlman, 1992) are also provided. The USGS St. Petersburg Coastal and Marine Science Center assigns a unique identifier to each cruise or field activity. For example, 00FGS01 tells us the data were collected in 2000 for cooperative work with the Florida Geological Survey (FGS) and the data were collected during the first field activity for that study in that calendar year. Refer to http://walrus.wr.usgs.gov/infobank/programs/html/definition/activity.html for a detailed description of the method used to assign the field activity ID. The boomer plate is an acoustic energy source that consists of capacitors charged to a high voltage and discharged through a transducer in the water. The transducer is towed on a sled floating on the water surface and when discharged, emits a short acoustic pulse, or shot, which propagates through the water, sediment column, or rock beneath. The acoustic energy is reflected at density boundaries (such as the seafloor, sediment, or rock layers beneath the seafloor), detected by the receiver, and recorded by a PC-based seismic acquisition system. This process is repeated at timed intervals (for example, 0.5 s) and recorded for specific intervals of time (for example, 100 ms). In this way, a two-dimensional (2D) vertical profile of the shallow geologic structure beneath the ship track is produced. Figure 1 displays the acquisition geometry. Refer to table 1 for a summary of acquisition parameters. The unprocessed seismic data are stored in SEG-Y format (Barry and others, 1975). For a detailed description of the data format, refer to the SEG-Y Format page. See the How To Download SEG-Y Data page for download instructions. The printable profiles provided are GIF images that were filtered and gained using Seismic Unix software. Refer to the Software page for details about the processing and examples of the processing scripts. The printable profiles can be viewed from the Profiles page or from links located on the trackline maps. To view the trackline maps and navigation files, and for more information about these items, see the Navigation page. Detailed information about the navigation system used can be found in table 1. Of a total record length of 200 ms, only the upper 100 ms of each profile are displayed because no useful information was observed deeper in the sections. A 10 ms deep water delay appears on lines b57-b63 and sl2-sl28. No digital data were collected for line sl6. However, line sl6r is a second attempt to collect digital data for this line. Digital data and 500-shot-interval location navigation are not available for the last 1,161 shots of line sl26 due to an equipment malfunction.

  16. Historical Time-Domain: Data Archives, Processing, and Distribution

    NASA Astrophysics Data System (ADS)

    Grindlay, Jonathan E.; Griffin, R. Elizabeth

    2012-04-01

    The workshop on Historical Time-Domain Astronomy (TDA) was attended by a near-capacity gathering of ~30 people. From information provided in turn by those present, an up-to-date overview was created of available plate archives, progress in their digitization, the extent of actual processing of those data, and plans for data distribution. Several recommendations were made for prioritising the processing and distribution of historical TDA data.

  17. Classical Pop: Documenting Popular Musical Culture in Library Audio Collections.

    ERIC Educational Resources Information Center

    Tarakan, Sheldon Lewis

    1983-01-01

    Discusses the library's role in developing a classical pop collection (defined as that music which is best representative of an era, event, or recognizable cultural trend). Popular culture, establishing the collection, funding, and archives are highlighted. A 230-item discography, addresses of five record companies, and 14 references are appended.…

  18. Integrated skills, integrated data: Mapping best practice and collections for innovation and engagement

    NASA Astrophysics Data System (ADS)

    Hosker, Rachel; Knowles, Claire; Rodger, Norman

    2015-02-01

    In recent years the University of Edinburgh [UoE] has seen change, mergers, external partnerships and innovation at the heart of its growth and activity. Collections at UoE were not immune from these changes and have pioneered projects that both support and highlight unique educational cultures. Technology and the dissemination of collections has not only engendered positive relationships with academics but has created wider opportunities for the use of collections in teaching, learning and research. This momentum and an established commitment to the interoperability of data and standards presented an opportunity to look for a global solution to collections management within the converged, cross disciplinary environment. This included harnessing expertise in the University with systems development for large European projects and wider project management. This session will explore how UoE became the first European contributor to the collections management tool Archives Space. Snapshots of a converged culture and how 'archives' have benefited from this (including how 'techies' and 'archivists' worked together). An upbeat finale will look at what the team at the UoE achieved and are excited about for the future.

  19. Conflict Termination--Transitioning from Warrior to Constable: A Primer

    DTIC Science & Technology

    1992-04-15

    historical or artistic interest; works of art, manuscripts, books, and other objects of artistic, historical, or archaeological interest; scientific ... collections and important collections of books or archives, or reproductions of the property defined above. Buildings used for cultural or religious

  20. Giving Structure to Legacy Finding Aids before Conversion to EAD: The Case of the Archives Departementales des Pyrenees-Atlantiques, France

    ERIC Educational Resources Information Center

    Goulet, Anne; Maftei, Nicolas

    2005-01-01

    At the Archives Departementales des Pyrenees-Atlantiques, the encoding of more than forty legacy finding aids written between 1863 and 2000 is part of a program of digitization of the collections. Because of the size of the project, an external consultant, ArchProteus, has been brought in and specific management procedures have been put in place…

  1. Collecting, Preserving & Sharing Information in Micronesia. Proceedings of the Annual Pacific Islands Association of Libraries and Archives Conference (3rd, Saipan, Northern Mariana Islands, October 13-15, 1993).

    ERIC Educational Resources Information Center

    Pacific Islands Association of Libraries and Archives, Guam.

    Participants from Washington, Hawaii, Majuro, Palau, Guam and other points in the Northern Mariana Islands came together to share information relating to the functions of libraries and archives as information banks and as preservers of the cultural heritage of Micronesia. Papers presented were: (1) "Reading Motivation in the Pacific"…

  2. Accuracy assessment in the Large Area Crop Inventory Experiment

    NASA Technical Reports Server (NTRS)

    Houston, A. G.; Pitts, D. E.; Feiveson, A. H.; Badhwar, G.; Ferguson, M.; Hsu, E.; Potter, J.; Chhikara, R.; Rader, M.; Ahlers, C.

    1979-01-01

    The Accuracy Assessment System (AAS) of the Large Area Crop Inventory Experiment (LACIE) was responsible for determining the accuracy and reliability of LACIE estimates of wheat production, area, and yield, made at regular intervals throughout the crop season, and for investigating the various LACIE error sources, quantifying these errors, and relating them to their causes. Some results of using the AAS during the three years of LACIE are reviewed. As the program culminated, AAS was able not only to meet the goal of obtaining accurate statistical estimates of sampling and classification accuracy, but also the goal of evaluating component labeling errors. Furthermore, the ground-truth data processing matured from collecting data for one crop (small grains) to collecting, quality-checking, and archiving data for all crops in a LACIE small segment.

  3. Data Management Considerations for the International Polar Year

    NASA Astrophysics Data System (ADS)

    Parsons, M. A.; Weaver, R. L.; Duerr, R.; Barry, R. G.

    2004-12-01

    The legacy of the International Geophysical Year and past International Polar Years is in the scientific data collected. The upcoming IPY will result in an unprecedented collection of geophysical and social science data from the polar regions. To realize the full scientific and interdisciplinary utility of these data it is essential to consider the design of data management systems early in the expirimental planning process. This paper will present an array of high level data management considerations for the IPY including cross-disciplinary data access, essential documentation, system guidance, and long-term data archiving. Specific recommendations from relevant international organizations such as the Joint Committee on Antarctic Data Management and the WCRP Climate and Cryosphere Programme will be considered. The potential role of the Electronic Geophysical Year and other International Years will also be discussed.

  4. EOSDIS: Archive and Distribution Systems in the Year 2000

    NASA Technical Reports Server (NTRS)

    Behnke, Jeanne; Lake, Alla

    2000-01-01

    Earth Science Enterprise (ESE) is a long-term NASA research mission to study the processes leading to global climate change. The Earth Observing System (EOS) is a NASA campaign of satellite observatories that are a major component of ESE. The EOS Data and Information System (EOSDIS) is another component of ESE that will provide the Earth science community with easy, affordable, and reliable access to Earth science data. EOSDIS is a distributed system, with major facilities at seven Distributed Active Archive Centers (DAACs) located throughout the United States. The EOSDIS software architecture is being designed to receive, process, and archive several terabytes of science data on a daily basis. Thousands of science users and perhaps several hundred thousands of non-science users are expected to access the system. The first major set of data to be archived in the EOSDIS is from Landsat-7. Another EOS satellite, Terra, was launched on December 18, 1999. With the Terra launch, the EOSDIS will be required to support approximately one terabyte of data into and out of the archives per day. Since EOS is a multi-mission program, including the launch of more satellites and many other missions, the role of the archive systems becomes larger and more critical. In 1995, at the fourth convening of NASA Mass Storage Systems and Technologies Conference, the development plans for the EOSDIS information system and archive were described. Five years later, many changes have occurred in the effort to field an operational system. It is interesting to reflect on some of the changes driving the archive technology and system development for EOSDIS. This paper principally describes the Data Server subsystem including how the other subsystems access the archive, the nature of the data repository, and the mass-storage I/O management. The paper reviews the system architecture (both hardware and software) of the basic components of the archive. It discusses the operations concept, code development, and testing phase of the system. Finally, it describes the future plans for the archive.

  5. A new archival infrastructure for highly-structured astronomical data

    NASA Astrophysics Data System (ADS)

    Dovgan, Erik; Knapic, Cristina; Sponza, Massimo; Smareglia, Riccardo

    2018-03-01

    With the advent of the 2020 Radio Astronomy Telescopes era, the amount and format of the radioastronomical data is becoming a massive and performance-critical challenge. Such an evolution of data models and data formats require new data archiving techniques that allow massive and fast storage of data that are at the same time also efficiently processed. A useful expertise for efficient archiviation has been obtained through data archiving of Medicina and Noto Radio Telescopes. The presented archival infrastructure named the Radio Archive stores and handles various formats, such as FITS, MBFITS, and VLBI's XML, which includes description and ancillary files. The modeling and architecture of the archive fulfill all the requirements of both data persistence and easy data discovery and exploitation. The presented archive already complies with the Virtual Observatory directives, therefore future service implementations will also be VO compliant. This article presents the Radio Archive services and tools, from the data acquisition to the end-user data utilization.

  6. NASA Remote Sensing Data in Earth Sciences: Processing, Archiving, Distribution, Applications at the GES DISC

    NASA Technical Reports Server (NTRS)

    Leptoukh, Gregory G.

    2005-01-01

    The NASA Goddard Earth Sciences Data and Information Services Center (GES DISC) is one of the major Distributed Active Archive Centers (DAACs) archiving and distributing remote sensing data from the NASA's Earth Observing System. In addition to providing just data, the GES DISC/DAAC has developed various value-adding processing services. A particularly useful service is data processing a t the DISC (i.e., close to the input data) with the users' algorithms. This can take a number of different forms: as a configuration-managed algorithm within the main processing stream; as a stand-alone program next to the on-line data storage; as build-it-yourself code within the Near-Archive Data Mining (NADM) system; or as an on-the-fly analysis with simple algorithms embedded into the web-based tools (to avoid downloading unnecessary all the data). The existing data management infrastructure at the GES DISC supports a wide spectrum of options: from data subsetting data spatially and/or by parameter to sophisticated on-line analysis tools, producing economies of scale and rapid time-to-deploy. Shifting processing and data management burden from users to the GES DISC, allows scientists to concentrate on science, while the GES DISC handles the data management and data processing at a lower cost. Several examples of successful partnerships with scientists in the area of data processing and mining are presented.

  7. Latin American Archives.

    ERIC Educational Resources Information Center

    Belsunce, Cesar A. Garcia

    1983-01-01

    Examination of the situation of archives in four Latin American countries--Argentina, Brazil, Colombia, and Costa Rica--highlights national systems, buildings, staff, processing of documents, accessibility and services to the public and publications and extension services. (EJS)

  8. Satellite and earth science data management activities at the U.S. geological survey's EROS data center

    USGS Publications Warehouse

    Carneggie, David M.; Metz, Gary G.; Draeger, William C.; Thompson, Ralph J.

    1991-01-01

    The U.S. Geological Survey's Earth Resources Observation Systems (EROS) Data Center, the national archive for Landsat data, has 20 years of experience in acquiring, archiving, processing, and distributing Landsat and earth science data. The Center is expanding its satellite and earth science data management activities to support the U.S. Global Change Research Program and the National Aeronautics and Space Administration (NASA) Earth Observing System Program. The Center's current and future data management activities focus on land data and include: satellite and earth science data set acquisition, development and archiving; data set preservation, maintenance and conversion to more durable and accessible archive medium; development of an advanced Land Data Information System; development of enhanced data packaging and distribution mechanisms; and data processing, reprocessing, and product generation systems.

  9. Archive of sediment data from vibracores collected in 2010 offshore of the Mississippi barrier islands

    USGS Publications Warehouse

    Kelso, Kyle W.; Flocks, James G.

    2015-01-01

    Selection of the core site locations was based on geophysical surveys conducted around the islands from 2008 to 2010. The surveys, using acoustic systems to image and interpret the nearsurface stratigraphy, were conducted to investigate the geologic controls on island evolution. This data series serves as an archive of sediment data collected from August to September 2010, offshore of the Mississippi barrier islands. Data products, including descriptive core logs, core photographs, results of sediment grain-size analyses, sample location maps, and geographic information system (GIS) data files with accompanying formal Federal Geographic Data Committee (FDGC) metadata can be downloaded from the data products and downloads page.

  10. Suomi Npp and Jpss Pre-Launch Test Data Collection and Archive

    NASA Astrophysics Data System (ADS)

    Denning, M.; Ullman, R.; Guenther, B.; Kilcoyne, H.; Chandler, C.; Adameck, J.

    2012-12-01

    During the development of each Suomi National Polar-orbiting Partnership (Suomi NPP) instrument, significant testing was performed, both in ambient and simulated orbital (thermal-vacuum) conditions, at the instrument factory, and again after integration with the spacecraft. The NPOESS Integrated Program Office (IPO), and later the NASA Joint Polar Satellite System (JPSS) Program Office, defined two primary objectives with respect to capturing instrument and spacecraft test data during these test events. The first objective was to disseminate test data and auxiliary documentation to an often distributed network of scientists to permit timely production of independent assessments of instrument performance, calibration, data quality, and test progress. The second goal was to preserve the data and documentation in a catalogued government archive for the life of the mission, to aid in the resolution of anomalies and to facilitate the comparison of on-orbit instrument operating characteristics to those observed prior to launch. In order to meet these objectives, Suomi NPP pre-launch test data collection, distribution, processing, and archive methods included adaptable support infrastructures to quickly and completely transfer test data and documentation from the instrument and spacecraft factories to sensor scientist teams on-site at the factory and around the country. These methods were unique, effective, and low in cost. These efforts supporting pre-launch instrument calibration permitted timely data quality assessments and technical feedback from contributing organizations within the government, academia, and industry, and were critical in supporting timely sensor development. Second, in parallel to data distribution to the sensor science teams, pre-launch test data were transferred and ingested into the central Suomi NPP calibration and validation (cal/val) system, known as the Government Resource for Algorithm Verification, Independent Testing, and Evaluation (GRAVITE), where they will reside for the life of the mission. As a result, data and documentation are available for query, analysis, and download by the cal/val community via the command-line GRAVITE Transfer Protocol (GTP) tool or via the NOAA-collaborative website "CasaNOSA". Instrument and spacecraft test data, telemetry, and ground support equipment information were collected and organized with detailed test procedures, logs, analyses, characterizations, and reports. This 45 Terabyte archive facilitates the comparison of on-orbit Suomi NPP operating characteristics with that observed prior to launch, and will serve as a resource to aid in the assessment of pre-launch JPSS-1 sensor performance. In summary, this paper will present the innovative pre-launch test data campaign infrastructures employed for Suomi NPP and planned for JPSS-1.

  11. Automated customized retrieval of radiotherapy data for clinical trials, audit and research.

    PubMed

    Romanchikova, Marina; Harrison, Karl; Burnet, Neil G; Hoole, Andrew Cf; Sutcliffe, Michael Pf; Parker, Michael Andrew; Jena, Rajesh; Thomas, Simon James

    2018-02-01

    To enable fast and customizable automated collection of radiotherapy (RT) data from tomotherapy storage. Human-readable data maps (TagMaps) were created to generate DICOM-RT (Digital Imaging and Communications in Medicine standard for Radiation Therapy) data from tomotherapy archives, and provided access to "hidden" information comprising delivery sinograms, positional corrections and adaptive-RT doses. 797 data sets totalling 25,000 scans were batch-exported in 31.5 h. All archived information was restored, including the data not available via commercial software. The exported data were DICOM-compliant and compatible with major commercial tools including RayStation, Pinnacle and ProSoma. The export ran without operator interventions. The TagMap method for DICOM-RT data modelling produced software that was many times faster than the vendor's solution, required minimal operator input and delivered high volumes of vendor-identical DICOM data. The approach is applicable to many clinical and research data processing scenarios and can be adapted to recover DICOM-RT data from other proprietary storage types such as Elekta, Pinnacle or ProSoma. Advances in knowledge: A novel method to translate data from proprietary storage to DICOM-RT is presented. It provides access to the data hidden in electronic archives, offers a working solution to the issues of data migration and vendor lock-in and paves the way for large-scale imaging and radiomics studies.

  12. Depending on Partnerships to Manage NASA's Earth Science Data

    NASA Astrophysics Data System (ADS)

    Behnke, J.; Lindsay, F. E.; Lowe, D. R.

    2015-12-01

    NASA's Earth Observing System Data and Information System (EOSDIS) has been a central component of the NASA Earth observation program since the 1990's.The data collected by NASA's remote sensing instruments represent a significant public investment in research, providing access to a world-wide public research community. From the beginning, NASA employed a free, open and non-discriminatory data policy to maximize the global utilization of the products derived from NASA's observational data and related analyses. EOSDIS is designed to ingest, process, archive, and distribute data in a multi-mission environment. The system supports a wide variety of Earth science disciplines, including cryosphere, land cover change, radiation budget, atmosphere dynamics and composition, as well as inter-disciplinary research, including global climate change. To this end, EOSDIS has collocated NASA Earth science data and processing with centers of science discipline expertise located at universities, other government agencies and NASA centers. Commercial industry is also part of this partnership as it focuses on developing the EOSDIS cross-element infrastructure. The partnership to develop and operate EOSDIS has made for a robust, flexible system that evolves continuously to take advantage of technological opportunities. The centralized entrance point to the NASA Earth Science data collection can be found at http://earthdata.nasa.gov. A distributed architecture was adopted to ensure discipline-specific support for the science data, while also leveraging standards and establishing policies and tools to enable interdisciplinary research, and analysis across multiple instruments. Today's EOSDIS is a loosely coupled, yet heterogeneous system designed to meet the requirements of both a diverse user community and a growing collection of data to be archived and distributed. The system was scaled to expand to meet the ever-growing volume of data (currently ~10 petabytes), and the exponential increase in user demand that has occurred over the past 15 years. We will present how the EOSDIS has relies on partnerships to support the challenges of managing NASA's Earth Science data.

  13. USAID Expands eMODIS Coverage for Famine Early Warning

    NASA Astrophysics Data System (ADS)

    Jenkerson, C.; Meyer, D. J.; Evenson, K.; Merritt, M.

    2011-12-01

    Food security in countries at risk is monitored by U.S. Agency for International Development (USAID) through its Famine Early Warning Systems Network (FEWS NET) using many methods including Moderate Resolution Imaging Spectroradiometer (MODIS) data processed by U.S. Geological Survey (USGS) into eMODIS Normalized Difference Vegetation Index (NDVI) products. Near-real time production is used comparatively with trends derived from the eMODIS archive to operationally monitor vegetation anomalies indicating threatened cropland and rangeland conditions. eMODIS production over Central America and the Caribbean (CAMCAR) began in 2009, and processes 10-day NDVI composites every 5 days from surface reflectance inputs produced using predicted spacecraft and climatology information at Land and Atmosphere Near real time Capability for Earth Observing Systems (EOS) (LANCE). These expedited eMODIS composites are backed by a parallel archive of precision-based NDVI calculated from surface reflectance data ordered through Level 1 and Atmosphere Archive and Distribution System (LAADS). Success in the CAMCAR region led to the recent expansion of eMODIS production to include Africa in 2010, and Central Asia in 2011. Near-real time 250-meter products are available for each region on the last day of an acquisition interval (generally before midnight) from an anonymous file transfer protocol (FTP) distribution site (ftp://emodisftp.cr.usgs.gov/eMODIS). The FTP site concurrently hosts the regional historical collections (2000 to present) which are also searchable using the USGS Earth Explorer (http://edcsns17.cr.usgs.gov/NewEarthExplorer). As eMODIS coverage continues to grow, these geographically gridded, georeferenced tagged image file format (GeoTIFF) NDVI composites increase their utility as effective tools for operational monitoring of near-real time vegetation data against historical trends.

  14. Digital Image Support in the ROADNet Real-time Monitoring Platform

    NASA Astrophysics Data System (ADS)

    Lindquist, K. G.; Hansen, T. S.; Newman, R. L.; Vernon, F. L.; Nayak, A.; Foley, S.; Fricke, T.; Orcutt, J.; Rajasekar, A.

    2004-12-01

    The ROADNet real-time monitoring infrastructure has allowed researchers to integrate geophysical monitoring data from a wide variety of signal domains. Antelope-based data transport, relational-database buffering and archiving, backup/replication/archiving through the Storage Resource Broker, and a variety of web-based distribution tools create a powerful monitoring platform. In this work we discuss our use of the ROADNet system for the collection and processing of digital image data. Remote cameras have been deployed at approximately 32 locations as of September 2004, including the SDSU Santa Margarita Ecological Reserve, the Imperial Beach pier, and the Pinon Flats geophysical observatory. Fire monitoring imagery has been obtained through a connection to the HPWREN project. Near-real-time images obtained from the R/V Roger Revelle include records of seafloor operations by the JASON submersible, as part of a maintenance mission for the H2O underwater seismic observatory. We discuss acquisition mechanisms and the packet architecture for image transport via Antelope orbservers, including multi-packet support for arbitrarily large images. Relational database storage supports archiving of timestamped images, image-processing operations, grouping of related images and cameras, support for motion-detect triggers, thumbnail images, pre-computed video frames, support for time-lapse movie generation and storage of time-lapse movies. Available ROADNet monitoring tools include both orbserver-based display of incoming real-time images and web-accessible searching and distribution of images and movies driven by the relational database (http://mercali.ucsd.edu/rtapps/rtimbank.php). An extension to the Kepler Scientific Workflow System also allows real-time image display via the Ptolemy project. Custom time-lapse movies may be made from the ROADNet web pages.

  15. Digitizing Villanova University's Eclipsing Binary Card Catalogue

    NASA Astrophysics Data System (ADS)

    Guzman, Giannina; Dalton, Briana; Conroy, Kyle; Prsa, Andrej

    2018-01-01

    Villanova University’s Department of Astrophysics and Planetary Science has years of hand-written archival data on Eclipsing Binaries at its disposal. This card catalog began at Princeton in the 1930’s with notable contributions from scientists such as Henry Norris Russel. During World War II, the archive was moved to the University of Pennsylvania, which was one of the world centers for Eclipsing Binary research, consequently, the contributions to the catalog during this time were immense. It was then moved to University of Florida at Gainesville before being accepted by Villanova in the 1990’s. The catalog has been kept in storage since then. The objective of this project is to digitize this archive and create a fully functional online catalog that contains the information available on the cards, along with the scan of the actual cards. Our group has built a database using a python-powered infrastructure to contain the collected data. The team also built a prototype web-based searchable interface as a front-end to the catalog. Following the data-entry process, information like the Right Ascension and Declination will be run against SIMBAD and any differences between values will be noted as part of the catalog. Information published online from the card catalog and even discrepancies in information for a star, could be a catalyst for new studies on these Eclipsing Binaries. Once completed, the database-driven interface will be made available to astronomers worldwide. The group will also acquire, from the database, a list of referenced articles that have yet to be found online in order to further pursue their digitization. This list will be comprised of references in the cards that were neither found on ADS nor online during the data-entry process. Pursuing the integration of these references to online queries such as ADS will be an ongoing process that will contribute and further facilitate studies on Eclipsing Binaries.

  16. The Expansion of the Astronomical Photographic Data Archive at PARI

    NASA Astrophysics Data System (ADS)

    Cline, J. Donald; Barker, Thurburn; Castelaz, Michael

    2017-01-01

    A diverse set of photometric, astrometric, spectral and surface brightness data exist on decades of photographic glass plates. The Astronomical Photographic Data Archive (APDA) at the Pisgah Astronomical Research Institute (PARI) was established in November 2007 and is dedicated to the task of collecting, restoring, preserving and storing astronomical photographic data and PARI continues to accept collections. APDA is also tasked with scanning each image and establishing a database of images that can be accessed via the Internet by the global community of scientists, researchers and students. APDA is a new type of astronomical observatory - one that harnesses analog data of the night sky taken for more than a century and making that data available in a digital format.In 2016, APDA expanded from 50 collections with about 220,000 plates to more than 55 collections and more than 340,000 plates and films. These account for more than 30% of all astronomical photographic data in the United States. The largest of the new acquisitions are the astronomical photographic plates in the Yale University collection. We present details of the newly added collections and review of other collections in APDA.

  17. On the Structure of Earth Science Data Collections

    NASA Astrophysics Data System (ADS)

    Barkstrom, B. R.

    2009-12-01

    While there has been substantial work in the IT community regarding metadata and file identifier schemas, there appears to be relatively little work on the organization of the file collections that constitute the preponderance of Earth science data. One symptom of this difficulty appears in nomenclature describing collections: the terms `Data Product,' `Data Set,' and `Version' are overlaid with multiple meanings between communities. A particularly important aspect of this lack of standardization appears when the community attempts to developa schema for data file identifiers. There are four candidate families of identifiers: ● Randomly assigned identifiers, such as GUIDs or UUIDs, ● Segmented numerical identifiers, such as OIDs or the prefixes for DOIs, ● Extensible URL-based identifiers, such as URNs, PURL, ARK, and similar schemas, ● Text-based identifiers based on citations for papers and books, such as those suggested for the International Polar Year (IPY) citations. Unfortunately, these schema families appear to be devoid of content based on the actual structures of Earth science data collections. In this paper, we consider an organization based on an industrial production paradigm that appears to provide the preponderance of Earth science data from satellites and in situ observations. This paradigm produces a hierarchical collection structure, similar to one discussed in Barkstrom [2003: Lecture Notes in Computer Science, 2649, pp. 118-133]. In this organization, three key collection types are ● a Data Product, which is a collection of files that have similar key parameters and included data time interval, ● a Data Set, which is a collection of files within a Data Product that comes from a specified set of Data Sources, ● a Data Set Version, which is a collection of files within a Data Set for which the data producer has attempted to ensure error homogeneity. Within a Data Set Version, files appear as a time series of instances that may be identified by the starting time of the data in the file. For data intended for climate uses, it seems appropriate to state this time in terms of Astronomical Julian Date, which is a long-standing international standard that provides continuity between current observations and paleo-climatic observations. Because this collection structure is hierarchical, it could be used by either of the two hierarchical identifier schema families, although it is probably easier to use with the OID/DOI family. This hierarchical collection structure fits into the hierarchical structure of Archival Information Packages (AIPs) identified in the Open Archival Information Systems (OAIS) Reference Model. In that model, AIPs are subdivided into Archival Information Units (AIUs), which describe individual files, or Archival Information Collections (AICs). The latter can be hierarchically nested, leading to an OAIS RM-consistent collection structure that does not appear clearly in other metadata standards. This paper will also discuss the connection between these collection categories and other metadata, as well as the possible need for other organizational schemas to capture the full range of Earth science data collection structures.

  18. A statistical analysis of IUE spectra of dwarf novae and nova-like stars

    NASA Technical Reports Server (NTRS)

    Ladous, Constanze

    1990-01-01

    First results of a statistical analysis of the IUE International Ultraviolet Explorer archive on dwarf novae and nova like stars are presented. The archive contains approximately 2000 low resolution spectra of somewhat over 100 dwarf novae and nova like stars. Many of these were looked at individually, but so far the collective information content of this set of data has not been explored. The first results of work are reported.

  19. WHOI and SIO (I): Next Steps toward Multi-Institution Archiving of Shipboard and Deep Submergence Vehicle Data

    NASA Astrophysics Data System (ADS)

    Detrick, R. S.; Clark, D.; Gaylord, A.; Goldsmith, R.; Helly, J.; Lemmond, P.; Lerner, S.; Maffei, A.; Miller, S. P.; Norton, C.; Walden, B.

    2005-12-01

    The Scripps Institution of Oceanography (SIO) and the Woods Hole Oceanographic Institution (WHOI) have joined forces with the San Diego Supercomputer Center to build a testbed for multi-institutional archiving of shipboard and deep submergence vehicle data. Support has been provided by the Digital Archiving and Preservation program funded by NSF/CISE and the Library of Congress. In addition to the more than 92,000 objects stored in the SIOExplorer Digital Library, the testbed will provide access to data, photographs, video images and documents from WHOI ships, Alvin submersible and Jason ROV dives, and deep-towed vehicle surveys. An interactive digital library interface will allow combinations of distributed collections to be browsed, metadata inspected, and objects displayed or selected for download. The digital library architecture, and the search and display tools of the SIOExplorer project, are being combined with WHOI tools, such as the Alvin Framegrabber and the Jason Virtual Control Van, that have been designed using WHOI's GeoBrowser to handle the vast volumes of digital video and camera data generated by Alvin, Jason and other deep submergence vehicles. Notions of scalability will be tested, as data volumes range from 3 CDs per cruise to 200 DVDs per cruise. Much of the scalability of this proposal comes from an ability to attach digital library data and metadata acquisition processes to diverse sensor systems. We are able to run an entire digital library from a laptop computer as well as from supercomputer-center-size resources. It can be used, in the field, laboratory or classroom, covering data from acquisition-to-archive using a single coherent methodology. The design is an open architecture, supporting applications through well-defined external interfaces maintained as an open-source effort for community inclusion and enhancement.

  20. Data hosting infrastructure for primary biodiversity data

    PubMed Central

    2011-01-01

    Background Today, an unprecedented volume of primary biodiversity data are being generated worldwide, yet significant amounts of these data have been and will continue to be lost after the conclusion of the projects tasked with collecting them. To get the most value out of these data it is imperative to seek a solution whereby these data are rescued, archived and made available to the biodiversity community. To this end, the biodiversity informatics community requires investment in processes and infrastructure to mitigate data loss and provide solutions for long-term hosting and sharing of biodiversity data. Discussion We review the current state of biodiversity data hosting and investigate the technological and sociological barriers to proper data management. We further explore the rescuing and re-hosting of legacy data, the state of existing toolsets and propose a future direction for the development of new discovery tools. We also explore the role of data standards and licensing in the context of data hosting and preservation. We provide five recommendations for the biodiversity community that will foster better data preservation and access: (1) encourage the community's use of data standards, (2) promote the public domain licensing of data, (3) establish a community of those involved in data hosting and archival, (4) establish hosting centers for biodiversity data, and (5) develop tools for data discovery. Conclusion The community's adoption of standards and development of tools to enable data discovery is essential to sustainable data preservation. Furthermore, the increased adoption of open content licensing, the establishment of data hosting infrastructure and the creation of a data hosting and archiving community are all necessary steps towards the community ensuring that data archival policies become standardized. PMID:22373257

Top