Production of Previews and Advanced Data Products for the ESO Science Archive
NASA Astrophysics Data System (ADS)
Rité, C.; Slijkhuis, R.; Rosati, P.; Delmotte, N.; Rino, B.; Chéreau, F.; Malapert, J.-C.
2008-08-01
We present a project being carried out by the Virtual Observatory Systems Department/Advanced Data Products group in order to populate the ESO Science Archive Facility with image previews and advanced data products. The main goal is to provide users of the ESO Science Archive Facility with the possibility of viewing pre-processed images associated with instruments like WFI, ISAAC and SOFI before actually retrieving the data for full processing. The image processing is done by using the ESO/MVM image reduction software developed at ESO, to produce astrometrically calibrated FITS images, ranging from simple previews of single archive images, to fully stacked mosaics. These data products can be accessed via the ESO Science Archive Query Form and also be viewed with the browser VirGO {http://archive.eso.org/cms/virgo}.
On-the-fly Data Reprocessing and Analysis Capabilities from the XMM-Newton Archive
NASA Astrophysics Data System (ADS)
Ibarra, A.; Sarmiento, M.; Colomo, E.; Loiseau, N.; Salgado, J.; Gabriel, C.
2017-10-01
The XMM-Newton Science Archive (XSA) includes since last release the possibility to perform on-the-fly data processing with SAS through the Remote Interface for Science Analysis (RISA) server. It enables scientists to analyse data without any download and installation of data and software. The analysis options presently available include extraction of spectra and light curves of user-defined EPIC source regions and full reprocessing of data for which currently archived pipeline products were processed with older SAS versions or calibration files. The current pipeline is fully aligned with the most recent SAS and calibration, while the last full reprocessing of the archive was performed in 2013. The on-the-fly data processing functionality in this release is an experimental version and we invite the community to test and let us know their results. Known issues and workarounds are described in the 'Watchouts' section of the XSA web page. Feedback on how this functionality should evolve will be highly appreciated.
Migration of medical image data archived using mini-PACS to full-PACS.
Jung, Haijo; Kim, Hee-Joung; Kang, Won-Suk; Lee, Sang-Ho; Kim, Sae-Rome; Ji, Chang Lyong; Kim, Jung-Han; Yoo, Sun Kook; Kim, Ki-Hwang
2004-06-01
This study evaluated the migration to full-PACS of medical image data archived using mini-PACS at two hospitals of the Yonsei University Medical Center, Seoul, Korea. A major concern in the migration of medical data is to match the image data from the mini-PACS with the hospital OCS (Ordered Communication System). Prior to carrying out the actual migration process, the principles, methods, and anticipated results for the migration with respect to both cost and effectiveness were evaluated. Migration gateway workstations were established and a migration software tool was developed. The actual migration process was performed based on the results of several migration simulations. Our conclusions were that a migration plan should be carefully prepared and tailored to the individual hospital environment because the server system, archive media, network, OCS, and policy for data management may be unique.
Lessons Learned While Exploring Cloud-Native Architectures for NASA EOSDIS Applications and Systems
NASA Technical Reports Server (NTRS)
Pilone, Dan; Mclaughlin, Brett; Plofchan, Peter
2017-01-01
NASA's Earth Observing System (EOS) is a coordinated series of satellites for long term global observations. NASA's Earth Observing System Data and Information System (EOSDIS) is a multi-petabyte-scale archive of environmental data that supports global climate change research by providing end-to-end services from EOS instrument data collection to science data processing to full access to EOS and other earth science data. On a daily basis, the EOSDIS ingests, processes, archives and distributes over 3 terabytes of data from NASA's Earth Science missions representing over 6000 data products ranging from various types of science disciplines. EOSDIS has continually evolved to improve the discoverability, accessibility, and usability of high-impact NASA data spanning the multi-petabyte-scale archive of Earth science data products. Reviewed and approved by Chris Lynnes.
Remotely Sensed Imagery from USGS: Update on Products and Portals
NASA Astrophysics Data System (ADS)
Lamb, R.; Lemig, K.
2016-12-01
The USGS Earth Resources Observation and Science (EROS) Center has recently implemented a number of additions and changes to its existing suite of products and user access systems. Together, these changes will enhance the accessibility, breadth, and usability of the remotely sensed image products and delivery mechanisms available from USGS. As of late 2016, several new image products are now available for public download at no charge from USGS/EROS Center. These new products include: (1) global Level 1T (precision terrain-corrected) products from Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER), provided via NASA's Land Processes Distributed Active Archive Center (LP DAAC); and (2) Sentinel-2 Multispectral Instrument (MSI) products, available through a collaborative effort with the European Space Agency (ESA). Other new products are also planned to become available soon. In an effort to enable future scientific analysis of the full 40+ year Landsat archive, the USGS also introduced a new "Collection Management" strategy for all Landsat Level 1 products. This new archive and access schema involves quality-based tier designations that will support future time series analysis of the historic Landsat archive at the pixel level. Along with the quality tier designations, the USGS has also implemented a number of other Level 1 product improvements to support Landsat science applications, including: enhanced metadata, improved geometric processing, refined quality assessment information, and angle coefficient files. The full USGS Landsat archive is now being reprocessed in accordance with the new `Collection 1' specifications. Several USGS data access and visualization systems have also seen major upgrades. These user interfaces include a new version of the USGS LandsatLook Viewer which was released in Fall 2017 to provide enhanced functionality and Sentinel-2 visualization and access support. A beta release of the USGS Global Visualization Tool ("GloVis Next") was also released in Fall 2017, with many new features including data visualization at full resolution. The USGS also introduced a time-enabled web mapping service (WMS) to support time-based access to the existing LandsatLook "natural color" full-resolution browse image services.
NASA Astrophysics Data System (ADS)
Purss, Matthew; Lewis, Adam; Edberg, Roger; Ip, Alex; Sixsmith, Joshua; Frankish, Glenn; Chan, Tai; Evans, Ben; Hurst, Lachlan
2013-04-01
Australia's Earth Observation Program has downlinked and archived satellite data acquired under the NASA Landsat mission for the Australian Government since the establishment of the Australian Landsat Station in 1979. Geoscience Australia maintains this archive and produces image products to aid the delivery of government policy objectives. Due to the labor intensive nature of processing of this data there have been few national-scale datasets created to date. To compile any Earth Observation product the historical approach has been to select the required subset of data and process "scene by scene" on an as-needed basis. As data volumes have increased over time, and the demand for the processed data has also grown, it has become increasingly difficult to rapidly produce these products and achieve satisfactory policy outcomes using these historic processing methods. The result is that we have been "drowning in a sea of uncalibrated data" and scientists, policy makers and the public have not been able to realize the full potential of the Australian Landsat Archive and its value is therefore significantly diminished. To overcome this critical issue, the Australian Space Research Program has funded the "Unlocking the Landsat Archive" (ULA) Project from April 2011 to June 2013 to improve the access and utilization of Australia's archive of Landsat data. The ULA Project is a public-private consortium led by Lockheed Martin Australia (LMA) and involving Geoscience Australia (GA), the Victorian Partnership for Advanced Computing (VPAC), the National Computational Infrastructure (NCI) at the Australian National University (ANU) and the Cooperative Research Centre for Spatial Information (CRC-SI). The outputs from the ULA project will become a fundamental component of Australia's eResearch infrastructure, with the Australian Landsat Archive hosted on the NCI and made openly available under a creative commons license. NCI provides access to researchers through significant HPC supercomputers, cloud infrastructure and data resources along with a large catalogue of software tools that make it possible to fully explore the potential of this data. Under the ULA Project, Geoscience Australia has developed a data-intensive processing workflow on the NCI. This system has allowed us to successfully process 11 years of the Australian Landsat Archive (from 2000 to 2010 inclusive) to standardized well-calibrated and sensor independent data products at a rate that allows for both bulk data processing of the archive and near-realtime processing of newly acquired satellite data. These products are available as Optical Surface Reflectance 25m (OSR25) and other derived products, such as Fractional Cover.
Status of the TESS Science Processing Operations Center
NASA Technical Reports Server (NTRS)
Jenkins, Jon M.; Twicken, Joseph D.; Campbell, Jennifer; Tenebaum, Peter; Sanderfer, Dwight; Davies, Misty D.; Smith, Jeffrey C.; Morris, Rob; Mansouri-Samani, Masoud; Girouardi, Forrest;
2017-01-01
The Transiting Exoplanet Survey Satellite (TESS) science pipeline is being developed by the Science Processing Operations Center (SPOC) at NASA Ames Research Center based on the highly successful Kepler Mission science pipeline. Like the Kepler pipeline, the TESS science pipeline will provide calibrated pixels, simple and systematic error-corrected aperture photometry, and centroid locations for all 200,000+ target stars, observed over the 2-year mission, along with associated uncertainties. The pixel and light curve products are modeled on the Kepler archive products and will be archived to the Mikulski Archive for Space Telescopes (MAST). In addition to the nominal science data, the 30-minute Full Frame Images (FFIs) simultaneously collected by TESS will also be calibrated by the SPOC and archived at MAST. The TESS pipeline will search through all light curves for evidence of transits that occur when a planet crosses the disk of its host star. The Data Validation pipeline will generate a suite of diagnostic metrics for each transit-like signature discovered, and extract planetary parameters by fitting a limb-darkened transit model to each potential planetary signature. The results of the transit search will be modeled on the Kepler transit search products (tabulated numerical results, time series products, and pdf reports) all of which will be archived to MAST.
Simple, Script-Based Science Processing Archive
NASA Technical Reports Server (NTRS)
Lynnes, Christopher; Hegde, Mahabaleshwara; Barth, C. Wrandle
2007-01-01
The Simple, Scalable, Script-based Science Processing (S4P) Archive (S4PA) is a disk-based archival system for remote sensing data. It is based on the data-driven framework of S4P and is used for data transfer, data preprocessing, metadata generation, data archive, and data distribution. New data are automatically detected by the system. S4P provides services such as data access control, data subscription, metadata publication, data replication, and data recovery. It comprises scripts that control the data flow. The system detects the availability of data on an FTP (file transfer protocol) server, initiates data transfer, preprocesses data if necessary, and archives it on readily available disk drives with FTP and HTTP (Hypertext Transfer Protocol) access, allowing instantaneous data access. There are options for plug-ins for data preprocessing before storage. Publication of metadata to external applications such as the Earth Observing System Clearinghouse (ECHO) is also supported. S4PA includes a graphical user interface for monitoring the system operation and a tool for deploying the system. To ensure reliability, S4P continuously checks stored data for integrity, Further reliability is provided by tape backups of disks made once a disk partition is full and closed. The system is designed for low maintenance, requiring minimal operator oversight.
Astro-WISE: Chaining to the Universe
NASA Astrophysics Data System (ADS)
Valentijn, E. A.; McFarland, J. P.; Snigula, J.; Begeman, K. G.; Boxhoorn, D. R.; Rengelink, R.; Helmich, E.; Heraudeau, P.; Verdoes Kleijn, G.; Vermeij, R.; Vriend, W.-J.; Tempelaar, M. J.; Deul, E.; Kuijken, K.; Capaccioli, M.; Silvotti, R.; Bender, R.; Neeser, M.; Saglia, R.; Bertin, E.; Mellier, Y.
2007-10-01
The recent explosion of recorded digital data and its processed derivatives threatens to overwhelm researchers when analysing their experimental data or looking up data items in archives and file systems. While current hardware developments allow the acquisition, processing and storage of hundreds of terabytes of data at the cost of a modern sports car, the software systems to handle these data are lagging behind. This problem is very general and is well recognized by various scientific communities; several large projects have been initiated, e.g., DATAGRID/EGEE {http://www.eu-egee.org/} federates compute and storage power over the high-energy physical community, while the international astronomical community is building an Internet geared Virtual Observatory {http://www.euro-vo.org/pub/} (Padovani 2006) connecting archival data. These large projects either focus on a specific distribution aspect or aim to connect many sub-communities and have a relatively long trajectory for setting standards and a common layer. Here, we report first light of a very different solution (Valentijn & Kuijken 2004) to the problem initiated by a smaller astronomical IT community. It provides an abstract scientific information layer which integrates distributed scientific analysis with distributed processing and federated archiving and publishing. By designing new abstractions and mixing in old ones, a Science Information System with fully scalable cornerstones has been achieved, transforming data systems into knowledge systems. This break-through is facilitated by the full end-to-end linking of all dependent data items, which allows full backward chaining from the observer/researcher to the experiment. Key is the notion that information is intrinsic in nature and thus is the data acquired by a scientific experiment. The new abstraction is that software systems guide the user to that intrinsic information by forcing full backward and forward chaining in the data modelling.
Costs and Benefits of Mission Participation in PDS4 Migrations
NASA Astrophysics Data System (ADS)
Mafi, J. N.; King, T. A.; Cecconi, B.; Faden, J.; Piker, C.; Kazden, D. P.; Gordon, M. K.; Joy, S. P.
2017-12-01
The Planetary Data System, Version 4 (PDS4) Standard, was a major reworking of the previous, PDS3 standard. According to PDS policy, "NASA missions confirmed for flight after [1 November 2011 were] required to archive their data according to PDS4 standards." Accordingly, NASA missions starting with LADEE (launched September 2013), and MAVEN (launched November 2013) have used the PDS4 standard. However, a large legacy of previously archived NASA planetary mission data already reside in the PDS archive in PDS3 and older formats. Plans to migrate the existing PDS archives to PDS4 have been discussed within PDS for some time, and have been reemphasized in the PDS Roadmap Study for 2017 - 2026 (https://pds.nasa.gov/roadmap/PlanetaryDataSystemRMS17-26_20jun17.pdf). Updating older PDS metadata to PDS4 would enable those data to take advantage of new capabilities offered by PDS4, and insure the full compatibility of past archives with current and future PDS4 tools and services. Responsibility for performing the migration to PDS4 falls primarily upon the PDS discipline nodes, though some support by the active (or recently active) instrument teams would be required in order to help augment the existing metadata to include information that is unique to PDS4. However, there may be some value in mission data providers becoming more actively involved in the migration process. The upfront costs of this approach may be offset by the long term benefits of data provider's understanding of PDS4, their ability to take more full advantage of PDS4 tools and services, and in their preparation for producing PDS4 archives for future missions. This presentation will explore the costs and benefits associated with this approach.
Williams, R.S.; Lyons, T.R.; Ferrigno, J.G.; Quinn, M.C.
1984-01-01
Discusses the programme on reproducing the 1930's and early 1940's nitrate aerial photographs of large areas of the US onto stable-base safety film, and the proceedings of a February 1981 meeting at the National Archives and Records Service, General Services Administration, which discussed the programme and inspected the results of the new full-size (1:1), roll-to-roll conversions. The latter process was found to be acceptable to all current and envisaged future users of this photography.-R.House
Exploring Digisonde Ionogram Data with SAO-X and DIDBase
NASA Astrophysics Data System (ADS)
Khmyrov, Grigori M.; Galkin, Ivan A.; Kozlov, Alexander V.; Reinisch, Bodo W.; McElroy, Jonathan; Dozois, Claude
2008-02-01
A comprehensive suite of software tools for ionogram data analysis and archiving has been developed at UMLCAR to support the exploration of raw and processed data from the worldwide network of digisondes in a low-latency, user-friendly environment. Paired with the remotely accessible Digital Ionogram Data Base (DIDBase), the SAO Explorer software serves as an example of how an academic institution conscientiously manages its resident data archive while local experts continue to work on design of new and improved data products, all in the name of free public access to the full roster of acquired ionospheric sounding data.
A new dataset validation system for the Planetary Science Archive
NASA Astrophysics Data System (ADS)
Manaud, N.; Zender, J.; Heather, D.; Martinez, S.
2007-08-01
The Planetary Science Archive is the official archive for the Mars Express mission. It has received its first data by the end of 2004. These data are delivered by the PI teams to the PSA team as datasets, which are formatted conform to the Planetary Data System (PDS). The PI teams are responsible for analyzing and calibrating the instrument data as well as the production of reduced and calibrated data. They are also responsible of the scientific validation of these data. ESA is responsible of the long-term data archiving and distribution to the scientific community and must ensure, in this regard, that all archived products meet quality. To do so, an archive peer-review is used to control the quality of the Mars Express science data archiving process. However a full validation of its content is missing. An independent review board recently recommended that the completeness of the archive as well as the consistency of the delivered data should be validated following well-defined procedures. A new validation software tool is being developed to complete the overall data quality control system functionality. This new tool aims to improve the quality of data and services provided to the scientific community through the PSA, and shall allow to track anomalies in and to control the completeness of datasets. It shall ensure that the PSA end-users: (1) can rely on the result of their queries, (2) will get data products that are suitable for scientific analysis, (3) can find all science data acquired during a mission. We defined dataset validation as the verification and assessment process to check the dataset content against pre-defined top-level criteria, which represent the general characteristics of good quality datasets. The dataset content that is checked includes the data and all types of information that are essential in the process of deriving scientific results and those interfacing with the PSA database. The validation software tool is a multi-mission tool that has been designed to provide the user with the flexibility of defining and implementing various types of validation criteria, to iteratively and incrementally validate datasets, and to generate validation reports.
Going, going, still there: using the WebCite service to permanently archive cited web pages.
Eysenbach, Gunther; Trudel, Mathieu
2005-12-30
Scholars are increasingly citing electronic "web references" which are not preserved in libraries or full text archives. WebCite is a new standard for citing web references. To "webcite" a document involves archiving the cited Web page through www.webcitation.org and citing the WebCite permalink instead of (or in addition to) the unstable live Web page. This journal has amended its "instructions for authors" accordingly, asking authors to archive cited Web pages before submitting a manuscript. Almost 200 other journals are already using the system. We discuss the rationale for WebCite, its technology, and how scholars, editors, and publishers can benefit from the service. Citing scholars initiate an archiving process of all cited Web references, ideally before they submit a manuscript. Authors of online documents and websites which are expected to be cited by others can ensure that their work is permanently available by creating an archived copy using WebCite and providing the citation information including the WebCite link on their Web document(s). Editors should ask their authors to cache all cited Web addresses (Uniform Resource Locators, or URLs) "prospectively" before submitting their manuscripts to their journal. Editors and publishers should also instruct their copyeditors to cache cited Web material if the author has not done so already. Finally, WebCite can process publisher submitted "citing articles" (submitted for example as eXtensible Markup Language [XML] documents) to automatically archive all cited Web pages shortly before or on publication. Finally, WebCite can act as a focussed crawler, caching retrospectively references of already published articles. Copyright issues are addressed by honouring respective Internet standards (robot exclusion files, no-cache and no-archive tags). Long-term preservation is ensured by agreements with libraries and digital preservation organizations. The resulting WebCite Index may also have applications for research assessment exercises, being able to measure the impact of Web services and published Web documents through access and Web citation metrics.
Lessons Learned while Exploring Cloud-Native Architectures for NASA EOSDIS Applications and Systems
NASA Astrophysics Data System (ADS)
Pilone, D.
2016-12-01
As new, high data rate missions begin collecting data, the NASA's Earth Observing System Data and Information System (EOSDIS) archive is projected to grow roughly 20x to over 300PBs by 2025. To prepare for the dramatic increase in data and enable broad scientific inquiry into larger time series and datasets, NASA has been exploring the impact of applying cloud technologies throughout EOSDIS. In this talk we will provide an overview of NASA's prototyping and lessons learned in applying cloud architectures to: Highly scalable and extensible ingest and archive of EOSDIS data Going "all-in" on cloud based application architectures including "serverless" data processing pipelines and evaluating approaches to vendor-lock in Rethinking data distribution and approaches to analysis in a cloud environment Incorporating and enforcing security controls while minimizing the barrier for research efforts to deploy to NASA compliant, operational environments. NASA's Earth Observing System (EOS) is a coordinated series of satellites for long term global observations. NASA's Earth Observing System Data and Information System (EOSDIS) is a multi-petabyte-scale archive of environmental data that supports global climate change research by providing end-to-end services from EOS instrument data collection to science data processing to full access to EOS and other earth science data. On a daily basis, the EOSDIS ingests, processes, archives and distributes over 3 terabytes of data from NASA's Earth Science missions representing over 6000 data products ranging from various types of science disciplines. EOSDIS has continually evolved to improve the discoverability, accessibility, and usability of high-impact NASA data spanning the multi-petabyte-scale archive of Earth science data products.
Earth observation archive activities at DRA Farnborough
NASA Technical Reports Server (NTRS)
Palmer, M. D.; Williams, J. M.
1993-01-01
Space Sector, Defence Research Agency (DRA), Farnborough have been actively involved in the acquisition and processing of Earth Observation data for over 15 years. During that time an archive of over 20,000 items has been built up. This paper describes the major archive activities, including: operation and maintenance of the main DRA Archive, the development of a prototype Optical Disc Archive System (ODAS), the catalog systems in use at DRA, the UK Processing and Archive Facility for ERS-1 data, and future plans for archiving activities.
The Production Data Approach for Full Lifecycle Management
NASA Astrophysics Data System (ADS)
Schopf, J.
2012-04-01
The amount of data generated by scientists is growing exponentially, and studies have shown [Koe04] that un-archived data sets have a resource half-life that is only a fraction of those resources that are electronically archived. Most groups still lack standard approaches and procedures for data management. Arguably, however, scientists know something about building software. A recent article in Nature [Mer10] stated that 45% of research scientists spend more time now developing software than they did 5 years ago, and 38% spent at least 1/5th of their time developing software. Fox argues [Fox10] that a simple release of data is not the correct approach to data curation. In addition, just as software is used in a wide variety of ways never initially envisioned by its developers, we're seeing this even to a greater extent with data sets. In order to address the need for better data preservation and access, we propose that data sets should be managed in a similar fashion to building production quality software. These production data sets are not simply published once, but go through a cyclical process, including phases such as design, development, verification, deployment, support, analysis, and then development again, thereby supporting the full lifecycle of a data set. The process involved in academically-produced software changes over time with respect to issues such as how much it is used outside the development group, but factors in aspects such as knowing who is using the code, enabling multiple developers to contribute to code development with common procedures, formal testing and release processes, developing documentation, and licensing. When we work with data, either as a collection source, as someone tagging data, or someone re-using it, many of the lessons learned in building production software are applicable. Table 1 shows a comparison of production software elements to production data elements. Table 1: Comparison of production software and production data. Production Software Production Data End-user considerations End-user considerations Multiple Coders: Repository with check-in procedures Coding standards Multiple producers/collectors Local archive with check-in procedure Metadata Standards Formal testing Formal testing Bug tracking and fixes Bug tracking and fixes, QA/QC Documentation Documentation Formal Release Process Formal release process to external archive License Citation/usage statement The full presentation of this abstract will include a detailed discussion of these issues so that researchers can produce usable and accessible data sets as a first step toward reproducible science. By creating production-quality data sets, we extend the potential of our data, both in terms of usability and usefulness to ourselves and other researchers. The more we treat data with formal processes and release cycles, the more relevant and useful it can be to the scientific community.
Challenges of archiving science data from long duration missions: the Rosetta case
NASA Astrophysics Data System (ADS)
Heather, David
2016-07-01
Rosetta is the first mission designed to orbit and land on a comet. It consists of an orbiter, carrying 11 science experiments, and a lander, called 'Philae', carrying 10 additional instruments. Rosetta was launched on 2 March 2004, and arrived at the comet 67P/Churyumov-Gerasimenko on 6 August 2014. During its long journey, Rosetta has completed flybys of the Earth and Mars, and made two excursions to the main asteroid belt to observe (2867) Steins and (21) Lutetia. On 12 November 2014, the Philae probe soft landed on comet 67P/Churyumov-Gerasimenko, the first time in history that such an extraordinary feat has been achieved. After the landing, the Rosetta orbiter followed the comet through its perihelion in August 2015, and will continue to accompany 67P/Churyumov-Gerasimenko as it recedes from the Sun until the end of the mission. There are significant challenges in managing the science archive of a mission such as Rosetta. The first data were returned from Rosetta more than 10 years ago, and there have been flybys of several planetary bodies, including two asteroids from which significant science data were returned by many of the instruments. The scientific applications for these flyby data can be very different to those taken during the main science phase at the comet, but there are severe limitations on the changes that can be applied to the data pipelines managed by the various science teams as resources are scarce. The priority is clearly on maximising the potential science from the comet phase, so data formats and pipelines have been designed with that in mind, and changes limited to managing issues found during official archiving authority and independent science reviews. In addition, in the time that Rosetta has been operating, the archiving standards themselves have evolved. All Rosetta data are archived following version 3 of NASA's Planetary Data System (PDS) Standards. Currently, new and upcoming planetary science missions are delivering data following the new 'PDS4' standards, which are using a very different format and require significant changes to the archive itself to manage. There are no plans at ESA to convert the data to PDS4 formats, but the community may need this to be completed in the long term if we are to realise the full scientific potential of the mission. There is a Memorandum of Understanding between ESA and NASA that commits to there being a full copy of the Rosetta science data holdings both within the Planetary Science Archive (PSA) at ESA and with NASA's Planetary Data System, at the Small Bodies Node (SBN) in Maryland. The requirements from each archiving authority place sometimes contradictory restrictions on the formatting and structure of the data content, and there has also been a significant evolution of the archives on both side of the Atlantic. The SBN have themselves expressed a desire to 'convert' the Rosetta data to PDS4 formats, so this will need to be carefully managed between the archiving authorities to ensure consistency in the Rosetta archive overall. Validation of the returned data to ensure full compliance with both the PSA and the PDS archives has required the development of a specific tool (DVal) that can be configured to manage the specificities of each instrument team's science data. Unlike the PDS, which comprises an affiliation of 'nodes', each specialising in a planetary science discipline, the PSA is a single archive designed to host data from all of ESA's planetary science missions. There have been significant challenges in evolving the archive to meet Rosetta's needs as a long-term project, without compromising the service provided to the other ongoing missions. Partly in response to this, the PSA is currently implementing a number of significant changes, both to its web-based interface to the scientific community, and to its database structure. The newly designed PSA will aim to provide easier and more direct access to the Rosetta data (and all of ESA's planetary science data holdings), and will help to soften the impact of some of the issues that have arisen with managing missions such as Rosetta in the existing framework. Conclusions: Development and management of the Rosetta science archive has been a significant challenge, due in part to the long duration of the mission and the corresponding need for development of the archive infrastructure and of the archiving process to manage these changes. The definition of a single set of conventions to manage the diverse suite of instruments, targets and indeed archiving authorities on Rosetta over this time has been a major issue, as has the need to evolve the validation processes that allow the data to be fully ingested and released to the community. This presentation will discuss the many issues faced by the PSA in the archiving of data from Rosetta, and the approach taken to resolve them. Lessons learned will be presented along with recommendations for other archiving authorities who will in future have the need to design and operate a science archive for long duration and international missions.
Preservation and Access to Manuscript Collections of the Czech National Library.
ERIC Educational Resources Information Center
Karen, Vladimir; Psohlavec, Stanislav
In 1996, the Czech National Library started a large-scale digitization of its extensive and invaluable collection of historical manuscripts and printed books. Each page of the selected documents is scanned using a high-resolution, full-color digital camera, processed, and archived on a CD-ROM disk. HTML coded description is added to the entire…
NASA Astrophysics Data System (ADS)
Scheers, B.; Bloemen, S.; Mühleisen, H.; Schellart, P.; van Elteren, A.; Kersten, M.; Groot, P. J.
2018-04-01
Coming high-cadence wide-field optical telescopes will image hundreds of thousands of sources per minute. Besides inspecting the near real-time data streams for transient and variability events, the accumulated data archive is a wealthy laboratory for making complementary scientific discoveries. The goal of this work is to optimise column-oriented database techniques to enable the construction of a full-source and light-curve database for large-scale surveys, that is accessible by the astronomical community. We adopted LOFAR's Transients Pipeline as the baseline and modified it to enable the processing of optical images that have much higher source densities. The pipeline adds new source lists to the archive database, while cross-matching them with the known cataloguedsources in order to build a full light-curve archive. We investigated several techniques of indexing and partitioning the largest tables, allowing for faster positional source look-ups in the cross matching algorithms. We monitored all query run times in long-term pipeline runs where we processed a subset of IPHAS data that have image source density peaks over 170,000 per field of view (500,000 deg-2). Our analysis demonstrates that horizontal table partitions of declination widths of one-degree control the query run times. Usage of an index strategy where the partitions are densely sorted according to source declination yields another improvement. Most queries run in sublinear time and a few (< 20%) run in linear time, because of dependencies on input source-list and result-set size. We observed that for this logical database partitioning schema the limiting cadence the pipeline achieved with processing IPHAS data is 25 s.
Mars Observer data production, transfer, and archival: The data production assembly line
NASA Technical Reports Server (NTRS)
Childs, David B.
1993-01-01
This paper describes the data production, transfer, and archival process designed for the Mars Observer Flight Project. It addresses the developmental and operational aspects of the archive collection production process. The developmental aspects cover the design and packaging of data products for archival and distribution to the planetary community. Also discussed is the design and development of a data transfer and volume production process capable of handling the large throughput and complexity of the Mars Observer data products. The operational aspects cover the main functions of the process: creating data and engineering products, collecting the data products and ancillary products in a central repository, producing archive volumes, validating volumes, archiving, and distributing the data to the planetary community.
NASA Astrophysics Data System (ADS)
Conway, Esther; Waterfall, Alison; Pepler, Sam; Newey, Charles
2015-04-01
In this paper we decribe a business process modelling approach to the integration of exisiting archival activities. We provide a high level overview of existing practice and discuss how procedures can be extended and supported through the description of preservation state. The aim of which is to faciliate the dynamic controlled management of scientific data through its lifecycle. The main types of archival processes considered are: • Management processes that govern the operation of an archive. These management processes include archival governance (preservation state management, selection of archival candidates and strategic management) . • Operational processes that constitute the core activities of the archive which maintain the value of research assets. These operational processes are the acquisition, ingestion, deletion, generation of metadata and preservation actvities, • Supporting processes, which include planning, risk analysis and monitoring of the community/preservation environment. We then proceed by describing the feasability testing of extended risk management and planning procedures which integrate current practices. This was done through the CEDA Archival Format Audit which inspected British Atmospherics Data Centre and National Earth Observation Data Centre Archival holdings. These holdings are extensive, comprising of around 2PB of data and 137 million individual files which were analysed and characterised in terms of format based risk. We are then able to present an overview of the risk burden faced by a large scale archive attempting to maintain the usability of heterogeneous environmental data sets. We conclude by presenting a dynamic data management information model that is capable of describing the preservation state of archival holdings throughout the data lifecycle. We provide discussion of the following core model entities and their relationships: • Aspirational entities, which include Data Entity definitions and their associated Preservation Objectives. • Risk entities, which act as drivers for change within the data lifecycle. These include Acquisitional Risks, Technical Risks, Strategic Risks and External Risks • Plan entities, which detail the actions to bring about change within an archive. These include Acquisition Plans, Preservation Plans and Monitoring plans • The Result entities describe the successful outcomes of the executed plans. These include Acquisitions, Mitigations and Accepted Risks.
12. Photocopy of photograph (original in Langley Research Center Archives, ...
12. Photocopy of photograph (original in Langley Research Center Archives, Hampton, VA LaRC) (L4496) AERIAL VIEW OF FULL-SCALE WIND TUNNEL UNDER CONSTRUCTION; c. 1930. NOTE SEAPLANE TOWING CHANNEL STRUCTURE IN BACKGROUND. - NASA Langley Research Center, Full-Scale Wind Tunnel, 224 Hunting Avenue, Hampton, Hampton, VA
Going, Going, Still There: Using the WebCite Service to Permanently Archive Cited Web Pages
Trudel, Mathieu
2005-01-01
Scholars are increasingly citing electronic “web references” which are not preserved in libraries or full text archives. WebCite is a new standard for citing web references. To “webcite” a document involves archiving the cited Web page through www.webcitation.org and citing the WebCite permalink instead of (or in addition to) the unstable live Web page. This journal has amended its “instructions for authors” accordingly, asking authors to archive cited Web pages before submitting a manuscript. Almost 200 other journals are already using the system. We discuss the rationale for WebCite, its technology, and how scholars, editors, and publishers can benefit from the service. Citing scholars initiate an archiving process of all cited Web references, ideally before they submit a manuscript. Authors of online documents and websites which are expected to be cited by others can ensure that their work is permanently available by creating an archived copy using WebCite and providing the citation information including the WebCite link on their Web document(s). Editors should ask their authors to cache all cited Web addresses (Uniform Resource Locators, or URLs) “prospectively” before submitting their manuscripts to their journal. Editors and publishers should also instruct their copyeditors to cache cited Web material if the author has not done so already. Finally, WebCite can process publisher submitted “citing articles” (submitted for example as eXtensible Markup Language [XML] documents) to automatically archive all cited Web pages shortly before or on publication. Finally, WebCite can act as a focussed crawler, caching retrospectively references of already published articles. Copyright issues are addressed by honouring respective Internet standards (robot exclusion files, no-cache and no-archive tags). Long-term preservation is ensured by agreements with libraries and digital preservation organizations. The resulting WebCite Index may also have applications for research assessment exercises, being able to measure the impact of Web services and published Web documents through access and Web citation metrics. PMID:16403724
Making SAR Data Accessible - ASF's ALOS PALSAR Radiometric Terrain Correction Project
NASA Astrophysics Data System (ADS)
Meyer, F. J.; Arko, S. A.; Gens, R.
2015-12-01
While SAR data have proven valuable for a wide range of geophysical research questions, so far, largely only the SAR-educated science communities have been able to fully exploit the information content of internationally available SAR archives. The main issues that have been preventing a more widespread utilization of SAR are related to (1) the diversity and complexity of SAR data formats, (2) the complexity of the processing flows needed to extract geophysical information from SAR, (3) the lack of standardization and automation of these processing flows, and (4) the often ignored geocoding procedures, leaving the data in image coordinate space. In order to improve upon this situation, ASF's radiometric terrain-correction (RTC) project is generating uniformly formatted and easily accessible value-added products from the ASF Distributed Active Archive Center's (DAAC) five-year archive of JAXA's ALOS PALSAR sensor. Specifically, the project applies geometric and radiometric corrections to SAR data to allow for an easy and direct combination of obliquely acquired SAR data with remote sensing imagery acquired in nadir observation geometries. Finally, the value-added data is provided to the user in the broadly accepted Geotiff format, in order to support the easy integration of SAR data into GIS environments. The goal of ASF's RTC project is to make SAR data more accessible and more attractive to the broader SAR applications community, especially to those users that currently have limited SAR expertise. Production of RTC products commenced October 2014 and will conclude late in 2015. As of July 2015, processing of 71% of ASF's ALOS PALSAR archive was completed. Adding to the utility of this dataset are recent changes to the data access policy that allow the full-resolution RTC products to be provided to the public, without restriction. In this paper we will introduce the processing flow that was developed for the RTC project and summarize the calibration and validation procedures that were implemented to determine and monitor system performance. The paper will also show the current progress of RTC processing, provide examples of generated data sets, and demonstrate the benefit of the RTC archives for applications such as land-use classification and change detection.
The Rosetta Science Archive: Status and Plans for Enhancing the Archive Content
NASA Astrophysics Data System (ADS)
Heather, David; Barthelemy, Maud; Besse, Sebastien; Fraga, Diego; Grotheer, Emmanuel; O'Rourke, Laurence; Taylor, Matthew; Vallat, Claire
2017-04-01
On 30 September 2016, Rosetta completed its incredible mission by landing on the surface of Comet 67P/Churyumov-Gerasimenko. Although this marked an end to the spacecraft's active operations, intensive work is still ongoing with instrument teams preparing their final science data deliveries for ingestion into ESA's Planetary Science Archive (PSA). In addition, ESA is establishing contracts with some instrument teams to enhance their data and documentation in an effort to provide the best long-term archive possible for the Rosetta mission. Currently, the majority of teams have delivered all of their data from the nominal mission (end of 2015), and are working on their remaining increments from the 1-year mission extension. The aim is to complete the nominal archiving with data from the complete mission by the end of this year, when a full mission archive review will be held. This review will assess the complete data holdings from Rosetta and ensure that the archive is ready for the long-term. With the resources from the operational mission coming to an end, ESA has established a number of 'enhanced archiving' contracts to ensure that the best possible data are delivered to the archive before instrument teams disband. Updates are focused on key aspects of an instrument's calibration or the production of higher level data / information, and are therefore specific to each instrument's needs. These contracts are currently being kicked off, and will run for various lengths depending upon the activities to be undertaken. The full 'archive enhancement' process will run until September 2019, when the post operations activities for Rosetta will end. Within these contracts, most instrument teams will work on providing a Science User Guide for their data, as well as updating calibrations. Several teams will also be delivering higher level and derived products. For example, the VIRTIS team will be updating both their spectral and geometrical calibrations, and will aim to deliver mapping products to the final archive. Similarly, the OSIRIS team will be improving their calibrations and delivering data additionally in FITS format. The Rosetta Plasma Consortium (RPC) instruments will complete cross-calibrations and a number of activities individual to each instrument. The MIDAS team will also be working on cross-calibrations and will produce a dust particle catalog from the comet coma. GIADA will be producing dust environment maps, with products in 3D plus time. A contract also exists to produce and deliver data set(s) containing sup-porting ground-based observations from amateur astronomers. In addition to these contracts, the Rosetta ESA archiving team will produce calibrated data sets for the NAVCAM instrument, and will work to include the latest shape models from the comet into the final Rosetta archive. Work is also underway to provide a centralized solution to the problem of geometry on the comet. This presentation will outline the current status of the Rosetta archive, as well as highlighting some of the 'enhanced archiving' activities planned with the various instrument teams on Rosetta.
NASA Astrophysics Data System (ADS)
Kontoes, Charalampos; Papoutsis, Ioannis; Herekakis, Themistoklis; Michail, Dimitrios; Ieronymidi, Emmanuela
2013-04-01
Remote sensing tools for the accurate, robust and timely assessment of the damages inflicted by forest wildfires provide information that is of paramount importance to public environmental agencies and related stakeholders before, during and after the crisis. The Institute for Astronomy, Astrophysics, Space Applications and Remote Sensing of the National Observatory of Athens (IAASARS/NOA) has developed a fully automatic single and/or multi date processing chain that takes as input archived Landsat 4, 5 or 7 raw images and produces precise diachronic burnt area polygons and damage assessments over the Greek territory. The methodology consists of three fully automatic stages: 1) the pre-processing stage where the metadata of the raw images are extracted, followed by the application of the LEDAPS software platform for calibration and mask production and the Automated Precise Orthorectification Package, developed by NASA, for image geo-registration and orthorectification, 2) the core-BSM (Burn Scar Mapping) processing stage which incorporates a published classification algorithm based on a series of physical indexes, the application of two filters for noise removal using graph-based techniques and the grouping of pixels classified as burnt to form the appropriate pixels clusters before proceeding to conversion from raster to vector, and 3) the post-processing stage where the products are thematically refined and enriched using auxiliary GIS layers (underlying land cover/use, administrative boundaries, etc.) and human logic/evidence to suppress false alarms and omission errors. The established processing chain has been successfully applied to the entire archive of Landsat imagery over Greece spanning from 1984 to 2012, which has been collected and managed in IAASARS/NOA. The number of full Landsat frames that were subject of process in the framework of the study was 415. These burn scar mapping products are generated for the first time to such a temporal and spatial extent and are ideal to use in further environmental time series analyzes, production of statistical indexes (frequency, geographical distribution and number of fires per prefecture) and applications, including change detection and climate change models, urban planning, correlation with manmade activities, etc.
Bent, John M.; Faibish, Sorin; Grider, Gary
2015-06-30
Cloud object storage is enabled for archived data, such as checkpoints and results, of high performance computing applications using a middleware process. A plurality of archived files, such as checkpoint files and results, generated by a plurality of processes in a parallel computing system are stored by obtaining the plurality of archived files from the parallel computing system; converting the plurality of archived files to objects using a log structured file system middleware process; and providing the objects for storage in a cloud object storage system. The plurality of processes may run, for example, on a plurality of compute nodes. The log structured file system middleware process may be embodied, for example, as a Parallel Log-Structured File System (PLFS). The log structured file system middleware process optionally executes on a burst buffer node.
Status of the TESS Science Processing Operations Center
NASA Astrophysics Data System (ADS)
Jenkins, Jon Michael; Caldwell, Douglas A.; Davies, Misty; Li, Jie; Morris, Robert L.; Rose, Mark; Smith, Jeffrey C.; Tenenbaum, Peter; Ting, Eric; Twicken, Joseph D.; Wohler, Bill
2018-06-01
The Transiting Exoplanet Survey Satellite (TESS) was selected by NASA’s Explorer Program to conduct a search for Earth’s closest cousins starting in 2018. TESS will conduct an all-sky transit survey of F, G and K dwarf stars between 4 and 12 magnitudes and M dwarf stars within 200 light years. TESS is expected to discover 1,000 small planets less than twice the size of Earth, and to measure the masses of at least 50 of these small worlds. The TESS science pipeline is being developed by the Science Processing Operations Center (SPOC) at NASA Ames Research Center based on the highly successful Kepler science pipeline. Like the Kepler pipeline, the TESS pipeline provides calibrated pixels, simple and systematic error-corrected aperture photometry, and centroid locations for all 200,000+ target stars observed over the 2-year mission, along with associated uncertainties. The pixel and light curve products are modeled on the Kepler archive products and will be archived to the Mikulski Archive for Space Telescopes (MAST). In addition to the nominal science data, the 30-minute Full Frame Images (FFIs) simultaneously collected by TESS will also be calibrated by the SPOC and archived at MAST. The TESS pipeline searches through all light curves for evidence of transits that occur when a planet crosses the disk of its host star. The Data Validation pipeline generates a suite of diagnostic metrics for each transit-like signature, and then extracts planetary parameters by fitting a limb-darkened transit model to each potential planetary signature. The results of the transit search are modeled on the Kepler transit search products (tabulated numerical results, time series products, and pdf reports) all of which will be archived to MAST. Synthetic sample data products are available at https://archive.stsci.edu/tess/ete-6.html.Funding for the TESS Mission has been provided by the NASA Science Mission Directorate.
Do, Bao H; Wu, Andrew; Biswal, Sandip; Kamaya, Aya; Rubin, Daniel L
2010-11-01
Storing and retrieving radiology cases is an important activity for education and clinical research, but this process can be time-consuming. In the process of structuring reports and images into organized teaching files, incidental pathologic conditions not pertinent to the primary teaching point can be omitted, as when a user saves images of an aortic dissection case but disregards the incidental osteoid osteoma. An alternate strategy for identifying teaching cases is text search of reports in radiology information systems (RIS), but retrieved reports are unstructured, teaching-related content is not highlighted, and patient identifying information is not removed. Furthermore, searching unstructured reports requires sophisticated retrieval methods to achieve useful results. An open-source, RadLex(®)-compatible teaching file solution called RADTF, which uses natural language processing (NLP) methods to process radiology reports, was developed to create a searchable teaching resource from the RIS and the picture archiving and communication system (PACS). The NLP system extracts and de-identifies teaching-relevant statements from full reports to generate a stand-alone database, thus converting existing RIS archives into an on-demand source of teaching material. Using RADTF, the authors generated a semantic search-enabled, Web-based radiology archive containing over 700,000 cases with millions of images. RADTF combines a compact representation of the teaching-relevant content in radiology reports and a versatile search engine with the scale of the entire RIS-PACS collection of case material. ©RSNA, 2010
Archiving Mars Mission Data Sets with the Planetary Data System
NASA Technical Reports Server (NTRS)
Guinness, Edward A.
2006-01-01
This viewgraph presentation reviews the use of the Planetary Data System (PDS) to archive the datasets that are received from the Mars Missions. It reviews the lessons learned in the actual archiving process, and presents an overview of the actual archiving process. It also reviews the lessons learned from the perspectives of the projects, the data producers and the data users.
Minimal Processing: Its Context and Influence in the Archival Community
ERIC Educational Resources Information Center
Gorzalski, Matt
2008-01-01
Since its publication in 2005, Mark A. Greene and Dennis Meissner's "More Product, Less Process: Revamping Traditional Archival Processing" has led to much discussion and self-examination within the archival community about working through backlogs. This article discusses the impact of Greene and Meissner's work and considers the questions and…
Desired Precision in Multi-Objective Optimization: Epsilon Archiving or Rounding Objectives?
NASA Astrophysics Data System (ADS)
Asadzadeh, M.; Sahraei, S.
2016-12-01
Multi-objective optimization (MO) aids in supporting the decision making process in water resources engineering and design problems. One of the main goals of solving a MO problem is to archive a set of solutions that is well-distributed across a wide range of all the design objectives. Modern MO algorithms use the epsilon dominance concept to define a mesh with pre-defined grid-cell size (often called epsilon) in the objective space and archive at most one solution at each grid-cell. Epsilon can be set to the desired precision level of each objective function to make sure that the difference between each pair of archived solutions is meaningful. This epsilon archiving process is computationally expensive in problems that have quick-to-evaluate objective functions. This research explores the applicability of a similar but computationally more efficient approach to respect the desired precision level of all objectives in the solution archiving process. In this alternative approach each objective function is rounded to the desired precision level before comparing any new solution to the set of archived solutions that already have rounded objective function values. This alternative solution archiving approach is compared to the epsilon archiving approach in terms of efficiency and quality of archived solutions for solving mathematical test problems and hydrologic model calibration problems.
New Processing of Spaceborne Imaging Radar-C (SIR-C) Data
NASA Astrophysics Data System (ADS)
Meyer, F. J.; Gracheva, V.; Arko, S. A.; Labelle-Hamer, A. L.
2017-12-01
The Spaceborne Imaging Radar-C (SIR-C) was a radar system, which successfully operated on two separate shuttle missions in April and October 1994. During these two missions, a total of 143 hours of radar data were recorded. SIR-C was the first multifrequency and polarimetric spaceborne radar system, operating in dual frequency (L- and C- band) and with quad-polarization. SIR-C had a variety of different operating modes, which are innovative even from today's point of view. Depending on the mode, it was possible to acquire data with different polarizations and carrier frequency combinations. Additionally, different swaths and bandwidths could be used during the data collection and it was possible to receive data with two antennas in the along-track direction.The United States Geological Survey (USGS) distributes the synthetic aperture radar (SAR) images as single-look complex (SLC) and multi-look complex (MLC) products. Unfortunately, since June 2005 the SIR-C processor has been inoperable and not repairable. All acquired SLC and MLC images were processed with a course resolution of 100 m with the goal of generating a quick look. These images are however not well suited for scientific analysis. Only a small percentage of the acquired data has been processed as full resolution SAR images and the unprocessed high resolution data cannot be processed any more at the moment.At the Alaska Satellite Facility (ASF) a new processor was developed to process binary SIR-C data to full resolution SAR images. ASF is planning to process the entire recoverable SIR-C archive to full resolution SLCs, MLCs and high resolution geocoded image products. ASF will make these products available to the science community through their existing data archiving and distribution system.The final paper will describe the new processor and analyze the challenges of reprocessing the SIR-C data.
Ethics and Truth in Archival Research
ERIC Educational Resources Information Center
Tesar, Marek
2015-01-01
The complexities of the ethics and truth in archival research are often unrecognised or invisible in educational research. This paper complicates the process of collecting data in the archives, as it problematises notions of ethics and truth in the archives. The archival research took place in the former Czechoslovakia and its turbulent political…
77 FR 35430 - Advisory Committee on the Records of Congress; Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-13
... full range of programs, policies, and plans for the Center for Legislative Archives in the Office of Legislative Archives, Presidential Libraries, and Museum Services (LPM). The meeting is open to the public...
78 FR 71672 - Advisory Committee on the Records of Congress; Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-29
... advises NARA on the full range of programs, policies, and plans for the Center for Legislative Archives in the Office of Legislative Archives, Presidential Libraries, and Museum Services (LPM). DATES: The...
Lunar Data Node: Apollo Data Restoration and Archiving Update
NASA Technical Reports Server (NTRS)
Williams, David R.; Hills, Howard K.; Guiness, Edward A.; Taylor, Patrick T.; McBride, Marie Julia
2013-01-01
The Lunar Data Node (LDN) of the Planetary Data System (PDS) is responsible for the restoration and archiving of Apollo data. The LDN is located at the National Space Science Data Center (NSSDC), which holds much of the extant Apollo data on microfilm, microfiche, hard-copy documents, and magnetic tapes in older formats. The goal of the restoration effort is to convert the data into user-accessible PDS formats, create a full set of explanatory supporting data (metadata), archive the full data sets through PDS, and post the data online at the PDS Geosciences Node. This will both enable easy use of the data by current researchers and ensure that the data and metadata are securely preserved for future use. We are also attempting to locate and preserve Apollo data which were never archived at NSSDC. We will give a progress report on the data sets we have been restoring and future work.
Going, going, still there: using the WebCite service to permanently archive cited Web pages.
Eysenbach, Gunther
2006-01-01
Scholars are increasingly citing electronic "web references" which are not preserved in libraries or full text archives. WebCite is a new standard for citing web references. To "webcite" a document involves archiving the cited Web page through www.webcitation.org and citing the WebCite permalink instead of (or in addition to) the unstable live Web page.
The Kanzelhöhe Online Data Archive
NASA Astrophysics Data System (ADS)
Pötzi, W.; Hirtenfellner-Polanec, W.; Temmer, M.
The Kanzelhöhe Observatory provides high-cadence full-disk observations of solar activity phenomena like sunspots, flares and prominence eruptions on a regular basis. The data are available for download from the KODA (Kanzelhöhe Observatory Data Archive) which is freely accessible. The archive offers sunspot drawings back to 1950 and high cadence H-α data back to 1973. Images from other instruments, like white-light and CaIIK, are available since 2007 and 2010, respectively. In the following we describe how to access the archive and the format of the data.
Linking Science Analysis with Observation Planning: A Full Circle Data Lifecycle
NASA Technical Reports Server (NTRS)
Grosvenor, Sandy; Jones, Jeremy; Koratkar, Anuradha; Li, Connie; Mackey, Jennifer; Neher, Ken; Wolf, Karl; Obenschain, Arthur F. (Technical Monitor)
2001-01-01
A clear goal of the Virtual Observatory (VO) is to enable new science through analysis of integrated astronomical archives. An additional and powerful possibility of the VO is to link and integrate these new analyses with planning of new observations. By providing tools that can be used for observation planning in the VO, the VO will allow the data lifecycle to come full circle: from theory to observations to data and back around to new theories and new observations. The Scientist's Expert Assistant (SEA) Simulation Facility (SSF) is working to combine the ability to access existing archives with the ability to model and visualize new observations. Integrating the two will allow astronomers to better use the integrated archives of the VO to plan and predict the success of potential new observations more efficiently, The full circle lifecycle enabled by SEA can allow astronomers to make substantial leaps in the quality of data and science returns on new observations. Our paper examines the exciting potential of integrating archival analysis with new observation planning, such as performing data calibration analysis on archival images and using that analysis to predict the success of new observations, or performing dynamic signal-to-noise analysis combining historical results with modeling of new instruments or targets. We will also describe how the development of the SSF is progressing and what have been its successes and challenges.
Linking Science Analysis with Observation Planning: A Full Circle Data Lifecycle
NASA Technical Reports Server (NTRS)
Jones, Jeremy; Grosvenor, Sandy; Wolf, Karl; Li, Connie; Koratkar, Anuradha; Powers, Edward I. (Technical Monitor)
2001-01-01
A clear goal of the Virtual Observatory (VO) is to enable new science through analysis of integrated astronomical archives. An additional and powerful possibility of the VO is to link and integrate these new analyses with planning of new observations. By providing tools that can be used for observation planning in the VO, the VO will allow the data lifecycle to come full circle: from theory to observations to data and back around to new theories and new observations. The Scientist's Expert Assistant (SEA) Simulation Facility (SSF) is working to combine the ability to access existing archives with the ability to model and visualize new observations. Integrating the two will allow astronomers to better use the integrated archives of the VO to plan and predict the success of potential new observations. The full circle lifecycle enabled by SEA can allow astronomers to make substantial leaps in the quality of data and science returns on new observations. Our paper will examine the exciting potential of integrating archival analysis with new observation planning, such as performing data calibration analysis on archival images and using that analysis to predict the success of new observations, or performing dynamic signal-to-noise analysis combining historical results with modeling of new instruments or targets. We will also describe how the development of the SSF is progressing and what has been its successes and challenges.
Mission Exploitation Platform PROBA-V
NASA Astrophysics Data System (ADS)
Goor, Erwin
2016-04-01
VITO and partners developed an end-to-end solution to drastically improve the exploitation of the PROBA-V EO-data archive (http://proba-v.vgt.vito.be/), the past mission SPOT-VEGETATION and derived vegetation parameters by researchers, service providers and end-users. The analysis of time series of data (+1PB) is addressed, as well as the large scale on-demand processing of near real-time data. From November 2015 an operational Mission Exploitation Platform (MEP) PROBA-V, as an ESA pathfinder project, will be gradually deployed at the VITO data center with direct access to the complete data archive. Several applications will be released to the users, e.g. - A time series viewer, showing the evolution of PROBA-V bands and derived vegetation parameters for any area of interest. - Full-resolution viewing services for the complete data archive. - On-demand processing chains e.g. for the calculation of N-daily composites. - A Virtual Machine will be provided with access to the data archive and tools to work with this data, e.g. various toolboxes and support for R and Python. After an initial release in January 2016, a research platform will gradually be deployed allowing users to design, debug and test applications on the platform. From the MEP PROBA-V, access to Sentinel-2 and landsat data will be addressed as well, e.g. to support the Cal/Val activities of the users. Users can make use of powerful Web based tools and can self-manage virtual machines to perform their work on the infrastructure at VITO with access to the complete data archive. To realise this, private cloud technology (openStack) is used and a distributed processing environment is built based on Hadoop. The Hadoop ecosystem offers a lot of technologies (Spark, Yarn, Accumulo, etc.) which we integrate with several open-source components. The impact of this MEP on the user community will be high and will completely change the way of working with the data and hence open the large time series to a larger community of users. The presentation will address these benefits for the users and discuss on the technical challenges in implementing this MEP.
Archiving, processing, and disseminating ASTER products at the USGS EROS Data Center
Jones, B.; Tolk, B.; ,
2002-01-01
The U.S. Geological Survey EROS Data Center archives, processes, and disseminates Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) data products. The ASTER instrument is one of five sensors onboard the Earth Observing System's Terra satellite launched December 18, 1999. ASTER collects broad spectral coverage with high spatial resolution at near infrared, shortwave infrared, and thermal infrared wavelengths with ground resolutions of 15, 30, and 90 meters, respectively. The ASTER data are used in many ways to understand local and regional earth-surface processes. Applications include land-surface climatology, volcanology, hazards monitoring, geology, agronomy, land cover change, and hydrology. The ASTER data are available for purchase from the ASTER Ground Data System in Japan and from the Land Processes Distributed Active Archive Center in the United States, which receives level 1A and level 1B data from Japan on a routine basis. These products are archived and made available to the public within 48 hours of receipt. The level 1A and level 1B data are used to generate higher level products that include routine and on-demand decorrelation stretch, brightness temperature at the sensor, emissivity, surface reflectance, surface kinetic temperature, surface radiance, polar surface and cloud classification, and digital elevation models. This paper describes the processes and procedures used to archive, process, and disseminate standard and on-demand higher level ASTER products at the Land Processes Distributed Active Archive Center.
Land processes distributed active archive center product lifecycle plan
Daucsavage, John C.; Bennett, Stacie D.
2014-01-01
The U.S. Geological Survey (USGS) Earth Resources Observation and Science (EROS) Center and the National Aeronautics and Space Administration (NASA) Earth Science Data System Program worked together to establish, develop, and operate the Land Processes (LP) Distributed Active Archive Center (DAAC) to provide stewardship for NASA’s land processes science data. These data are critical science assets that serve the land processes science community with potential value beyond any immediate research use, and therefore need to be accounted for and properly managed throughout their lifecycle. A fundamental LP DAAC objective is to enable permanent preservation of these data and information products. The LP DAAC accomplishes this by bridging data producers and permanent archival resources while providing intermediate archive services for data and information products.
NASA Astrophysics Data System (ADS)
Holmdahl, P. E.; Ellis, A. B. E.; Moeller-Olsen, P.; Ringgaard, J. P.
1981-12-01
The basic requirements of the SAR ground segment of ERS-1 are discussed. A system configuration for the real time data acquisition station and the processing and archive facility is depicted. The functions of a typical SAR processing unit (SPU) are specified, and inputs required for near real time and full precision, deferred time processing are described. Inputs and the processing required for provision of these inputs to the SPU are dealt with. Data flow through the systems, and normal and nonnormal operational sequence, are outlined. Prerequisites for maintaining overall performance are identified, emphasizing quality control. The most demanding tasks to be performed by the front end are defined in order to determine types of processors and peripherals which comply with throughput requirements.
Practical holography III; Proceedings of the Meeting, Los Angeles, CA, Jan. 17, 18, 1989
NASA Astrophysics Data System (ADS)
Benton, Stephen A.
Various papers on practical holography are presented. Individual topics addressed include: design of large format commercial display holograms, design of a one-step full-color holographic recording system, color reflection holography, full color rainbow hologram using a photoresist plate, secondary effects in processing holograms, archival properties of holograms, survey of properties of volume holographic materials, image stability of DMP-128 holograms, activation monitor for DMP-128, microwave drying effects on dichromated gelatin holograms, sensitization process of dichromated gelatin, holographic optics for vision systems, holographic fingerprint sensor, cross-talk and cross-coupling in multiplexed holographic gratings, compact illuminators for transmission holograms, solar holoconcentrators in dichromated grains, three-dimensional display of scientific data, holographic liquid crystal displays, in situ swelling for hologaphic color control.
VLBA Archive &Distribution Architecture
NASA Astrophysics Data System (ADS)
Wells, D. C.
1994-01-01
Signals from the 10 antennas of NRAO's VLBA [Very Long Baseline Array] are processed by a Correlator. The complex fringe visibilities produced by the Correlator are archived on magnetic cartridges using a low-cost architecture which is capable of scaling and evolving. Archive files are copied to magnetic media to be distributed to users in FITS format, using the BINTABLE extension. Archive files are labelled using SQL INSERT statements, in order to bind the DBMS-based archive catalog to the archive media.
The Ethics of Archival Research
ERIC Educational Resources Information Center
McKee, Heidi A.; Porter, James E.
2012-01-01
What are the key ethical issues involved in conducting archival research? Based on examination of cases and interviews with leading archival researchers in composition, this article discusses several ethical questions and offers a heuristic to guide ethical decision making. Key to this process is recognizing the person-ness of archival materials.…
Meeting Students Where They Are: Advancing a Theory and Practice of Archives in the Classroom
ERIC Educational Resources Information Center
Saidy, Christina; Hannah, Mark; Sura, Tom
2011-01-01
This article uses theories of technical communication and archives to advance a pedagogy that includes archival production in the technical communication classroom. By developing and maintaining local classroom archives, students directly engage in valuable processes of appraisal, selection, collaboration, and retention. The anticipated outcomes…
NASA Astrophysics Data System (ADS)
Martinez, E.; Glassy, J. M.; Fowler, D. K.; Khayat, M.; Olding, S. W.
2014-12-01
The NASA Earth Science Data Systems Working Groups (ESDSWG) focuses on improving technologies and processes related to science discovery and preservation. One particular group, the Data Preservation Practices, is defining a set of guidelines to aid data providers in planning both what to submit for archival, and when to submit artifacts, so that the archival process can begin early in the project's life cycle. This has the benefit of leveraging knowledge within the project before staff roll off to other work. In this poster we describe various project archival use cases and identify possible archival life cycles that map closely to the pace and flow of work. To understand "archival life cycles", i.e., distinct project phases that produce archival artifacts such as instrument capabilities, calibration reports, and science data products, the workig group initially mapped the archival requirements defined in the Preservation Content Specification to the typical NASA project life cycle. As described in the poster, this work resulted in a well-defined archival life cycle, but only for some types of projects; it did not fit well for condensed project life cycles experienced within airborne and balloon campaigns. To understand the archival process for projects with compressed cycles, the working group gathered use cases from various communities. This poster will describe selected uses cases that provided insight into the unique flow of these projects, as well as proposing archival life cycles that map artifacts to projects with compressed timelines. Finally, the poster will conclude with some early recommendations for data providers, which will be captured in a formal Guidelines document - to be published in 2015.
Preservation and Enhancement of the Spacewatch Data Archives
NASA Technical Reports Server (NTRS)
Larsen, Jeffrey A.
2003-01-01
In March of 1998, the asteroid 1997 XF11 was announced to be potentially hazardous after being tracked over 90 days. A potential two year wait for confirming observations was shortened to under 24 hours because of the existence of archived photographic prediscovery images. Spacewatch was a pioneer in using CCD scanning and possesses a valuable digital archive of its scans. Unfortunately these data are aging on magnetic tape and will soon be lost. Since 1990, the Spacewatch project gathered some 1.5 Terabytes of scan data covering roughly 75,000 degrees of sky to a limiting magnitude of V = 21.5. The data have not yet been mined for all of their asteroids for scientific studies and orbit determination. Spacewatch's real-time motion detection program MODP was constrained by the computers of the era to use simplified image processing algorithms at a reduced efficiency. Jedicke and Herron estimated MODP's efficiency at finding asteroids to be approximately 60 percent to V=18 and improving somewhat thereafter. This lead to a substantial bias correction in their analyses. Larsen has developed a MODP replacement capable in excess of 90 percent efficiency in the same range and able to push a magnitude fainter in completeness. We propose a program of post-processing and re-archiving Spacewatch data. Our scans would be transferred from tape to CD-ROMs and converted to FITS images -- establishing a consistent data format and media for both past and future Spacewatch observations. Larsen's MODP replacement would mine these data for previously undetected motions, which would be made available to the Minor Planet Center and our ongoing asteroid population studies. A searchable observation record would be made generally available for prediscovery work. We estimate the net asteroid yield of this proposal is equivalent to three full years of Spacewatch operations.
NASA Astrophysics Data System (ADS)
Bonano, Manuela; Buonanno, Sabatino; Ojha, Chandrakanta; Berardino, Paolo; Lanari, Riccardo; Zeni, Giovanni; Manunta, Michele
2017-04-01
The advanced DInSAR technique referred to as Small BAseline Subset (SBAS) algorithm has already largely demonstrated its effectiveness to carry out multi-scale and multi-platform surface deformation analyses relevant to both natural and man-made hazards. Thanks to its capability to generate displacement maps and long-term deformation time series at both regional (low resolution analysis) and local (full resolution analysis) spatial scales, it allows to get more insights on the spatial and temporal patterns of localized displacements relevant to single buildings and infrastructures over extended urban areas, with a key role in supporting risk mitigation and preservation activities. The extensive application of the multi-scale SBAS-DInSAR approach in many scientific contexts has gone hand in hand with new SAR satellite mission development, characterized by different frequency bands, spatial resolution, revisit times and ground coverage. This brought to the generation of huge DInSAR data stacks to be efficiently handled, processed and archived, with a strong impact on both the data storage and the computational requirements needed for generating the full resolution SBAS-DInSAR results. Accordingly, innovative and effective solutions for the automatic processing of massive SAR data archives and for the operational management of the derived SBAS-DInSAR products need to be designed and implemented, by exploiting the high efficiency (in terms of portability, scalability and computing performances) of the new ICT methodologies. In this work, we present a novel parallel implementation of the full resolution SBAS-DInSAR processing chain, aimed at investigating localized displacements affecting single buildings and infrastructures relevant to very large urban areas, relying on different granularity level parallelization strategies. The image granularity level is applied in most steps of the SBAS-DInSAR processing chain and exploits the multiprocessor systems with distributed memory. Moreover, in some processing steps very heavy from the computational point of view, the Graphical Processing Units (GPU) are exploited for the processing of blocks working on a pixel-by-pixel basis, requiring strong modifications on some key parts of the sequential full resolution SBAS-DInSAR processing chain. GPU processing is implemented by efficiently exploiting parallel processing architectures (as CUDA) for increasing the computing performances, in terms of optimization of the available GPU memory, as well as reduction of the Input/Output operations on the GPU and of the whole processing time for specific blocks w.r.t. the corresponding sequential implementation, particularly critical in presence of huge DInSAR datasets. Moreover, to efficiently handle the massive amount of DInSAR measurements provided by the new generation SAR constellations (CSK and Sentinel-1), we perform a proper re-design strategy aimed at the robust assimilation of the full resolution SBAS-DInSAR results into the web-based Geonode platform of the Spatial Data Infrastructure, thus allowing the efficient management, analysis and integration of the interferometric results with different data sources.
Digitizing and Securing Archived Laboratory Notebooks
ERIC Educational Resources Information Center
Caporizzo, Marilyn
2008-01-01
The Information Group at Millipore has been successfully using a digital rights management tool to secure the email distribution of archived laboratory notebooks. Millipore is a life science leader providing cutting-edge technologies, tools, and services for bioscience research and biopharmaceutical manufacturing. Consisting of four full-time…
NASA Astrophysics Data System (ADS)
Chatzistergos, Theodosios; Ermolli, Ilaria; Solanki, Sami K.; Krivova, Natalie A.
2018-01-01
Context. Historical Ca II K spectroheliograms (SHG) are unique in representing long-term variations of the solar chromospheric magnetic field. They usually suffer from numerous problems and lack photometric calibration. Thus accurate processing of these data is required to get meaningful results from their analysis. Aims: In this paper we aim at developing an automatic processing and photometric calibration method that provides precise and consistent results when applied to historical SHG. Methods: The proposed method is based on the assumption that the centre-to-limb variation of the intensity in quiet Sun regions does not vary with time. We tested the accuracy of the proposed method on various sets of synthetic images that mimic problems encountered in historical observations. We also tested our approach on a large sample of images randomly extracted from seven different SHG archives. Results: The tests carried out on the synthetic data show that the maximum relative errors of the method are generally <6.5%, while the average error is <1%, even if rather poor quality observations are considered. In the absence of strong artefacts the method returns images that differ from the ideal ones by <2% in any pixel. The method gives consistent values for both plage and network areas. We also show that our method returns consistent results for images from different SHG archives. Conclusions: Our tests show that the proposed method is more accurate than other methods presented in the literature. Our method can also be applied to process images from photographic archives of solar observations at other wavelengths than Ca II K.
Flexible server-side processing of climate archives
NASA Astrophysics Data System (ADS)
Juckes, Martin; Stephens, Ag; Damasio da Costa, Eduardo
2014-05-01
The flexibility and interoperability of OGC Web Processing Services are combined with an extensive range of data processing operations supported by the Climate Data Operators (CDO) library to facilitate processing of the CMIP5 climate data archive. The challenges posed by this peta-scale archive allow us to test and develop systems which will help us to deal with approaching exa-scale challenges. The CEDA WPS package allows users to manipulate data in the archive and export the results without first downloading the data -- in some cases this can drastically reduce the data volumes which need to be transferred and greatly reduce the time needed for the scientists to get their results. Reductions in data transfer are achieved at the expense of an additional computational load imposed on the archive (or near-archive) infrastructure. This is managed with a load balancing system. Short jobs may be run in near real-time, longer jobs will be queued. When jobs are queued the user is provided with a web dashboard displaying job status. A clean split between the data manipulation software and the request management software is achieved by exploiting the extensive CDO library. This library has a long history of development to support the needs of the climate science community. Use of the library ensures that operations run on data by the system can be reproduced by users using the same operators installed on their own computers. Examples using the system deployed for the CMIP5 archive will be shown and issues which need to be addressed as archive volumes expand into the exa-scale will be discussed.
Flexible server-side processing of climate archives
NASA Astrophysics Data System (ADS)
Juckes, M. N.; Stephens, A.; da Costa, E. D.
2013-12-01
The flexibility and interoperability of OGC Web Processing Services are combined with an extensive range of data processing operations supported by the Climate Data Operators (CDO) library to facilitate processing of the CMIP5 climate data archive. The challenges posed by this peta-scale archive allow us to test and develop systems which will help us to deal with approaching exa-scale challenges. The CEDA WPS package allows users to manipulate data in the archive and export the results without first downloading the data -- in some cases this can drastically reduce the data volumes which need to be transferred and greatly reduce the time needed for the scientists to get their results. Reductions in data transfer are achieved at the expense of an additional computational load imposed on the archive (or near-archive) infrastructure. This is managed with a load balancing system. Short jobs may be run in near real-time, longer jobs will be queued. When jobs are queued the user is provided with a web dashboard displaying job status. A clean split between the data manipulation software and the request management software is achieved by exploiting the extensive CDO library. This library has a long history of development to support the needs of the climate science community. Use of the library ensures that operations run on data by the system can be reproduced by users using the same operators installed on their own computers. Examples using the system deployed for the CMIP5 archive will be shown and issues which need to be addressed as archive volumes expand into the exa-scale will be discussed.
Water level ingest, archive and processing system - an integral part of NOAA's tsunami database
NASA Astrophysics Data System (ADS)
McLean, S. J.; Mungov, G.; Dunbar, P. K.; Price, D. J.; Mccullough, H.
2013-12-01
The National Oceanic and Atmospheric Administration (NOAA), National Geophysical Data Center (NGDC) and collocated World Data Service for Geophysics (WDS) provides long-term archive, data management, and access to national and global tsunami data. Archive responsibilities include the NOAA Global Historical Tsunami event and runup database, damage photos, as well as other related hazards data. Beginning in 2008, NGDC was given the responsibility of archiving, processing and distributing all tsunami and hazards-related water level data collected from NOAA observational networks in a coordinated and consistent manner. These data include the Deep-ocean Assessment and Reporting of Tsunami (DART) data provided by the National Data Buoy Center (NDBC), coastal-tide-gauge data from the National Ocean Service (NOS) network and tide-gauge data from the two National Weather Service (NWS) Tsunami Warning Centers (TWCs) regional networks. Taken together, this integrated archive supports tsunami forecast, warning, research, mitigation and education efforts of NOAA and the Nation. Due to the variety of the water level data, the automatic ingest system was redesigned, along with upgrading the inventory, archive and delivery capabilities based on modern digital data archiving practices. The data processing system was also upgraded and redesigned focusing on data quality assessment in an operational manner. This poster focuses on data availability highlighting the automation of all steps of data ingest, archive, processing and distribution. Examples are given from recent events such as the October 2012 hurricane Sandy, the Feb 06, 2013 Solomon Islands tsunami, and the June 13, 2013 meteotsunami along the U.S. East Coast.
Cloud-based NEXRAD Data Processing and Analysis for Hydrologic Applications
NASA Astrophysics Data System (ADS)
Seo, B. C.; Demir, I.; Keem, M.; Goska, R.; Weber, J.; Krajewski, W. F.
2016-12-01
The real-time and full historical archive of NEXRAD Level II data, covering the entire United States from 1991 to present, recently became available on Amazon cloud S3. This provides a new opportunity to rebuild the Hydro-NEXRAD software system that enabled users to access vast amounts of NEXRAD radar data in support of a wide range of research. The system processes basic radar data (Level II) and delivers radar-rainfall products based on the user's custom selection of features such as space and time domain, river basin, rainfall product space and time resolution, and rainfall estimation algorithms. The cloud-based new system can eliminate prior challenges faced by Hydro-NEXRAD data acquisition and processing: (1) temporal and spatial limitation arising from the limited data storage; (2) archive (past) data ingestion and format conversion; and (3) separate data processing flow for the past and real-time Level II data. To enhance massive data processing and computational efficiency, the new system is implemented and tested for the Iowa domain. This pilot study begins by ingesting rainfall metadata and implementing Hydro-NEXRAD capabilities on the cloud using the new polarimetric features, as well as the existing algorithm modules and scripts. The authors address the reliability and feasibility of cloud computation and processing, followed by an assessment of response times from an interactive web-based system.
Kalvelage, T.; Willems, Jennifer
2003-01-01
The design of the EOS Data and Information Systems (EOSDIS) to acquire, archive, manage and distribute Earth observation data to the broadest possible user community was discussed. A number of several integrated retrieval, processing and distribution capabilities have been explained. The value of these functions to the users were described and potential future improvements were laid out for the users. The users were interested in acquiring the retrieval, processing and archiving systems integrated so that they can get the data they want in the format and delivery mechanism of their choice.
Archiving California’s historical duck nesting data
Ackerman, Joshua T.; Herzog, Mark P.; Brady, Caroline; Eadie, John M.; Yarris, Greg S.
2015-07-14
With the conclusion of this project, most duck nest data have been entered, but all nest-captured hen data and other breeding waterfowl data that were outside the scope of this project have still not been entered and electronically archived. Maintaining an up-to-date archive will require additional resources to archive and enter the new duck nest data each year in an iterative process. Further, data proofing should be conducted whenever possible, and also should be considered an iterative process as there was sometimes missing data that could not be filled in without more direct knowledge of specific projects. Despite these disclaimers, this duck data archive represents a massive and useful dataset to inform future research and management questions.
Radiologic image communication and archive service: a secure, scalable, shared approach
NASA Astrophysics Data System (ADS)
Fellingham, Linda L.; Kohli, Jagdish C.
1995-11-01
The Radiologic Image Communication and Archive (RICA) service is designed to provide a shared archive for medical images to the widest possible audience of customers. Images are acquired from a number of different modalities, each available from many different vendors. Images are acquired digitally from those modalities which support direct digital output and by digitizing films for projection x-ray exams. The RICA Central Archive receives standard DICOM 3.0 messages and data streams from the medical imaging devices at customer institutions over the public telecommunication network. RICA represents a completely scalable resource. The user pays only for what he is using today with the full assurance that as the volume of image data that he wishes to send to the archive increases, the capacity will be there to accept it. To provide this seamless scalability imposes several requirements on the RICA architecture: (1) RICA must support the full array of transport services. (2) The Archive Interface must scale cost-effectively to support local networks that range from the very small (one x-ray digitizer in a medical clinic) to the very large and complex (a large hospital with several CTs, MRs, Nuclear medicine devices, ultrasound machines, CRs, and x-ray digitizers). (3) The Archive Server must scale cost-effectively to support rapidly increasing demands for service providing storage for and access to millions of patients and hundreds of millions of images. The architecture must support the incorporation of improved technology as it becomes available to maintain performance and remain cost-effective as demand rises.
Davis, Brian N.; Werpy, Jason; Friesz, Aaron M.; Impecoven, Kevin; Quenzer, Robert; Maiersperger, Tom; Meyer, David J.
2015-01-01
Current methods of searching for and retrieving data from satellite land remote sensing archives do not allow for interactive information extraction. Instead, Earth science data users are required to download files over low-bandwidth networks to local workstations and process data before science questions can be addressed. New methods of extracting information from data archives need to become more interactive to meet user demands for deriving increasingly complex information from rapidly expanding archives. Moving the tools required for processing data to computer systems of data providers, and away from systems of the data consumer, can improve turnaround times for data processing workflows. The implementation of middleware services was used to provide interactive access to archive data. The goal of this middleware services development is to enable Earth science data users to access remote sensing archives for immediate answers to science questions instead of links to large volumes of data to download and process. Exposing data and metadata to web-based services enables machine-driven queries and data interaction. Also, product quality information can be integrated to enable additional filtering and sub-setting. Only the reduced content required to complete an analysis is then transferred to the user.
Historical Trends in Counsellor Education Dissertations
ERIC Educational Resources Information Center
Richards, Judith; Dykeman, Cass; Bender, Sara
2016-01-01
There exists a dearth of literature on the content, research method and research design trends of dissertations in education. Within one large subfield of education (i.e. counsellor education), an online and full-text archive of dissertations has become available. This archive contains over 200 dissertations produced in Oregon State University's…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Doll, Stephanie R.; Cooke, Gary A.
The 222-S Laboratory blended supernate waste from Hanford Tanks 241-AN-101, 241-AN- 106, 241-AP-105, 241-AP-106, 241-AP-107, and 241-AY-101 from the hot cell archive to create a bulk composite. The composite was blended with 600 mL 19.4 M NaOH, which brought the total volume to approximately 11.5 L (3 gal). The composite was filtered to remove solids and passed through spherical resorcinol-formaldehyde ion-exchange resin columns to remove cesium. The composite masses were tracked as a treatability study. Samples collected before, during, and after the ion-exchange process were characterized for a full suite of analytes (inorganic, organic, and radionuclides) to aid in themore » classification of the waste for shipping, receiving, treatment, and disposal determinations.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Doll, S. R.; Cooke, G. A.
The 222-S Laboratory blended supernate waste from Hanford Tanks 241-AN-101, 241-AN- 106, 241-AP-105, 241-AP-106, 241-AP-107, and 241-AY-101 from the hot cell archive to create a bulk composite. The composite was blended with 600 mL 19.4 M NaOH, which brought the total volume to approximately 11.5 L (3 gal). The composite was filtered to remove solids and passed through spherical resorcinol-formaldehyde ion-exchange resin columns to remove cesium. The composite masses were tracked as a treatability study. Samples collected before, during, and after the ion exchange process were characterized for a full suite of analytes (inorganic, organic, and radionuclides) to aid inmore » the classification of the waste for shipping, receiving, treatment, and disposal determinations.« less
2008-08-01
Requirements for UXO Discrimination: Paper prepared for UXO Location Workshop, Annapolis, May 2005. Billings, S. D., Pasion , L. R., Beran, L., Oldenburg, D...Discrimination Strategies for Application to Live Sites. Billings, S. D., Pasion , L. R. and Oldenburg, D., 2007b, Sky Research/University of British Columbia...moved RTS to reacquire SE1-12 which was lost by the DAS issue as well. o Talked with Len Pasion about the best approach for full coverage data given
Accessing Suomi NPP OMPS Products Through the GES DISC Online Data Services
NASA Astrophysics Data System (ADS)
Johnson, J. E.; Wei, J. C.; Garasimov, I.; Vollmer, B.
2017-12-01
The NASA Goddard Earth Sciences Data and Information Services Center (GES DISC) is the primary archive of the latest versions of atmospheric composition data from the Suomi National Polar-orbiting Partnership (NPP) Ozone Mapping Profiler Suite (OMPS) mission. OMPS consists of three spectrometers: a Nadir Mapper (300-420 nm) with 50×50 km2 resolution and 2600 km wide swath, a Nadir Profiler (250-310 nm) with 250×250 km2 footprint, and a three-slit Limb Profiler (290-1000 nm) making 3 vertical profiles spaced about 250 km apart with 1-2 km vertical resolution up to 65 km altitude. OMPS measures primarily ozone, both total column and vertical profiles, but also includes measurements of NO2 and SO2 total and tropospheric columns, aerosol extinction profiles. Also available from OMPS are the Level-1B calibrated and geolocated radiances. All data products are generated at the OMPS Science Investigator Processing System (SIPS) at NASA/GSFC. This presentation will provide an overview of the OMPS products available at the GES DISC archive, as well as demonstrate the various data services provided by the GES DISC. Traditionally users have accessed data by downloading data files using anonymous FTP. Although one may still download the full OMPS data products from the archive (using HTTPS instead), the GES DISC now also offers online data services that allow users to not have to physically download the full data files to their desktop computer. Users can access the data through a desktop client tool (such as IDL, Matlab or Panoply) using OPeNDAP. Other data services include file subsetters (spatially, temporally, and/or by variable), as well as data visualization and exploration services for users to preview or quickly analyze the data. Since TOMS and EOS Aura data products are also available from the GES DISC archive, these can be easily accessed and compared with the OMPS data.
The Land Processes Distributed Active Archive Center (LP DAAC)
Golon, Danielle K.
2016-10-03
The Land Processes Distributed Active Archive Center (LP DAAC) operates as a partnership with the U.S. Geological Survey and is 1 of 12 DAACs within the National Aeronautics and Space Administration (NASA) Earth Observing System Data and Information System (EOSDIS). The LP DAAC ingests, archives, processes, and distributes NASA Earth science remote sensing data. These data are provided to the public at no charge. Data distributed by the LP DAAC provide information about Earth’s surface from daily to yearly intervals and at 15 to 5,600 meter spatial resolution. Data provided by the LP DAAC can be used to study changes in agriculture, vegetation, ecosystems, elevation, and much more. The LP DAAC provides several ways to access, process, and interact with these data. In addition, the LP DAAC is actively archiving new datasets to provide users with a variety of data to study the Earth.
OASIS: A Data Fusion System Optimized for Access to Distributed Archives
NASA Astrophysics Data System (ADS)
Berriman, G. B.; Kong, M.; Good, J. C.
2002-05-01
The On-Line Archive Science Information Services (OASIS) is accessible as a java applet through the NASA/IPAC Infrared Science Archive home page. It uses Geographical Information System (GIS) technology to provide data fusion and interaction services for astronomers. These services include the ability to process and display arbitrarily large image files, and user-controlled contouring, overlay regeneration and multi-table/image interactions. OASIS has been optimized for access to distributed archives and data sets. Its second release (June 2002) provides a mechanism that enables access to OASIS from "third-party" services and data providers. That is, any data provider who creates a query form to an archive containing a collection of data (images, catalogs, spectra) can direct the result files from the query into OASIS. Similarly, data providers who serve links to datasets or remote services on a web page can access all of these data with one instance of OASIS. In this was any data or service provider is given access to the full suite of capabilites of OASIS. We illustrate the "third-party" access feature with two examples: queries to the high-energy image datasets accessible from GSFC SkyView, and links to data that are returned from a target-based query to the NASA Extragalactic Database (NED). The second release of OASIS also includes a file-transfer manager that reports the status of multiple data downloads from remote sources to the client machine. It is a prototype for a request management system that will ultimately control and manage compute-intensive jobs submitted through OASIS to computing grids, such as request for large scale image mosaics and bulk statistical analysis.
Cluster Ion Spectrometry (CIS) Data Archiving in the CAA
NASA Astrophysics Data System (ADS)
Dandouras, I. S.; Barthe, A.; Penou, E.; Brunato, S.; Reme, H.; Kistler, L. M.; Blagau, A.; Facsko, G.; Kronberg, E.; Laakso, H. E.
2009-12-01
The Cluster Active Archive (CAA) aims at preserving the four Cluster spacecraft data, so that they are usable in the long-term by the scientific community as well as by the instrument team PIs and Co-Is. This implies that the data are filed together with the descriptive and documentary elements making it possible to select and interpret them. The CIS (Cluster Ion Spectrometry) experiment is a comprehensive ionic plasma spectrometry package onboard the four Cluster spacecraft, capable of obtaining full three-dimensional ion distributions (about 0 to 40 keV/e) with a time resolution of one spacecraft spin (4 sec) and with mass-per-charge composition determination. The CIS package consists of two different instruments, a Hot Ion Analyser (HIA) and a time-of-flight ion Composition Distribution Function (CODIF) analyser. For the archival of the CIS data a multi-level approach has been adopted. The CAA archival includes processed raw data (Level 1 data), moments of the ion distribution functions (Level 2 data), and calibrated high-resolution data in a variety of physical units (Level 3 data). The latter are 3-D ion distribution functions and 2-D pitch-angle distributions. In addition, a software package has been developed to allow the CAA user to interactively calculate partial or total moments of the ion distributions. Instrument cross-calibration has been an important activity in preparing the data for archival. The CIS data archive includes also experiment documentation, graphical products for browsing through the data, and data caveats. In addition, data quality indexes are under preparation, to help the user. Given the complexity of an ion spectrometer, and the variety of its operational modes, each one being optimised for a different magnetospheric region or measurement objective, consultation of the data caveats by the end user will always be a necessary step in the data analysis.
Kalvelage, Thomas A.; Willems, Jennifer
2005-01-01
The US Geological Survey's EROS Data Center (EDC) hosts the Land Processes Distributed Active Archive Center (LP DAAC). The LP DAAC supports NASA's Earth Observing System (EOS), which is a series of polar-orbiting and low inclination satellites for long-term global observations of the land surface, biosphere, solid Earth, atmosphere, and oceans. The EOS Data and Information Systems (EOSDIS) was designed to acquire, archive, manage and distribute Earth observation data to the broadest possible user community.The LP DAAC is one of four DAACs that utilize the EOSDIS Core System (ECS) to manage and archive their data. Since the ECS was originally designed, significant changes have taken place in technology, user expectations, and user requirements. Therefore the LP DAAC has implemented additional systems to meet the evolving needs of scientific users, tailored to an integrated working environment. These systems provide a wide variety of services to improve data access and to enhance data usability through subsampling, reformatting, and reprojection. These systems also support the wide breadth of products that are handled by the LP DAAC.The LP DAAC is the primary archive for the Landsat 7 Enhanced Thematic Mapper Plus (ETM+) data; it is the only facility in the United States that archives, processes, and distributes data from the Advanced Spaceborne Thermal Emission/Reflection Radiometer (ASTER) on NASA's Terra spacecraft; and it is responsible for the archive and distribution of “land products” generated from data acquired by the Moderate Resolution Imaging Spectroradiometer (MODIS) on NASA's Terra and Aqua satellites.
AstroCloud, a Cyber-Infrastructure for Astronomy Research: Data Archiving and Quality Control
NASA Astrophysics Data System (ADS)
He, B.; Cui, C.; Fan, D.; Li, C.; Xiao, J.; Yu, C.; Wang, C.; Cao, Z.; Chen, J.; Yi, W.; Li, S.; Mi, L.; Yang, S.
2015-09-01
AstroCloud is a cyber-Infrastructure for Astronomy Research initiated by Chinese Virtual Observatory (China-VO) under funding support from NDRC (National Development and Reform commission) and CAS (Chinese Academy of Sciences)1(Cui et al. 2014). To archive the astronomical data in China, we present the implementation of the astronomical data archiving system (ADAS). Data archiving and quality control are the infrastructure for the AstroCloud. Throughout the data of the entire life cycle, data archiving system standardized data, transferring data, logging observational data, archiving ambient data, And storing these data and metadata in database. Quality control covers the whole process and all aspects of data archiving.
21 CFR 177.2400 - Perfluorocarbon cured elastomers.
Code of Federal Regulations, 2014 CFR
2014-04-01
... as are provided: Substances Limitations Carbon black (channel process of furnace combustion process... Park, MD 20740, or available for inspection at the National Archives and Records Administration (NARA....archives.gov/federal_register/code_of_federal_regulations/ibr_locations.html. (2) Thermogravimetry...
NASA Astrophysics Data System (ADS)
Neakrase, Lynn; Hornung, Danae; Sweebe, Kathrine; Huber, Lyle; Chanover, Nancy J.; Stevenson, Zena; Berdis, Jodi; Johnson, Joni J.; Beebe, Reta F.
2017-10-01
The Research and Analysis programs within NASA’s Planetary Science Division now require archiving of resultant data with the Planetary Data System (PDS) or an equivalent archive. The PDS Atmospheres Node is developing an online environment for assisting data providers with this task. The Educational Labeling System for Atmospheres (ELSA) is being designed with Django/Python coding to provide an easier environment for facilitating not only communication with the PDS node, but also streamlining the process of learning, developing, submitting, and reviewing archive bundles under the new PDS4 archiving standard. Under the PDS4 standard, data are archived in bundles, collections, and basic products that form an organizational hierarchy of interconnected labels that describe the data and relationships between the data and its documentation. PDS4 labels are implemented using Extensible Markup Language (XML), which is an international standard for managing metadata. Potential data providers entering the ELSA environment can learn more about PDS4, plan and develop label templates, and build their archive bundles. ELSA provides an interface to tailor label templates aiding in the creation of required internal Logical Identifiers (URN - Uniform Resource Names) and Context References (missions, instruments, targets, facilities, etc.). The underlying structure of ELSA uses Django/Python code that make maintaining and updating the interface easy to do for our undergraduate/graduate students. The ELSA environment will soon provide an interface for using the tailored templates in a pipeline to produce entire collections of labeled products, essentially building the user’s archive bundle. Once the pieces of the archive bundle are assembled, ELSA provides options for queuing the completed bundle for peer review. The peer review process has also been streamlined for online access and tracking to help make the archiving process with PDS as transparent as possible. We discuss the current status of ELSA and provide examples of its implementation.
Charting the Course: Life Cycle Management of Mars Mission Digital Information
NASA Technical Reports Server (NTRS)
Reiz, Julie M.
2003-01-01
This viewgraph presentation reviews the life cycle management of MER Project information. This process was an essential key to the successful launch of the MER Project rovers. Incorporating digital information archive requirements early in the project life cycle resulted in: Design of an information system that included archive metadata, Reduced the risk of information loss through in-process appraisal, Easier transfer of project information to institutional online archive and Project appreciation for preserving information for reuse by future projects
arXiv.org and Physics Education
ERIC Educational Resources Information Center
Ramlo, Susan
2007-01-01
The website arXiv.org (pronounced "archive") is a free online resource for full-text articles in the fields of physics, mathematics, computer science, nonlinear science, and quantitative biology that has existed for about 15 years. Available directly at http://www.arXiv.org, this e-print archive is searchable. As of Jan. 3, 2007, arXiv had open…
130. Photographic copy of drawing (undated, original drawing in Archives, ...
130. Photographic copy of drawing (undated, original drawing in Archives, Public Affairs Department, Sears Merchandise Group, Hoffman Estates, Illinois). Engineer unknown. Full size. Pneumatic tube cartridges used at the time of the Plant closing. 4' PNEUMATIC TUBE SYSTEM - 11' LEATHER CARRIER - Sears Roebuck & Company Mail Order Plant, Merchandise Building, 924 South Homan Avenue, Chicago, Cook County, IL
131. Photographic copy of drawing (undated, original drawing in Archives, ...
131. Photographic copy of drawing (undated, original drawing in Archives, Public Affairs Department, Sears Merchandise Group, Hoffman Estates, Illinois). Engineer unknown. Full size. Pneumatic tube cartridges used at the time of the Plant closing. 4' PNEUMATIC TUBE SYSTEM - 11' LEATHER CARRIER - Sears Roebuck & Company Mail Order Plant, Merchandise Building, 924 South Homan Avenue, Chicago, Cook County, IL
Enhancement of real-time EPICS IOC PV management for the data archiving system
NASA Astrophysics Data System (ADS)
Kim, Jae-Ha
2015-10-01
The operation of a 100-MeV linear proton accelerator, the major driving values and experimental data need to be archived. According to the experimental conditions, different data are required. Functions that can add new data and delete data in real time need to be implemented. In an experimental physics and industrial control system (EPICS) input output controller (IOC), the value of process variables (PVs) are matched with the driving values and data. The PV values are archived in text file format by using the channel archiver. There is no need to create a database (DB) server, just a need for large hard disk. Through the web, the archived data can be loaded, and new PV values can be archived without stopping the archive engine. The details of the implementation of a data archiving system with channel archiver are presented, and some preliminary results are reported.
Elliott, Paul; Peakman, Tim C
2008-04-01
UK Biobank is a large prospective study in the UK to investigate the role of genetic factors, environmental exposures and lifestyle in the causes of major diseases of late and middle age. Extensive data and biological samples are being collected from 500,000 participants aged between 40 and 69 years. The biological samples that are collected and how they are processed and stored will have a major impact on the future scientific usefulness of the UK Biobank resource. The aim of the UK Biobank sample handling and storage protocol is to specify methods for the collection and storage of participant samples that give maximum scientific return within the available budget. Processing or storage methods that, as far as can be predicted, will preclude current or future assays have been avoided. The protocol was developed through a review of the literature on sample handling and processing, wide consultation within the academic community and peer review. Protocol development addressed which samples should be collected, how and when they should be processed and how the processed samples should be stored to ensure their long-term integrity. The recommended protocol was extensively tested in a series of validation studies. UK Biobank collects about 45 ml blood and 9 ml of urine with minimal local processing from each participant using the vacutainer system. A variety of preservatives, anti-coagulants and clot accelerators is used appropriate to the expected end use of the samples. Collection of other material (hair, nails, saliva and faeces) was also considered but rejected for the full cohort. Blood and urine samples from participants are transported overnight by commercial courier to a central laboratory where they are processed and aliquots of urine, plasma, serum, white cells and red cells stored in ultra-low temperature archives. Aliquots of whole blood are also stored for potential future production of immortalized cell lines. A standard panel of haematology assays is completed on whole blood from all participants, since such assays need to be conducted on fresh samples (whereas other assays can be done on stored samples). By the end of the recruitment phase, 15 million sample aliquots will be stored in two geographically separate archives: 9.5 million in a -80 degrees C automated archive and 5.5 million in a manual liquid nitrogen archive at -180 degrees C. Because of the size of the study and the numbers of samples obtained from participants, the protocol stipulates a highly automated approach for the processing and storage of samples. Implementation of the processes, technology, systems and facilities has followed best practices used in manufacturing industry to reduce project risk and to build in quality and robustness. The data produced from sample collection, processing and storage are highly complex and are managed by a commercially available LIMS system fully integrated with the entire process. The sample handling and storage protocol adopted by UK Biobank provides quality assured and validated methods that are feasible within the available funding and reflect the size and aims of the project. Experience from recruiting and processing the first 40,000 participants to the study demonstrates that the adopted methods and technologies are fit-for-purpose and robust.
Challenges of the science data processing, analysis and archiving approach in BepiColombo
NASA Astrophysics Data System (ADS)
Martinez, Santa
BepiColombo is a joint mission of the European Space Agency (ESA) and the Japan Aerospace Exploration Agency (JAXA) to the planet Mercury. It comprises two separate orbiters: the Mercury Planetary Orbiter (MPO) and the Mercury Magnetospheric Orbiter (MMO). After approximately 7.5 years of cruise, BepiColombo will arrive at Mercury in 2024 and will gather data during a 1-year nominal mission, with a possible 1-year extension. The approach selected for BepiColombo for the processing, analysis and archiving of the science data represents a significant change with respect to previous ESA planetary missions. Traditionally Instrument Teams are responsible for processing, analysing and preparing their science data for the long-term archive, however in BepiColombo, the Science Ground Segment (SGS), located in Madrid, Spain, will play a key role in these activities. Fundamental aspects of this approach include: the involvement of the SGS in the definition, development and operation of the instrument processing pipelines; the production of ready-to-archive science products compatible with NASA’s Planetary Data System (PDS) standards in all the processing steps; the joint development of a quick-look analysis system to monitor deviations between planned and executed observations to feed back the results into the different planning cycles when possible; and a mission archive providing access to the scientific products and to the operational data throughout the different phases of the mission (from the early development phase to the legacy phase). In order to achieve these goals, the SGS will need to overcome a number of challenges. The proposed approach requires a flexible infrastructure able to cope with a distributed data processing system, residing in different locations but designed as a single entity. For this, all aspects related to the integration of software developed by different Instrument Teams and the alignment of their development schedules will need to be considered. In addition, the SGS is taking full responsibility for the production of the first level of science data (un-calibrated), with the associated operational implications. An additional difficulty impacting the processing strategies relates to the various spacecraft data downlink mechanisms available for the MPO and their associated data latency. With regards to archiving, the main challenges include: the use of a new version of the PDS standards (so-called PDS4), being implemented for the first time in an ESA planetary mission; the use of external standards (CDF, FITS); and the implementation of interoperability protocols that aim to make all data (from both MPO and MMO) globally accessible through a distributed archive to the end-users. For the definition of the quick-look analysis system, it is very important to understand and harmonise the different views and expectations of the science team. Due to the long duration of the Cruise phase, and the fact that there are many years between the design of the system and the nominal mission, it might be difficult for some Instrument Teams to accurately define their needs so many years before operations. In particular, new scientific discoveries over the coming years by the MESSENGER spacecraft, currently orbiting Mercury, may influence how the Instrument Teams on BepiColombo define their operations and their reduction and analysis techniques. In addition, due to the long duration of the mission, it is not always possible or practical to document all accumulated knowledge on paper so if personnel leave some of their knowledge is lost as well. This is key, particularly for the Instrument Teams. By taking a pro-active role in the collection of requirements and expectations of the science team together with the definition of clear guidelines early in the mission and by developing close collaboration with the Instrument Teams, the SGS will be able to identify how to best exploit the expertise on both sides and to guarantee that the necessary support is provided when needed. This contribution will detail the main challenges and advantages associated with the data processing, analysis and archiving approach in BepiColombo, and will summarise the various efforts ongoing to guarantee that the scientific requirements of the mission and the expectations of the science team are fulfilled. Future ESA planetary missions (e.g. ExoMars, JUICE) will follow a similar approach, adapting the efforts to the profile of the mission.
Semi-automated Data Set Submission Work Flow for Archival with the ORNL DAAC
NASA Astrophysics Data System (ADS)
Wright, D.; Beaty, T.; Cook, R. B.; Devarakonda, R.; Eby, P.; Heinz, S. L.; Hook, L. A.; McMurry, B. F.; Shanafield, H. A.; Sill, D.; Santhana Vannan, S.; Wei, Y.
2013-12-01
The ORNL DAAC archives and publishes, free of charge, data and information relevant to biogeochemical, ecological, and environmental processes. The ORNL DAAC primarily archives data produced by NASA's Terrestrial Ecology Program; however, any data that are pertinent to the biogeochemical and ecological community are of interest. The data set submission process to the ORNL DAAC has been recently updated and semi-automated to provide a consistent data provider experience and to create a uniform data product. The data archived at the ORNL DAAC must be well formatted, self-descriptive, and documented, as well as referenced in a peer-reviewed publication. If the ORNL DAAC is the appropriate archive for a data set, the data provider will be sent an email with several URL links to guide them through the submission process. The data provider will be asked to fill out a short online form to help the ORNL DAAC staff better understand the data set. These questions cover information about the data set, a description of the data set, temporal and spatial characteristics of the data set, and how the data were prepared and delivered. The questionnaire is generic and has been designed to gather input on the various diverse data sets the ORNL DAAC archives. A data upload module and metadata editor further guide the data provider through the submission process. For submission purposes, a complete data set includes data files, document(s) describing data, supplemental files, metadata record(s), and an online form. There are five major functions the ORNL DAAC performs during the process of archiving data: 1) Ingestion is the ORNL DAAC side of submission; data are checked, metadata records are compiled, and files are converted to archival formats. 2) Metadata records and data set documentation made searchable and the data set is given a permanent URL. 3) The data set is published, assigned a DOI, and advertised. 4) The data set is provided long-term post-project support. 5) Stewardship of data ensures the data are stored on state of the art computer systems with reliable backups.
PACS archive upgrade and data migration: clinical experiences
NASA Astrophysics Data System (ADS)
Liu, Brent J.; Documet, Luis; Sarti, Dennis A.; Huang, H. K.; Donnelly, John
2002-05-01
Saint John's Health Center PACS data volumes have increased dramatically since the hospital became filmless in April of 1999. This is due in part of continuous image accumulation, and the integration of a new multi-slice detector CT scanner into PACS. The original PACS archive would not be able to handle the distribution and archiving load and capacity in the near future. Furthermore, there is no secondary copy backup of all the archived PACS image data for disaster recovery purposes. The purpose of this paper is to present a clinical and technical process template to upgrade and expand the PACS archive, migrate existing PACs image data to the new archive, and provide a back-up and disaster recovery function not currently available. Discussion of the technical and clinical pitfalls and challenges involved in this process will be presented as well. The server hardware configuration was upgraded and a secondary backup implemented for disaster recovery. The upgrade includes new software versions, database reconfiguration, and installation of a new tape jukebox to replace the current MOD jukebox. Upon completion, all PACS image data from the original MOD jukebox was migrated to the new tape jukebox and verified. The migration was performed during clinical operation continuously in the background. Once the data migration was completed the MOD jukebox was removed. All newly acquired PACS exams are now archived to the new tape jukebox. All PACs image data residing on the original MOD jukebox have been successfully migrated into the new archive. In addition, a secondary backup of all PACS image data has been implemented for disaster recovery and has been verified using disaster scenario testing. No PACS image data was lost during the entire process and there was very little clinical impact during the entire upgrade and data migration. Some of the pitfalls and challenges during this upgrade process included hardware reconfiguration for the original archive server, clinical downtime involved with the upgrade, and data migration planning to minimize impact on clinical workflow. The impact was minimized with a downtime contingency plan.
Huh, Sun
2013-01-01
ScienceCentral, a free or open access, full-text archive of scientific journal literature at the Korean Federation of Science and Technology Societies, was under test in September 2013. Since it is a Journal Article Tag Suite-based full text database, extensible markup language files of all languages can be presented, according to Unicode Transformation Format 8-bit encoding. It is comparable to PubMed Central: however, there are two distinct differences. First, its scope comprises all science fields; second, it accepts all language journals. Launching ScienceCentral is the first step for free access or open access academic scientific journals of all languages to leap to the world, including scientific journals from Croatia.
AIRSAR Automated Web-based Data Processing and Distribution System
NASA Technical Reports Server (NTRS)
Chu, Anhua; vanZyl, Jakob; Kim, Yunjin; Lou, Yunling; Imel, David; Tung, Wayne; Chapman, Bruce; Durden, Stephen
2005-01-01
In this paper, we present an integrated, end-to-end synthetic aperture radar (SAR) processing system that accepts data processing requests, submits processing jobs, performs quality analysis, delivers and archives processed data. This fully automated SAR processing system utilizes database and internet/intranet web technologies to allow external users to browse and submit data processing requests and receive processed data. It is a cost-effective way to manage a robust SAR processing and archival system. The integration of these functions has reduced operator errors and increased processor throughput dramatically.
Photocopy of original drawings (original located at the National Archives, ...
Photocopy of original drawings (original located at the National Archives, San Bruno, California, Navy # 104-A-5). Depts. Yards & docks, U.S. Navy Mare Island, Cal., "full sized detail of fretwork panels in back of organ recess, St. Peter's Chapel, Mare Island, Cal., December 1904. - Mare Island Naval Shipyard, St. Peter's Chapel, Walnut Street & Cedar Parkway, Vallejo, Solano County, CA
Proba-V Mission Exploitation Platform
NASA Astrophysics Data System (ADS)
Goor, E.
2017-12-01
VITO and partners developed the Proba-V Mission Exploitation Platform (MEP) as an end-to-end solution to drastically improve the exploitation of the Proba-V (an EC Copernicus contributing mission) EO-data archive, the past mission SPOT-VEGETATION and derived vegetation parameters by researchers, service providers (e.g. the EC Copernicus Global Land Service) and end-users. The analysis of time series of data (PB range) is addressed, as well as the large scale on-demand processing of near real-time data on a powerful and scalable processing environment. New features are still developed, but the platform is yet fully operational since November 2016 and offers A time series viewer (browser web client and API), showing the evolution of Proba-V bands and derived vegetation parameters for any country, region, pixel or polygon defined by the user. Full-resolution viewing services for the complete data archive. On-demand processing chains on a powerfull Hadoop/Spark backend. Virtual Machines can be requested by users with access to the complete data archive mentioned above and pre-configured tools to work with this data, e.g. various toolboxes and support for R and Python. This allows users to immediately work with the data without having to install tools or download data, but as well to design, debug and test applications on the platform. Jupyter Notebooks is available with some examples python and R projects worked out to show the potential of the data. Today the platform is already used by several international third party projects to perform R&D activities on the data, and to develop/host data analysis toolboxes. From the Proba-V MEP, access to other data sources such as Sentinel-2 and landsat data is also addressed. Selected components of the MEP are also deployed on public cloud infrastructures in various R&D projects. Users can make use of powerful Web based tools and can self-manage virtual machines to perform their work on the infrastructure at VITO with access to the complete data archive. To realise this, private cloud technology (openStack) is used and a distributed processing environment is built based on Hadoop. The Hadoop ecosystem offers a lot of technologies (Spark, Yarn, Accumulo) which we integrate with several open-source components (e.g. Geotrellis).
A Toolkit For CryoSat Investigations By The ESRIN EOP-SER Altimetry Team
NASA Astrophysics Data System (ADS)
Dinardo, Salvatore; Bruno, Lucas; Benveniste, Jerome
2013-12-01
The scope of this work is to feature the new tool for the exploitation of the CryoSat data, designed and developed entirely by the Altimetry Team at ESRIN EOP-SER (Earth Observation - Exploitation, Research and Development). The tool framework is composed of two separate components: the first one handles the data collection and management, the second one is the processing toolkit. The CryoSat FBR (Full Bit Rate) data is downlinked uncompressed from the satellite, containing un-averaged individual echoes. This data is made available in the Kiruna CalVal server in a 10 day rolling archive. Daily at ESRIN all the CryoSat FBR data, in SAR and SARin Mode, are downloaded (around 30 Gigabytes) catalogued and archived in local ESRIN EOP-SER workstations. As of March 2013, the total amount of FBR data is over 9 Terabytes, with CryoSat acquisition dates spanning January 2011 to February 2013 (with some gaps). This archive was built by merging partial datasets available at ESTEC and NOAA, that have been kindly made available for EOP-SER team. The on-demand access to this low level data is restricted to expert users with validated ESA P.I. credentials. Currently the main users of the archiving functionality are the team members of the Project CP4O (STSE- CryoSat Plus for Ocean), CNES and NOAA. The second component of the service is the processing toolkit. On the EOP-SER workstations there is internally and independently developed software that is able to process the FBR data in SAR/SARin mode to generate multi-looked echoes (Level 1B) and subsequently able to re-track them in SAR and SARin mode (Level 2) over open ocean, exploiting the SAMOSA model and other internally developed models. The processing segment is used for research & development scopes, supporting the development contracts awarded confronting the deliverables to ESA, on site demonstrations/training to selected users, cross- comparison against third part products (CLS/CNES CPP Products for instance), preparation to Sentinel-3 mission, publications, etc. Samples of these experimental SAR/SARin L1b/L2 Products can be provided to the scientific community for comparison with self-processed data, on-request. So far, the processing has been designed and optimized for open ocean studies and is fully functional only over this kind of surface but there are plans to augment this processing capacity over coastal zones, inland waters and over land in sight of maximizing the exploitation of the upcoming Sentinel-3 Topographic mission over all surfaces. There are also plans to make the toolkit fully accessible through software “gridification” to run in the ESRin GPod (Grid Processing on Demand) Service and to extend the tool's functionalities to support Sentinel-3 Mission (both Simulated and Real Data). Graphs and statistics about the spatial coverage and amount of FBR data actually archived on the EOP-SER workstations and some scientific results will be shown in this paper along with the tests that have been designed and performed to validate the products (tests against CryoSat Kiruna PDGS Products and against transponder data).
122. FULL STARBOARD VIEW UNDERWAY, IN CAMOUFLAGE PAINT SCHEME. 5 ...
122. FULL STARBOARD VIEW UNDERWAY, IN CAMOUFLAGE PAINT SCHEME. 5 SEPTEMBER 1944. (NATIONAL ARCHIVES NO. 80-G-284087) - U.S.S. HORNET, Puget Sound Naval Shipyard, Sinclair Inlet, Bremerton, Kitsap County, WA
NASA Astrophysics Data System (ADS)
Dupac, X.; Arviset, C.; Fernandez Barreiro, M.; Lopez-Caniego, M.; Tauber, J.
2015-12-01
The Planck Collaboration has released in 2015 their second major dataset through the Planck Legacy Archive (PLA). It includes cosmological, Extragalactic and Galactic science data in temperature (intensity) and polarization. Full-sky maps are provided with unprecedented angular resolution and sensitivity, together with a large number of ancillary maps, catalogues (generic, SZ clusters and Galactic cold clumps), time-ordered data and other information. The extensive cosmological likelihood package allows cosmologists to fully explore the plausible parameters of the Universe. A new web-based PLA user interface is made public since Dec. 2014, allowing easier and faster access to all Planck data, and replacing the previous Java-based software. Numerous additional improvements to the PLA are also being developed through the so-called PLA Added-Value Interface, making use of an external contract with the Planetek Hellas and Expert Analytics software companies. This will allow users to process time-ordered data into sky maps, separate astrophysical components in existing maps, simulate the microwave and infrared sky through the Planck Sky Model, and use a number of other functionalities.
NASA Astrophysics Data System (ADS)
Pötzi, Werner; Temmer, Manuela; Veronig, Astrid; Hirtenfellner-Polanec, Wolfgang; Baumgartner, Dietmar
2013-04-01
Kanzelhöhe Observatory (KSO; kso.ac.at) located in the South of Austria is part of the Institute of Physics of the University of Graz. Since the early 1940s, the Sun has been observed in various layers and wavelengths. Currently, KSO provides high-cadence full-disk observations of the solar disk in three wavelengths: H-alpha line, Ca II K line, white light. Real-time images are published online. For scientific use, the data is processed, and immediately available to the scientific community after each observing day via the Kanzelhöhe Online Data Archive archive (KODA; kanzelhohe.uni-graz.at). KSO is part of the Global H-Alpha Network and is also one of the contributing stations for the international sunspot number. In the frame of ESA's Space Situational Awareness program, methods are currently under development for near-real image recognition with respect to solar flares and filaments. These data products will give valuable complementary information for the solar sources of space weather.
NASA Technical Reports Server (NTRS)
Lewis, Adam; Lymburner, Leo; Purss, Matthew B. J.; Brooke, Brendan; Evans, Ben; Ip, Alex; Dekker, Arnold G.; Irons, James R.; Minchin, Stuart; Mueller, Norman
2015-01-01
The effort and cost required to convert satellite Earth Observation (EO) data into meaningful geophysical variables has prevented the systematic analysis of all available observations. To overcome these problems, we utilise an integrated High Performance Computing and Data environment to rapidly process, restructure and analyse the Australian Landsat data archive. In this approach, the EO data are assigned to a common grid framework that spans the full geospatial and temporal extent of the observations - the EO Data Cube. This approach is pixel-based and incorporates geometric and spectral calibration and quality assurance of each Earth surface reflectance measurement. We demonstrate the utility of the approach with rapid time-series mapping of surface water across the entire Australian continent using 27 years of continuous, 25 m resolution observations. Our preliminary analysis of the Landsat archive shows how the EO Data Cube can effectively liberate high-resolution EO data from their complex sensor-specific data structures and revolutionise our ability to measure environmental change.
Archiving Software Systems: Approaches to Preserve Computational Capabilities
NASA Astrophysics Data System (ADS)
King, T. A.
2014-12-01
A great deal of effort is made to preserve scientific data. Not only because data is knowledge, but it is often costly to acquire and is sometimes collected under unique circumstances. Another part of the science enterprise is the development of software to process and analyze the data. Developed software is also a large investment and worthy of preservation. However, the long term preservation of software presents some challenges. Software often requires a specific technology stack to operate. This can include software, operating systems and hardware dependencies. One past approach to preserve computational capabilities is to maintain ancient hardware long past its typical viability. On an archive horizon of 100 years, this is not feasible. Another approach to preserve computational capabilities is to archive source code. While this can preserve details of the implementation and algorithms, it may not be possible to reproduce the technology stack needed to compile and run the resulting applications. This future forward dilemma has a solution. Technology used to create clouds and process big data can also be used to archive and preserve computational capabilities. We explore how basic hardware, virtual machines, containers and appropriate metadata can be used to preserve computational capabilities and to archive functional software systems. In conjunction with data archives, this provides scientist with both the data and capability to reproduce the processing and analysis used to generate past scientific results.
NASA Astrophysics Data System (ADS)
Choi, Sang-Hwa; Kim, Sung Dae; Park, Hyuk Min; Lee, SeungHa
2016-04-01
We established and have operated an integrated data system for managing, archiving and sharing marine geology and geophysical data around Korea produced from various research projects and programs in Korea Institute of Ocean Science & Technology (KIOST). First of all, to keep the consistency of data system with continuous data updates, we set up standard operating procedures (SOPs) for data archiving, data processing and converting, data quality controls, and data uploading, DB maintenance, etc. Database of this system comprises two databases, ARCHIVE DB and GIS DB for the purpose of this data system. ARCHIVE DB stores archived data as an original forms and formats from data providers for data archive and GIS DB manages all other compilation, processed and reproduction data and information for data services and GIS application services. Relational data management system, Oracle 11g, adopted for DBMS and open source GIS techniques applied for GIS services such as OpenLayers for user interface, GeoServer for application server, PostGIS and PostgreSQL for GIS database. For the sake of convenient use of geophysical data in a SEG Y format, a viewer program was developed and embedded in this system. Users can search data through GIS user interface and save the results as a report.
Recommendations for a service framework to access astronomical archives
NASA Technical Reports Server (NTRS)
Travisano, J. J.; Pollizzi, J.
1992-01-01
There are a large number of astronomical archives and catalogs on-line for network access, with many different user interfaces and features. Some systems are moving towards distributed access, supplying users with client software for their home sites which connects to servers at the archive site. Many of the issues involved in defining a standard framework of services that archive/catalog suppliers can use to achieve a basic level of interoperability are described. Such a framework would simplify the development of client and server programs to access the wide variety of astronomical archive systems. The primary services that are supplied by current systems include: catalog browsing, dataset retrieval, name resolution, and data analysis. The following issues (and probably more) need to be considered in establishing a standard set of client/server interfaces and protocols: Archive Access - dataset retrieval, delivery, file formats, data browsing, analysis, etc.; Catalog Access - database management systems, query languages, data formats, synchronous/asynchronous mode of operation, etc.; Interoperability - transaction/message protocols, distributed processing mechanisms (DCE, ONC/SunRPC, etc), networking protocols, etc.; Security - user registration, authorization/authentication mechanisms, etc.; Service Directory - service registration, lookup, port/task mapping, parameters, etc.; Software - public vs proprietary, client/server software, standard interfaces to client/server functions, software distribution, operating system portability, data portability, etc. Several archive/catalog groups, notably the Astrophysics Data System (ADS), are already working in many of these areas. In the process of developing StarView, which is the user interface to the Space Telescope Data Archive and Distribution Service (ST-DADS), these issues and the work of others were analyzed. A framework of standard interfaces for accessing services on any archive system which would benefit archive user and supplier alike is proposed.
ERIC Educational Resources Information Center
Bratslavsky, Lauren Michelle
2013-01-01
The dissertation offers a historical inquiry about how television's material traces entered archival spaces. Material traces refer to both the moving image products and the assortment of documentation about the processes of television as industrial and creative endeavors. By identifying the development of television-specific archives and…
NASA Technical Reports Server (NTRS)
Young, Millennia; Van Baalen, Mary
2016-01-01
This session is intended to provide to HRP IWS attendees instant feedback on archived astronaut data, including such topics as content of archives, access, request processing, and data format. Members of the LSAH and LSDA teams will be available at a 'help desk' during the poster sessions to answer questions from researchers.
A Counter-Proposal for Process: Toward the Development of Online Writing Archives
ERIC Educational Resources Information Center
Jensen, Kyle
2009-01-01
This dissertation advances an alternate vision for research and teaching in rhetoric and composition studies that centers on the development of online writing archives. To justify the need for this alternate vision, it assesses the limitations of the field's predominant research and teaching program: process theory. More specifically, it examines…
The Convergence of Information Technology, Data, and Management in a Library Imaging Program
ERIC Educational Resources Information Center
France, Fenella G.; Emery, Doug; Toth, Michael B.
2010-01-01
Integrating advanced imaging and processing capabilities in libraries, archives, and museums requires effective systems and information management to ensure that the large amounts of digital data about cultural artifacts can be readily acquired, stored, archived, accessed, processed, and linked to other data. The Library of Congress is developing…
NASA Astrophysics Data System (ADS)
Vision, T. J.
2010-12-01
Many datasets collected by academic research teams, despite being difficult or impossible to effectively reproduce, are never shared with the wider community even after the findings based upon them appear in print. This limits the extent to which published scientific findings can be verified and cuts short the opportunity for data to realize their full potential value through reuse and repurposing. While many scientists perceive data to be public goods that should be made available upon publication, they also perceive limited incentive for doing so themselves. This, combined with the lack of mandates for data archiving and the absence of a trusted public repository that can host any kind of data, means that the practice of data archiving is rare. When data are shared post-publication, it is often through ad hoc mechanisms and under terms that present obstacles to reuse. When data are archived, it is generally achieved through routes that do not ensure preservation or discoverability. To address this mix of technical and sociocultural obstacles to data reuse, a consortium of journals in ecology and evolutionary biology recently launched a digital data repository (Dryad) and developed a joint policy mandating data archiving at the time of publication. Dryad has a number of features specifically designed to make it possible for universal data archiving to be achieved with low-burden and low-cost at the time of publication. These include a streamlined submission process through the exchange of metadata through integration with the manuscript processing system, handshaking with more specialized data repositories, and metadata curation that is assisted by automated generation of cataloging terms. To directly benefit data depositors, data are treated as a citable scholarly product through the assignment of trackable data DOIs. The data are permanently linked from the original article and are made freely available with an explicit waiver of restrictions to reuse. The Dryad Consortium, which includes both society-owned and publisher-owned journals, is responsible for governing and sustaining the repository. For scientists, Dryad provides a rich source of data for confirmation of findings, tests of new methodology, and synthetic studies. It also provides the means for depositors to tangibly increase the scientific impact of their work. For journals, Dryad archives data in a more permanent, feature-rich, and cost-effective way than through use of supplementary online materials. Despite its biological origins, Dryad provides a discipline-neutral model for including data fully within the fold of scholarly communication.
[Development of a medical equipment support information system based on PDF portable document].
Cheng, Jiangbo; Wang, Weidong
2010-07-01
According to the organizational structure and management system of the hospital medical engineering support, integrate medical engineering support workflow to ensure the medical engineering data effectively, accurately and comprehensively collected and kept in electronic archives. Analyse workflow of the medical, equipment support work and record all work processes by the portable electronic document. Using XML middleware technology and SQL Server database, complete process management, data calculation, submission, storage and other functions. The practical application shows that the medical equipment support information system optimizes the existing work process, standardized and digital, automatic and efficient orderly and controllable. The medical equipment support information system based on portable electronic document can effectively optimize and improve hospital medical engineering support work, improve performance, reduce costs, and provide full and accurate digital data
Huh, Sun
2013-01-01
ScienceCentral, a free or open access, full-text archive of scientific journal literature at the Korean Federation of Science and Technology Societies, was under test in September 2013. Since it is a Journal Article Tag Suite-based full text database, extensible markup language files of all languages can be presented, according to Unicode Transformation Format 8-bit encoding. It is comparable to PubMed Central: however, there are two distinct differences. First, its scope comprises all science fields; second, it accepts all language journals. Launching ScienceCentral is the first step for free access or open access academic scientific journals of all languages to leap to the world, including scientific journals from Croatia. PMID:24266292
An Archive of Spectra from the Mayall Fourier Transform Spectrometer at Kitt Peak
NASA Astrophysics Data System (ADS)
Pilachowski, C. A.; Hinkle, K. H.; Young, M. D.; Dennis, H. B.; Gopu, A.; Henschel, R.; Hayashi, S.
2017-02-01
We describe the SpArc science gateway for spectral data obtained using the Fourier Transform Spectrometer (FTS) in operation at the Mayall 4-m telescope at the Kitt Peak National Observatory during the period from 1975 through 1995. SpArc is hosted by Indiana University Bloomington and is available for public access. The archive includes nearly 10,000 individual spectra of more than 800 different astronomical sources including stars, nebulae, galaxies, and solar system objects. We briefly describe the FTS instrument itself and summarize the conversion of the original interferograms into spectral data and the process for recovering the data into FITS files. The architecture of the archive is discussed and the process for retrieving data from the archive is introduced. Sample use cases showing typical FTS spectra are presented.
Emulsions for pulsed holography: new and improved processing schemes
NASA Astrophysics Data System (ADS)
Rodin, Alexey M.; Taylor, Rob
2003-05-01
Recent improvements in the processing of commercially available holographic recording materials for pulsed holography are reviewed. Harmonics of pulsed Nd:YLF/Nd:Phosphate Glass, Nd:YLF, Nd:YAG laser's, and the fundamental wavelength of a pulsed Ruby laser were used as radiation sources for the recording of transmission and reflection holography gratings. It is shown that ultra-fine grain size materials such as PFG-03C and Ultimate-15 can be successfully applied for small and medium format pulsed holography applications. These small grain size emulsions are especially important in the areas of artistic archival portraiture and contact Denisyuk micro-holography of living objects, where noiseless image reconstruction is of a primary concern. It suggests that HOE's, such as full-color image projection screens, may be successfully recorded on PFG-03C holographic emulsions using a pulsed RGB laser. A range of commercial RGB pulsed lasers suitable for these applications are introduced. Visible wavelengths currently produced from these lasers covers the spectrum of 440 - 660nm. Latest developments of a full range of pulsed holographic camera systems manufactured by GEOLA that are suitable for medium and large format portraiture, medical imaging, museum artifact archival recording, and other types of holography are also reviewed with particular reference to new integrated digital mastering features. Finally, the initial commercial production of a new photopolymer film with a sensitivity range of 625-680nm is introduced. Initial CW exposure energies at 633nm were 30 - 50mJ/cm2; with diffraction efficiencies of 75 - 80% observed with this new material.
Globally Gridded Satellite observations for climate studies
Knapp, K.R.; Ansari, S.; Bain, C.L.; Bourassa, M.A.; Dickinson, M.J.; Funk, Chris; Helms, C.N.; Hennon, C.C.; Holmes, C.D.; Huffman, G.J.; Kossin, J.P.; Lee, H.-T.; Loew, A.; Magnusdottir, G.
2011-01-01
Geostationary satellites have provided routine, high temporal resolution Earth observations since the 1970s. Despite the long period of record, use of these data in climate studies has been limited for numerous reasons, among them that no central archive of geostationary data for all international satellites exists, full temporal and spatial resolution data are voluminous, and diverse calibration and navigation formats encumber the uniform processing needed for multisatellite climate studies. The International Satellite Cloud Climatology Project (ISCCP) set the stage for overcoming these issues by archiving a subset of the full-resolution geostationary data at ~10-km resolution at 3-hourly intervals since 1983. Recent efforts at NOAA's National Climatic Data Center to provide convenient access to these data include remapping the data to a standard map projection, recalibrating the data to optimize temporal homogeneity, extending the record of observations back to 1980, and reformatting the data for broad public distribution. The Gridded Satellite (GridSat) dataset includes observations from the visible, infrared window, and infrared water vapor channels. Data are stored in Network Common Data Format (netCDF) using standards that permit a wide variety of tools and libraries to process the data quickly and easily. A novel data layering approach, together with appropriate satellite and file metadata, allows users to access GridSat data at varying levels of complexity based on their needs. The result is a climate data record already in use by the meteorological community. Examples include reanalysis of tropical cyclones, studies of global precipitation, and detection and tracking of the intertropical convergence zone.
Globally Gridded Satellite (GridSat) Observations for Climate Studies
NASA Technical Reports Server (NTRS)
Knapp, Kenneth R.; Ansari, Steve; Bain, Caroline L.; Bourassa, Mark A.; Dickinson, Michael J.; Funk, Chris; Helms, Chip N.; Hennon, Christopher C.; Holmes, Christopher D.; Huffman, George J.;
2012-01-01
Geostationary satellites have provided routine, high temporal resolution Earth observations since the 1970s. Despite the long period of record, use of these data in climate studies has been limited for numerous reasons, among them: there is no central archive of geostationary data for all international satellites, full temporal and spatial resolution data are voluminous, and diverse calibration and navigation formats encumber the uniform processing needed for multi-satellite climate studies. The International Satellite Cloud Climatology Project set the stage for overcoming these issues by archiving a subset of the full resolution geostationary data at approx.10 km resolution at 3 hourly intervals since 1983. Recent efforts at NOAA s National Climatic Data Center to provide convenient access to these data include remapping the data to a standard map projection, recalibrating the data to optimize temporal homogeneity, extending the record of observations back to 1980, and reformatting the data for broad public distribution. The Gridded Satellite (GridSat) dataset includes observations from the visible, infrared window, and infrared water vapor channels. Data are stored in the netCDF format using standards that permit a wide variety of tools and libraries to quickly and easily process the data. A novel data layering approach, together with appropriate satellite and file metadata, allows users to access GridSat data at varying levels of complexity based on their needs. The result is a climate data record already in use by the meteorological community. Examples include reanalysis of tropical cyclones, studies of global precipitation, and detection and tracking of the intertropical convergence zone.
Thermal Infrared Radiometric Calibration of the Entire Landsat 4, 5, and 7 Archive (1982-2010)
NASA Technical Reports Server (NTRS)
Schott, John R.; Hook, Simon J.; Barsi, Julia A.; Markham, Brian L.; Miller, Jonathan; Padula, Francis P.; Raqueno, Nina G.
2012-01-01
Landsat's continuing record of the thermal state of the earth's surface represents the only long term (1982 to the present) global record with spatial scales appropriate for human scale studies (i.e., tens of meters). Temperature drives many of the physical and biological processes that impact the global and local environment. As our knowledge of, and interest in, the role of temperature on these processes have grown, the value of Landsat data to monitor trends and process has also grown. The value of the Landsat thermal data archive will continue to grow as we develop more effective ways to study the long term processes and trends affecting the planet. However, in order to take proper advantage of the thermal data, we need to be able to convert the data to surface temperatures. A critical step in this process is to have the entire archive completely and consistently calibrated into absolute radiance so that it can be atmospherically compensated to surface leaving radiance and then to surface radiometric temperature. This paper addresses the methods and procedures that have been used to perform the radiometric calibration of the earliest sizable thermal data set in the archive (Landsat 4 data). The completion of this effort along with the updated calibration of the earlier (1985 1999) Landsat 5 data, also reported here, concludes a comprehensive calibration of the Landsat thermal archive of data from 1982 to the present
ModelArchiver—A program for facilitating the creation of groundwater model archives
Winston, Richard B.
2018-03-01
ModelArchiver is a program designed to facilitate the creation of groundwater model archives that meet the requirements of the U.S. Geological Survey (USGS) policy (Office of Groundwater Technical Memorandum 2016.02, https://water.usgs.gov/admin/memo/GW/gw2016.02.pdf, https://water.usgs.gov/ogw/policy/gw-model/). ModelArchiver version 1.0 leads the user step-by-step through the process of creating a USGS groundwater model archive. The user specifies the contents of each of the subdirectories within the archive and provides descriptions of the archive contents. Descriptions of some files can be specified automatically using file extensions. Descriptions also can be specified individually. Those descriptions are added to a readme.txt file provided by the user. ModelArchiver moves the content of the archive to the archive folder and compresses some folders into .zip files.As part of the archive, the modeler must create a metadata file describing the archive. The program has a built-in metadata editor and provides links to websites that can aid in creation of the metadata. The built-in metadata editor is also available as a stand-alone program named FgdcMetaEditor version 1.0, which also is described in this report. ModelArchiver updates the metadata file provided by the user with descriptions of the files in the archive. An optional archive list file generated automatically by ModelMuse can streamline the creation of archives by identifying input files, output files, model programs, and ancillary files for inclusion in the archive.
Historical Time-Domain: Data Archives, Processing, and Distribution
NASA Astrophysics Data System (ADS)
Grindlay, Jonathan E.; Griffin, R. Elizabeth
2012-04-01
The workshop on Historical Time-Domain Astronomy (TDA) was attended by a near-capacity gathering of ~30 people. From information provided in turn by those present, an up-to-date overview was created of available plate archives, progress in their digitization, the extent of actual processing of those data, and plans for data distribution. Several recommendations were made for prioritising the processing and distribution of historical TDA data.
EOSDIS: Archive and Distribution Systems in the Year 2000
NASA Technical Reports Server (NTRS)
Behnke, Jeanne; Lake, Alla
2000-01-01
Earth Science Enterprise (ESE) is a long-term NASA research mission to study the processes leading to global climate change. The Earth Observing System (EOS) is a NASA campaign of satellite observatories that are a major component of ESE. The EOS Data and Information System (EOSDIS) is another component of ESE that will provide the Earth science community with easy, affordable, and reliable access to Earth science data. EOSDIS is a distributed system, with major facilities at seven Distributed Active Archive Centers (DAACs) located throughout the United States. The EOSDIS software architecture is being designed to receive, process, and archive several terabytes of science data on a daily basis. Thousands of science users and perhaps several hundred thousands of non-science users are expected to access the system. The first major set of data to be archived in the EOSDIS is from Landsat-7. Another EOS satellite, Terra, was launched on December 18, 1999. With the Terra launch, the EOSDIS will be required to support approximately one terabyte of data into and out of the archives per day. Since EOS is a multi-mission program, including the launch of more satellites and many other missions, the role of the archive systems becomes larger and more critical. In 1995, at the fourth convening of NASA Mass Storage Systems and Technologies Conference, the development plans for the EOSDIS information system and archive were described. Five years later, many changes have occurred in the effort to field an operational system. It is interesting to reflect on some of the changes driving the archive technology and system development for EOSDIS. This paper principally describes the Data Server subsystem including how the other subsystems access the archive, the nature of the data repository, and the mass-storage I/O management. The paper reviews the system architecture (both hardware and software) of the basic components of the archive. It discusses the operations concept, code development, and testing phase of the system. Finally, it describes the future plans for the archive.
Diagnosis and prediction of neuroendocrine liver metastases: a protocol of six systematic reviews.
Arigoni, Stephan; Ignjatovic, Stefan; Sager, Patrizia; Betschart, Jonas; Buerge, Tobias; Wachtl, Josephine; Tschuor, Christoph; Limani, Perparim; Puhan, Milo A; Lesurtel, Mickael; Raptis, Dimitri A; Breitenstein, Stefan
2013-12-23
Patients with hepatic metastases from neuroendocrine tumors (NETs) benefit from an early diagnosis, which is crucial for the optimal therapy and management. Diagnostic procedures include morphological and functional imaging, identification of biomarkers, and biopsy. The aim of six systematic reviews discussed in this study is to assess the predictive value of Ki67 index and other biomarkers, to compare the diagnostic accuracy of morphological and functional imaging, and to define the role of biopsy in the diagnosis and prediction of neuroendocrine tumor liver metastases. An objective group of librarians will provide an electronic search strategy to examine the following databases: MEDLINE, EMBASE and The Cochrane Library (Cochrane Database of Systematic Reviews, Cochrane Central Register of Controlled Trials (CENTRAL), Database of Abstracts of Reviews of Effects). There will be no restriction concerning language and publication date. The qualitative and quantitative synthesis of the systematic review will be conducted with randomized controlled trials (RCT), prospective and retrospective comparative cohort studies, and case-control studies. Case series will be collected in a separate database and only used for descriptive purposes. This study is ongoing and presents a protocol of six systematic reviews to elucidate the role of histopathological and biochemical markers, biopsies of the primary tumor and the metastases as well as morphological and functional imaging modalities for the diagnosis and prediction of neuroendocrine liver metastases. These systematic reviews will assess the value and accuracy of several diagnostic modalities in patients with NET liver metastases, and will provide a basis for the development of clinical practice guidelines. The systematic reviews have been prospectively registered with the International Prospective Register of Systematic Reviews (PROSPERO): CRD42012002644; http://www.metaxis.com/prospero/full_doc.asp?RecordID=2644 (Archived by WebCite at http://www.webcitation.org/6LzCLd5sF), CRD42012002647; http://www.metaxis.com/prospero/full_doc.asp?RecordID=2647 (Archived by WebCite at http://www.webcitation.org/6LzCRnZnO), CRD42012002648; http://www.metaxis.com/prospero/full_doc.asp?RecordID=2648 (Archived by WebCite at http://www.webcitation.org/6LzCVeuVR), CRD42012002649; http://www.metaxis.com/prospero/full_doc.asp?RecordID=2649 (Archived by WebCite at http://www.webcitation.org/6LzCZzZWU), CRD42012002650; http://www.metaxis.com/prospero/full_doc.asp?RecordID=2650 (Archived by WebCite at http://www.webcitation.org/6LzDPhGb8), CRD42012002651; http://www.crd.york.ac.uk/PROSPERO/display_record.asp?ID=CRD42012002651#.UrMglPRDuVo (Archived by WebCite at http://www.webcitation.org/6LzClCNff).
A new archival infrastructure for highly-structured astronomical data
NASA Astrophysics Data System (ADS)
Dovgan, Erik; Knapic, Cristina; Sponza, Massimo; Smareglia, Riccardo
2018-03-01
With the advent of the 2020 Radio Astronomy Telescopes era, the amount and format of the radioastronomical data is becoming a massive and performance-critical challenge. Such an evolution of data models and data formats require new data archiving techniques that allow massive and fast storage of data that are at the same time also efficiently processed. A useful expertise for efficient archiviation has been obtained through data archiving of Medicina and Noto Radio Telescopes. The presented archival infrastructure named the Radio Archive stores and handles various formats, such as FITS, MBFITS, and VLBI's XML, which includes description and ancillary files. The modeling and architecture of the archive fulfill all the requirements of both data persistence and easy data discovery and exploitation. The presented archive already complies with the Virtual Observatory directives, therefore future service implementations will also be VO compliant. This article presents the Radio Archive services and tools, from the data acquisition to the end-user data utilization.
Archive & Data Management Activities for ISRO Science Archives
NASA Astrophysics Data System (ADS)
Thakkar, Navita; Moorthi, Manthira; Gopala Krishna, Barla; Prashar, Ajay; Srinivasan, T. P.
2012-07-01
ISRO has kept a step ahead by extending remote sensing missions to planetary and astronomical exploration. It has started with Chandrayaan-1 and successfully completed the moon imaging during its life time in the orbit. Now, in future ISRO is planning to launch Chandrayaan-2 (next moon mission), Mars Mission and Astronomical mission ASTROSAT. All these missions are characterized by the need to receive process, archive and disseminate the acquired science data to the user community for analysis and scientific use. All these science missions will last for a few months to a few years but the data received are required to be archived, interoperable and requires a seamless access to the user community for the future. ISRO has laid out definite plans to archive these data sets in specified standards and develop relevant access tools to be able to serve the user community. To achieve this goal, a Data Center is set up at Bangalore called Indian Space Science Data Center (ISSDC). This is the custodian of all the data sets of the current and future science missions of ISRO . Chandrayaan-1 is the first among the planetary missions launched/to be launched by ISRO and we had taken the challenge and developed a system for data archival and dissemination of the payload data received. For Chandrayaan-1 the data collected from all the instruments are processed and is archived in the archive layer in the Planetary Data System (PDS 3.0) standards, through the automated pipeline. But the dataset once stored is of no use unless it is made public, which requires a Web-based dissemination system that can be accessible to all the planetary scientists/data users working in this field. Towards this, a Web- based Browse and Dissemination system has been developed, wherein users can register and search for their area of Interest and view the data archived for TMC & HYSI with relevant Browse chips and Metadata of the data. Users can also order the data and get it on their desktop in the PDS. For other AO payloads users can view the metadata and the data is available through FTP site. This same archival and dissemination strategy will be extended for the next moon mission Chandrayaan-2. ASTROSAT is going to be the first multi-wavelength astronomical mission for which the data is archived at ISSDC. It consists of five astronomical payloads that would allow simultaneous multi-wavelengths observations from X-ray to Ultra-Violet (UV) of astronomical objects. It is planned to archive the data sets in FITS. The archive of the ASTROSAT will be done in the Archive Layer at ISSDC. The Browse of the Archive will be available through the ISDA (Indian Science Data Archive) web site. The Browse will be IVOA compliant with a search mechanism using VOTable. The data will be available to the users only on request basis via a FTP site after the lock in period is over. It is planned that the Level2 pipeline software and various modules for processing the data sets will be also available on the web site. This paper, describes the archival procedure of Chandrayaan-1 and archive plan for the ASTROSAT, Chandrayaan-2 and other future mission of ISRO including the discussion on data management activities.
NASA Astrophysics Data System (ADS)
Raugh, Anne; Henneken, Edwin
The Planetary Data System (PDS) is actively involved in designing both metadata and interfaces to make the assignment of Digital Object Identifiers (DOIs) to archival data a part of the archiving process for all data creators. These DOIs will be registered through DataCite, a non-profit organization whose members are all deeply concerned with archival research data, provenance tracking through the literature, and proper acknowledgement of the various types of efforts that contribute to the creation of an archival reference data set. Making the collection of citation metadata and its ingestion into the DataCite DOI database easy - and easy to do correctly - is in the best interests of all stakeholders: the data creators; the curators; the indexing organizations like the Astrophysics Data System (ADS); and the data users. But in order to realize the promise of DOIs, there are three key issues to address: 1) How do we incorporate the metadata collection process simply and naturally into the PDS archive creation process; 2) How do we encourage journal editors to require references to previously published data with the same rigor with which they require references to previously published research and analysis; and finally, 3) How can we change the culture of academic and research employers to recognize that the effort required to prepare a PDS archival data set is a career achievement on par with contributing to a refereed article in the professional literature. Data archives and scholarly publications are the long-term return on investment that funding agencies and the science community expect in exchange for research spending. The traceability and reproducibility ensured by the integration of DOIs and their related metadata into indexing and search services is an essential part of providing and optimizing that return.
NASA Technical Reports Server (NTRS)
Leptoukh, Gregory G.
2005-01-01
The NASA Goddard Earth Sciences Data and Information Services Center (GES DISC) is one of the major Distributed Active Archive Centers (DAACs) archiving and distributing remote sensing data from the NASA's Earth Observing System. In addition to providing just data, the GES DISC/DAAC has developed various value-adding processing services. A particularly useful service is data processing a t the DISC (i.e., close to the input data) with the users' algorithms. This can take a number of different forms: as a configuration-managed algorithm within the main processing stream; as a stand-alone program next to the on-line data storage; as build-it-yourself code within the Near-Archive Data Mining (NADM) system; or as an on-the-fly analysis with simple algorithms embedded into the web-based tools (to avoid downloading unnecessary all the data). The existing data management infrastructure at the GES DISC supports a wide spectrum of options: from data subsetting data spatially and/or by parameter to sophisticated on-line analysis tools, producing economies of scale and rapid time-to-deploy. Shifting processing and data management burden from users to the GES DISC, allows scientists to concentrate on science, while the GES DISC handles the data management and data processing at a lower cost. Several examples of successful partnerships with scientists in the area of data processing and mining are presented.
129. FULL AERIAL VIEW SHOWING FORWARD PORT QUARTER, ENTERING PEARL ...
129. FULL AERIAL VIEW SHOWING FORWARD PORT QUARTER, ENTERING PEARL HARBOR AFTER APOLLO 11 RECOVERY. 26 JULY 1969. (NATIONAL ARCHIVES NO. 428-KN-18090) - U.S.S. HORNET, Puget Sound Naval Shipyard, Sinclair Inlet, Bremerton, Kitsap County, WA
ERIC Educational Resources Information Center
Belsunce, Cesar A. Garcia
1983-01-01
Examination of the situation of archives in four Latin American countries--Argentina, Brazil, Colombia, and Costa Rica--highlights national systems, buildings, staff, processing of documents, accessibility and services to the public and publications and extension services. (EJS)
: + Advanced Search à Advanced Search All Fields: Title: Full Text: Bibliographic Data: Creator / Author: Name discoverable at no charge to users. DOE PAGES offers free public access to the best available full-text version and distributed content, with PAGES maintaining a permanent archive of all full text and metadata. In
Carneggie, David M.; Metz, Gary G.; Draeger, William C.; Thompson, Ralph J.
1991-01-01
The U.S. Geological Survey's Earth Resources Observation Systems (EROS) Data Center, the national archive for Landsat data, has 20 years of experience in acquiring, archiving, processing, and distributing Landsat and earth science data. The Center is expanding its satellite and earth science data management activities to support the U.S. Global Change Research Program and the National Aeronautics and Space Administration (NASA) Earth Observing System Program. The Center's current and future data management activities focus on land data and include: satellite and earth science data set acquisition, development and archiving; data set preservation, maintenance and conversion to more durable and accessible archive medium; development of an advanced Land Data Information System; development of enhanced data packaging and distribution mechanisms; and data processing, reprocessing, and product generation systems.
Restoration of Apollo Data by the Lunar Data Project/PDS Lunar Data Node: An Update
NASA Technical Reports Server (NTRS)
Williams, David R.; Hills, H. Kent; Taylor, Patrick T.; Grayzeck, Edwin J.; Guinness, Edward A.
2016-01-01
The Apollo 11, 12, and 14 through 17 missions orbited and landed on the Moon, carrying scientific instruments that returned data from all phases of the missions, included long-lived Apollo Lunar Surface Experiments Packages (ALSEPs) deployed by the astronauts on the lunar surface. Much of these data were never archived, and some of the archived data were on media and in formats that are outmoded, or were deposited with little or no useful documentation to aid outside users. This is particularly true of the ALSEP data returned autonomously for many years after the Apollo missions ended. The purpose of the Lunar Data Project and the Planetary Data System (PDS) Lunar Data Node is to take data collections already archived at the NASA Space Science Data Coordinated Archive (NSSDCA) and prepare them for archiving through PDS, and to locate lunar data that were never archived, bring them into NSSDCA, and then archive them through PDS. Preparing these data for archiving involves reading the data from the original media, be it magnetic tape, microfilm, microfiche, or hard-copy document, converting the outmoded, often binary, formats when necessary, putting them into a standard digital form accepted by PDS, collecting the necessary ancillary data and documentation (metadata) to ensure that the data are usable and well-described, summarizing the metadata in documentation to be included in the data set, adding other information such as references, mission and instrument descriptions, contact information, and related documentation, and packaging the results in a PDS-compliant data set. The data set is then validated and reviewed by a group of external scientists as part of the PDS final archive process. We present a status report on some of the data sets that we are processing.
2014-01-01
expansion a 1/K 11e–6-12e–6 Specific heat C p J/kg K 440-520 Thermal conductivity k W/m K 35-50 Heat transfer coefficient h W/m2 K 45 Sink temperature...filler-metal consumable electrode to the weld; third, prediction of the temporal evolution and the spatial distribution of thermal and mechanical...the thermal The current issue and full text archive of this journal is available at www.emeraldinsight.com/1573-6105.htm Received 20 May 2013 Revised 13
Supporting the Use of GPM-GV Field Campaign Data Beyond Project Scientists
NASA Astrophysics Data System (ADS)
Weigel, A. M.; Smith, D. K.; Sinclair, L.; Bugbee, K.
2017-12-01
The Global Precipitation Measurement (GPM) Mission Ground Validation (GV) consisted of a collection of field campaigns at various locations focusing on particular aspects of precipitation. Data collected during the GPM-GV are necessary for better understanding the instruments and algorithms used to monitor water resources, study the global hydrologic cycle, understand climate variability, and improve weather prediction. The GPM-GV field campaign data have been archived at the NASA Global Hydrology Resource Center (GHRC) Distributed Achive Archive Center (DAAC). These data consist of a heterogeneous collection of observations that require careful handling, full descriptive user guides, and helpful instructions for data use. These actions are part of the data archival process. In addition, the GHRC focuses on expanding the use of GPM-GV data beyond the validation and instrument researchers that participated in the field campaigns. To accomplish this, GHRC ties together the similarities and differences between the various field campaigns with the goal of improving user documents to be more easily read by those outside the field of research. In this poster, the authors will describe the GPM-GV datasets, discuss data use among the broader community, outline the types of problems/issues with these datasets, demonstrate what tools support data visualization and use, and highlight the outreach materials developed to educate both younger and general audiences about the data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shoopman, J. D.
This report documents Livermore Computing (LC) activities in support of ASC L2 milestone 5589: Modernization and Expansion of LLNL Archive Disk Cache, due March 31, 2016. The full text of the milestone is included in Attachment 1. The description of the milestone is: Description: Configuration of archival disk cache systems will be modernized to reduce fragmentation, and new, higher capacity disk subsystems will be deployed. This will enhance archival disk cache capability for ASC archive users, enabling files written to the archives to remain resident on disk for many (6–12) months, regardless of file size. The milestone was completed inmore » three phases. On August 26, 2015 subsystems with 6PB of disk cache were deployed for production use in LLNL’s unclassified HPSS environment. Following that, on September 23, 2015 subsystems with 9 PB of disk cache were deployed for production use in LLNL’s classified HPSS environment. On January 31, 2016, the milestone was fully satisfied when the legacy Data Direct Networks (DDN) archive disk cache subsystems were fully retired from production use in both LLNL’s unclassified and classified HPSS environments, and only the newly deployed systems were in use.« less
SENTINEL-2 Services Library - efficient way for exploration and exploitation of EO data
NASA Astrophysics Data System (ADS)
Milcinski, Grega; Batic, Matej; Kadunc, Miha; Kolaric, Primoz; Mocnik, Rok; Repse, marko
2017-04-01
With more than 1.5 million scenes available covering over 11 billion sq. kilometers of area and containing half a quadrillion of pixels, Sentinel-2 is becoming one of the most important MSI datasets in the world. However, the vast amount of data makes it difficult to work with. This is certainly an important reason, why the number of Sentinel based applications is not as high as it could be at this point. We will present a Copernicus Award [1] winning service for archiving, processing and distribution of Sentinel data, Sentinel Hub [2]. It makes it easy for anyone to tap into global Sentinel archive and exploit its rich multi-sensor data to observe changes in the land. We will demonstrate, how one is able not just to observe imagery all over the world but also to create its own statistical analysis in a matter of seconds, performing comparison of different sensors through various time segments. The result can be immediately observed in any GIS tool or exported as a raster file for post-processing. All of these actions can be performed on a full, worldwide, S-2 archive (multi-temporal and multi-spectral). To demonstrate the technology, we created a publicly accessible web application, called "Sentinel Playground" [3], which makes it possible to query Sentinel-2 data anywhere in the world, and experts-oriented tool "EO Browser" [4], where it is also possible to observe land changes through longer period by using historical Landsat data as well. [1] http://www.copernicus-masters.com/index.php?anzeige=press-2016-03.html [2] http://www.sentinel-hub.com [3] http://apps.sentinel-hub.com/sentinel-playground/ [4] http://apps.eocloud.sentinel-hub.com/eo-browser/
NASA Technical Reports Server (NTRS)
Behnke, Jeanne; Doescher, Chris
2015-01-01
This presentation discusses 25 years of interactions between NASA and the USGS to manage a Land Processes Distributed Active Archive Center (LPDAAC) for the purpose of providing users access to NASA's rich collection of Earth Science data. The presentation addresses challenges, efforts and metrics on the performance.
15. Photocopy of photograph (original in the Langley Research Center ...
15. Photocopy of photograph (original in the Langley Research Center Archives, Hampton, VA LaRC) (L4933) VIEW NORTHWEST OF THE FULL-SCALE WIND TUNNEL, c. 1932. - NASA Langley Research Center, Full-Scale Wind Tunnel, 224 Hunting Avenue, Hampton, Hampton, VA
23. Photocopy of photograph (original in the Langley Research Center ...
23. Photocopy of photograph (original in the Langley Research Center Archives, Hampton, VA LaRC) (L73-5028) MODEL OF SUPERSONIC TRANSPORT IN FULL-SCALE WIND TUNNEL. - NASA Langley Research Center, Full-Scale Wind Tunnel, 224 Hunting Avenue, Hampton, Hampton, VA
26. Photocopy of photograph (original in the Langley Research Center ...
26. Photocopy of photograph (original in the Langley Research Center Archives, Hampton, VA LaRC) (L64792) ALBACORE SUBMARINE DRAG TESTS IN THE FULL-SCALE WIND TUNNEL. - NASA Langley Research Center, Full-Scale Wind Tunnel, 224 Hunting Avenue, Hampton, Hampton, VA
17. Photocopy of photograph (original in the Langley Research Center ...
17. Photocopy of photograph (original in the Langley Research Center Archives, Hampton, VA LaRC) (L79-7343) AERIAL VIEW OF THE FULL-SCALE WIND TUNNEL, 1979. - NASA Langley Research Center, Full-Scale Wind Tunnel, 224 Hunting Avenue, Hampton, Hampton, VA
Content-based retrieval of historical Ottoman documents stored as textual images.
Saykol, Ediz; Sinop, Ali Kemal; Güdükbay, Ugur; Ulusoy, Ozgür; Cetin, A Enis
2004-03-01
There is an accelerating demand to access the visual content of documents stored in historical and cultural archives. Availability of electronic imaging tools and effective image processing techniques makes it feasible to process the multimedia data in large databases. In this paper, a framework for content-based retrieval of historical documents in the Ottoman Empire archives is presented. The documents are stored as textual images, which are compressed by constructing a library of symbols occurring in a document, and the symbols in the original image are then replaced with pointers into the codebook to obtain a compressed representation of the image. The features in wavelet and spatial domain based on angular and distance span of shapes are used to extract the symbols. In order to make content-based retrieval in historical archives, a query is specified as a rectangular region in an input image and the same symbol-extraction process is applied to the query region. The queries are processed on the codebook of documents and the query images are identified in the resulting documents using the pointers in textual images. The querying process does not require decompression of images. The new content-based retrieval framework is also applicable to many other document archives using different scripts.
STARS 2.0: 2nd-generation open-source archiving and query software
NASA Astrophysics Data System (ADS)
Winegar, Tom
2008-07-01
The Subaru Telescope is in process of developing an open-source alternative to the 1st-generation software and databases (STARS 1) used for archiving and query. For STARS 2, we have chosen PHP and Python for scripting and MySQL as the database software. We have collected feedback from staff and observers, and used this feedback to significantly improve the design and functionality of our future archiving and query software. Archiving - We identified two weaknesses in 1st-generation STARS archiving software: a complex and inflexible table structure and uncoordinated system administration for our business model: taking pictures from the summit and archiving them in both Hawaii and Japan. We adopted a simplified and normalized table structure with passive keyword collection, and we are designing an archive-to-archive file transfer system that automatically reports real-time status and error conditions and permits error recovery. Query - We identified several weaknesses in 1st-generation STARS query software: inflexible query tools, poor sharing of calibration data, and no automatic file transfer mechanisms to observers. We are developing improved query tools and sharing of calibration data, and multi-protocol unassisted file transfer mechanisms for observers. In the process, we have redefined a 'query': from an invisible search result that can only transfer once in-house right now, with little status and error reporting and no error recovery - to a stored search result that can be monitored, transferred to different locations with multiple protocols, reporting status and error conditions and permitting recovery from errors.
Reference Model for an Open Archival Information System
NASA Technical Reports Server (NTRS)
1997-01-01
This document is a technical report for use in developing a consensus on what is required to operate a permanent, or indefinite long-term, archive of digital information. It may be useful as a starting point for a similar document addressing the indefinite long-term preservation of non-digital information. This report establishes a common framework of terms and concepts which comprise an Open Archival Information System (OAIS). It allows existing and future archives to be more meaningfully compared and contrasted. It provides a basis for further standardization of within an archival context and it should promote greater vendor awareness of, and support of , archival requirements. Through the process of normal evolution, it is expected that expansion, deletion, or modification to this document may occur. This report is therefore subject to CCSDS document management and change control procedures.
Avrin, D E; Andriole, K P; Yin, L; Gould, R G; Arenson, R L
2001-03-01
A hierarchical storage management (HSM) scheme for cost-effective on-line archival of image data using lossy compression is described. This HSM scheme also provides an off-site tape backup mechanism and disaster recovery. The full-resolution image data are viewed originally for primary diagnosis, then losslessly compressed and sent off site to a tape backup archive. In addition, the original data are wavelet lossy compressed (at approximately 25:1 for computed radiography, 10:1 for computed tomography, and 5:1 for magnetic resonance) and stored on a large RAID device for maximum cost-effective, on-line storage and immediate retrieval of images for review and comparison. This HSM scheme provides a solution to 4 problems in image archiving, namely cost-effective on-line storage, disaster recovery of data, off-site tape backup for the legal record, and maximum intermediate storage and retrieval through the use of on-site lossy compression.
Code of Federal Regulations, 2010 CFR
2010-07-01
... actions and decisions in a manner that facilitates archival processing for public access. Central agency... Defense Other Regulations Relating to National Defense INFORMATION SECURITY OVERSIGHT OFFICE, NATIONAL ARCHIVES AND RECORDS ADMINISTRATION CLASSIFIED NATIONAL SECURITY INFORMATION Declassification § 2001.34...
The what, why, and how of born-open data.
Rouder, Jeffrey N
2016-09-01
Although many researchers agree that scientific data should be open to scrutiny to ferret out poor analyses and outright fraud, most raw data sets are not available on demand. There are many reasons researchers do not open their data, and one is technical. It is often time consuming to prepare and archive data. In response, my laboratory has automated the process such that our data are archived the night they are created without any human approval or action. All data are versioned, logged, time stamped, and uploaded including aborted runs and data from pilot subjects. The archive is GitHub, github.com, the world's largest collection of open-source materials. Data archived in this manner are called born open. In this paper, I discuss the benefits of born-open data and provide a brief technical overview of the process. I also address some of the common concerns about opening data before publication.
18. Photocopy of photograph (original in the Langley Research Center ...
18. Photocopy of photograph (original in the Langley Research Center Archives, Hampton, VA LaRC) (L83-8341) VIEW OF FANS IN FULL-SCALE WIND TUNNEL, c. 1960s. - NASA Langley Research Center, Full-Scale Wind Tunnel, 224 Hunting Avenue, Hampton, Hampton, VA
13. Photocopy of photograph (original in the Langley Research Center ...
13. Photocopy of photograph (original in the Langley Research Center Archives, Hampton, VA LaRC) (NACA 4655) VIEW LOOKING NORTH AT THE FULL-SCALE WIND TUNNEL UNDER CONSTRUCTION. - NASA Langley Research Center, Full-Scale Wind Tunnel, 224 Hunting Avenue, Hampton, Hampton, VA
16. Photocopy of photograph (original in the Langley Research Center ...
16. Photocopy of photograph (original in the Langley Research Center Archives, Hampton, VA LaRC) (L89-07075) AERIAL VIEW LOOKING NORTHWEST AT THE FULL-SCALE WIND TUNNEL, 1989. - NASA Langley Research Center, Full-Scale Wind Tunnel, 224 Hunting Avenue, Hampton, Hampton, VA
19. Photocopy of photograph (original in the Langley Research Center ...
19. Photocopy of photograph (original in the Langley Research Center Archives, Hampton, VA LaRC) (L5925) LOENING SCL-1 SEAPLANE IN THE FULL-SCALE WIND TUNNEL, OCTOBER 1931. - NASA Langley Research Center, Full-Scale Wind Tunnel, 224 Hunting Avenue, Hampton, Hampton, VA
ASF archive issues: Current status, past history, and questions for the future
NASA Technical Reports Server (NTRS)
Goula, Crystal A.; Wales, Carl
1994-01-01
The Alaska SAR Facility (ASF) collects, processes, archives, and distributes data from synthetic aperture radar (SAR) satellites in support of scientific research. ASF has been in operation since 1991 and presently has an archive of over 100 terabytes of data. ASF is performing an analysis of its magnetic tape storage system to ensure long-term preservation of this archive. Future satellite missions have the possibility of doubling to tripling the amounts of data that ASF acquires. ASF is examining the current data systems and the high volume storage, and exploring future concerns and solutions.
ESO science data product standard for 1D spectral products
NASA Astrophysics Data System (ADS)
Micol, Alberto; Arnaboldi, Magda; Delmotte, Nausicaa A. R.; Mascetti, Laura; Retzlaff, Joerg
2016-07-01
The ESO Phase 3 process allows the upload, validation, storage, and publication of reduced data through the ESO Science Archive Facility. Since its introduction, 2 million data products have been archived and published; 80% of them are one-dimensional extracted and calibrated spectra. Central to Phase3 is the ESO science data product standard that defines metadata and data format of any product. This contribution describes the ESO data standard for 1d-spectra, its adoption by the reduction pipelines of selected instrument modes for in-house generation of reduced spectra, the enhanced archive legacy value. Archive usage statistics are provided.
ROSETTA: How to archive more than 10 years of mission
NASA Astrophysics Data System (ADS)
Barthelemy, Maud; Heather, D.; Grotheer, E.; Besse, S.; Andres, R.; Vallejo, F.; Barnes, T.; Kolokolova, L.; O'Rourke, L.; Fraga, D.; A'Hearn, M. F.; Martin, P.; Taylor, M. G. G. T.
2018-01-01
The Rosetta spacecraft was launched in 2004 and, after several planetary and two asteroid fly-bys, arrived at comet 67P/Churyumov-Gerasimenko in August 2014. After escorting the comet for two years and executing its scientific observations, the mission ended on 30 September 2016 through a touch down on the comet surface. This paper describes how the Planetary Science Archive (PSA) and the Planetary Data System - Small Bodies Node (PDS-SBN) worked with the Rosetta instrument teams to prepare the science data collected over the course of the Rosetta mission for inclusion in the science archive. As Rosetta is an international mission in collaboration between ESA and NASA, all science data from the mission are fully archived within both the PSA and the PDS. The Rosetta archiving process, supporting tools, archiving systems, and their evolution throughout the mission are described, along with a discussion of a number of the challenges faced during the Rosetta implementation. The paper then presents the current status of the archive for each of the science instruments, before looking to the improvements planned both for the archive itself and for the Rosetta data content. The lessons learned from the first 13 years of archiving on Rosetta are finally discussed with an aim to help future missions plan and implement their science archives.
Planetary Data Archiving Plan at JAXA
NASA Astrophysics Data System (ADS)
Shinohara, Iku; Kasaba, Yasumasa; Yamamoto, Yukio; Abe, Masanao; Okada, Tatsuaki; Imamura, Takeshi; Sobue, Shinichi; Takashima, Takeshi; Terazono, Jun-Ya
After the successful rendezvous of Hayabusa with the small-body planet Itokawa, and the successful launch of Kaguya to the moon, Japanese planetary community has gotten their own and full-scale data. However, at this moment, these datasets are only available from the data sites managed by each mission team. The databases are individually constructed in the different formats, and the user interface of these data sites is not compatible with foreign databases. To improve the usability of the planetary archives at JAXA and to enable the international data exchange smooth, we are investigating to make a new planetary database. Within a coming decade, Japan will have fruitful datasets in the planetary science field, Venus (Planet-C), Mercury (BepiColombo), and several missions in planning phase (small-bodies). In order to strongly assist the international scientific collaboration using these mission archive data, the planned planetary data archive at JAXA should be managed in an unified manner and the database should be constructed in the international planetary database standard style. In this presentation, we will show the current status and future plans of the planetary data archiving at JAXA.
Solomon, April D; Hytinen, Madison E; McClain, Aryn M; Miller, Marilyn T; Dawson Cruz, Tracey
2018-01-01
DNA profiles have been obtained from fingerprints, but there is limited knowledge regarding DNA analysis from archived latent fingerprints-touch DNA "sandwiched" between adhesive and paper. Thus, this study sought to comparatively analyze a variety of collection and analytical methods in an effort to seek an optimized workflow for this specific sample type. Untreated and treated archived latent fingerprints were utilized to compare different biological sampling techniques, swab diluents, DNA extraction systems, DNA concentration practices, and post-amplification purification methods. Archived latent fingerprints disassembled and sampled via direct cutting, followed by DNA extracted using the QIAamp® DNA Investigator Kit, and concentration with Centri-Sep™ columns increased the odds of obtaining an STR profile. Using the recommended DNA workflow, 9 of the 10 samples provided STR profiles, which included 7-100% of the expected STR alleles and two full profiles. Thus, with carefully selected procedures, archived latent fingerprints can be a viable DNA source for criminal investigations including cold/postconviction cases. © 2017 American Academy of Forensic Sciences.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Apyan, A.; Badillo, J.; Cruz, J. Diaz
The CMS experiment at the LHC relies on 7 Tier-1 centres of the WLCG to perform the majority of its bulk processing activity, and to archive its data. During the first run of the LHC, these two functions were tightly coupled as each Tier-1 was constrained to process only the data archived on its hierarchical storage. This lack of flexibility in the assignment of processing workflows occasionally resulted in uneven resource utilisation and in an increased latency in the delivery of the results to the physics community.The long shutdown of the LHC in 2013-2014 was an opportunity to revisit thismore » mode of operations, disentangling the processing and archive functionalities of the Tier-1 centres. The storage services at the Tier-1s were redeployed breaking the traditional hierarchical model: each site now provides a large disk storage to host input and output data for processing, and an independent tape storage used exclusively for archiving. Movement of data between the tape and disk endpoints is not automated, but triggered externally through the WLCG transfer management systems.With this new setup, CMS operations actively controls at any time which data is available on disk for processing and which data should be sent to archive. Thanks to the high-bandwidth connectivity guaranteed by the LHCOPN, input data can be freely transferred between disk endpoints as needed to take advantage of free CPU, turning the Tier-1s into a large pool of shared resources. The output data can be validated before archiving them permanently, and temporary data formats can be produced without wasting valuable tape resources. Lastly, the data hosted on disk at Tier-1s can now be made available also for user analysis since there is no risk any longer of triggering chaotic staging from tape.In this contribution, we describe the technical solutions adopted for the new disk and tape endpoints at the sites, and we report on the commissioning and scale testing of the service. We detail the procedures implemented by CMS computing operations to actively manage data on disk at Tier-1 sites, and we give examples of the benefits brought to CMS workflows by the additional flexibility of the new system.« less
Recent advances and plans in processing and geocoding of SAR data at the DFD
NASA Technical Reports Server (NTRS)
Noack, W.
1993-01-01
Because of the needs of future projects like ENVISAT and the experiences made with the current operational ERS-1 facilities, a radical change in the synthetic aperture radar (SAR) processing scenarios can be predicted for the next years. At the German PAF several new developments were initialized which are driven mainly either by user needs or by system and operational constraints ('lessons learned'). At the end there will be a major simplification and uniformation of all used computer systems. Especially the following changes are likely to be implemented at the German PAF: transcription before archiving, processing of all standard products with high throughput directly at the receiving stations, processing of special 'high-valued' products at the PAF, usage of a single type of processor hardware, implementation of a large and fast on-line data archive, and improved and unified fast data network between the processing and archiving facilities. A short description of the current operational SAR facilities as well as the future implementations are given.
Water isotope systematics: Improving our palaeoclimate interpretations
Jones, M. D.; Dee, S.; Anderson, L.; Baker, A.; Bowen, G.; Noone, D.
2016-01-01
The stable isotopes of oxygen and hydrogen, measured in a variety of archives, are widely used proxies in Quaternary Science. Understanding the processes that control δ18O change have long been a focus of research (e.g. Shackleton and Opdyke, 1973; Talbot, 1990 ; Leng, 2006). Both the dynamics of water isotope cycling and the appropriate interpretation of geological water-isotope proxy time series remain subjects of active research and debate. It is clear that achieving a complete understanding of the isotope systematics for any given archive type, and ideally each individual archive, is vital if these palaeo-data are to be used to their full potential, including comparison with climate model experiments of the past. Combining information from modern monitoring and process studies, climate models, and proxy data is crucial for improving our statistical constraints on reconstructions of past climate variability.As climate models increasingly incorporate stable water isotope physics, this common language should aid quantitative comparisons between proxy data and climate model output. Water-isotope palaeoclimate data provide crucial metrics for validating GCMs, whereas GCMs provide a tool for exploring the climate variability dominating signals in the proxy data. Several of the studies in this set of papers highlight how collaborations between palaeoclimate experimentalists and modelers may serve to expand the usefulness of palaeoclimate data for climate prediction in future work.This collection of papers follows the session on Water Isotope Systematics held at the 2013 AGU Fall Meeting in San Francisco. Papers in that session, the breadth of which are represented here, discussed such issues as; understanding sub-GNIP scale (Global Network for Isotopes in Precipitation, (IAEA/WMO, 2006)) variability in isotopes in precipitation from different regions, detailed examination of the transfer of isotope signals from precipitation to geological archives, and the implications of advances in understanding in these areas for the interpretation of palaeo records and proxy data – climate model comparison.Here, we briefly review these areas of research, and discuss challenges for the water isotope community in improving our ability to partition climate vs. auxiliary signals in palaeoclimate data.
New Technology Changing The Face of Mobile Seismic Networks
NASA Astrophysics Data System (ADS)
Brisbourne, A.; Denton, P.; Seis-Uk
SEIS-UK, a seismic equipment pool and data management facility run by a consortium of four UK universities (Leicester, Leeds, Cambridge and Royal Holloway, London) completed its second phase in 2001. To compliment the existing broadband equipment pool, which has been deployed to full capacity to date, the consortium undertook a tender evaluation process for low-power, lightweight sensors and recorders, for use on both controlled source and passive seismic experiments. The preferred option, selected by the consortium, was the Guralp CMG-6TD system, with 150 systems ordered. The CMG-6TD system is a new concept in temporary seismic equipment. A 30s- 100Hz force-feedback sensor, integral 24bit digitiser and 3-4Gbyte of solid-state memory are all housed in a single unit. Use of the most recent technologies has kept the power consumption to below 1W and the weight to 3.5Kg per unit. The concept of the disk-swap procedure for obtaining data from the field has been usurped by a fast data download technique using firewire technology. This allows for rapid station servicing, essential when 150 stations are in use, and also ensures the environmental integrity of the system by removing the requirement for a disk access port and envi- ronmentally exposed data disk. The system therefore meets the criteria for controlled source and passive seismic experiments: (1) the single unit concept and low-weight is designed for rapid deployment on short-term projects; (2) the low power consumption reduces the power-supply requirements facilitating deployment; (3) the low self-noise and bandwidth of the sensor make it applicable to passive experiments involving nat- ural sources. Further to this acquisition process, in collaboration with external groups, the SEIS- UK data management procedures have been streamlined with the integration of the Guralp GCF format data into the PASSCAL PDB software. This allows for rapid dissemination of field data and the production of archive-ready datasets, reducing the time between field recording and data archive. The archiving procedure for SEIS- UK datasets has been established, with data from experiments carried out with the broadband equipment already on the permanent continuous data archive at IRIS DMC.
NASA Technical Reports Server (NTRS)
White, Nicholas (Technical Monitor); Murray, Stephen S.
2003-01-01
(1) Chandra Archive: SAO has maintained the interfaces through which HEASARC gains access to the Chandra Data Archive. At HEASARC's request, we have implemented an anonymous ftp copy of a major part of the public archive and we keep that archive up-to- date. SAO has participated in the ADEC interoperability working group, establishing guidelines or interoperability standards and prototyping such interfaces. We have provided an NVO-based prototype interface, intending to serve the HEASARC-led NVO demo project. HEASARC's Astrobrowse interface was maintained and updated. In addition, we have participated in design discussions surrounding HEASARC's Caldb project. We have attended the HEASARC Users Group meeting and presented CDA status and developments. (2) Chandra CALDB: SA0 has maintained and expanded the Chandra CALDB by including four new data file types, defining the corresponding CALDB keyword/identification structures. We have provided CALDB upgrades for the public (CIAO) and for Standard Data Processing. Approximately 40 new files have been added to the CALDB in these version releases. There have been in the past year ten of these CALDB upgrades, each with unique index configurations. In addition, with the inputs from software, archive, and calibration scientists, as well as CIAO/SDP software developers, we have defined a generalized expansion of the existing CALDB interface and indexing structure. The purpose of this is to make the CALDB more generally applicable and useful in new and future missions that will be supported archivally by HEASARC. The generalized interface will identify additional configurational keywords and permit more extensive calibration parameter and boundary condition specifications for unique file selection. HEASARC scientists and developers from SAO and GSFC have become involved in this work, which is expected to produce a new interface for general use within the current year. (3) DS9: One of the decisions that came from last year's HEADCC meeting was to make the ds9 image display program the primary vehicle for displaying line graphics (as well as images). The first step required to make this possible was to enhance the line graphics capabilities of ds9. SAO therefore spent considerable effort upgrading ds9 to use Tcl 8.4 so that the BLT line graphics package could be built and imported into ds9 from source code, rather than from a pre-built (and generally outdated) shared library. This task, which is nearly complete, allows us to extend BLT as needed for the HEAD community. Following HEADCC discussion concerning archiving and the display of archived data, we extended ds9 to support full access to many astronomical Web-based archives sites, including HEASARC, MAST, CHANDRA, SKYVIEW, ADS, NED, SIMBAD, IRAS, NVRO, SAO TDC, and FIRST. Using ds9's new internal Web access capabilities, these archives can be accessed via their Web page. FITS images, plots, spectra, and journal abstracts can be referenced, down-loaded, and displayed directly and easily in ds9. For more information, see: http://hea-www.harvard.edu/saord/ds9. Also after the HEADCC discussion concerning region filtering, we extended the Funtools sample implementation of region filtering as described in: http://hea-www.harvard.edu/saord/funtools/regions.html. In particular, we added several new composite regions for event and image filtering, including elliptical and box annuli. We also extended the panda (Pie AND Annulus) region support to include box pandas and elliptical pandas. These new composite regions are especially useful in programs that need to count photons in each separate region using only a single pass through the data. Support for these new regions was added to ds9. In the same vein, we developed new region support for filtering images using simple FITS image masks, i.e. 8-bit or 16-bit FITS images where the value of a pixel is the region id number for that pixel. Other important enhancements to DS9 this year, include supporor multiple world coordinate systems, three dimensional event file binning, image smoothing, region groups and tags, the ability to save images in a number of image formats (such as JPEG, TIFF, PNG, FITS), improvements in support for integrating external analysis tools, and support for the virtual observatory. In particular, a full-featured web browser has been implemented within D S 9 . This provides support for full access to HEASARC archive sites such as SKYVIEW and W3BROWSE, in addition to other astronomical archives sites such as MAST, CHANDRA, ADS, NED, SIMBAD, IRAS, NVRO, SA0 TDC, and FIRST. From within DS9, the archives can be searched, and FITS images, plots, spectra, and journal abstracts can be referenced, downloaded and displayed The web browser provides the basis for the built-in help facility. All DS9 documentation, including the reference manual, FAQ, Know Features, and contact information is now available to the user without the need for external display applications. New versions of DS9 maybe downloaded and installed using this facility. Two important features used in the analysis of high energy astronomical data have been implemented in the past year. The first is support for binning photon event data in three dimensions. By binning the third dimension in time or energy, users are easily able to detect variable x-ray sources and identify other physical properties of their data. Second, a number of fast smoothing algorithms have been implemented in DS9, which allow users to smooth their data in real time. Algorithms for boxcar, tophat, and gaussian smoothing are supported.
The Growth of the User Community of the La Silla Paranal Observatory Science Archive
NASA Astrophysics Data System (ADS)
Romaniello, M.; Arnaboldi, M.; Da Rocha, C.; De Breuck, C.; Delmotte, N.; Dobrzycki, A.; Fourniol, N.; Freudling, W.; Mascetti, L.; Micol, A.; Retzlaff, J.; Sterzik, M.; Sequeiros, I. V.; De Breuck, M. V.
2016-03-01
The archive of the La Silla Paranal Observatory has grown steadily into a powerful science resource for the ESO astronomical community. Established in 1998, the Science Archive Facility (SAF) stores both the raw data generated by all ESO instruments and selected processed (science-ready) data. The growth of the SAF user community is analysed through access and publication statistics. Statistics are presented for archival users, who do not contribute to observing proposals, and contrasted with regular and archival users, who are successful in competing for observing time. Archival data from the SAF contribute to about one paper out of four that use data from ESO facilities. This study reveals that the blend of users constitutes a mixture of the traditional ESO community making novel use of the data and of a new community being built around the SAF.
Proba-V Mission Exploitation Platform
NASA Astrophysics Data System (ADS)
Goor, Erwin; Dries, Jeroen
2017-04-01
VITO and partners developed the Proba-V Mission Exploitation Platform (MEP) as an end-to-end solution to drastically improve the exploitation of the Proba-V (a Copernicus contributing mission) EO-data archive (http://proba-v.vgt.vito.be/), the past mission SPOT-VEGETATION and derived vegetation parameters by researchers, service providers and end-users. The analysis of time series of data (+1PB) is addressed, as well as the large scale on-demand processing of near real-time data on a powerful and scalable processing environment. Furthermore data from the Copernicus Global Land Service is in scope of the platform. From November 2015 an operational Proba-V MEP environment, as an ESA operation service, is gradually deployed at the VITO data center with direct access to the complete data archive. Since autumn 2016 the platform is operational and yet several applications are released to the users, e.g. - A time series viewer, showing the evolution of Proba-V bands and derived vegetation parameters from the Copernicus Global Land Service for any area of interest. - Full-resolution viewing services for the complete data archive. - On-demand processing chains on a powerfull Hadoop/Spark backend e.g. for the calculation of N-daily composites. - Virtual Machines can be provided with access to the data archive and tools to work with this data, e.g. various toolboxes (GDAL, QGIS, GrassGIS, SNAP toolbox, …) and support for R and Python. This allows users to immediately work with the data without having to install tools or download data, but as well to design, debug and test applications on the platform. - A prototype of jupyter Notebooks is available with some examples worked out to show the potential of the data. Today the platform is used by several third party projects to perform R&D activities on the data, and to develop/host data analysis toolboxes. In parallel the platform is further improved and extended. From the MEP PROBA-V, access to Sentinel-2 and landsat data will be available as well soon. Users can make use of powerful Web based tools and can self-manage virtual machines to perform their work on the infrastructure at VITO with access to the complete data archive. To realise this, private cloud technology (openStack) is used and a distributed processing environment is built based on Hadoop. The Hadoop ecosystem offers a lot of technologies (Spark, Yarn, Accumulo, etc.) which we integrate with several open-source components (e.g. Geotrellis). The impact of this MEP on the user community will be high and will completely change the way of working with the data and hence open the large time series to a larger community of users. The presentation will address these benefits for the users and discuss on the technical challenges in implementing this MEP. Furthermore demonstrations will be done. Platform URL: https://proba-v-mep.esa.int/
The TESS Transiting Planet Search Predicted Recovery and Reliability Rates
NASA Astrophysics Data System (ADS)
Smith, Jeffrey C.; Caldwell, Douglas A.; Davies, Misty; Jenkins, Jon Michael; Li, Jie; Morris, Robert L.; Rose, Mark; Tenenbaum, Peter; Ting, Eric; Twicken, Joseph D.; Wohler, Bill
2018-06-01
The Transiting Exoplanet Survey Satellite (TESS) will search for transiting planet signatures via the Science Processing Operations Center (SPOC) Science Pipeline at NASA Ames Research Center. We report on predicted transit recovery and reliability rates for planetary signatures. These estimates are based on simulated runs of the pipeline using realistic stellar models and transiting planet populations along with best estimates for instrumental noise, thermal induced focus changes, instrumental drift and stochastic artifacts in the light curve data. Key sources of false positives are identified and summarized. TESS will launch in 2018 and survey the full sky for transiting exoplanets over a period of two years. The SPOC pipeline was ported from the Kepler Science Operations Center (SOC) codebase and extended for TESS after the mission was selected for flight in the NASA Astrophysics Explorer program. Candidate planet detections and data products will be delivered to the Mikulski Archive for Space Telescopes (MAST); the MAST URL is archive.stsci.edu/tess. Funding for the TESS Mission has been provided by the NASA Science Mission Directorate.
Henderson, Michael L; Dayhoff, Ruth E; Titton, Csaba P; Casertano, Andrew
2006-01-01
As part of its patient care mission, the U.S. Veterans Health Administration performs diagnostic imaging procedures at 141 medical centers and 850 outpatient clinics. VHA's VistA Imaging Package provides a full archival, display, and communications infrastructure and interfaces to radiology and other HIS modules as well as modalities and a worklist provider In addition, various medical center entities within VHA have elected to install commercial picture archiving and communications systems to enable image organization and interpretation. To evaluate interfaces between commercial PACS, the VistA hospital information system, and imaging modalities, VHA has built a fully constrained specification that is based on the Radiology Technical Framework (Rad-TF) Integrating the Healthcare Enterprise. The Health Level Seven normative conformance mechanism was applied to the IHE Rad-TF and agency requirements to arrive at a baseline set of message specifications. VHA provides a thorough implementation and testing process to promote the adoption of standards-based interoperability by all PACS vendors that want to interface with VistA Imaging.
NASA Astrophysics Data System (ADS)
Ayres, Thomas
2009-07-01
This is a Calibration Archival proposal to develop, implement, and test enhancements to the pipeline wavelength scales of STIS echelle spectra, to take full advantage of the extremely high performance of which the instrument is capable. The motivation is a recent extensive investigation--The Deep Lamp Project--which identified systematic wavelength distortions in all 44 primary and secondary settings of the four STIS echelle modes: E140M, E140H, E230M, and E230H. The method was to process deep exposures of the onboard Pt/Cr-Ne calibration source as if they were science images, and measure deviations of the lamp lines from their laboratory wavelengths. An approach has been developed to correct the distortions post facto, but it would be preferable to implement a more robust dispersion model in the pipeline itself. The proposed study will examine a more extensive set of WAVECALs than in the exploratory Deep Lamp effort, and will benefit from a new laboratory line list specifically for the STIS lamps. Ironing out the wrinkles in the STIS wavelength scales will impact many diverse science investigations, especially the Legacy Archival project "StarCAT."
20. Photocopy of photograph (original in the Langley Research Center ...
20. Photocopy of photograph (original in the Langley Research Center Archives, Hampton, VA LaRC) (L15337) DRAG-CLEANUP STUDIES OF THE BREWSTER BUFFALO IN THE FULL SCALE WIND TUNNEL, 1938. - NASA Langley Research Center, Full-Scale Wind Tunnel, 224 Hunting Avenue, Hampton, Hampton, VA
24. Photocopy of photograph (original in the Langley Research Center ...
24. Photocopy of photograph (original in the Langley Research Center Archives, Hampton, VA LaRC) (L75-734) MODEL OF SUPERSONIC TRANSPORT IN FULL-SCALE WIND TUNNEL FROM ENTRANCE CONE. - NASA Langley Research Center, Full-Scale Wind Tunnel, 224 Hunting Avenue, Hampton, Hampton, VA
25. Photocopy of photograph (original in the Langley Research Center ...
25. Photocopy of photograph (original in the Langley Research Center Archives, Hampton, VA LaRC) (L81-7333) RUTAN'S VARI-EZE ADVANCED CONCEPTS AIRCRAFT IN THE FULL-SCALE WIND TUNNEL. - NASA Langley Research Center, Full-Scale Wind Tunnel, 224 Hunting Avenue, Hampton, Hampton, VA
Using natural archives to track sources and long-term trends of pollution: an introduction
Jules Blais,; Rosen, Michael R.; John Smol,
2015-01-01
This book explores the myriad ways that environmental archives can be used to study the distribution and long-term trajectories of contaminants. The volume first focuses on reviews that examine the integrity of the historic record, including factors related to hydrology, post-depositional diffusion, and mixing processes. This is followed by a series of chapters dealing with the diverse archives available for long-term studies of environmental pollution.
National Space Science Data Center Information Model
NASA Astrophysics Data System (ADS)
Bell, E. V.; McCaslin, P.; Grayzeck, E.; McLaughlin, S. A.; Kodis, J. M.; Morgan, T. H.; Williams, D. R.; Russell, J. L.
2013-12-01
The National Space Science Data Center (NSSDC) was established by NASA in 1964 to provide for the preservation and dissemination of scientific data from NASA missions. It has evolved to support distributed, active archives that were established in the Planetary, Astrophysics, and Heliophysics disciplines through a series of Memoranda of Understanding. The disciplines took over responsibility for working with new projects to acquire and distribute data for community researchers while the NSSDC remained vital as a deep archive. Since 2000, NSSDC has been using the Archive Information Package to preserve data over the long term. As part of its effort to streamline the ingest of data into the deep archive, the NSSDC developed and implemented a data model of desired and required metadata in XML. This process, in use for roughly five years now, has been successfully used to support the identification and ingest of data into the NSSDC archive, most notably those data from the Planetary Data System (PDS) submitted under PDS3. A series of software packages (X-ware) were developed to handle the submission of data from the PDS nodes utilizing a volume structure. An XML submission manifest is generated at the PDS provider site prior to delivery to NSSDC. The manifest ensures the fidelity of PDS data delivered to NSSDC. Preservation metadata is captured in an XML object when NSSDC archives the data. With the recent adoption by the PDS of the XML-based PDS4 data model, there is an opportunity for the NSSDC to provide additional services to the PDS such as the preservation, tracking, and restoration of individual products (e.g., a specific data file or document), which was unfeasible in the previous PDS3 system. The NSSDC is modifying and further streamlining its data ingest process to take advantage of the PDS4 model, an important consideration given the ever-increasing amount of data being generated and archived by orbiting missions at the Moon and Mars, other active projects such as BRRISON, LADEE, MAVEN, INSIGHT, OSIRIS-REX and ground-based observatories. Streamlining the ingest process also benefits the continued processing of PDS3 data. We will report on our progress and status.
76 FR 26317 - Advisory Committee on Presidential Library-Foundation Partnerships
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-06
... NATIONAL ARCHIVES AND RECORDS ADMINISTRATION Advisory Committee on Presidential Library-Foundation... Library-Foundation Partnerships. The meeting will be held to discuss the reorganization of the National Archives as they relate to Presidential Libraries, Social Media Initiatives, Processing of Presidential...
Code of Federal Regulations, 2010 CFR
2010-01-01
... Citrus Industry, Part 1, Chapter 20-13 Market Classification, Maturity Standards and Processing or... 2065-S, 14th and Independence Ave., Washington, DC 20250 or at the National Archives and Records...: http://www.archives.gov/federal_register/code_of_federal_regulations/ibr_locations.html. ...
NASA Astrophysics Data System (ADS)
Santhana Vannan, S. K.; Ramachandran, R.; Deb, D.; Beaty, T.; Wright, D.
2017-12-01
This paper summarizes the workflow challenges of curating and publishing data produced from disparate data sources and provides a generalized workflow solution to efficiently archive data generated by researchers. The Oak Ridge National Laboratory Distributed Active Archive Center (ORNL DAAC) for biogeochemical dynamics and the Global Hydrology Resource Center (GHRC) DAAC have been collaborating on the development of a generalized workflow solution to efficiently manage the data publication process. The generalized workflow presented here are built on lessons learned from implementations of the workflow system. Data publication consists of the following steps: Accepting the data package from the data providers, ensuring the full integrity of the data files. Identifying and addressing data quality issues Assembling standardized, detailed metadata and documentation, including file level details, processing methodology, and characteristics of data files Setting up data access mechanisms Setup of the data in data tools and services for improved data dissemination and user experience Registering the dataset in online search and discovery catalogues Preserving the data location through Digital Object Identifiers (DOI) We will describe the steps taken to automate, and realize efficiencies to the above process. The goals of the workflow system are to reduce the time taken to publish a dataset, to increase the quality of documentation and metadata, and to track individual datasets through the data curation process. Utilities developed to achieve these goal will be described. We will also share metrics driven value of the workflow system and discuss the future steps towards creation of a common software framework.
Shaping ESO2020+ Together: Feedback from the Community Poll
NASA Astrophysics Data System (ADS)
Primas, F.; Ivison, R.; Berger, J.-P.; Caselli, P.; De Gregorio-Monsalvo, I.; Alonso Herrero, A.; Knudsen, K. K.; Leibundgut, B.; Moitinho, A.; Saviane, I.; Spyromilio, J.; Testi, L.; Vennes, S.
2015-09-01
A thorough evaluation and prioritisation of the ESO science programme into the 2020+ timeframe took place under the auspices of a working group, comprising astronomers drawn from ESO’s advisory structure and from within ESO. This group reported to ESO’s Scientific Technical Committee, and to ESO Council, concluding the exercise with the publication of a report, “Science Priorities at ESO”. A community poll and a dedicated workshop, held in January 2015, formed part of the information gathering process. The community poll was designed to probe the demographics of the user community, its scientific interests, use of observing facilities and plans for use of future telescopes and instruments, its views on types of observing programmes and on the provision of data processing and archiving. A total of 1775 full responses to the poll were received and an analysis of the results is presented here. Foremost is the importance of regular observing programmes on all ESO observing facilities, in addition to Large Programmes and Public Surveys. There was also a strong community requirement for ESO to process and archive data obtained at ESO facilities. Other aspects, especially those related to future facilities, are more challenging to interpret because of biases related to the distribution of science expertise and favoured wavelength regime amongst the targeted audience. The results of the poll formed a fundamental component of the report and pro-vide useful data to guide the evolution of ESO’s science programme.
The ExoMars Science Data Archive: Status and Plans
NASA Astrophysics Data System (ADS)
Heather, David; Barbarisi, Isa; Brumfitt, Jon; Lim, Tanya; Metcalfe, Leo; Villacorta, Antonio
2017-04-01
The ExoMars program is a co-operation between ESA and Roscosmos comprising two missions: the first, launched on 14 March 2016, included the Trace Gas Orbiter and Schiaparelli lander; the second, due for launch in 2020, will be a Rover and Surface Platform (RSP). The archiving and management of the science data to be returned from ExoMars will require a significant development effort for the new Planetary Science Archive (PSA). These are the first data in the PSA to be formatted according to the new PDS4 Standards, and there are also significant differences in the way in which a scientist will want to query, retrieve, and use data from a suite of rover instruments as opposed to remote sensing instrumentation from an orbiter. NASA has a strong user community interaction for their rovers, and a similar approach to their 'Analysts Notebook' will be needed for the future PSA. In addition to the archiving interface itself, there are differences with the overall archiving process being followed for ExoMars compared to previous ESA planetary missions. The first level of data processing for the 2016 mission, from telemetry to raw, is completed by ESA at ESAC in Madrid, where the archive itself resides. Data continuously flow direct to the PSA, where after the given proprietary period, they will be released to the community via the user interfaces. For the rover mission, the data pipelines are being developed by European industry, in close collaboration with ESA PSA experts and with the instrument teams. The first level of data processing will be carried out for all instruments at ALTEC in Turin where the pipelines are developed, and from where the rover operations will also be run. This presentation will focus on the challenges involved in archiving the data from the ExoMars Program, and will outline the plans and current status of the system being developed to respond to the needs of the missions.
14. Photocopy of photograph (original in the Langley Research Center ...
14. Photocopy of photograph (original in the Langley Research Center Archives, Hampton, VA LaRC) (L4776) VIEW SOUTH THROUGH ENTRANCE CONE OF FULL-SCALE WIND TUNNEL UNDER CONSTRUCTION, SEPTEMBER 12, 1930. - NASA Langley Research Center, Full-Scale Wind Tunnel, 224 Hunting Avenue, Hampton, Hampton, VA
21. Photocopy of photograph (original in the Langley Research Center ...
21. Photocopy of photograph (original in the Langley Research Center Archives, Hampton, VA LaRC) (L-9850) ANNUAL AIRCRAFT ENGINEERING CONFERENCE IN FULL-SCALE WIND TUNNEL; GROUP PHOTOGRAPH OF PARTICIPANTS, mAY 23, 1934. - NASA Langley Research Center, Full-Scale Wind Tunnel, 224 Hunting Avenue, Hampton, Hampton, VA
News from the ESO Science Archive Facility
NASA Astrophysics Data System (ADS)
Dobrzycki, A.; Arnaboldi, M.; Bierwirth, T.; Boelter, M.; Da Rocha, C.; Delmotte, N.; Forchì, V.; Fourniol, N.; klein Gebbinck, M.; Lange, U.; Mascetti, L.; Micol, A.; Moins, C.; Munte, C.; Pluciennik, C.; Retzlaff, J.; Romaniello, M.; Rosse, N.; Sequeiros, I. V.; Vuong, M.-H.; Zampieri, S.
2015-09-01
ESO Science Archive Facility (SAF) - one of the world's biggest astronomical archives - combines two roles: operational (ingest, tallying, safekeeping and distribution to observers of raw data taken with ESO telescopes and processed data generated both internally and externally) and scientific (publication and delivery of all flavours of data to external users). This paper presents the “State of the SAF.” SAF, as a living entity, is constantly implementing new services and upgrading the existing ones. We present recent and future developments related to the Archive's Request Handler and metadata handling as well as performance and usage statistics and trends. We also discuss the current and future datasets on offer at SAF.
Archiving of interferometric radio and mm/submm data at the National Radio Astronomy Observatory
NASA Astrophysics Data System (ADS)
Lacy, Mark
2018-06-01
Modern radio interferometers such as ALMA and the VLA are capable of producing ~1TB/day of data for processing into image products of comparable size. Besides the shear volume of data, the products themselves can be complicated and are sometimes hard to map into standard astronomical archive metadata. We also face similar issues to those faced by archives at other wavelengths, namely the role of archives as the basis of reprocessing platforms and facilities, and the validation and ingestion of user-derived products. In this talk I shall discuss the plans of NRAO in these areas over the next decade.
Remediation of the protein data bank archive.
Henrick, Kim; Feng, Zukang; Bluhm, Wolfgang F; Dimitropoulos, Dimitris; Doreleijers, Jurgen F; Dutta, Shuchismita; Flippen-Anderson, Judith L; Ionides, John; Kamada, Chisa; Krissinel, Eugene; Lawson, Catherine L; Markley, John L; Nakamura, Haruki; Newman, Richard; Shimizu, Yukiko; Swaminathan, Jawahar; Velankar, Sameer; Ory, Jeramia; Ulrich, Eldon L; Vranken, Wim; Westbrook, John; Yamashita, Reiko; Yang, Huanwang; Young, Jasmine; Yousufuddin, Muhammed; Berman, Helen M
2008-01-01
The Worldwide Protein Data Bank (wwPDB; wwpdb.org) is the international collaboration that manages the deposition, processing and distribution of the PDB archive. The online PDB archive at ftp://ftp.wwpdb.org is the repository for the coordinates and related information for more than 47 000 structures, including proteins, nucleic acids and large macromolecular complexes that have been determined using X-ray crystallography, NMR and electron microscopy techniques. The members of the wwPDB-RCSB PDB (USA), MSD-EBI (Europe), PDBj (Japan) and BMRB (USA)-have remediated this archive to address inconsistencies that have been introduced over the years. The scope and methods used in this project are presented.
Status of worldwide Landsat archive
Warriner, Howard W.
1987-01-01
In cooperation with the International Landsat community, and through the Landsat Technical Working Group (LTWG), NOAA is assembling information about the status of the Worldwide Landsat Archive. During LTWG 9, member nations agreed to participate in a survey of International Landsat data holding and of their archive experiences with Landsat data. The goal of the effort was two-fold; one, to document the Landsat archive to date, and, two, to ensure that specific nations' experience with long-term Landsat archival problems were available to others. The survey requested details such as amount of data held, the format of the archive holdings by Spacecraft/Sensor, and acquisition years; the estimated costs to accumulated process, and replace the data (if necessary); the storage space required, and any member nation's plans that would establish the insurance of continuing quality. As a group, the LTWG nations are concerned about the characteristics and reliability of long-term magnetic media storage. Each nation's experience with older data retrieval is solicited in the survey. This information will allow nations to anticipate and plan for required changes to their archival holdings. Also solicited were reports of any upgrades to a nation's archival system that are currently planned and all results of attempts to reduce archive holdings including methodology, current status, and the planned access rates and product support that are anticipated for responding to future archival usage.
77 FR 29391 - Advisory Committee on the Presidential Library-Foundation Partnerships
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-17
... NATIONAL ARCHIVES AND RECORDS ADMINISTRATION Advisory Committee on the Presidential Library... Presidential Library-Foundation Partnerships. The meeting will be held to discuss the National Archives and Records Administration's budget and its strategic planning process as it relates to Presidential Libraries...
Code of Federal Regulations, 2010 CFR
2010-01-01
... Florida Citrus Industry, Part 1, Chapter 20-13 Market Classification, Maturity Standards and Processing or... 2065-S, 14th and Independence Ave., Washington, DC 20250 or at the National Archives and Records...: http://www.archives.gov/federal_register/code_of_federal_regulations/ibr_locations.html. ...
Production and Distribution of NASA MODIS Remote Sensing Products
NASA Technical Reports Server (NTRS)
Wolfe, Robert
2007-01-01
The two Moderate Resolution Imaging Spectroradiometer (MODIS) instruments on-board NASA's Earth Observing System (EOS) Terra and Aqua satellites make key measurements for understanding the Earth's terrestrial ecosystems. Global time-series of terrestrial geophysical parameters have been produced from MODIS/Terra for over 7 years and for MODIS/Aqua for more than 4 1/2 years. These well calibrated instruments, a team of scientists and a large data production, archive and distribution systems have allowed for the development of a new suite of high quality product variables at spatial resolutions as fine as 250m in support of global change research and natural resource applications. This talk describes the MODIS Science team's products, with a focus on the terrestrial (land) products, the data processing approach and the process for monitoring and improving the product quality. The original MODIS science team was formed in 1989. The team's primary role is the development and implementation of the geophysical algorithms. In addition, the team provided feedback on the design and pre-launch testing of the instrument and helped guide the development of the data processing system. The key challenges the science team dealt with before launch were the development of algorithms for a new instrument and provide guidance of the large and complex multi-discipline processing system. Land, Ocean and Atmosphere discipline teams drove the processing system requirements, particularly in the area of the processing loads and volumes needed to daily produce geophysical maps of the Earth at resolutions as fine as 250 m. The processing system had to handle a large number of data products, large data volumes and processing loads, and complex processing requirements. Prior to MODIS, daily global maps from heritage instruments, such as Advanced Very High Resolution Radiometer (AVHRR), were not produced at resolutions finer than 5 km. The processing solution evolved into a combination of processing the lower level (Level 1) products and the higher level discipline specific Land and Atmosphere products in the MODIS Science Investigator Lead Processing System (SIPS), the MODIS Adaptive Processing System (MODAPS), and archive and distribution of the Land products to the user community by two of NASA s EOS Distributed Active Archive Centers (DAACs). Recently, a part of MODAPS, the Level 1 and Atmosphere Archive and Distribution System (LAADS), took over the role of archiving and distributing the Level 1 and Atmosphere products to the user community.
AstroCloud, a Cyber-Infrastructure for Astronomy Research: Overview
NASA Astrophysics Data System (ADS)
Cui, C.; Yu, C.; Xiao, J.; He, B.; Li, C.; Fan, D.; Wang, C.; Hong, Z.; Li, S.; Mi, L.; Wan, W.; Cao, Z.; Wang, J.; Yin, S.; Fan, Y.; Wang, J.
2015-09-01
AstroCloud is a cyber-Infrastructure for Astronomy Research initiated by Chinese Virtual Observatory (China-VO) under funding support from NDRC (National Development and Reform commission) and CAS (Chinese Academy of Sciences). Tasks such as proposal submission, proposal peer-review, data archiving, data quality control, data release and open access, Cloud based data processing and analyzing, will be all supported on the platform. It will act as a full lifecycle management system for astronomical data and telescopes. Achievements from international Virtual Observatories and Cloud Computing are adopted heavily. In this paper, backgrounds of the project, key features of the system, and latest progresses are introduced.
NASA Technical Reports Server (NTRS)
Heard, Pamala D.
1998-01-01
The purpose of this research is to explore the development of Marshall Space Flight Center Unique Programs. These academic tools provide the Education Program Office with important information from the Education Computer Aided Tracking System (EDCATS). This system is equipped to provide on-line data entry, evaluation, analysis, and report generation, with full archiving for all phases of the evaluation process. Another purpose is to develop reports and data that is tailored to Marshall Space Flight Center Unique Programs. It also attempts to acquire knowledge on how, why, and where information is derived. As a result, a user will be better prepared to decide which available tool is the most feasible for their reports.
RNAV STAR Procedural Adherence
NASA Technical Reports Server (NTRS)
Stewart, Michael J.; Matthews, Bryan L.
2017-01-01
In this exploratory archival study we mined the performance of 24 major US airports area navigation standard terminal arrival routes (RNAV STARs) over the preceding three years. Overlaying radar track data on top of RNAV STAR routes provided a comparison between aircraft flight paths and the waypoint positions and altitude restrictions. NASA Ames Supercomputing resources were utilized to perform the data mining and processing. We investigated STARs by lateral transition path (full-lateral), vertical restrictions (full-lateral/full-vertical), and skipped waypoints (skips). In addition, we graphed altitudes and their frequencies of occurrence for altitude restrictions. Full-lateral compliance was generally greater than Full-lateral/full-vertical, but the delta between the rates was not always consistent. Full-lateral/full-vertical usage medians of the 2016 procedures ranged from 0 in KDEN (Denver) to 21 in KMEM (Memphis). Waypoint skips ranged from 0 to nearly 100 for specific waypoints. Altitudes restrictions were sometimes missed by systemic amounts in 1000 ft. increments from the restriction, creating multi-modal distributions. Other times, altitude misses looked to be more normally distributed around the restriction. This work is a preliminary investigation into the objective performance of instrument procedures and provides a framework to track how procedural concepts and design intervention function. In addition, this tool may aid in providing acceptability metrics as well as risk assessment information.
TESS Ground System Operations and Data Products
NASA Astrophysics Data System (ADS)
Glidden, Ana; Guerrero, Natalia; Fausnaugh, Michael; TESS Team
2018-01-01
We describe the ground system operations for processing data from the Transiting Exoplanet Survey Satellite (TESS), highlighting the role of the Science Operations Center (SOC). TESS is a spaced-based (nearly) all-sky mission, designed to find small planets around nearby bright stars using the transit method. We detail the flow of data from pixel measurements on the instrument to final products available at the Mikulski Archive for Space Telescopes (MAST). The ground system relies on a host of players to process the data, including the Payload Operations Center at MIT, the Science Processing Operation Center at NASA Ames, and the TESS Science Office, led by the Harvard-Smithsonian Center for Astrophysics and MIT. Together, these groups will deliver TESS Input Catalog, instrument calibration models, calibrated target pixels and full frame images, threshold crossing event reports, two-minute light curves, and the TESS Objects of Interest List.
Geostationary Lightning Mapper: Lessons Learned from Post Launch Test
NASA Astrophysics Data System (ADS)
Edgington, S.; Tillier, C. E.; Demroff, H.; VanBezooijen, R.; Christian, H. J., Jr.; Bitzer, P. M.
2017-12-01
Pre-launch calibration and algorithm design for the GOES Geostationary Lightning Mapper resulted in a successful and trouble-free on-orbit activation and post-launch test sequence. Within minutes of opening the GLM aperture door on January 4th, 2017, lightning was detected across the entire field of view. During the six-month post-launch test period, numerous processing parameters on board the instrument and in the ground processing algorithms were fine-tuned. Demonstrated on-orbit performance exceeded pre-launch predictions. We provide an overview of the ground calibration sequence, on-orbit tuning of the instrument, tuning of the ground processing algorithms (event filtering and navigation). We also touch on new insights obtained from analysis of a large and growing archive of raw GLM data, containing 3e8 flash detections derived from over 1e10 full-disk images of the Earth.
Adaptability in the Development of Data Archiving Services at Johns Hopkins University
NASA Astrophysics Data System (ADS)
Petters, J.; DiLauro, T.; Fearon, D.; Pralle, B.
2015-12-01
Johns Hopkins University (JHU) Data Management Services provides archiving services for institutional researchers through the JHU Data Archive, thereby increasing the access to and use of their research data. From its inception our unit's archiving service has evolved considerably. While some of these changes have been internally driven so that our unit can archive quality data collections more efficiently, we have also developed archiving policies and procedures on the fly in response to researcher needs. Providing our archiving services for JHU research groups from a variety of research disciplines have surfaced different sets of expectations and needs. We have used each interaction to help us refine our services and quickly satisfy the researchers we serve (following the first agile principle). Here we discuss the development of our newest archiving service model, its implementation over the past several months, and the processes by which we have continued to refine and improve our archiving services since its implementation. Through this discussion we will illustrate the benefits of planning, structure and flexibility in development of archiving services that maximize the potential value of research data. We will describe interactions with research groups, including those from environmental engineering and international health, and how we were able to rapidly modify and develop our archiving services to meet their needs (e.g. in an 'agile' way). For example, our interactions with both of these research groups led first to discussion in regular standing meetings and eventually development of new archiving policies and procedures. These policies and procedures centered on limiting access to archived research data while associated manuscripts progress through peer-review and publication.
Interim report on Landsat national archive activities
NASA Technical Reports Server (NTRS)
Boyd, John E.
1993-01-01
The Department of the Interior (DOI) has the responsibility to preserve and to distribute most Landsat Thematic Mapper (TM) and Multispectral Scanner (MSS) data that have been acquired by the five Landsat satellites operational since July 1972. Data that are still covered by exclusive marketing rights, which were granted by the U.S. Government to the commercial Landsat operator, cannot be distributed by the DOI. As the designated national archive for Landsat data, the U.S. Geological Survey's EROS Data Center (EDC) has initiated two new programs to protect and make available any of the 625,000 MSS scenes currently archived and the 200,000 TM scenes to be archived at EDC by 1995. A specially configured system has begun converting Landsat MSS data from obsolete high density tapes (HDT's) to more dense digital cassette tapes. After transcription, continuous satellite swaths are (1) divided into standard scenes defined by a world reference system, (2) geographically located by latitude and longitude, and (3) assessed for overall quality. Digital browse images are created by subsampling the full-resolution swaths. Conversion of the TM HDT's will begin in the fourth quarter of 1992 and will be conducted concurrently with MSS conversion. Although the TM archive is three times larger than the entire MSS archive, conversion of data from both sensor systems and consolidation of the entire Landsat archive at EDC will be completed by the end of 1994. Some MSS HDT's have deteriorated, primarily as a result of hydrolysis of the pigment binder. Based on a small sample of the 11 terabytes of post-1978 MSS data and the 41 terabytes of TM data to be converted, it appears that to date, less than 2 percent of the data have been lost. The data loss occurs within small portions of some scenes; few scenes are lost entirely. Approximately 10,000 pre-1979 MSS HDT's have deteriorated to such an extent, as a result of hydrolysis, that the data cannot be recovered without special treatment of the tapes. An independent consulting division of a major tape manufacturer has analyzed affected tapes and is confident that restorative procedures can be applied to the HDT's to permit one pass to reproduce the data on another recording media. A system to distribute minimally processed Landsat data will be procured in 1992 and will be operational by mid-1994. Any TM or MSS data in the national archive that are not restricted by exclusive marketing rights will be reproduced directly from the archive media onto user specified computer-compatible media. TM data will be produced either at a raw level (radiometrically and geometrically uncorrected) or at an intermediate level (radiometrically corrected and geometrically indexed). MSS data will be produced to an intermediate level or to a fully corrected level (radiometrically corrected and geometrically transformed to an Oblique Mercator projection). The system will be capable of providing ordered scenes within 48 hours of receipt of order.
NASA CDDIS: Next Generation System
NASA Astrophysics Data System (ADS)
Michael, B. P.; Noll, C. E.; Woo, J. Y.; Limbacher, R. I.
2017-12-01
The Crustal Dynamics Data Information System (CDDIS) supports data archiving and distribution activities for the space geodesy and geodynamics community. The main objectives of the system are to make space geodesy and geodynamics related data and derived products available in a central archive, to maintain information about the archival of these data, to disseminate these data and information in a timely manner to a global scientific research community, and to provide user based tools for the exploration and use of the archive. As the techniques and data volume have increased, the CDDIS has evolved to offer a broad range of data ingest services, from data upload, quality control, documentation, metadata extraction, and ancillary information. As a major step taken to improve services, the CDDIS has transitioned to a new hardware system and implemented incremental upgrades to a new software system to meet these goals while increasing automation. This new system increases the ability of the CDDIS to consistently track errors and issues associated with data and derived product files uploaded to the system and to perform post-ingest checks on all files received for the archive. In addition, software to process new data sets and changes to existing data sets have been implemented to handle new formats and any issues identified during the ingest process. In this poster, we will discuss the CDDIS archive in general as well as review and contrast the system structures and quality control measures employed before and after the system upgrade. We will also present information about new data sets and changes to existing data and derived products archived at the CDDIS.
NASA Astrophysics Data System (ADS)
Yu, Kaifeng; Hartmann, Kai; Nottebaum, Veit; Stauch, Georg; Lu, Huayu; Zeeden, Christian; Yi, Shuangwen; Wünnemann, Bernd; Lehmkuhl, Frank
2016-04-01
Geochemical characteristics have been intensively used to assign sediment properties to paleoclimate and provenance. Nonetheless, in particular concerning the arid context, bulk geochemistry of different sediment archives and corresponding process interpretations are hitherto elusive. The Ejina Basin, with its suite of different sediment archives, is known as one of the main sources for the loess accumulation on the Chinese Loess Plateau. In order to understand mechanisms along this supra-regional sediment cascade, it is crucial to decipher the archive characteristics and formation processes. To address these issues, five profiles in different geomorphological contexts were selected. Analyses of X-ray fluorescence and diffraction, grain size, optically stimulated luminescence and radiocarbon dating were performed. Robust factor analysis was applied to reduce the attribute space to the process space of sedimentation history. Five sediment archives from three lithologic units exhibit geochemical characteristics as follows: (i) aeolian sands have high contents of Zr and Hf, whereas only Hf can be regarded as a valuable indicator to discriminate the coarse sand proportion; (ii) sandy loess has high Ca and Sr contents which both exhibit broad correlations with the medium to coarse silt proportions; (iii) lacustrine clays have high contents of felsic, ferromagnesian and mica source elements e.g., K, Fe, Ti, V, and Ni; (iv) fluvial sands have high contents of Mg, Cl and Na which may be enriched in evaporite minerals; (v) alluvial gravels have high contents of Cr which may originate from nearby Cr-rich bedrock. Temporal variations can be illustrated by four robust factors: weathering intensity, silicate-bearing mineral abundance, saline/alkaline magnitude and quasi-constant aeolian input. In summary, the bulk-composition of the late Quaternary sediments in this arid context is governed by the nature of the source terrain, weak chemical weathering, authigenic minerals, aeolian sand input, whereas pedogenesis and diagenesis exert only limited influences. Hence, this study demonstrates a practical geochemical strategy supplemented by grain size and mineralogical data, to discriminate sediment archives and thereafter enhance our ability to offer more intriguing information about the sedimentary processes in the arid central Asia.
Cassini Archive Tracking System
NASA Technical Reports Server (NTRS)
Conner, Diane; Sayfi, Elias; Tinio, Adrian
2006-01-01
The Cassini Archive Tracking System (CATS) is a computer program that enables tracking of scientific data transfers from originators to the Planetary Data System (PDS) archives. Without CATS, there is no systematic means of locating products in the archive process or ensuring their completeness. By keeping a database of transfer communications and status, CATS enables the Cassini Project and the PDS to efficiently and accurately report on archive status. More importantly, problem areas are easily identified through customized reports that can be generated on the fly from any Web-enabled computer. A Web-browser interface and clearly defined authorization scheme provide safe distributed access to the system, where users can perform functions such as create customized reports, record a transfer, and respond to a transfer. CATS ensures that Cassini provides complete science archives to the PDS on schedule and that those archives are available to the science community by the PDS. The three-tier architecture is loosely coupled and designed for simple adaptation to multimission use. Written in the Java programming language, it is portable and can be run on any Java-enabled Web server.
NASA Technical Reports Server (NTRS)
Thompson, Susan E.; Fraquelli, Dorothy; Van Cleve, Jeffrey E.; Caldwell, Douglas A.
2016-01-01
A description of Kepler, its design, performance and operational constraints may be found in the Kepler Instrument Handbook (KIH, Van Cleve Caldwell 2016). A description of Kepler calibration and data processing is described in the Kepler Data Processing Handbook (KDPH, Jenkins et al. 2016; Fanelli et al. 2011). Science users should also consult the special ApJ Letters devoted to early Kepler results and mission design (April 2010, ApJL, Vol. 713 L79-L207). Additional technical details regarding the data processing and data qualities can be found in the Kepler Data Characteristics Handbook (KDCH, Christiansen et al. 2013) and the Data Release Notes (DRN). This archive manual specifically documents the file formats, as they exist for the last data release of Kepler, Data Release 25(KSCI-19065-002). The earlier versions of the archive manual and data release notes act as documentation for the earlier versions of the data files.
Harrison, Arnell S.; Dadisman, Shawn V.; Flocks, James G.; Wiese, Dana S.; Robbins, Lisa L.
2007-01-01
In May of 2006, the U.S. Geological Survey conducted geophysical surveys offshore of Siesta Key, Florida. This report serves as an archive of unprocessed digital chirp seismic reflection data, trackline maps, navigation files, GIS information, Field Activity Collection System (FACS) logs, observer's logbook, and formal FGDC metadata. Gained digital images of the seismic profiles are also provided. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG-Y format (Barry and others, 1975) and may be downloaded and processed with commercial or public domain software such as Seismic Unix (SU). Example SU processing scripts and USGS software for viewing the SEG-Y files (Zihlman, 1992) are also provided.
Harrison, Arnell S.; Dadisman, Shawn V.; Ferina, Nick F.; Wiese, Dana S.; Flocks, James G.
2007-01-01
In June of 2006, the U.S. Geological Survey conducted a geophysical survey offshore of Isles Dernieres, Louisiana. This report serves as an archive of unprocessed digital CHIRP seismic reflection data, trackline maps, navigation files, GIS information, Field Activity Collection System (FACS) logs, observer's logbook, and formal FGDC metadata. Gained digital images of the seismic profiles are also provided. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG-Y format (Barry and others, 1975) and may be downloaded and processed with commercial or public domain software such as Seismic UNIX (SU). Example SU processing scripts and USGS software for viewing the SEG-Y files (Zihlman, 1992) are also provided.
Marine ferromanganese encrustations: Archives of changing oceans
Koschinsky, Andrea; Hein, James
2017-01-01
Marine iron–manganese oxide coatings occur in many shallow and deep-water areas of the global ocean and can form in three ways: 1) Fe–Mn crusts can precipitate from seawater onto rocks on seamounts; 2) Fe–Mn nodules can form on the sediment surface around a nucleus by diagenetic processes in sediment pore water; 3) encrustations can precipitate from hydrothermal fluids. These oxide coatings have been growing for thousands to tens of millions of years. They represent a vast archive of how oceans have changed, including variations of climate, ocean currents, geological activity, erosion processes on land, and even anthropogenic impact. A growing toolbox of age-dating methods and element and isotopic signatures are being used to exploit these archives.
EOS MLS Science Data Processing System: A Description of Architecture and Capabilities
NASA Technical Reports Server (NTRS)
Cuddy, David T.; Echeverri, Mark D.; Wagner, Paul A.; Hanzel, Audrey T.; Fuller, Ryan A.
2006-01-01
This paper describes the architecture and capabilities of the Science Data Processing System (SDPS) for the EOS MLS. The SDPS consists of two major components--the Science Computing Facility and the Science Investigator-led Processing System. The Science Computing Facility provides the facilities for the EOS MLS Science Team to perform the functions of scientific algorithm development, processing software development, quality control of data products, and scientific analyses. The Science Investigator-led Processing System processes and reprocesses the science data for the entire mission and delivers the data products to the Science Computing Facility and to the Goddard Space Flight Center Earth Science Distributed Active Archive Center, which archives and distributes the standard science products.
NASA Astrophysics Data System (ADS)
Choquet, Élodie; Pueyo, Laurent; Soummer, Rémi; Perrin, Marshall D.; Hagan, J. Brendan; Gofas-Salas, Elena; Rajan, Abhijith; Aguilar, Jonathan
2015-09-01
The ALICE program, for Archival Legacy Investigation of Circumstellar Environment, is currently conducting a virtual survey of about 400 stars, by re-analyzing the HST-NICMOS coronagraphic archive with advanced post-processing techniques. We present here the strategy that we adopted to identify detections and potential candidates for follow-up observations, and we give a preliminary overview of our detections. We present a statistical analysis conducted to evaluate the confidence level on these detection and the completeness of our candidate search.
NASA Astrophysics Data System (ADS)
Dlesk, A.; Raeva, P.; Vach, K.
2018-05-01
Processing of analog photogrammetric negatives using current methods brings new challenges and possibilities, for example, creation of a 3D model from archival images which enables the comparison of historical state and current state of cultural heritage objects. The main purpose of this paper is to present possibilities of processing archival analog images captured by photogrammetric camera Rollei 6006 metric. In 1994, the Czech company EuroGV s.r.o. carried out photogrammetric measurements of former limestone quarry the Great America located in the Central Bohemian Region in the Czech Republic. All the negatives of photogrammetric images, complete documentation, coordinates of geodetically measured ground control points, calibration reports and external orientation of images calculated in the Combined Adjustment Program are preserved and were available for the current processing. Negatives of images were scanned and processed using structure from motion method (SfM). The result of the research is a statement of what accuracy is possible to expect from the proposed methodology using Rollei metric images originally obtained for terrestrial intersection photogrammetry while adhering to the proposed methodology.
NASA Astrophysics Data System (ADS)
Arko, S. A.; Hogenson, R.; Geiger, A.; Herrmann, J.; Buechler, B.; Hogenson, K.
2016-12-01
In the coming years there will be an unprecedented amount of SAR data available on a free and open basis to research and operational users around the globe. The Alaska Satellite Facility (ASF) DAAC hosts, through an international agreement, data from the Sentinel-1 spacecraft and will be hosting data from the upcoming NASA ISRO SAR (NISAR) mission. To more effectively manage and exploit these vast datasets, ASF DAAC has begun moving portions of the archive to the cloud and utilizing cloud services to provide higher-level processing on the data. The Hybrid Pluggable Processing Pipeline (HyP3) project is designed to support higher-level data processing in the cloud and extend the capabilities of researchers to larger scales. Built upon a set of core Amazon cloud services, the HyP3 system allows users to request data processing using a number of canned algorithms or their own algorithms once they have been uploaded to the cloud. The HyP3 system automatically accesses the ASF cloud-based archive through the DAAC RESTful application programming interface and processes the data on Amazon's elastic compute cluster (EC2). Final products are distributed through Amazon's simple storage service (S3) and are available for user download. This presentation will provide an overview of ASF DAAC's activities moving the Sentinel-1 archive into the cloud and developing the integrated HyP3 system, covering both the benefits and difficulties of working in the cloud. Additionally, we will focus on the utilization of HyP3 for higher-level processing of SAR data. Two example algorithms, for sea-ice tracking and change detection, will be discussed as well as the mechanism for integrating new algorithms into the pipeline for community use.
Cardio-PACs: a new opportunity
NASA Astrophysics Data System (ADS)
Heupler, Frederick A., Jr.; Thomas, James D.; Blume, Hartwig R.; Cecil, Robert A.; Heisler, Mary
2000-05-01
It is now possible to replace film-based image management in the cardiac catheterization laboratory with a Cardiology Picture Archiving and Communication System (Cardio-PACS) based on digital imaging technology. The first step in the conversion process is installation of a digital image acquisition system that is capable of generating high-quality DICOM-compatible images. The next three steps, which are the subject of this presentation, involve image display, distribution, and storage. Clinical requirements and associated cost considerations for these three steps are listed below: Image display: (1) Image quality equal to film, with DICOM format, lossless compression, image processing, desktop PC-based with color monitor, and physician-friendly imaging software; (2) Performance specifications include: acquire 30 frames/sec; replay 15 frames/sec; access to file server 5 seconds, and to archive 5 minutes; (3) Compatibility of image file, transmission, and processing formats; (4) Image manipulation: brightness, contrast, gray scale, zoom, biplane display, and quantification; (5) User-friendly control of image review. Image distribution: (1) Standard IP-based network between cardiac catheterization laboratories, file server, long-term archive, review stations, and remote sites; (2) Non-proprietary formats; (3) Bidirectional distribution. Image storage: (1) CD-ROM vs disk vs tape; (2) Verification of data integrity; (3) User-designated storage capacity for catheterization laboratory, file server, long-term archive. Costs: (1) Image acquisition equipment, file server, long-term archive; (2) Network infrastructure; (3) Review stations and software; (4) Maintenance and administration; (5) Future upgrades and expansion; (6) Personnel.
Microfilm Permanence and Archival Quality
ERIC Educational Resources Information Center
Avedon, Don M.
1972-01-01
The facts about microfilm permanence and archival quality are presented in simple terms. The major factors, including the film base material, the film emulsion, processing, and storage conditions are reviewed. The designations on the edge of the film are explained and a list of refernces provided. (14 references) (Author)
Code of Federal Regulations, 2010 CFR
2010-01-01
... Industry, Part 1, Chapter 20-13 Market Classification, Maturity Standards and Processing or Packing... Independence Ave., Washington, DC 20250 or at the National Archives and Records Administration (NARA). For information on the availability of this material at NARA, call 202-741-6030, or go to: http://www.archives.gov...
22. Photocopy of photograph (original in the Langley Research Center ...
22. Photocopy of photograph (original in the Langley Research Center Archives, Hampton, VA LaRC) (L27056) LOCKHEED YP-38 IN THE FULL-SCALE WIND TUNNEL; THIS WAS THE PROTOTYPE OF THE P-38 (LOCKHEED LIGHTNING); c. 1941. - NASA Langley Research Center, Full-Scale Wind Tunnel, 224 Hunting Avenue, Hampton, Hampton, VA
Pooling the resources of the CMS Tier-1 sites
Apyan, A.; Badillo, J.; Cruz, J. Diaz; ...
2015-12-23
The CMS experiment at the LHC relies on 7 Tier-1 centres of the WLCG to perform the majority of its bulk processing activity, and to archive its data. During the first run of the LHC, these two functions were tightly coupled as each Tier-1 was constrained to process only the data archived on its hierarchical storage. This lack of flexibility in the assignment of processing workflows occasionally resulted in uneven resource utilisation and in an increased latency in the delivery of the results to the physics community.The long shutdown of the LHC in 2013-2014 was an opportunity to revisit thismore » mode of operations, disentangling the processing and archive functionalities of the Tier-1 centres. The storage services at the Tier-1s were redeployed breaking the traditional hierarchical model: each site now provides a large disk storage to host input and output data for processing, and an independent tape storage used exclusively for archiving. Movement of data between the tape and disk endpoints is not automated, but triggered externally through the WLCG transfer management systems.With this new setup, CMS operations actively controls at any time which data is available on disk for processing and which data should be sent to archive. Thanks to the high-bandwidth connectivity guaranteed by the LHCOPN, input data can be freely transferred between disk endpoints as needed to take advantage of free CPU, turning the Tier-1s into a large pool of shared resources. The output data can be validated before archiving them permanently, and temporary data formats can be produced without wasting valuable tape resources. Lastly, the data hosted on disk at Tier-1s can now be made available also for user analysis since there is no risk any longer of triggering chaotic staging from tape.In this contribution, we describe the technical solutions adopted for the new disk and tape endpoints at the sites, and we report on the commissioning and scale testing of the service. We detail the procedures implemented by CMS computing operations to actively manage data on disk at Tier-1 sites, and we give examples of the benefits brought to CMS workflows by the additional flexibility of the new system.« less
Pooling the resources of the CMS Tier-1 sites
NASA Astrophysics Data System (ADS)
Apyan, A.; Badillo, J.; Diaz Cruz, J.; Gadrat, S.; Gutsche, O.; Holzman, B.; Lahiff, A.; Magini, N.; Mason, D.; Perez, A.; Stober, F.; Taneja, S.; Taze, M.; Wissing, C.
2015-12-01
The CMS experiment at the LHC relies on 7 Tier-1 centres of the WLCG to perform the majority of its bulk processing activity, and to archive its data. During the first run of the LHC, these two functions were tightly coupled as each Tier-1 was constrained to process only the data archived on its hierarchical storage. This lack of flexibility in the assignment of processing workflows occasionally resulted in uneven resource utilisation and in an increased latency in the delivery of the results to the physics community. The long shutdown of the LHC in 2013-2014 was an opportunity to revisit this mode of operations, disentangling the processing and archive functionalities of the Tier-1 centres. The storage services at the Tier-1s were redeployed breaking the traditional hierarchical model: each site now provides a large disk storage to host input and output data for processing, and an independent tape storage used exclusively for archiving. Movement of data between the tape and disk endpoints is not automated, but triggered externally through the WLCG transfer management systems. With this new setup, CMS operations actively controls at any time which data is available on disk for processing and which data should be sent to archive. Thanks to the high-bandwidth connectivity guaranteed by the LHCOPN, input data can be freely transferred between disk endpoints as needed to take advantage of free CPU, turning the Tier-1s into a large pool of shared resources. The output data can be validated before archiving them permanently, and temporary data formats can be produced without wasting valuable tape resources. Finally, the data hosted on disk at Tier-1s can now be made available also for user analysis since there is no risk any longer of triggering chaotic staging from tape. In this contribution, we describe the technical solutions adopted for the new disk and tape endpoints at the sites, and we report on the commissioning and scale testing of the service. We detail the procedures implemented by CMS computing operations to actively manage data on disk at Tier-1 sites, and we give examples of the benefits brought to CMS workflows by the additional flexibility of the new system.
Operating a petabyte class archive at ESO
NASA Astrophysics Data System (ADS)
Suchar, Dieter; Lockhart, John S.; Burrows, Andrew
2008-07-01
The challenges of setting up and operating a Petabyte Class Archive will be described in terms of computer systems within a complex Data Centre environment. The computer systems, including the ESO Primary and Secondary Archive and the associated computational environments such as relational databases will be explained. This encompasses the entire system project cycle, including the technical specifications, procurement process, equipment installation and all further operational phases. The ESO Data Centre construction and the complexity of managing the environment will be presented. Many factors had to be considered during the construction phase, such as power consumption, targeted cooling and the accumulated load on the building structure to enable the smooth running of a Petabyte class Archive.
Remediation of the protein data bank archive
Henrick, Kim; Feng, Zukang; Bluhm, Wolfgang F.; Dimitropoulos, Dimitris; Doreleijers, Jurgen F.; Dutta, Shuchismita; Flippen-Anderson, Judith L.; Ionides, John; Kamada, Chisa; Krissinel, Eugene; Lawson, Catherine L.; Markley, John L.; Nakamura, Haruki; Newman, Richard; Shimizu, Yukiko; Swaminathan, Jawahar; Velankar, Sameer; Ory, Jeramia; Ulrich, Eldon L.; Vranken, Wim; Westbrook, John; Yamashita, Reiko; Yang, Huanwang; Young, Jasmine; Yousufuddin, Muhammed; Berman, Helen M.
2008-01-01
The Worldwide Protein Data Bank (wwPDB; wwpdb.org) is the international collaboration that manages the deposition, processing and distribution of the PDB archive. The online PDB archive at ftp://ftp.wwpdb.org is the repository for the coordinates and related information for more than 47 000 structures, including proteins, nucleic acids and large macromolecular complexes that have been determined using X-ray crystallography, NMR and electron microscopy techniques. The members of the wwPDB–RCSB PDB (USA), MSD-EBI (Europe), PDBj (Japan) and BMRB (USA)–have remediated this archive to address inconsistencies that have been introduced over the years. The scope and methods used in this project are presented. PMID:18073189
Kokaram, Anil C
2004-03-01
Image sequence restoration has been steadily gaining in importance with the increasing prevalence of visual digital media. The demand for content increases the pressure on archives to automate their restoration activities for preservation of the cultural heritage that they hold. There are many defects that affect archived visual material and one central issue is that of Dirt and Sparkle, or "Blotches." Research in archive restoration has been conducted for more than a decade and this paper places that material in context to highlight the advances made during that time. The paper also presents a new and simpler Bayesian framework that achieves joint processing of noise, missing data, and occlusion.
International Ultraviolet Explorer Final Archive
NASA Technical Reports Server (NTRS)
1997-01-01
CSC processed IUE images through the Final Archive Data Processing System. Raw images were obtained from both NDADS and the IUEGTC optical disk platters for processing on the Alpha cluster, and from the IUEGTC optical disk platters for DECstation processing. Input parameters were obtained from the IUE database. Backup tapes of data to send to VILSPA were routinely made on the Alpha cluster. IPC handled more than 263 requests for priority NEWSIPS processing during the contract. Staff members also answered various questions and requests for information and sent copies of IUE documents to requesters. CSC implemented new processing capabilities into the NEWSIPS processing systems as they became available. In addition, steps were taken to improve efficiency and throughput whenever possible. The node TORTE was reconfigured as the I/O server for Alpha processing in May. The number of Alpha nodes used for the NEWSIPS processing queue was increased to a maximum of six in measured fashion in order to understand the dependence of throughput on the number of nodes and to be able to recognize when a point of diminishing returns was reached. With Project approval, generation of the VD FITS files was dropped in July. This action not only saved processing time but, even more significantly, also reduced the archive storage media requirements, and the time required to perform the archiving, drastically. The throughput of images verified through CDIVS and processed through NEWSIPS for the contract period is summarized below. The number of images of a given dispersion type and camera that were processed in any given month reflects several factors, including the availability of the required NEWSIPS software system, the availability of the corresponding required calibrations (e.g., the LWR high-dispersion ripple correction and absolute calibration), and the occurrence of reprocessing efforts such as that conducted to incorporate the updated SWP sensitivity-degradation correction in May.
The COSMO-SkyMed ground and ILS and OPS segments upgrades for full civilian capacity exploitation
NASA Astrophysics Data System (ADS)
Fasano, L.; De Luca, G. F.; Cardone, M.; Loizzo, R.; Sacco, P.; Daraio, M. G.
2015-10-01
COSMO-SkyMed (CSK), is an Earth Observation joint program between Agenzia Spaziale Italiana (Italian Space Agency, ASI) and Italian Ministry of Defense (It-MoD). It consists of a constellation of four X Band Synthetic Aperture Radar (SAR) whose first satellite of has been launched on June 2007. Today the full constellation is fully qualified and is in an operative phase. The COSMO-SkyMed System includes 3 Segments: the Space Segment, the Ground Segment and the Integrated Logistic Support and Operations Segment (ILS and OPS) As part of a more complex re-engineering process aimed to improve the expected constellation lifetime, to fully exploit several system capabilities, to manage the obsolescence, to reduce the maintenance costs and to exploit the entire constellation capability for Civilian users a series of activities have been performed. In the next months these activities are planned to be completed and start to be operational so that it will be possible the programming, planning, acquisition, raw processing and archiving of all the images that the constellation can acquire.
The ESA Gaia Archive: Data Release 1
NASA Astrophysics Data System (ADS)
Salgado, J.; González-Núñez, J.; Gutiérrez-Sánchez, R.; Segovia, J. C.; Durán, J.; Hernández, J. L.; Arviset, C.
2017-10-01
The ESA Gaia mission is producing the most accurate source catalogue in astronomy to date. This represents a challenge in archiving to make the information and data accessible to astronomers in an efficient way, due to the size and complexity of the data. Also, new astronomical missions, taking larger and larger volumes of data, are reinforcing this change in the development of archives. Archives, as simple applications to access data, are evolving into complex data centre structures where computing power services are available for users and data mining tools are integrated into the server side. In the case of astronomy missions that involve the use of large catalogues, such as Gaia (or Euclid to come), the common ways to work on the data need to be changed to the following paradigm: "move the code close to the data". This implies that data mining functionalities are becoming a must to allow for the maximum scientific exploitation of the data. To enable these capabilities, a TAP+ interface, crossmatch capabilities, full catalogue histograms, serialisation of intermediate results in cloud resources, such as VOSpace etc., have been implemented for the Gaia Data Release 1 (DR1), to enable the exploitation of these science resources by the community without any bottlenecks in the connection bandwidth. We present the architecture, infrastructure and tools already available in the Gaia Archive DR1 (http://archives.esac.esa.int/gaia/) and we describe the capabilities and infrastructure.
Integrating the ODI-PPA scientific gateway with the QuickReduce pipeline for on-demand processing
NASA Astrophysics Data System (ADS)
Young, Michael D.; Kotulla, Ralf; Gopu, Arvind; Liu, Wilson
2014-07-01
As imaging systems improve, the size of astronomical data has continued to grow, making the transfer and processing of data a significant burden. To solve this problem for the WIYN Observatory One Degree Imager (ODI), we developed the ODI-Portal, Pipeline, and Archive (ODI-PPA) science gateway, integrating the data archive, data reduction pipelines, and a user portal. In this paper, we discuss the integration of the QuickReduce (QR) pipeline into PPA's Tier 2 processing framework. QR is a set of parallelized, stand-alone Python routines accessible to all users, and operators who can create master calibration products and produce standardized calibrated data, with a short turn-around time. Upon completion, the data are ingested into the archive and portal, and made available to authorized users. Quality metrics and diagnostic plots are generated and presented via the portal for operator approval and user perusal. Additionally, users can tailor the calibration process to their specific science objective(s) by selecting custom datasets, applying preferred master calibrations or generating their own, and selecting pipeline options. Submission of a QuickReduce job initiates data staging, pipeline execution, and ingestion of output data products all while allowing the user to monitor the process status, and to download or further process/analyze the output within the portal. User-generated data products are placed into a private user-space within the portal. ODI-PPA leverages cyberinfrastructure at Indiana University including the Big Red II supercomputer, the Scholarly Data Archive tape system and the Data Capacitor shared file system.
Bookbinding and the Conservation of Books. A Dictionary of Descriptive Terminology.
ERIC Educational Resources Information Center
Roberts, Matt T.; Etherington, Don
Intended for bookbinders and conservators of library and archival material and for those working in related fields, such as bibliography and librarianship, this dictionary contains definitions for the nomenclature of bookbinding and the conservation of archival material, illustrations of bookbinding equipment and processes, and biographical…
The Archivists' Toolkit: Another Step toward Streamlined Archival Processing
ERIC Educational Resources Information Center
Westbrook, Bradley D.; Mandell, Lee; Shepherd, Kelcy; Stevens, Brian; Varghese, Jason
2006-01-01
The Archivists' Toolkit is a software application currently in development and designed to support the creation and management of archival information. This article summarizes the development of the application, including some of the problems the application is designed to resolve. Primary emphasis is placed on describing the application's…
Ocean color - Availability of the global data set
NASA Technical Reports Server (NTRS)
Feldman, Gene; Kuring, Norman; Ng, Carolyn; Esaias, Wayne; Mcclain, Chuck; Elrod, Jane; Maynard, Nancy; Endres, Dan
1989-01-01
The use of satellite observations of ocean color to provide reliable estimates of marine phytoplankton biomass on synoptic scales is examined. An overview is given of the Coastal Zone Color Scanner data processing system. The archiving and distribution of ocean color data are discussed, and NASA-sponsored archive sites are listed.
Facilities Requirements for Archives and Special Collections Department.
ERIC Educational Resources Information Center
Brown, Charlotte B.
The program of the Archives and Special Collections Department at Franklin and Marshall College requires the following function areas to be located in the Shadek-Fackenthal Library: (1) Reading Room; (2) Conservation Laboratory; (3) Isolation Room; (4) storage for permanent collection; (5) storage for high security materials; (6) Processing Room;…
Incorporating Oracle on-line space management with long-term archival technology
NASA Technical Reports Server (NTRS)
Moran, Steven M.; Zak, Victor J.
1996-01-01
The storage requirements of today's organizations are exploding. As computers continue to escalate in processing power, applications grow in complexity and data files grow in size and in number. As a result, organizations are forced to procure more and more megabytes of storage space. This paper focuses on how to expand the storage capacity of a Very Large Database (VLDB) cost-effectively within a Oracle7 data warehouse system by integrating long term archival storage sub-systems with traditional magnetic media. The Oracle architecture described in this paper was based on an actual proof of concept for a customer looking to store archived data on optical disks yet still have access to this data without user intervention. The customer had a requirement to maintain 10 years worth of data on-line. Data less than a year old still had the potential to be updated thus will reside on conventional magnetic disks. Data older than a year will be considered archived and will be placed on optical disks. The ability to archive data to optical disk and still have access to that data provides the system a means to retain large amounts of data that is readily accessible yet significantly reduces the cost of total system storage. Therefore, the cost benefits of archival storage devices can be incorporated into the Oracle storage medium and I/O subsystem without loosing any of the functionality of transaction processing, yet at the same time providing an organization access to all their data.
NASA Astrophysics Data System (ADS)
Kantor, J.
During LSST observing, transient events will be detected and alerts generated at the LSST Archive Center at NCSA in Champaign-Illinois. As a very high rate of alerts is expected, approaching ˜ 10 million per night, we plan for VOEvent-compliant Distributor/Brokers (http://voevent.org) to be the primary end-points of the full LSST alert streams. End users will then use these Distributor/Brokers to classify and filter events on the stream for those fitting their science goals. These Distributor/Brokers are envisioned to be operated as a community service by third parties who will have signed MOUs with LSST. The exact identification of Distributor/Brokers to receive alerts will be determined as LSST approaches full operations and may change over time, but it is in our interest to identify and coordinate with them as early as possible. LSST will also operate a limited Distributor/Broker with a filtering capability at the Archive Center, to allow alerts to be sent directly to a limited number of entities that for some reason need to have a more direct connection to LSST. This might include, for example, observatories with significant follow-up capabilities whose observing may temporarily be more directly tied to LSST observing. It will let astronomers create simple filters that limit what alerts are ultimately forwarded to them. These user defined filters will be possible to specify using an SQL-like declarative language, or short snippets of (likely Python) code. We emphasize that this LSST-provided capability will be limited, and is not intended to satisfy the wide variety of use cases that a full-fledged public Event Distributor/Broker could. End users will not be able to subscribe to full, unfiltered, alert streams coming directly from LSST. In this session, we will discuss anticipated LSST data rates, and capabilities for alert processing and distribution/brokering. We will clarify what the LSST Observatory will provide versus what we anticipate will be a community effort.
NASA Astrophysics Data System (ADS)
Fowler, D. K.; Moses, J. F.; Duerr, R. E.; Webster, D.; Korn, D.
2010-12-01
Data Stewardship is becoming a principal part of a data manager’s work at NSIDC. It is vitally important that our organization makes a commitment to both current and long-term goals of data management and the preservation of our scientific data. Data must be available to researchers not only during active missions, but long after missions end. This includes maintaining accurate documentation, data tools, and a knowledgeable user support staff. NSIDC is preparing for long-term support of the ICESat mission data. Though ICESat has seen its last operational day, the data is still being improved and NSIDC is scheduled to archive the final release, Release 33, starting late in 2010. This release will include the final adjustments to the processing algorithms and will produce the best possible products to date. Along with the higher-level data sets, all supporting documentation will be archived at NSIDC. For the long-term archive, it is imperative that there is sufficient information about how products were prepared in order to convince future researchers that the scientific results are reproducible. The processing algorithms along with the Level 0 and ancillary products used to create the higher-level products will be archived and made available to users. This can enable users to examine production history, to derive revised products and to create their own products. Also contained in the long-term archive will be pre-launch, calibration/validation, and test data. These data are an important part of the provenance which must be preserved. For longevity, we’ll need to archive the data and documentation in formats that will be supported in the years to come.
Synergy with HST and JWST Data Management Systems
NASA Astrophysics Data System (ADS)
Greene, Gretchen; Space Telescope Data Management Team
2014-01-01
The data processing and archive systems for the JWST will contain a petabyte of science data and the best news is that users will have fast access to the latest calibrations through a variety of new services. With a synergistic approach currently underway with the STScI science operations between the Hubble Space Telescope and James Webb Space Telescope data management subsystems (DMS), operational verification is right around the corner. Next year the HST archive will provide scientists on-demand fully calibrated data products via the Mikulski Archive for Space Telescopes (MAST), which takes advantage of an upgraded DMS. This enhanced system, developed jointly with the JWST DMS is based on a new CONDOR distributed processing system capable of reprocessing data using a prioritization queue which runs in the background. A Calibration Reference Data System manages the latest optimal configuration for each scientific instrument pipeline. Science users will be able to search and discover the growing MAST archive calibrated datasets from these missions along with the other multiple mission holdings both local to MAST and available through the Virtual Observatory. JWST data systems will build upon the successes and lessons learned from the HST legacy and move us forward into the next generation of multi-wavelength archive research.
Harrison, Arnell S.; Dadisman, Shawn V.; Reich, Christopher D.; Wiese, Dana S.; Greenwood, Jason W.; Swarzenski, Peter W.
2007-01-01
In September of 2006, the U.S. Geological Survey conducted geophysical surveys offshore of Fort Lauderdale, FL. This report serves as an archive of unprocessed digital boomer and CHIRP seismic reflection data, trackline maps, navigation files, GIS information, Field Activity Collection System (FACS) logs, observer's logbook, and formal FGDC metadata. Filtered and gained digital images of the seismic profiles are also provided. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG-Y format (Barry and others, 1975) and may be downloaded and processed with commercial or public domain software such as Seismic Unix (SU). Example SU processing scripts and USGS software for viewing the SEG-Y files (Zihlman, 1992) are also provided.
Appraisal of the papers of biomedical scientists and physicians for a medical archives.
Anderson, P G
1985-01-01
Numerous medical libraries house archival collections. This article discusses criteria for selecting personal papers of biomedical scientists and physicians for a medical archives and defines key terms, such as appraisal, manuscripts, papers, records, and series. Appraisal focuses on both collection and series levels. Collection-level criteria include the significance of a scientist's career and the uniqueness, coverage, and accessibility of the manuscripts. Series frequently found among medically related manuscripts are enumerated and discussed. Types of organizational records and the desirability of accessioning them along with manuscripts are considered. Advantages of direct communication with creators of manuscripts are described. The initial appraisal process is not the last word: reevaluation of materials must take place during processing and can be resumed long afterwards. PMID:4052673
Big data challenges for large radio arrays
NASA Astrophysics Data System (ADS)
Jones, D. L.; Wagstaff, K.; Thompson, D. R.; D'Addario, L.; Navarro, R.; Mattmann, C.; Majid, W.; Lazio, J.; Preston, J.; Rebbapragada, U.
2012-03-01
Future large radio astronomy arrays, particularly the Square Kilometre Array (SKA), will be able to generate data at rates far higher than can be analyzed or stored affordably with current practices. This is, by definition, a "big data" problem, and requires an end-to-end solution if future radio arrays are to reach their full scientific potential. Similar data processing, transport, storage, and management challenges face next-generation facilities in many other fields. The Jet Propulsion Laboratory is developing technologies to address big data issues, with an emphasis in three areas: 1) Lower-power digital processing architectures to make highvolume data generation operationally affordable, 2) Date-adaptive machine learning algorithms for real-time analysis (or "data triage") of large data volumes, and 3) Scalable data archive systems that allow efficient data mining and remote user code to run locally where the data are stored.
Wen, Xuejiao; Qiu, Xiaolan; Han, Bing; Ding, Chibiao; Lei, Bin; Chen, Qi
2018-05-07
Range ambiguity is one of the factors which affect the SAR image quality. Alternately transmitting up and down chirp modulation pulses is one of the methods used to suppress the range ambiguity. However, the defocusing range ambiguous signal can still hold the stronger backscattering intensity than the mainlobe imaging area in some case, which has a severe impact on visual effects and subsequent applications. In this paper, a novel hybrid range ambiguity suppression method for up and down chirp modulation is proposed. The method can obtain the ambiguity area image and reduce the ambiguity signal power appropriately, by applying pulse compression using a contrary modulation rate and CFAR detecting method. The effectiveness and correctness of the approach is demonstrated by processing the archive images acquired by Chinese Gaofen-3 SAR sensor in full-polarization mode.
MODIS land data at the EROS data center DAAC
Jenkerson, Calli B.; Reed, B.C.
2001-01-01
The US Geological Survey's (USGS) Earth Resources Observation Systems (EROS) Data Center (EDC) in Sioux Falls, SD, USA, is the primary national archive for land processes data and one of the National Aeronautics and Space Administration's (NASA) Distributed Active Archive Centers (DAAC) for the Earth Observing System (EOS). One of EDC's functions as a DAAC is the archival and distribution of Moderate Resolution Spectroradiometer (MODIS) Land Data collected from the Earth Observing System (EOS) satellite Terra. More than 500,000 publicly available MODIS land data granules totaling 25 Terabytes (Tb) are currently stored in the EDC archive. This collection is managed, archived, and distributed by EOS Data and Information System (EOSDIS) Core System (ECS) at EDC. EDC User Services support the use of MODIS Land data, which include land surface reflectance/albedo, temperature/emissivity, vegetation characteristics, and land cover, by responding to user inquiries, constructing user information sites on the EDC web page, and presenting MODIS materials worldwide.
Ionospheric characteristics for archiving at the World Data Centers. Technical report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gamache, R.R.; Reinisch, B.W.
1990-12-01
A database structure for archiving ionospheric characteristics at uneven data rates was developed at the July 1989 Ionospheric Informatics Working Group (IIWG) Lowell Workshop in Digital Ionogram Data Formats for World Data Center Archiving. This structure is proposed as a new URSI standard and is being employed by the World Data Center A for solar terrestrial physics for archiving characteristics. Here the database has been slightly refined for the application and programs written to generate these database files using as input Digisonde 256 ARTIST data, post processed by the ULCAR ADEP (ARTIST Data Editing Program) system. The characteristics program asmore » well as supplemental programs developed for this task are described here. The new software will make it possible to archive the ionospheric characteristics from the Geophysics Laboratory high latitude Digisonde network, the AWS DISS and the international Digisonde networks, and other ionospheric sounding networks.« less
[A new concept for integration of image databanks into a comprehensive patient documentation].
Schöll, E; Holm, J; Eggli, S
2001-05-01
Image processing and archiving are of increasing importance in the practice of modern medicine. Particularly due to the introduction of computer-based investigation methods, physicians are dealing with a wide variety of analogue and digital picture archives. On the other hand, clinical information is stored in various text-based information systems without integration of image components. The link between such traditional medical databases and picture archives is a prerequisite for efficient data management as well as for continuous quality control and medical education. At the Department of Orthopedic Surgery, University of Berne, a software program was developed to create a complete multimedia electronic patient record. The client-server system contains all patients' data, questionnaire-based quality control, and a digital picture archive. Different interfaces guarantee the integration into the hospital's data network. This article describes our experiences in the development and introduction of a comprehensive image archiving system at a large orthopedic center.
Applying the SERENITY Methodology to the Domain of Trusted Electronic Archiving
NASA Astrophysics Data System (ADS)
Porekar, Jan; Klobučar, Tomaž; Šaljič, Svetlana; Gabrijelčič, Dušan
We present the application of the SERENITY methodology to the domain of long-term trusted electronic archiving, sometimes also referred to as trusted digital notary services. We address the SERENITY approach from thepoint of view of a company providing security solutions in the mentioned domain and adopt the role of a solution developer. In this chapter we show a complete vertical slice through the trusted archiving domain providing: (i) the relevant S&D properties, (ii) the S&D classes and S&D patterns on both organizational and technical level, (iii) describe how S&D patterns are integrated into a trusted longterm archiving service using the SERENITY Run-Time Framework (SRF). At the end of the chapter we put in perspective what a solution developer can learn from the process of capturing security knowledge according to SERENITY methodology and we discuss how existing implementations of archiving services can benefit from SERENITY approach in the future.
NASA Technical Reports Server (NTRS)
Mah, G. R.; Myers, J.
1993-01-01
The U.S. Government has initiated the Global Change Research program, a systematic study of the Earth as a complete system. NASA's contribution of the Global Change Research Program is the Earth Observing System (EOS), a series of orbital sensor platforms and an associated data processing and distribution system. The EOS Data and Information System (EOSDIS) is the archiving, production, and distribution system for data collected by the EOS space segment and uses a multilayer architecture for processing, archiving, and distributing EOS data. The first layer consists of the spacecraft ground stations and processing facilities that receive the raw data from the orbiting platforms and then separate the data by individual sensors. The second layer consists of Distributed Active Archive Centers (DAAC) that process, distribute, and archive the sensor data. The third layer consists of a user science processing network. The EOSDIS is being developed in a phased implementation. The initial phase, Version 0, is a prototype of the operational system. Version 0 activities are based upon existing systems and are designed to provide an EOSDIS-like capability for information management and distribution. An important science support task is the creation of simulated data sets for EOS instruments from precursor aircraft or satellite data. The Land Processes DAAC, at the EROS Data Center (EDC), is responsible for archiving and processing EOS precursor data from airborne instruments such as the Thermal Infrared Multispectral Scanner (TIMS), the Thematic Mapper Simulator (TMS), and Airborne Visible and Infrared Imaging Spectrometer (AVIRIS). AVIRIS, TIMS, and TMS are flown by the NASA-Ames Research Center ARC) on an ER-2. The ER-2 flies at 65000 feet and can carry up to three sensors simultaneously. Most jointly collected data sets are somewhat boresighted and roughly registered. The instrument data are being used to construct data sets that simulate the spectral and spatial characteristics of the Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) instrument scheduled to be flown on the first EOS-AM spacecraft. The ASTER is designed to acquire 14 channels of land science data in the visible and near-IR (VNIR), shortwave-IR (SWIR), and thermal-IR (TIR) regions from 0.52 micron to 11.65 micron at high spatial resolutions of 15 m to 90 m. Stereo data will also be acquired in the VNIR region in a single band. The AVIRIS and TMS cover the ASTER VNIR and SWIR bands, and the TIMS covers the TIR bands. Simulated ASTER data sets have been generated over Death Valley, California, Cuprite, Nevada, and the Drum Mountains, Utah using a combination of AVIRIS, TIMS, amd TMS data, and existing digital elevation models (DEM) for the topographic information.
Building A Cloud Based Distributed Active Data Archive Center
NASA Technical Reports Server (NTRS)
Ramachandran, Rahul; Baynes, Katie; Murphy, Kevin
2017-01-01
NASA's Earth Science Data System (ESDS) Program facilitates the implementation of NASA's Earth Science strategic plan, which is committed to the full and open sharing of Earth science data obtained from NASA instruments to all users. The Earth Science Data information System (ESDIS) project manages the Earth Observing System Data and Information System (EOSDIS). Data within EOSDIS are held at Distributed Active Archive Centers (DAACs). One of the key responsibilities of the ESDS Program is to continuously evolve the entire data and information system to maximize returns on the collected NASA data.
Automatic indexing of scanned documents: a layout-based approach
NASA Astrophysics Data System (ADS)
Esser, Daniel; Schuster, Daniel; Muthmann, Klemens; Berger, Michael; Schill, Alexander
2012-01-01
Archiving official written documents such as invoices, reminders and account statements in business and private area gets more and more important. Creating appropriate index entries for document archives like sender's name, creation date or document number is a tedious manual work. We present a novel approach to handle automatic indexing of documents based on generic positional extraction of index terms. For this purpose we apply the knowledge of document templates stored in a common full text search index to find index positions that were successfully extracted in the past.
NASA Astrophysics Data System (ADS)
Redmon, R. J.; Loto'aniu, P. T. M.; Boudouridis, A.; Chi, P. J.; Singer, H. J.; Kress, B. T.; Rodriguez, J. V.; Abdelqader, A.; Tilton, M.
2017-12-01
The era of NOAA observations of the geomagnetic field started with SMS-1 in May 1974 and continues to this day with GOES-13-16 (on-orbit). We describe the development of a new 20+ year archive of science-quality, high-cadence geostationary measurements of the magnetic field from eight NOAA spacecraft (GOES-8 through GOES-15), the status of GOES-16 and new scientific results using these data. GOES magnetic observations provide an early warning of impending space weather, are the core geostationary data set used for the construction of magnetospheric magnetic models, and can be used to estimate electromagnetic wave power in frequency bands important for plasma processes. Many science grade improvements are being made across the GOES archive to unify the format and content from GOES-8 through the new GOES-R series (with the first of that series launched on November 19, 2016). A majority of the 2-Hz magnetic observations from GOES-8-12 have never before been publicly accessible due to processing constraints. Now, a NOAA Big Earth Data Initiative project is underway to process these measurements starting from original telemetry records. Overall the new archive will include vector measurements in geophysically relevant coordinates (EPN, GSM, and VDH), comprehensive documentation, highest temporal cadence, best calibration parameters, recomputed means, updated quality flagging, full spacecraft ephemeris information, a unified standard format and public access. We are also developing spectral characterization tools for estimating power in standard frequency bands (up to 1 Hz for G8-15), and detecting ULF waves related to field-line resonances. We present the project status and findings, including in-situ statistical and extreme ULF event properties, and case studies where the ULF oscillations along the same field line were observed simultaneously by GOES near the equator in the magnetosphere, the ST-5 satellites at low altitudes, and ground magnetometer stations. For event studies, we find that the wave amplitude of poloidal oscillations is amplified at low altitudes but attenuated on the ground, confirming the theoretical predictions of wave propagation from the magnetosphere to the ground. We include examples of GOES-16 particle flux and magnetic field observations illustrating complex particle dynamics.
XMM-Newton On-demand Reprocessing Using SaaS Technology
NASA Astrophysics Data System (ADS)
Ibarra, A.; Fajersztejn, N.; Loiseau, N.; Gabriel, C.
2014-05-01
We present here the architectural design of the new on-the-fly reprocessing capabilities that will be soon developed and implemented in the new XMM-Newton Science Operation Centre. The inclusion of processing capabilities into the archive, as we plan, will be possible thanks to the recent refurbishment of the XMM-Newton science archive, its alignment with the latest web technologies and the XMM-Newton Remote Interface for Science Analysis (RISA), a revolutionary idea of providing processing capabilities through internet services.
Digital Archiving: Where the Past Lives Again
NASA Astrophysics Data System (ADS)
Paxson, K. B.
2012-06-01
The process of digital archiving for variable star data by manual entry with an Excel spreadsheet is described. Excel-based tools including a Step Magnitude Calculator and a Julian Date Calculator for variable star observations where magnitudes and Julian dates have not been reduced are presented. Variable star data in the literature and the AAVSO International Database prior to 1911 are presented and reviewed, with recent archiving work being highlighted. Digitization using optical character recognition software conversion is also demonstrated, with editing and formatting suggestions for the OCR-converted text.
U.S. Geological Survey archived data recovery in Texas, 2008-11
Wehmeyer, Loren L.; Reece, Brian D.
2011-01-01
The 2008–11 data rescue and recovery efforts by the U.S. Geological Survey (USGS) Texas Water Science Center resulted in an efficient workflow process, database, and Web user interface for scientists and citizens to access archived environmental information with practical applications. Much of this information is unique and has never been readily available to the public. The methods developed and lessons learned during this effort are now being applied to facilitate recovering archived information requested by USGS scientists, cooperators, and the general public.
The Future of the Protein Data Bank
Berman, Helen M.; Kleywegt, Gerard J.; Nakamura, Haruki; Markley, John L.
2013-01-01
The Worldwide Protein Data Bank (wwPDB) is the international collaboration that manages the deposition, processing and distribution of the PDB archive. The wwPDB’s mission is to maintain a single archive of macromolecular structural data that are freely and publicly available to the global community. Its members [RCSB PDB (USA), PDBe (Europe), PDBj (Japan), and BMRB (USA)] host data-deposition sites and mirror the PDB ftp archive. To support future developments in structural biology, the wwPDB partners are addressing organizational, scientific, and technical challenges. PMID:23023942
Automated Reduction and Calibration of SCUBA Archive Data Using ORAC-DR
NASA Astrophysics Data System (ADS)
Jenness, T.; Stevens, J. A.; Archibald, E. N.; Economou, F.; Jessop, N.; Robson, E. I.; Tilanus, R. P. J.; Holland, W. S.
The Submillimetre Common User Bolometer Array (SCUBA) instrument has been operating on the James Clerk Maxwell Telescope (JCMT) since 1997. The data archive is now sufficiently large that it can be used for investigating instrumental properties and the variability of astronomical sources. This paper describes the automated calibration and reduction scheme used to process the archive data with particular emphasis on the pointing observations. This is made possible by using the ORAC-DR data reduction pipeline, a flexible and extensible data reduction pipeline that is used on UKIRT and the JCMT.
A Robust, Low-Cost Virtual Archive for Science Data
NASA Technical Reports Server (NTRS)
Lynnes, Christopher; Vollmer, Bruce
2005-01-01
Despite their expense tape silos are still often the only affordable option for petabytescale science data archives, particularly when other factors such as data reliability, floor space, power and cooling load are accounted for. However, the complexity, management software, hardware reliability and access latency of tape silos make online data storage ever more attractive. Drastic reductions in low-cost mass-market PC disk drivers help to make this more affordable (approx. 1$/GB), but are challenging to scale to the petabyte range and of questionable reliability for archival use, On the other hand, if much of the science archive could be "virtualized", i.e., produced on demand when requested by users, we would need store only a fraction of the data online, perhaps bringing an online-only system into in affordable range. Radiance data from the satellite-borne Moderate Resolution Imaging Spectroradiometer (MODIS) instrument provides a good opportunity for such a virtual archive: the raw data amount to 140 GB/day, but these are small relative to the 550 GB/day making up the radiance products. These data are routinely processed as inputs for geophysical parameter products and then archived on tape at the Goddard Earth Sciences Distributed Active Archive (GES DAAC) for distributing to users. Virtualizing them would be an immediate and signifcant reduction in the amount of data being stored in the tape archives and provide more customizable products. A prototype of such a virtual archive is being developed to prove the concept and develop ways of incorporating the robustness that a science data archive requires.
Use of Archived Information by the United States National Data Center
NASA Astrophysics Data System (ADS)
Junek, W. N.; Pope, B. M.; Roman-Nieves, J. I.; VanDeMark, T. F.; Ichinose, G. A.; Poffenberger, A.; Woods, M. T.
2012-12-01
The United States National Data Center (US NDC) is responsible for monitoring international compliance to nuclear test ban treaties, acquiring data and data products from the International Data Center (IDC), and distributing data according to established policy. The archive of automated and reviewed event solutions residing at the US NDC is a valuable resource for assessing and improving the performance of signal detection, event formation, location, and discrimination algorithms. Numerous research initiatives are currently underway that are focused on optimizing these processes using historic waveform data and alphanumeric information. Identification of optimum station processing parameters is routinely performed through the analysis of archived waveform data. Station specific detector tuning studies produce and compare receiver operating characteristics for multiple detector configurations (e.g., detector type, filter passband) to identify an optimum set of processing parameters with an acceptable false alarm rate. Large aftershock sequences can inundate automated phase association algorithms with numerous detections that are closely spaced in time, which increases the number of false and/or mixed associations in automated event solutions and increases analyst burden. Archived waveform data and alphanumeric information are being exploited to develop an aftershock processor that will construct association templates to assist the Global Association (GA) application, reduce the number of false and merged phase associations, and lessen analyst burden. Statistical models are being developed and evaluated for potential use by the GA application for identifying and rejecting unlikely preliminary event solutions. Other uses of archived data at the US NDC include: improved event locations using empirical travel time corrections and discrimination via a statistical framework known as the event classification matrix (ECM).
Smith, D.V.; Drenth, B.R.; Fairhead, J.D.; Lei, K.; Dark, J.A.; Al-Bassam, K.
2011-01-01
Aeromagnetic data belonging to the State Company of Geology and Mining of Iraq (GEOSURV) have been recovered from magnetic tapes and early paper maps. In 1974 a national airborne survey was flown by the French firm Compagnie General de Geophysique (CGG). Following the survey the magnetic data were stored on magnetic tapes within an air conditioned archive run by GEOSURV. In 1990, the power supply to the archive was cut resulting in the present-day poor condition of the tapes. Frontier Processing Company and the U.S. Geological Survey (USGS) have been able to recover over 99 percent of the original digital data from the CGG tapes. Preliminary reprocessing of the data yielded a total magnetic field anomaly map that reveals fine structures not evident in available published maps. Successful restoration of these comprehensive, high quality digital datasets obviates the need to resurvey the entire country, thereby saving considerable time and money. These data were delivered to GEOSURV in a standard format for further analysis and interpretation. A parallel effort by GETECH concentrated on recovering the legacy gravity data from the original field data sheets archived by IPC (Iraq Petroleum Company). These data have been compiled with more recent GEOSURV sponsored surveys thus allowing for the first time a comprehensive digital and unified national gravity database to be constructed with full principal facts. Figure 1 shows the final aeromagnetic and gravity data coverage of Iraq. The only part of Iraq lacking gravity and aeromagnetic data coverage is the mountainous areas of the Kurdish region of northeastern Iraq. Joint interpretation of the magnetic and gravity data will help guide future geophysical investigations by GEOSURV, whose ultimate aim is to discover economical mineral and energy resources. ?? 2011 Society of Exploration Geophysicists.
The Operation and Architecture of the Keck Observatory Archive
NASA Astrophysics Data System (ADS)
Berriman, G. B.; Gelino, C. R.; Laity, A.; Kong, M.; Swain, M.; Holt, J.; Goodrich, R.; Mader, J.; Tran, H. D.
2014-05-01
The Infrared Processing and Analysis Center (IPAC) and the W. M. Keck Observatory (WMKO) are collaborating to build an archive for the twin 10-m Keck Telescopes, located near the summit of Mauna Kea. The Keck Observatory Archive (KOA) takes advantage of IPAC's long experience with managing and archiving large and complex data sets from active missions and serving them to the community; and of the Observatory's knowledge of the operation of its sophisticated instrumentation and the organization of the data products. By the end of 2013, KOA will contain data from all eight active observatory instruments, with an anticipated volume of 28 TB. The data include raw science and observations, quick look products, weather information, and, for some instruments, reduced and calibrated products. The goal of including data from all instruments is the cumulation of a rapid expansion of the archive's holdings, and already data from four new instruments have been added since October 2012. One more active instrument, the integral field spectrograph OSIRIS, is scheduled for ingestion in December 2013. After preparation for ingestion into the archive, the data are transmitted electronically from WMKO to IPAC for curation in the physical archive. This process includes validation of the science and content of the data and verification that data were not corrupted in transmission. The archived data include both newly-acquired observations and all previously acquired observations. The older data extends back to the date of instrument commissioning; for some instruments, such as HIRES, these data can extend as far back as 1994. KOA will continue to ingest all newly obtained observations, at an anticipated volume of 4 TB per year, and plans to ingest data from two decommissioned instruments. Access to these data is governed by a data use policy that guarantees Principal Investigators (PI) exclusive access to their data for at least 18 months, and allows for extensions as granted by institutional Selecting Officials. Approximately one-half of the data in the archive are public. The archive architecture takes advantage of existing software and is designed for sustainability. The data preparation and quality assurance software exploits the software infrastructure at WMKO, and the physical archive at IPAC re-uses the portable component based architecture developed originally for the Infrared Science Archive, with extensions custom to KOA as needed. We will discuss the science services available to end-users. These include web and program query interfaces, interactive tabulation of data and metadata, association of files with science files, and interactive visualization of data products. We will discuss how the growth in the archive holdings has led to to a growth in usage and published science results. Finally, we will discuss the future of KOA, including the provision of data reduction pipelines and interoperability with world-wide archives and data centers, including VO-compliant services.
A PDA study management tool (SMT) utilizing wireless broadband and full DICOM viewing capability
NASA Astrophysics Data System (ADS)
Documet, Jorge; Liu, Brent; Zhou, Zheng; Huang, H. K.; Documet, Luis
2007-03-01
During the last 4 years IPI (Image Processing and Informatics) Laboratory has been developing a web-based Study Management Tool (SMT) application that allows Radiologists, Film librarians and PACS-related (Picture Archiving and Communication System) users to dynamically and remotely perform Query/Retrieve operations in a PACS network. The users utilizing a regular PDA (Personal Digital Assistant) can remotely query a PACS archive to distribute any study to an existing DICOM (Digital Imaging and Communications in Medicine) node. This application which has proven to be convenient to manage the Study Workflow [1, 2] has been extended to include a DICOM viewing capability in the PDA. With this new feature, users can take a quick view of DICOM images providing them mobility and convenience at the same time. In addition, we are extending this application to Metropolitan-Area Wireless Broadband Networks. This feature requires Smart Phones that are capable of working as a PDA and have access to Broadband Wireless Services. With the extended application to wireless broadband technology and the preview of DICOM images, the Study Management Tool becomes an even more powerful tool for clinical workflow management.
NASA Technical Reports Server (NTRS)
Lee, L. R.; Montague, K. A.; Charvat, J. M.; Wear, M. L.; Thomas, D. M.; Van Baalen, M.
2016-01-01
Since the 2010 NASA directive to make the Life Sciences Data Archive (LSDA) and Lifetime Surveillance of Astronaut Health (LSAH) data archives more accessible by the research and operational communities, demand for astronaut medical data has increased greatly. LSAH and LSDA personnel are working with Human Research Program on many fronts to improve data access and decrease lead time for release of data. Some examples include the following: Feasibility reviews for NASA Research Announcement (NRA) data mining proposals; Improved communication, support for researchers, and process improvements for retrospective Institutional Review Board (IRB) protocols; Supplemental data sharing for flight investigators versus purely retrospective studies; Work with the Multilateral Human Research Panel for Exploration (MHRPE) to develop acceptable data sharing and crew consent processes and to organize inter-agency data coordinators to facilitate requests for international crewmember data. Current metrics on data requests crew consenting will be presented, along with limitations on contacting crew to obtain consent. Categories of medical monitoring data available for request will be presented as well as flow diagrams detailing data request processing and approval steps.
NASA Astrophysics Data System (ADS)
Divine, D. V.; Godtliebsen, F.; Rue, H.
2012-01-01
The paper proposes an approach to assessment of timescale errors in proxy-based series with chronological uncertainties. The method relies on approximation of the physical process(es) forming a proxy archive by a random Gamma process. Parameters of the process are partly data-driven and partly determined from prior assumptions. For a particular case of a linear accumulation model and absolutely dated tie points an analytical solution is found suggesting the Beta-distributed probability density on age estimates along the length of a proxy archive. In a general situation of uncertainties in the ages of the tie points the proposed method employs MCMC simulations of age-depth profiles yielding empirical confidence intervals on the constructed piecewise linear best guess timescale. It is suggested that the approach can be further extended to a more general case of a time-varying expected accumulation between the tie points. The approach is illustrated by using two ice and two lake/marine sediment cores representing the typical examples of paleoproxy archives with age models based on tie points of mixed origin.
Disaster Preparedness Manual and Workbook for Pennsylvania Libraries and Archives.
ERIC Educational Resources Information Center
Swan, Elizabeth, Ed.; And Others
This document suggests components for a sound disaster plan for libraries and archives. The planning process includes four steps which are covered in this manual: educating the staff about disaster preparedness literature; planning to prevent disasters; preparing to respond to an emergency and minimize its effects; and planning how to restore…
Organizational Structures, Processes, and Problems: A Literature Review and Taxonomy
1984-05-01
and the reserves system in Israel." Archives Europeennes de Sociologie , 1974, 15 (2), 262-272. Ruber, G. P., Ullman, J., & Leifer, R. "Optimum...34 Archives Europeennes de Sociologie , 1977, 18 (1), 29-54. Maynard, C. D., & Blalock, A. B. "The welfare-state within the military." Journal of
Supporting Student Research with Semantic Technologies and Digital Archives
ERIC Educational Resources Information Center
Martinez-Garcia, Agustina; Corti, Louise
2012-01-01
This article discusses how the idea of higher education students as producers of knowledge rather than consumers can be operationalised by means of student research projects, in which processes of research archiving and analysis are enabled through the use of semantic technologies. It discusses how existing digital repository frameworks can be…
The Apache OODT Project: An Introduction
NASA Astrophysics Data System (ADS)
Mattmann, C. A.; Crichton, D. J.; Hughes, J. S.; Ramirez, P.; Goodale, C. E.; Hart, A. F.
2012-12-01
Apache OODT is a science data system framework, borne over the past decade, with 100s of FTEs of investment, tens of sponsoring agencies (NASA, NIH/NCI, DoD, NSF, universities, etc.), and hundreds of projects and science missions that it powers everyday to their success. At its core, Apache OODT carries with it two fundamental classes of software services and components: those that deal with information integration from existing science data repositories and archives, that themselves have already-in-use business processes and models for populating those archives. Information integration allows search, retrieval, and dissemination across these heterogeneous systems, and ultimately rapid, interactive data access, and retrieval. The other suite of services and components within Apache OODT handle population and processing of those data repositories and archives. Workflows, resource management, crawling, remote data retrieval, curation and ingestion, along with science data algorithm integration all are part of these Apache OODT software elements. In this talk, I will provide an overview of the use of Apache OODT to unlock and populate information from science data repositories and archives. We'll cover the basics, along with some advanced use cases and success stories.
Commercial imagery archive product development
NASA Astrophysics Data System (ADS)
Sakkas, Alysa
1999-12-01
The Lockheed Martin (LM) team had garnered over a decade of operational experience in digital imagery management and analysis for the US Government at numerous worldwide sites. Recently, it set out to create a new commercial product to serve the needs of large-scale imagery archiving and analysis markets worldwide. LM decided to provide a turnkey commercial solution to receive, store, retrieve, process, analyze and disseminate in 'push' or 'pull' modes components and adapted and developed its own algorithms to provide added functionality not commercially available elsewhere. The resultant product, Intelligent Library System, satisfies requirements for (a) a potentially unbounded, data archive automated workflow management for increased user productivity; (c) automatic tracking and management of files stored on shelves; (d) ability to ingest, process and disseminate data involves with bandwidths ranging up to multi-gigabit per second; (e) access through a thin client- to-server network environment; (f) multiple interactive users needing retrieval of filters in seconds from both archived images or in real time, and (g) scalability that maintains information throughput performance as the size of the digital library grows.
NASA Astrophysics Data System (ADS)
Cogliati, M.; Tonelli, E.; Battaglia, D.; Scaioni, M.
2017-12-01
Archive aerial photos represent a valuable heritage to provide information about land content and topography in the past years. Today, the availability of low-cost and open-source solutions for photogrammetric processing of close-range and drone images offers the chance to provide outputs such as DEM's and orthoimages in easy way. This paper is aimed at demonstrating somehow and to which level of accuracy digitized archive aerial photos may be used within a such kind of low-cost software (Agisoft Photoscan Professional®) to generate photogrammetric outputs. Different steps of the photogrammetric processing workflow are presented and discussed. The main conclusion is that this procedure may come to provide some final products, which however do not feature the high accuracy and resolution that may be obtained using high-end photogrammetric software packages specifically designed for aerial survey projects. In the last part a case study is presented about the use of four-epoch archive of aerial images to analyze the area where a tunnel has to be excavated.
AstroCloud, a Cyber-Infrastructure for Astronomy Research: Architecture
NASA Astrophysics Data System (ADS)
Xiao, J.; Yu, C.; Cui, C.; He, B.; Li, C.; Fan, D.; Hong, Z.; Yin, S.; Wang, C.; Cao, Z.; Fan, Y.; Li, S.; Mi, L.; Wan, W.; Wang, J.; Zhang, H.
2015-09-01
AstroCloud is a cyber-Infrastructure for Astronomy Research initiated by Chinese Virtual Observatory (China-VO) under funding support from NDRC (National Development and Reform commission) and CAS (Chinese Academy of Sciences). The ultimate goal of this project is to provide a comprehensive end-to-end astronomy research environment where several independent systems seamlessly collaborate to support the full lifecycle of the modern observational astronomy based on big data, from proposal submission, to data archiving, data release, and to in-situ data analysis and processing. In this paper, the architecture and key designs of the AstroCloud platform are introduced, including data access middleware, access control and security framework, extendible proposal workflow, and system integration mechanism.
NASA Technical Reports Server (NTRS)
Barbre, Robert E., Jr.
2015-01-01
This paper describes in detail the QC and splicing methodology for KSC's 50- and 915-MHz DRWP measurements that generates an extensive archive of vertically complete profiles from 0.20-18.45 km. The concurrent POR from each archive extends from April 2000 to December 2009. MSFC NE applies separate but similar QC processes to each of the 50- and 915-MHz DRWP archives. DRWP literature and data examination provide the basis for developing and applying the automated and manual QC processes on both archives. Depending on the month, the QC'ed 50- and 915-MHz DRWP archives retain 52-65% and 16-30% of the possible data, respectively. The 50- and 915-MHz DRWP QC archives retain 84-91% and 85-95%, respectively, of all the available data provided that data exist in the non- QC'ed archives. Next, MSFC NE applies an algorithm to splice concurrent measurements from both DRWP sources. Last, MSFC NE generates a composite profile from the (up to) five available spliced profiles to effectively characterize boundary layer winds and to utilize all possible 915-MHz DRWP measurements at each timestamp. During a given month, roughly 23,000-32,000 complete profiles exist from 0.25-18.45 km from the composite profiles' archive, and approximately 5,000- 27,000 complete profiles exist from an archive utilizing an individual 915-MHz DRWP. One can extract a variety of profile combinations (pairs, triplets, etc.) from this sample for a given application. The sample of vertically complete DRWP wind measurements not only gives launch vehicle customers greater confidence in loads and trajectory assessments versus using balloon output, but also provides flexibility to simulate different DOL situations across applicable altitudes. In addition to increasing sample size and providing more flexibility for DOL simulations in the vehicle design phase, the spliced DRWP database provides any upcoming launch vehicle program with the capability to utilize DRWP profiles on DOL to compute vehicle steering commands, provided the program applies the procedures that this report describes to new DRWP data on DOL. Decker et al. (2015) details how SLS is proposing to use DRWP data and splicing techniques on DOL. Although automation could enhance the current DOL 50-MHz DRWP QC process and could streamline any future DOL 915-MHz DRWP QC and splicing process, the DOL community would still require manual intervention to ensure that the vehicle only uses valid profiles. If a program desires to use high spatial resolution profiles, then the algorithm could randomly add high-frequency components to the DRWP profiles. The spliced DRWP database provides lots of flexibility in how one performs DOL simulations, and the algorithms that this report provides will assist the aerospace and atmospheric communities that are interested in utilizing the DRWP.
NASA Astrophysics Data System (ADS)
Yaqoob, T.
2005-12-01
We describe a public WWW archive (HotGAS) containing data products from Chandra observations using the High Energy Grating Spectrometer (HETGS). Spectral products are available from the archive in various formats and are suitable for use by non-experts and experts alike. Lightcurves and cross-dispersion profiles are also available. Easy and user-friendly access for non X-ray astronomers to reprocessed, publishable quality grating data products should help to promote inter-disciplinary and multi-wavelength research on active galactic nuclei (AGN). The archive will also be useful to X-ray astronomers who have not yet had experience with high resolution X-ray spectroscopy, as well as experienced X-ray astronomers who need quick access to clean and ready-to-go data products. Theoreticians may find the archive useful for testing their models without having to deal with the fine details of data processing and reduction. We also anticipate that the archive will be useful for training graduate students in high-resolution X-ray spectroscopy and for providing a resource for projects for high-school and graduate students. We plan to eventually expand the archive to include AGN data from the Chandra Low Energy Grating Spectrometer (LETGS), and the XMM-Newton Reflection-Grating Spectrometer (RGS). Further in the future we plan to extend the archive to include data from other astrophysical sources aside from AGN. The project thus far is funded by an archival Chandra grant.
Intrusive Thought and Relativity Associated with Task Performance
1995-01-23
currently living with a partner; Ninety percent of participants were full-time students, 8 percent were employed full-time, and 2 percent were unemployed ...employed full-time 2% unemployed 83% Caucasian 8% Asian 7% African American 2% Hispanic 80% some college 8% high school 5% college degree 5...Research, 11, 213--21B. Horowitz, M.J. (1969). Psychic trauma: Return of images after a stress film. Archives of General Psychiatry, 20, 552- 559
Creating a web-based digital photographic archive: one hospital library's experience.
Marshall, Caroline; Hobbs, Janet
2017-04-01
Cedars-Sinai Medical Center is a nonprofit community hospital based in Los Angeles. Its history spans over 100 years, and its growth and development from the merging of 2 Jewish hospitals, Mount Sinai and Cedars of Lebanon, is also part of the history of Los Angeles. The medical library collects and maintains the hospital's photographic archive, to which retiring physicians, nurses, and an active Community Relations Department have donated photographs over the years. The collection was growing rapidly, it was impossible to display all the materials, and much of the collection was inaccessible to patrons. The authors decided to make the photographic collection more accessible to medical staff and researchers by purchasing a web-based digital archival package, Omeka. We decided what material should be digitized by analyzing archival reference requests and considering the institution's plan to create a Timeline Wall documenting and celebrating the history of Cedars-Sinai. Within 8 months, we digitized and indexed over 500 photographs. The digital archive now allows patrons and researchers to access the history of the hospital and enables the library to process archival references more efficiently.
Comprehensive planning of data archive in Japanese planetary missions
NASA Astrophysics Data System (ADS)
Yamamoto, Yukio; Shinohara, Iku; Hoshino, Hirokazu; Tateno, Naoki; Hareyama, Makoto; Okada, Naoki; Ebisawa, Ken
Comprehensive planning of data archive in Japanese planetary missions Japan Aerospace Exploration Agency (JAXA) provides HAYABUSA and KAGUYA data as planetary data archives. These data archives, however, were prepared independently. Therefore the inconsistency of data format has occurred, and the knowledge of data archiving activity is not inherited. Recently, the discussion of comprehensive planning of data archive has started to prepare up-coming planetary missions, which indicates the comprehensive plan of data archive is required in several steps. The framework of the comprehensive plan is divided into four items: Preparation, Evaluation, Preservation, and Service. 1. PREPARATION FRAMEWORK Data is classified into several types: raw data, level-0, 1, 2 processing data, ancillary data, and etc. The task of mission data preparation is responsible for instrument teams, but preparations beside mission data and support of data management are essential to make unified conventions and formats over instruments in a mission, and over missions. 2. EVALUATION FRAMEWORK There are two meanings of evaluation: format and quality. The format evaluation is often discussed in the preparation framework. The data quality evaluation which is often called quality assurance (QA) or quality control (QC) must be performed by third party apart from preparation teams. An instrument team has the initiative for the preparation itself, and the third-party group is organized to evaluate the instrument team's activity. 3. PRESERVATION FRAMEWORK The main topic of this framework is document management, archiving structure, and simple access method. The mission produces many documents in the process of the development. Instrument de-velopment is no exception. During long-term development of a mission, many documents are obsoleted and updated repeatedly. A smart system will help instrument team to reduce some troubles of document management and archiving task. JAXA attempts to follow PDS manners to do this management since PDS has highly sophisticated archiving structure. In addition, the access method to archived data must be simple and standard well over a decade. 4. SERVICE FRAMEWORK The service framework including planetary data access protocol, PDAP, has been developed to share a stored data effectively. The sophisticated service framework will work not only for publication data, but also for low-level data. JAXA's data query services is under developed based on PDAP, which means that the low-level data can be published in the same manner as level 2 data. In this presentation, we report the detail structure of these four frameworks adopting upcoming Planet-C, Venus Climate Orbiter, mission.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Terwilliger, Thomas C.
2012-06-04
Within the ICSTI Insights Series we offer three articles on the 'living publication' that is already available to practitioners in the important field of crystal structure determination and analysis. While the specific examples are drawn from this particular field, we invite readers to draw parallels in their own fields of interest. The first article describes the present state of the crystallographic living publication, already recognized by an ALPSP (Association of Learned and Professional Society Publishers) Award for Publishing Innovation in 2006. The second article describes the potential impact on the record of science as greater post-publication analysis becomes more commonmore » within currently accepted data deposition practices, using processed diffraction data as the starting point. The third article outlines a vision for the further improvement of crystallographic structure reports within potentially achievable enhanced data deposition practices, based upon raw (unprocessed) diffraction data. The IUCr in its Commissions and Journals has for many years emphasized the importance of publications being accompanied by data and the interpretation of the data in terms of atomic models. This has been followed as policy by numerous other journals in the field and its cognate disciplines. This practice has been well served by databases and archiving institutions such as the Protein Data Bank (PDB), the Cambridge Crystallographic Data Centre (CCDC), and the Inorganic Crystal Structure Database (ICSD). Normally the models that are archived are interpretations of the data, consisting of atomic coordinates with their displacement parameters, along with processed diffraction data from X-ray, neutron or electron diffraction studies. In our current online age, a reader can not only consult the printed word, but can display and explore the results with molecular graphics software of exceptional quality. Furthermore, the routine availability of processed diffraction data allows readers to perform direct calculations of the electron density (using X-rays and electrons as probes) or the nuclear density (using neutrons as probe) on which the molecular models are directly based. This current community practice is described in our first article. There are various ways that these data and tools can be used to further analyze the molecules that have been crystallized. Notably, once a set of results is announced via the publication, the research community can start to interact directly with the data and models. This gives the community the opportunity not only to read about the structure, but to examine it in detail, and even generate subsequent improved models. These improved models could, in principle, be archived along with the original interpretation of the data and can represent a continuously improving set of interpretations of a set of diffraction data. The models could improve both by correction of errors in the original interpretation and by the use of new representations of molecules in crystal structures that more accurately represent the contents of a crystal. These possible developments are described in our second article. A current, significant, thrust for the IUCr is whether it would be advantageous for the crystallographic community to require, rather than only encourage, the archiving of the raw (unprocessed) diffraction data images measured from a crystal, a fibre or a solution. This issue is being evaluated in detail by an IUCr Working Group (see http://forums.iucr.org). Such archived raw data would be linked to and from any associated publications. The archiving of raw diffraction data could allow as yet undeveloped processing methods to have access to the originally measured data. The debate within the community about this much larger proposed archiving effort revolves around the issue of 'cost versus benefit'. Costs can be minimized by preserving the raw data in local repositories, either at centralized synchrotron and neutron research institutes, or at research universities. Archiving raw data is also perceived as being more effective than just archiving processed data in countering scientific fraud, which exists in our field, albeit at a tiny level of occurrences. In parallel developments, sensitivities to avoiding research malpractice are encouraging Universities to establish their own data repositories for research and academic staff. These various 'raw data archives', would complement the existing processed data archives. These archives could however have gaps in their coverage arising from a lack of resources. Nevertheless we believe that a sufficiently large raw data archive, with reasonable global coverage, could be encouraged and have major benefits. These possible developments, costs and benefits, are described in our third and final article on 'The living publication'.« less
NASA Astrophysics Data System (ADS)
Saunier, Sébastien; Northrop, Amy; Lavender, Samantha; Galli, Luca; Ferrara, Riccardo; Mica, Stefano; Biasutti, Roberto; Goryl, Philippe; Gascon, Ferran; Meloni, Marco
2017-10-01
Whilst recent years have witnessed the development and exploitation of operational Earth Observation (EO) satellite constellation data, the valorisation of historical archives has been a challenge. The European Space Agency (ESA) Landsat Multi Spectral Scanner (MSS) products cover Greenland, Iceland, Continental Europe and North Africa represent an archive of over 600,000 processed Level 1 (L1) scenes that will accompany around 1 million ESA Landsat Thematic Mapper (TM) and Enhanced Thematic Mapper Plus (ETM+) products already available. ESA began acquiring MSS data in 1975 and it is well known that this dataset can be degraded due to missing data and a loss in accuracy. For these reasons, the content of the product format has been reviewed and the ESA Landsat processing baseline significantly updated to ensure products are fit for user purposes. This paper presents the new MSS product format including the updated metadata parameters for error traceability, and the specification of the Quality Assurance Band (BQA) engineered to allow the best pixel selection and also the application of image restoration techniques. This paper also discusses major improvements applied to the radiometric and geometric processing. For the benefits of the community, ESA is now able to maximize the number of L1 MSS products that can potentially be generated from the raw Level 0 (L0) data and ensure the highest possible data quality is reached. Also, by improving product format, processing and adding a pixel based quality band, the MSS archive becomes interoperable with recently reprocessed Landsat data and that from live missions by way of assuring product quality on a pixel basis.
NASA's Long-Term Archive (LTA) of ICESat Data at the National Snow and Ice Data Center (NSIDC)
NASA Astrophysics Data System (ADS)
Fowler, D. K.; Moses, J. F.; Dimarzio, J. P.; Webster, D.
2011-12-01
Data Stewardship, preservation, and reproducibility are becoming principal parts of a data manager's work. In an era of distributed data and information systems, where the host location ought to be transparent to the internet user, it is of vital importance that organizations make a commitment to both current and long-term goals of data management and the preservation of scientific data. NASA's EOS Data and Information System (EOSDIS) is a distributed system of discipline-specific archives and mission-specific science data processing facilities. Satellite missions and instruments go through a lifecycle that involves pre-launch calibration, on-orbit data acquisition and product generation, and final reprocessing. Data products and descriptions flow to the archives for distribution on a regular basis during the active part of the mission. However there is additional information from the product generation and science teams needed to ensure the observations will be useful for long term climate studies. Examples include ancillary input datasets, product generation software, and production history as developed by the team during the course of product generation. These data and information will need to be archived after product data processing is completed. Using inputs from the USGCRP Workshop on Long Term Archive Requirements (1998), discussions with EOS instrument teams, and input from the 2011 ESIPS Federation meeting, NASA is developing a set of Earth science data and information content requirements for long term preservation that will ultimately be used for all the EOS missions as they come to completion. Since the ICESat/GLAS mission is one of the first to come to an end, NASA and NSIDC are preparing for long-term support of the ICESat mission data now. For a long-term archive, it is imperative that there is sufficient information about how products were prepared in order to convince future researchers that the scientific results are accurate, understandable, useable, and reproducible. Our experience suggests data centers know what to preserve in most cases, i.e., the processing algorithms along with the Level 0 or Level 1a input and ancillary products used to create the higher-level products will be archived and made available to users. In other cases the data centers must seek guidance from the science team, e.g., for pre-launch, calibration/validation, and test data. All these data are an important part of product provenance, contributing to and helping establish the integrity of the scientific observations for long term climate studies. In this presentation we will describe application of information content requirements, guidance from the ICESat/GLAS Science Team and the flow of additional information from the ICESat Science team and Science Investigator-Led Processing System to the Distributed Active Archive Center.
NASA Astrophysics Data System (ADS)
Divine, D.; Godtliebsen, F.; Rue, H.
2012-04-01
Detailed knowledge of past climate variations is of high importance for gaining a better insight into the possible future climate scenarios. The relative shortness of available high quality instrumental climate data conditions the use of various climate proxy archives in making inference about past climate evolution. It, however, requires an accurate assessment of timescale errors in proxy-based paleoclimatic reconstructions. We here propose an approach to assessment of timescale errors in proxy-based series with chronological uncertainties. The method relies on approximation of the physical process(es) forming a proxy archive by a random Gamma process. Parameters of the process are partly data-driven and partly determined from prior assumptions. For a particular case of a linear accumulation model and absolutely dated tie points an analytical solution is found suggesting the Beta-distributed probability density on age estimates along the length of a proxy archive. In a general situation of uncertainties in the ages of the tie points the proposed method employs MCMC simulations of age-depth profiles yielding empirical confidence intervals on the constructed piecewise linear best guess timescale. It is suggested that the approach can be further extended to a more general case of a time-varying expected accumulation between the tie points. The approach is illustrated by using two ice and two lake/marine sediment cores representing the typical examples of paleoproxy archives with age models constructed using tie points of mixed origin.
NASA Astrophysics Data System (ADS)
Malik, M. A.; Cantwell, K. L.; Reser, B.; Gray, L. M.
2016-02-01
Marine researchers and managers routinely rely on interdisciplinary data sets collected using hull-mounted sonars, towed sensors, or submersible vehicles. These data sets can be broadly categorized into acoustic remote sensing, imagery-based observations, water property measurements, and physical samples. The resulting raw data sets are overwhelmingly large and complex, and often require specialized software and training to process. To address these challenges, NOAA's Office of Ocean Exploration and Research (OER) is developing tools to improve the discoverability of raw data sets and integration of quality-controlled processed data in order to facilitate re-use of archived oceanographic data. Majority of recently collected OER raw oceanographic data can be retrieved from national data archives (e.g. NCEI and NOAA central library). Merging of disperse data sets by scientists with diverse expertise, however remains problematic. Initial efforts at OER have focused on merging geospatial acoustic remote sensing data with imagery and water property measurements that typically lack direct geo-referencing. OER has developed `smart' ship and submersible tracks that can provide a synopsis of geospatial coverage of various data sets. Tools under development enable scientists to quickly assess the relevance of archived OER data to their respective research or management interests, and enable quick access to the desired raw and processed data sets. Pre-processing of the data and visualization to combine various data sets also offers benefits to streamline data quality assurance and quality control efforts.
Harrison, Arnell S.; Dadisman, Shawn V.; Kindinger, Jack G.; Morton, Robert A.; Blum, Mike D.; Wiese, Dana S.; Subiño, Janice A.
2007-01-01
In June of 1996, the U.S. Geological Survey conducted geophysical surveys from Nueces to Copano Bays, Texas. This report serves as an archive of unprocessed digital boomer seismic reflection data, trackline maps, navigation files, GIS information, cruise log, and formal FGDC metadata. Filtered and gained digital images of the seismic profiles and high resolution scanned TIFF images of the original paper printouts are also provided. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG-Y format (Barry and others, 1975) and may be downloaded and processed with commercial or public domain software such as Seismic Unix (SU). Example SU processing scripts and USGS software for viewing the SEG-Y files (Zihlman, 1992) are also provided.
Harrison, Arnell S.; Dadisman, Shawn V.; Davis, Jeffrey B.; Flocks, James G.; Wiese, Dana S.
2009-01-01
From September 2 through 4, 2008, the U.S. Geological Survey and St. Johns River Water Management District (SJRWMD) conducted geophysical surveys in Lakes Cherry, Helen, Hiawassee, Louisa, and Prevatt, central Florida. This report serves as an archive of unprocessed digital boomer seismic reflection data, trackline maps, navigation files, GIS information, FACS logs, and formal FGDC metadata. Filtered and gained digital images of the seismic profiles are also provided. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG-Y format (Barry and others, 1975) and may be downloaded and processed with commercial or public domain software such as Seismic Unix (SU). Example SU processing scripts and USGS software for viewing the SEG-Y files (Zihlman, 1992) are also provided.
The Challenges Facing Science Data Archiving on Current Mass Storage Systems
NASA Technical Reports Server (NTRS)
Peavey, Bernard; Behnke, Jeanne (Editor)
1996-01-01
This paper discusses the desired characteristics of a tape-based petabyte science data archive and retrieval system required to store and distribute several terabytes (TB) of data per day over an extended period of time, probably more than 115 years, in support of programs such as the Earth Observing System Data and Information System (EOSDIS). These characteristics take into consideration not only cost effective and affordable storage capacity, but also rapid access to selected files, and reading rates that are needed to satisfy thousands of retrieval transactions per day. It seems that where rapid random access to files is not crucial, the tape medium, magnetic or optical, continues to offer cost effective data storage and retrieval solutions, and is likely to do so for many years to come. However, in environments like EOS these tape based archive solutions provide less than full user satisfaction. Therefore, the objective of this paper is to describe the performance and operational enhancements that need to be made to the current tape based archival systems in order to achieve greater acceptance by the EOS and similar user communities.
Multi-provider architecture for cloud outsourcing of medical imaging repositories.
Godinho, Tiago Marques; Bastião Silva, Luís A; Costa, Carlos; Oliveira, José Luís
2014-01-01
Over the last few years, the extended usage of medical imaging procedures has raised the medical community attention towards the optimization of their workflows. More recently, the federation of multiple institutions into a seamless distribution network has brought hope of increased quality healthcare services along with more efficient resource management. As a result, medical institutions are constantly looking for the best infrastructure to deploy their imaging archives. In this scenario, public cloud infrastructures arise as major candidates, as they offer elastic storage space, optimal data availability without great requirements of maintenance costs or IT personnel, in a pay-as-you-go model. However, standard methodologies still do not take full advantage of outsourced archives, namely because their integration with other in-house solutions is troublesome. This document proposes a multi-provider architecture for integration of outsourced archives with in-house PACS resources, taking advantage of foreign providers to store medical imaging studies, without disregarding security. It enables the retrieval of images from multiple archives simultaneously, improving performance, data availability and avoiding the vendor-locking problem. Moreover it enables load balancing and cache techniques.
NASA Astrophysics Data System (ADS)
Herr, J.; Bhatnagar, T.; Goldfarb, S.; Irrer, J.; McKee, S.; Neal, H. A.
2008-07-01
Large scientific collaborations as well as universities have a growing need for multimedia archiving of meetings and courses. Collaborations need to disseminate training and news to their wide-ranging members, and universities seek to provide their students with more useful studying tools. The University of Michigan ATLAS Collaboratory Project has been involved in the recording and archiving of multimedia lectures since 1999. Our software and hardware architecture has been used to record events for CERN, ATLAS, many units inside the University of Michigan, Fermilab, the American Physical Society and the International Conference on Systems Biology at Harvard. Until 2006 our group functioned primarily as a tiny research/development team with special commitments to the archiving of certain ATLAS events. In 2006 we formed the MScribe project, using a larger scale, and highly automated recording system to record and archive eight University courses in a wide array of subjects. Several robotic carts are wheeled around campus by unskilled student helpers to automatically capture and post to the Web audio, video, slides and chalkboard images. The advances the MScribe project has made in automation of these processes, including a robotic camera operator and automated video processing, are now being used to record ATLAS Collaboration events, making them available more quickly than before and enabling the recording of more events.
ERIC Educational Resources Information Center
Michelson, Avra; Rothenberg, Jeff
1993-01-01
The report considers the interaction of trends in information technology and trends in research practices and the policy implications for archives. The information is divided into 4 sections. The first section, an "Overview of Information Technology Trends," discusses end-user computing, which includes ubiquitous computing, end-user…
ERIC Educational Resources Information Center
Nicewarner, Metta
1988-01-01
Description of the microfilming of a women's studies archive at the Texas Woman's University Library discusses: (1) project background; (2) criteria for equipment purchase; (3) equipment selected; (4) recommended resources; (5) indexing and layout decisions; (6) the filming process; and (7) the pros and cons of in-house microreproduction. (three…
Improving the Quality of Backup Process for Publishing Houses and Printing Houses
NASA Astrophysics Data System (ADS)
Proskuriakov, N. E.; Yakovlev, B. S.; Pries, V. V.
2018-04-01
The analysis of main types for data threats, used by print media, and their influence on the vitality and security of information is made. The influence of the programs settings for preparing archive files, the types of file managers on the backup process is analysed. We proposed a simple and economical version of the practical implementation of the backup process consisting of 4 components: the command line interpreter, the 7z archiver, the Robocopy utility, and network storage. We recommend that the best option would be to create backup copies, consisting of three local copies of data and two network copies.
The ExoMars science data archive: status and plans
NASA Astrophysics Data System (ADS)
Heather, David
2016-07-01
The ExoMars program, a cooperation between ESA and Roscosmos, comprises two missions: the Trace Gas Orbiter, to be launched in 2016, and a rover and surface platform, due for launch in 2018. This will be the first time ESA has operated a rover, and the archiving and management of the science data to be returned will require a significant effort in development of the new Planetary Science Archive (PSA). The ExoMars mission data will also be formatted according to the new PDS4 Standards, based in XML, and this will be the first data of that format to be archived in the PSA. There are significant differences in the way in which a scientist will want to query, retrieve, and use data from a suite of rover instruments as opposed to remote sensing instrumentation from an orbiter. The PSA data holdings and the accompanying services are currently driven more towards the management of remote sensing data, so some significant changes will be needed. Among them will be a much closer link to the operational information than is currently available for our missions. NASA have a strong user community interaction with their analysts notebook, which provides detailed operational information to explain why, where and when operations took place. A similar approach will be needed for the future PSA, which is currently being designed. In addition to the archiving interface itself, there are differences with the overall archiving process being followed for ExoMars compared to previous ESA planetary missions. The Trace Gas Orbiter data pipelines for the first level of processing from telemetry to raw data, will be hosted directly by ESA's ground segment at ESAC in Madrid, where the archive itself resides. Data will have a continuous flow direct to the PSA, where after the given proprietary period, it will be directly released to the community via the new user interface. For the rover mission, the data pipelines are being developed by European industry, in close collaboration with ESA PSA experts and with the instrument teams. The first level of data processing will be carried out for all instruments at ALTEC in Turin where the pipelines are developed, and from where the rover operations will also be run. The PDS4 data will be directly produced and used for planning purposes within the operations centre before being passed on the the PSA for long term archiving. While this has clear advantages in the long-term regarding the timely population of the archive with at least the first level of data, the outsourcing of the pipelines to industry introduces complications. Firstly, it is difficult to have the necessary expertise on hand to train the individuals designing the pipelines, and to define the archiving conventions needed to meet the scientific needs of the mission. It also introduces issues in terms of driving the schedule, as industry is committed to making deliveries within fixed budgets and time-frames that may not necessarily be in line with the needs of archiving, and may not be able to respond well to the ongoing evolution of the PDS4 standards. This presentation will focus on the challenges involved in archiving rover data for the PSA, and will outline the plans and current status of the system being developed to respond to the needs of the mission.
Getting from then to now: Sustaining the Lesbian Herstory Archives as a lesbian organization.
Smith-Cruz, Shawn ta; Rando, Flavia; Corbman, Rachel; Edel, Deborah; Gwenwald, Morgan; Nestle, Joan; Thistlethwaite, Polly
2016-01-01
This article is a compilation of six narratives written by collective members of the volunteer-run Lesbian Herstory Archives, the oldest and largest collection of lesbian material in the world. Narratives draw on a yearlong series of conversations, which culminated in a panel discussion at the 40th Anniversary celebration. Authors' narratives detail the significance of the Lesbian Herstory Archives as a successful and sustainable lesbian organization. Topics covered span four decades and include: the organization's history and practice, founding and activism, the acquisition of the current space, community engagement, and processing of special collections.
NASA Technical Reports Server (NTRS)
2002-01-01
TRMM has acquired more than four years of data since its launch in November 1997. All TRMM standard products are processed by the TRMM Science Data and Information System (TSDIS) and archived and distributed to general users by the GES DAAC. Table 1 shows the total archive and distribution as of February 28, 2002. The Utilization Ratio (UR), defined as the ratio of the number of distributed files to the number of archived files, of the TRMM standard products has been steadily increasing since 1998 and is currently at 6.98.
Data archiving and serving system implementation in CLEP's GRAS Core System
NASA Astrophysics Data System (ADS)
Zuo, Wei; Zeng, Xingguo; Zhang, Zhoubin; Geng, Liang; Li, Chunlai
2017-04-01
The Ground Research & Applications System(GRAS) is one of the five systems of China's Lunar Exploration Project(CLEP), it is responsible for data acquisition, processing, management and application, and it is also the operation control center during satellite in-orbit and payload operation management. Chang'E-1, Chang'E-2 and Chang'E-3 have collected abundant lunar exploration data. The aim of this work is to present the implementation of data archiving and Serving in CLEP's GRAS Core System software. This first approach provides a client side API and server side software allowing the creation of a simplified version of CLEPDB data archiving software, and implements all required elements to complete data archiving flow from data acquisition until its persistent storage technology. The client side includes all necessary components that run on devices that acquire or produce data, distributing and streaming to configure remote archiving servers. The server side comprises an archiving service that stores into PDS files all received data. The archiving solution aims at storing data coming for the Data Acquisition Subsystem, the Operation Management Subsystem, the Data Preprocessing Subsystem and the Scientific Application & Research Subsystem. The serving solution aims at serving data for the various business systems, scientific researchers and public users. The data-driven and component clustering methods was adopted in this system, the former is used to solve real-time data archiving and data persistence services; the latter is used to keep the continuous supporting ability of archive and service to new data from Chang'E Mission. Meanwhile, it can save software development cost as well.
NASA Astrophysics Data System (ADS)
Evans, Robert D.; Petropavlovskikh, Irina; McClure-Begley, Audra; McConville, Glen; Quincy, Dorothy; Miyagawa, Koji
2017-10-01
The United States government has operated Dobson ozone spectrophotometers at various sites, starting during the International Geophysical Year (1 July 1957 to 31 December 1958). A network of stations for long-term monitoring of the total column content (thickness of the ozone layer) of the atmosphere was established in the early 1960s and eventually grew to 16 stations, 14 of which are still operational and submit data to the United States of America's National Oceanic and Atmospheric Administration (NOAA). Seven of these sites are also part of the Network for the Detection of Atmospheric Composition Change (NDACC), an organization that maintains its own data archive. Due to recent changes in data processing software the entire dataset was re-evaluated for possible changes. To evaluate and minimize potential changes caused by the new processing software, the reprocessed data record was compared to the original data record archived in the World Ozone and UV Data Center (WOUDC) in Toronto, Canada. The history of the observations at the individual stations, the instruments used for the NOAA network monitoring at the station, the method for reducing zenith-sky observations to total ozone, and calibration procedures were re-evaluated using data quality control tools built into the new software. At the completion of the evaluation, the new datasets are to be published as an update to the WOUDC and NDACC archives, and the entire dataset is to be made available to the scientific community. The procedure for reprocessing Dobson data and the results of the reanalysis on the archived record are presented in this paper. A summary of historical changes to 14 station records is also provided.
REMS Wind Sensor Preliminary Results
NASA Astrophysics Data System (ADS)
De La Torre Juarez, M.; Gomez-Elvira, J.; Navarro, S.; Marin, M.; Torres, J.; Rafkin, S. C.; Newman, C. E.; Pla-García, J.
2015-12-01
The REMS instrument is part of the Mars Science Laboratory payload. It is a sensor suite distributed over several parts of the rover. The wind sensor, which is composed of two booms equipped with a set of hot plate anemometers, is installed on the Rover Sensing Mast (RSM). During landing most of the hot plates of one boom were damaged, most likely by the pebbles lifted by the Sky Crane thruster. The loss of one wind boom necessitated a full review of the data processing strategy. Different algorithms have been tested on the readings of the first Mars year, and these results are now archived in the Planetary Data System (PDS), The presentation will include a description of the data processing methods and of the resulting products, including the typical evolution of wind speed and direction session-by-session, hour-by-hour and other kinds of statistics . A review of the wind readings over the first Mars year will also be presented.
Representation of viruses in the remediated PDB archive
Lawson, Catherine L.; Dutta, Shuchismita; Westbrook, John D.; Henrick, Kim; Berman, Helen M.
2008-01-01
A new scheme has been devised to represent viruses and other biological assemblies with regular noncrystallographic symmetry in the Protein Data Bank (PDB). The scheme describes existing and anticipated PDB entries of this type using generalized descriptions of deposited and experimental coordinate frames, symmetry and frame transformations. A simplified notation has been adopted to express the symmetry generation of assemblies from deposited coordinates and matrix operations describing the required point, helical or crystallographic symmetry. Complete correct information for building full assemblies, subassemblies and crystal asymmetric units of all virus entries is now available in the remediated PDB archive. PMID:18645236
NASA Astrophysics Data System (ADS)
Waldhauser, F.; Schaff, D. P.
2012-12-01
Archives of digital seismic data recorded by seismometer networks around the world have grown tremendously over the last several decades helped by the deployment of seismic stations and their continued operation within the framework of monitoring earthquake activity and verification of the Nuclear Test-Ban Treaty. We show results from our continuing effort in developing efficient waveform cross-correlation and double-difference analysis methods for the large-scale processing of regional and global seismic archives to improve existing earthquake parameter estimates, detect seismic events with magnitudes below current detection thresholds, and improve real-time monitoring procedures. We demonstrate the performance of these algorithms as applied to the 28-year long seismic archive of the Northern California Seismic Network. The tools enable the computation of periodic updates of a high-resolution earthquake catalog of currently over 500,000 earthquakes using simultaneous double-difference inversions, achieving up to three orders of magnitude resolution improvement over existing hypocenter locations. This catalog, together with associated metadata, form the underlying relational database for a real-time double-difference scheme, DDRT, which rapidly computes high-precision correlation times and hypocenter locations of new events with respect to the background archive (http://ddrt.ldeo.columbia.edu). The DDRT system facilitates near-real-time seismicity analysis, including the ability to search at an unprecedented resolution for spatio-temporal changes in seismogenic properties. In areas with continuously recording stations, we show that a detector built around a scaled cross-correlation function can lower the detection threshold by one magnitude unit compared to the STA/LTA based detector employed at the network. This leads to increased event density, which in turn pushes the resolution capability of our location algorithms. On a global scale, we are currently building the computational framework for double-difference processing the combined parametric and waveform archives of the ISC, NEIC, and IRIS with over three million recorded earthquakes worldwide. Since our methods are scalable and run on inexpensive Beowulf clusters, periodic re-analysis of such archives may thus become a routine procedure to continuously improve resolution in existing global earthquake catalogs. Results from subduction zones and aftershock sequences of recent great earthquakes demonstrate the considerable social and economic impact that high-resolution images of active faults, when available in real-time, will have in the prompt evaluation and mitigation of seismic hazards. These results also highlight the need for consistent long-term seismic monitoring and archiving of records.
A Routing Mechanism for Cloud Outsourcing of Medical Imaging Repositories.
Godinho, Tiago Marques; Viana-Ferreira, Carlos; Bastião Silva, Luís A; Costa, Carlos
2016-01-01
Web-based technologies have been increasingly used in picture archive and communication systems (PACS), in services related to storage, distribution, and visualization of medical images. Nowadays, many healthcare institutions are outsourcing their repositories to the cloud. However, managing communications between multiple geo-distributed locations is still challenging due to the complexity of dealing with huge volumes of data and bandwidth requirements. Moreover, standard methodologies still do not take full advantage of outsourced archives, namely because their integration with other in-house solutions is troublesome. In order to improve the performance of distributed medical imaging networks, a smart routing mechanism was developed. This includes an innovative cache system based on splitting and dynamic management of digital imaging and communications in medicine objects. The proposed solution was successfully deployed in a regional PACS archive. The results obtained proved that it is better than conventional approaches, as it reduces remote access latency and also the required cache storage space.
Sentinel-1 Interferometry from the Cloud to the Scientist
NASA Astrophysics Data System (ADS)
Garron, J.; Stoner, C.; Johnston, A.; Arko, S. A.
2017-12-01
Big data problems and solutions are growing in the technological and scientific sectors daily. Cloud computing is a vertically and horizontally scalable solution available now for archiving and processing large volumes of data quickly, without significant on-site computing hardware costs. Be that as it may, the conversion of scientific data processors to these powerful platforms requires not only the proof of concept, but the demonstration of credibility in an operational setting. The Alaska Satellite Facility (ASF) Distributed Active Archive Center (DAAC), in partnership with NASA's Jet Propulsion Laboratory, is exploring the functional architecture of Amazon Web Services cloud computing environment for the processing, distribution and archival of Synthetic Aperture Radar data in preparation for the NASA-ISRO Synthetic Aperture Radar (NISAR) Mission. Leveraging built-in AWS services for logging, monitoring and dashboarding, the GRFN (Getting Ready for NISAR) team has built a scalable processing, distribution and archival system of Sentinel-1 L2 interferograms produced using the ISCE algorithm. This cloud-based functional prototype provides interferograms over selected global land deformation features (volcanoes, land subsidence, seismic zones) and are accessible to scientists via NASA's EarthData Search client and the ASF DAACs primary SAR interface, Vertex, for direct download. The interferograms are produced using nearest-neighbor logic for identifying pairs of granules for interferometric processing, creating deep stacks of BETA products from almost every satellite orbit for scientists to explore. This presentation highlights the functional lessons learned to date from this exercise, including the cost analysis of various data lifecycle policies as implemented through AWS. While demonstrating the architecture choices in support of efficient big science data management, we invite feedback and questions about the process and products from the InSAR community.
NASA Technical Reports Server (NTRS)
Van Cleve, Jeffrey (Editor); Jenkins, Jon; Caldwell, Doug; Allen, Christopher L.; Batalha, Natalie; Bryson, Stephen T.; Chandrasekaran, Hema; Clarke, Bruce D.; Cote, Miles T.; Dotson, Jessie L.;
2010-01-01
The Data Analysis Working Group have released long and short cadence materials, including FFIs and Dropped Targets for the Public. The Kepler Science Office considers Data Release 4 to provide "browse quality" data. These notes have been prepared to give Kepler users of the Multimission Archive at STScl (MAST) a summary of how the data were collected and prepared, and how well the data processing pipeline is functioning on flight data. They will be updated for each release of data to the public archive and placed on MAST along with other Kepler documentation, at http://archive.stsci.edu/kepler/documents.html. Data release 3 is meant to give users the opportunity to examine the data for possibly interesting science and to involve the users in improving the pipeline for future data releases. To perform the latter service, users are encouraged to notice and document artifacts, either in the raw or processed data, and report them to the Science Office.
Planetary Data Archiving Activities of ISRO
NASA Astrophysics Data System (ADS)
Gopala Krishna, Barla; D, Rao J.; Thakkar, Navita; Prashar, Ajay; Manthira Moorthi, S.
ISRO has launched its first planetary mission to moon viz., Chandrayaan-1 on October 22, 2008. This mission carried eleven instruments; a wealth of science data has been collected during its mission life (November 2008 to August 2009), which is archived at Indian Space Science Data Centre (ISSDC). The data centre ISSDC is responsible for the Ingest, storage, processing, Archive, and dissemination of the payload and related ancillary data in addition to real-time spacecraft operations support. ISSDC is designed to provide high computation power, large storage and hosting a variety of applications necessary to support all the planetary and space science missions of ISRO. State-of-the-art architecture of ISSDC provides the facility to ingest the raw payload data of all the science payloads of the science satellites in automatic manner, processes raw data and generates payload specific processed outputs, generate higher level products and disseminates the data sets to principal investigators, guest observers, payload operations centres (POC) and to general public. The data archive makes use of the well-proven archive standards of the Planetary Data System (PDS). The long term Archive for five payloads of Chandrayaan-1 data viz., TMC, HySI, SARA, M3 and MiniSAR is released from ISSDC on19th April 2013 (http://www.issdc.gov.in) to the users. Additionally DEMs generated from possible passes of Chandrayaan-1 TMC stereo data and sample map sheets of Lunar Atlas are also archived and released from ISSDC along with the LTA. Mars Orbiter Mission (MOM) is the recent planetary mission launched on October 22, 2013; currently enroute to MARS, carrying five instruments (http://www.isro.org) viz., Mars Color Camera (MCC) to map various morphological features on Mars with varying resolution and scales using the unique elliptical orbit, Methane Sensor for Mars (MSM) to measure total column of methane in the Martian atmosphere, Thermal Infrared Imaging Spectrometer (TIS) to map surface composition & mineralogy of mars, Mars Exospheric Neutral Composition Analyser (MENCA) to study the composition and density of the Martian neutral atmosphere and Lyman Alpha Photometer (LAP) to investigate the loss process of water in Martian atmosphere, towards fulfilling the mission objectives. Active archive created in PDS for some of the instrument data during the earth phase of the mission is being analysed by the PIs. The Mars science data from the onboard instruments is expected during September 2014. The next planetary mission planned to moon is Chandrayaan-2 which consists of an orbiter having five instruments (http://www.isro.org) viz, (i) Imaging IR Spectrometer (IIRS) for mineral mapping, (ii) TMC-2 for topographic mapping, (iii) MiniSAR to detect water ice in the permanently shadowed regions on the Lunar poles, up to a depth of a few meters, (iv) Large Area Soft X-ray spectrometer (CLASS) & Solar X-ray Monitor (XSM) for mapping the major elements present on the lunar surface and (v)Neutral Mass Spectrometer (ChACE2) to carry out a detailed study of the lunar exosphere towards moon exploration; a rover for some specific experiments and a Lander for technology experiment and demonstration. The data is planned to be archived in PDS standards.
An Ontology-Based Archive Information Model for the Planetary Science Community
NASA Technical Reports Server (NTRS)
Hughes, J. Steven; Crichton, Daniel J.; Mattmann, Chris
2008-01-01
The Planetary Data System (PDS) information model is a mature but complex model that has been used to capture over 30 years of planetary science data for the PDS archive. As the de-facto information model for the planetary science data archive, it is being adopted by the International Planetary Data Alliance (IPDA) as their archive data standard. However, after seventeen years of evolutionary change the model needs refinement. First a formal specification is needed to explicitly capture the model in a commonly accepted data engineering notation. Second, the core and essential elements of the model need to be identified to help simplify the overall archive process. A team of PDS technical staff members have captured the PDS information model in an ontology modeling tool. Using the resulting knowledge-base, work continues to identify the core elements, identify problems and issues, and then test proposed modifications to the model. The final deliverables of this work will include specifications for the next generation PDS information model and the initial set of IPDA archive data standards. Having the information model captured in an ontology modeling tool also makes the model suitable for use by Semantic Web applications.
Zong, W; Wang, P; Leung, B; Moody, G B; Mark, R G
2002-01-01
The advent of implantable cardioverter defibrillators (ICDs) has resulted in significant reductions in mortality in patients at high risk for sudden cardiac death. Extensive related basic research and clinical investigation continue. ICDs typically record intracardiac electrograms and inter-beat intervals along with device settings during episodes of device delivery of therapy. Researchers wishing to study these data further have until now been limited to viewing paper plots. In support of multi-center clinical studies of patients with ICDs, we have developed a web based searchable ICD data archiving system, which allows users to use a web browser to upload ICD data from diskettes to a server where the data are automatically processed and archived. Users can view and download the archived ICD data directly via the web. The entire system is built from open source software. At present more than 500 patient ICD data sets have been uploaded to and archived in the system. This project will be of value not only to those who wish to conduct research using ICD data, but also to clinicians who need to archive and review ICD data collected from their patients.
2016-12-01
including the GSBPP exit survey , archived GSBPP capstones, faculty advisement data, faculty interviews, and a new GSBPP student survey in order to detail...analysis from multiple sources, including the GSBPP exit survey , archived GSBPP capstones, faculty advisement data, faculty interviews, and a new...GSBPP student survey in order to detail the capstone’s process, content, and value to multiple stakeholders. The project team also employs the Plan-Do
A New Archive and Internet Search Engine May Change the Nature of On-Line Research.
ERIC Educational Resources Information Center
Selingo, Jeffrey
1998-01-01
In the process of trying to preserve Internet history by archiving it, a company has developed a powerful Internet search engine that provides information on Web site usage patterns, which can act as a relatively objective source of information about information sources and can link sources that a researcher might otherwise miss. However, issues…
Commercial imagery archive, management, exploitation, and distribution project development
NASA Astrophysics Data System (ADS)
Hollinger, Bruce; Sakkas, Alysa
1999-10-01
The Lockheed Martin (LM) team had garnered over a decade of operational experience on the U.S. Government's IDEX II (Imagery Dissemination and Exploitation) system. Recently, it set out to create a new commercial product to serve the needs of large-scale imagery archiving and analysis markets worldwide. LM decided to provide a turnkey commercial solution to receive, store, retrieve, process, analyze and disseminate in 'push' or 'pull' modes imagery, data and data products using a variety of sources and formats. LM selected 'best of breed' hardware and software components and adapted and developed its own algorithms to provide added functionality not commercially available elsewhere. The resultant product, Intelligent Library System (ILS)TM, satisfies requirements for (1) a potentially unbounded, data archive (5000 TB range) (2) automated workflow management for increased user productivity; (3) automatic tracking and management of files stored on shelves; (4) ability to ingest, process and disseminate data volumes with bandwidths ranging up to multi- gigabit per second; (5) access through a thin client-to-server network environment; (6) multiple interactive users needing retrieval of files in seconds from both archived images or in real time, and (7) scalability that maintains information throughput performance as the size of the digital library grows.
Commercial imagery archive, management, exploitation, and distribution product development
NASA Astrophysics Data System (ADS)
Hollinger, Bruce; Sakkas, Alysa
1999-12-01
The Lockheed Martin (LM) team had garnered over a decade of operational experience on the U.S. Government's IDEX II (Imagery Dissemination and Exploitation) system. Recently, it set out to create a new commercial product to serve the needs of large-scale imagery archiving and analysis markets worldwide. LM decided to provide a turnkey commercial solution to receive, store, retrieve, process, analyze and disseminate in 'push' or 'pull' modes imagery, data and data products using a variety of sources and formats. LM selected 'best of breed' hardware and software components and adapted and developed its own algorithms to provide added functionality not commercially available elsewhere. The resultant product, Intelligent Library System (ILS)TM, satisfies requirements for (a) a potentially unbounded, data archive (5000 TB range) (b) automated workflow management for increased user productivity; (c) automatic tracking and management of files stored on shelves; (d) ability to ingest, process and disseminate data volumes with bandwidths ranging up to multi- gigabit per second; (e) access through a thin client-to-server network environment; (f) multiple interactive users needing retrieval of files in seconds from both archived images or in real time, and (g) scalability that maintains information throughput performance as the size of the digital library grows.
NASA Astrophysics Data System (ADS)
Boler, F. M.; Blewitt, G.; Kreemer, C. W.; Bock, Y.; Noll, C. E.; McWhirter, J.; Jamason, P.; Squibb, M. B.
2010-12-01
Space geodetic science and other disciplines using geodetic products have benefited immensely from open sharing of data and metadata from global and regional archives Ten years ago Scripps Orbit and Permanent Array Center (SOPAC), the NASA Crustal Dynamics Data Information System (CDDIS), UNAVCO and other archives collaborated to create the GPS Seamless Archive Centers (GSAC) in an effort to further enable research with the expanding collections of GPS data then becoming available. The GSAC partners share metadata to facilitate data discovery and mining across participating archives and distribution of data to users. This effort was pioneering, but was built on technology that has now been rendered obsolete. As the number of geodetic observing technologies has expanded, the variety of data and data products has grown dramatically, exposing limitations in data product sharing. Through a NASA ROSES project, the three archives (CDDIS, SOPAC and UNAVCO) have been funded to expand the original GSAC capability for multiple geodetic observation types and to simultaneously modernize the underlying technology by implementing web services. The University of Nevada, Reno (UNR) will test the web services implementation by incorporating them into their daily GNSS data processing scheme. The effort will include new methods for quality control of current and legacy data that will be a product of the analysis/testing phase performed by UNR. The quality analysis by UNR will include a report of the stability of the stations coordinates over time that will enable data users to select sites suitable for their application, for example identifying stations with large seasonal effects. This effort will contribute to enhanced ability for very large networks to obtain complete data sets for processing.
Diagnostic report acquisition unit for the Mayo/IBM PACS project
NASA Astrophysics Data System (ADS)
Brooks, Everett G.; Rothman, Melvyn L.
1991-07-01
The Mayo Clinic and IBM Rochester have jointly developed a picture archive and control system (PACS) for use with Mayo's MRI and Neuro-CT imaging modalities. One of the challenges of developing a useful PACS involves integrating the diagnostic reports with the electronic images so they can be displayed simultaneously. By the time a diagnostic report is generated for a particular case, its images have already been captured and archived by the PACS. To integrate the report with the images, the authors have developed an IBM Personal System/2 computer (PS/2) based diagnostic report acquisition unit (RAU). A typed copy of the report is transmitted via facsimile to the RAU where it is stacked electronically with other reports that have been sent previously but not yet processed. By processing these reports at the RAU, the information they contain is integrated with the image database and a copy of the report is archived electronically on an IBM Application System/400 computer (AS/400). When a user requests a set of images for viewing, the report is automatically integrated with the image data. By using a hot key, the user can toggle on/off the report on the display screen. This report describes process, hardware, and software employed to integrate the diagnostic report information into the PACS, including how the report images are captured, transmitted, and entered into the AS/400 database. Also described is how the archived reports and their associated medical images are located and merged for retrieval and display. The methods used to detect and process error conditions are also discussed.
Towards integrated modelling: full image simulations for WEAVE
NASA Astrophysics Data System (ADS)
Dalton, Gavin; Ham, Sun Jeong; Trager, Scott; Abrams, Don Carlos; Bonifacio, Piercarlo; Aguerri, J. A. L.; Middleton, Kevin; Benn, Chris; Rogers, Kevin; Stuik, Remko; Carrasco, Esperanza; Vallenari, Antonella; Jin, Shoko; Lewis, Jim
2016-08-01
We present an integrated end-end simulation of the spectral images that will be obtained by the weave spectrograph, which aims to include full modelling of all effects from the top of the atmosphere to the detector. These data are based in input spectra from a combination of library spectra and synthetic models, and will be used to provide inputs for an endend test of the full weave data pipeline and archive systems, prior to 1st light of the instrument.
NASA Astrophysics Data System (ADS)
Poser, Kathrin; Peters, Steef; Hommersom, Annelies; Giardino, Claudia; Bresciani, Mariano; Cazzaniga, Ilaria; Schenk, Karin; Heege, Thomas; Philipson, Petra; Ruescas, Ana; Bottcher, Martin; Stelzer, Kerstin
2015-12-01
The GLaSS project develops a prototype infrastructure to ingest and process large amounts of Sentinel-2 and Sentinel-3 data for lakes and reservoirs. To demonstrate the value of satellite observations for the management of aquatic ecosystems, global case studies are performed addressing different types of lakes with their respective problems and management questions. One of these case studies is concentrating on deep clear lakes worldwide. The aim of this case study is to evaluate trends of chlorophyll-a concentrations (Chl-a) as a proxy of the trophic status based on the MERIS full resolution data archive. Some preliminary results of this case study are presented here.
Scalable Data Mining and Archiving for the Square Kilometre Array
NASA Astrophysics Data System (ADS)
Jones, D. L.; Mattmann, C. A.; Hart, A. F.; Lazio, J.; Bennett, T.; Wagstaff, K. L.; Thompson, D. R.; Preston, R.
2011-12-01
As the technologies for remote observation improve, the rapid increase in the frequency and fidelity of those observations translates into an avalanche of data that is already beginning to eclipse the resources, both human and technical, of the institutions and facilities charged with managing the information. Common data management tasks like cataloging both data itself and contextual meta-data, creating and maintaining scalable permanent archive, and making data available on-demand for research present significant software engineering challenges when considered at the scales of modern multi-national scientific enterprises such as the upcoming Square Kilometre Array project. The NASA Jet Propulsion Laboratory (JPL), leveraging internal research and technology development funding, has begun to explore ways to address the data archiving and distribution challenges with a number of parallel activities involving collaborations with the EVLA and ALMA teams at the National Radio Astronomy Observatory (NRAO), and members of the Square Kilometre Array South Africa team. To date, we have leveraged the Apache OODT Process Control System framework and its catalog and archive service components that provide file management, workflow management, resource management as core web services. A client crawler framework ingests upstream data (e.g., EVLA raw directory output), identifies its MIME type and automatically extracts relevant metadata including temporal bounds, and job-relevant/processing information. A remote content acquisition (pushpull) service is responsible for staging remote content and handing it off to the crawler framework. A science algorithm wrapper (called CAS-PGE) wraps underlying code including CASApy programs for the EVLA, such as Continuum Imaging and Spectral Line Cube generation, executes the algorithm, and ingests its output (along with relevant extracted metadata). In addition to processing, the Process Control System has been leveraged to provide data curation and automatic ingestion for the MeerKAT/KAT-7 precursor instrument in South Africa, helping to catalog and archive correlator and sensor output from KAT-7, and to make the information available for downstream science analysis. These efforts, supported by the increasing availability of high-quality open source software, represent a concerted effort to seek a cost-conscious methodology for maintaining the integrity of observational data from the upstream instrument to the archive, and at the same time ensuring that the data, with its richly annotated catalog of meta-data, remains a viable resource for research into the future.
Implementing the HDF-EOS5 software library for data products in the UNAVCO InSAR archive
NASA Astrophysics Data System (ADS)
Baker, Scott; Meertens, Charles; Crosby, Christopher
2017-04-01
UNAVCO is a non-profit university-governed consortium that operates the U.S. National Science Foundation (NSF) Geodesy Advancing Geosciences and EarthScope (GAGE) facility and provides operational support to the Western North America InSAR Consortium (WInSAR). The seamless synthetic aperture radar archive (SSARA) is a seamless distributed access system for SAR data and higher-level data products. Under the NASA-funded SSARA project, a user-contributed InSAR archive for interferograms, time series, and other derived data products was developed at UNAVCO. The InSAR archive development has led to the adoption of the HDF-EOS5 data model, file format, and library. The HDF-EOS software library was designed to support NASA Earth Observation System (EOS) science data products and provides data structures for radar geometry (Swath) and geocoded (Grid) data based on the HDF5 data model and file format provided by the HDF Group. HDF-EOS5 inherits the benefits of HDF5 (open-source software support, internal compression, portability, support for structural data, self-describing file metadata enhanced performance, and xml support) and provides a way to standardize InSAR data products. Instrument- and datatype-independent services, such as subsetting, can be applied to files across a wide variety of data products through the same library interface. The library allows integration with GIS software packages such as ArcGIS and GDAL, conversion to other data formats like NetCDF and GeoTIFF, and is extensible with new data structures to support future requirements. UNAVCO maintains a GitHub repository that provides example software for creating data products from popular InSAR processing software packages like GMT5SAR and ISCE as well as examples for reading and converting the data products into other formats. Digital object identifiers (DOI) have been incorporated into the InSAR archive allowing users to assign a permanent location for their processed result and easily reference the final data products. A metadata attribute is added to the HDF-EOS5 file when a DOI is minted for a data product. These data products are searchable through the SSARA federated query providing access to processed data for both expert and non-expert InSAR users. The archive facilitates timely distribution of processed data with particular importance for geohazards and event response.
WFIRST Science Operations at STScI
NASA Astrophysics Data System (ADS)
Gilbert, Karoline; STScI WFIRST Team
2018-06-01
With sensitivity and resolution comparable the Hubble Space Telescope, and a field of view 100 times larger, the Wide Field Instrument (WFI) on WFIRST will be a powerful survey instrument. STScI will be the Science Operations Center (SOC) for the WFIRST Mission, with additional science support provided by the Infrared Processing and Analysis Center (IPAC) and foreign partners. STScI will schedule and archive all WFIRST observations, calibrate and produce pipeline-reduced data products for imaging with the Wide Field Instrument, support the High Latitude Imaging and Supernova Survey Teams, and support the astronomical community in planning WFI imaging observations and analyzing the data. STScI has developed detailed concepts for WFIRST operations, including a data management system integrating data processing and the archive which will include a novel, cloud-based framework for high-level data processing, providing a common environment accessible to all users (STScI operations, Survey Teams, General Observers, and archival investigators). To aid the astronomical community in examining the capabilities of WFIRST, STScI has built several simulation tools. We describe the functionality of each tool and give examples of its use.
Remote-Sensing Data Distribution and Processing in the Cloud at the ASF DAAC
NASA Astrophysics Data System (ADS)
Stoner, C.; Arko, S. A.; Nicoll, J. B.; Labelle-Hamer, A. L.
2016-12-01
The Alaska Satellite Facility (ASF) Distributed Active Archive Center (DAAC) has been tasked to archive and distribute data from both SENTINEL-1 satellites and from the NASA-ISRO Synthetic Aperture Radar (NISAR) satellite in a cost effective manner. In order to best support processing and distribution of these large data sets for users, the ASF DAAC enhanced our data system in a number of ways that will be detailed in this presentation.The SENTINEL-1 mission comprises a constellation of two polar-orbiting satellites, operating day and night performing C-band Synthetic Aperture Radar (SAR) imaging, enabling them to acquire imagery regardless of the weather. SENTINEL-1A was launched by the European Space Agency (ESA) in April 2014. SENTINEL-1B is scheduled to launch in April 2016.The NISAR satellite is designed to observe and take measurements of some of the planet's most complex processes, including ecosystem disturbances, ice-sheet collapse, and natural hazards such as earthquakes, tsunamis, volcanoes and landslides. NISAR will employ radar imaging, polarimetry, and interferometry techniques using the SweepSAR technology employed for full-resolution wide-swath imaging. NISAR data files are large, making storage and processing a challenge for conventional store and download systems.To effectively process, store, and distribute petabytes of data in a High-performance computing environment, ASF took a long view with regard to technology choices and picked a path of most flexibility and Software re-use. To that end, this Software tools and services presentation will cover Web Object Storage (WOS) and the ability to seamlessly move from local sunk cost hardware to public cloud, such as Amazon Web Services (AWS). A prototype of SENTINEL-1A system that is in AWS, as well as a local hardware solution, will be examined to explain the pros and cons of each. In preparation for NISAR files which will be even larger than SENTINEL-1A, ASF has embarked on a number of cloud initiatives, including processing in the cloud at scale, processing data on-demand, and processing end-user computations on DAAC data in the cloud.
NASA Technical Reports Server (NTRS)
Carroll, Mary Anne; Emmons, Louisa
1995-01-01
The compilation and archiving of NO(x) and NO(y) measurements began in mid-March 1994. Since the submission of the first report, data summaries have been obtained for the TROPOZ 2, STRATOZ 3, OCTA and TOR/Schauinsland campaigns, and the full data sets will become a part of this archive in the near future. Climatologies of NO(x) and NO(y) have been developed from these and previously archived data sets, including the available GTE campaigns (ABLE-2A, B, -3A, B, CITE-2, -3, TRACE-A, PEM WEST-A) and AASE 1 and 2. The data have been grouped by season and altitude (boundary layer and 3 km ranges in the free troposphere). Maps showing median values of midday NO, NO(x) and NO(y) have been produced for each season for the boundary layer and 3 km ranges of the free troposphere. The statistics of the data (median, mean, and standard deviation, central 67% and 90%) have also been determined, and are shown in representative figures included in this report.
Using Firefly Tools to Enhance Archive Web Pages
NASA Astrophysics Data System (ADS)
Roby, W.; Wu, X.; Ly, L.; Goldina, T.
2013-10-01
Astronomy web developers are looking for fast and powerful HTML 5/AJAX tools to enhance their web archives. We are exploring ways to make this easier for the developer. How could you have a full FITS visualizer or a Web 2.0 table that supports paging, sorting, and filtering in your web page in 10 minutes? Can it be done without even installing any software or maintaining a server? Firefly is a powerful, configurable system for building web-based user interfaces to access astronomy science archives. It has been in production for the past three years. Recently, we have made some of the advanced components available through very simple JavaScript calls. This allows a web developer, without any significant knowledge of Firefly, to have FITS visualizers, advanced table display, and spectrum plots on their web pages with minimal learning curve. Because we use cross-site JSONP, installing a server is not necessary. Web sites that use these tools can be created in minutes. Firefly was created in IRSA, the NASA/IPAC Infrared Science Archive (http://irsa.ipac.caltech.edu). We are using Firefly to serve many projects including Spitzer, Planck, WISE, PTF, LSST and others.
The EXOSAT database and archive
NASA Technical Reports Server (NTRS)
Reynolds, A. P.; Parmar, A. N.
1992-01-01
The EXOSAT database provides on-line access to the results and data products (spectra, images, and lightcurves) from the EXOSAT mission as well as access to data and logs from a number of other missions (such as EINSTEIN, COS-B, ROSAT, and IRAS). In addition, a number of familiar optical, infrared, and x ray catalogs, including the Hubble Space Telescope (HST) guide star catalog are available. The complete database is located at the EXOSAT observatory at ESTEC in the Netherlands and is accessible remotely via a captive account. The database management system was specifically developed to efficiently access the database and to allow the user to perform statistical studies on large samples of astronomical objects as well as to retrieve scientific and bibliographic information on single sources. The system was designed to be mission independent and includes timing, image processing, and spectral analysis packages as well as software to allow the easy transfer of analysis results and products to the user's own institute. The archive at ESTEC comprises a subset of the EXOSAT observations, stored on magnetic tape. Observations of particular interest were copied in compressed format to an optical jukebox, allowing users to retrieve and analyze selected raw data entirely from their terminals. Such analysis may be necessary if the user's needs are not accommodated by the products contained in the database (in terms of time resolution, spectral range, and the finesse of the background subtraction, for instance). Long-term archiving of the full final observation data is taking place at ESRIN in Italy as part of the ESIS program, again using optical media, and ESRIN have now assumed responsibility for distributing the data to the community. Tests showed that raw observational data (typically several tens of megabytes for a single target) can be transferred via the existing networks in reasonable time.
Integration of EGA secure data access into Galaxy.
Hoogstrate, Youri; Zhang, Chao; Senf, Alexander; Bijlard, Jochem; Hiltemann, Saskia; van Enckevort, David; Repo, Susanna; Heringa, Jaap; Jenster, Guido; J A Fijneman, Remond; Boiten, Jan-Willem; A Meijer, Gerrit; Stubbs, Andrew; Rambla, Jordi; Spalding, Dylan; Abeln, Sanne
2016-01-01
High-throughput molecular profiling techniques are routinely generating vast amounts of data for translational medicine studies. Secure access controlled systems are needed to manage, store, transfer and distribute these data due to its personally identifiable nature. The European Genome-phenome Archive (EGA) was created to facilitate access and management to long-term archival of bio-molecular data. Each data provider is responsible for ensuring a Data Access Committee is in place to grant access to data stored in the EGA. Moreover, the transfer of data during upload and download is encrypted. ELIXIR, a European research infrastructure for life-science data, initiated a project (2016 Human Data Implementation Study) to understand and document the ELIXIR requirements for secure management of controlled-access data. As part of this project, a full ecosystem was designed to connect archived raw experimental molecular profiling data with interpreted data and the computational workflows, using the CTMM Translational Research IT (CTMM-TraIT) infrastructure http://www.ctmm-trait.nl as an example. Here we present the first outcomes of this project, a framework to enable the download of EGA data to a Galaxy server in a secure way. Galaxy provides an intuitive user interface for molecular biologists and bioinformaticians to run and design data analysis workflows. More specifically, we developed a tool -- ega_download_streamer - that can download data securely from EGA into a Galaxy server, which can subsequently be further processed. This tool will allow a user within the browser to run an entire analysis containing sensitive data from EGA, and to make this analysis available for other researchers in a reproducible manner, as shown with a proof of concept study. The tool ega_download_streamer is available in the Galaxy tool shed: https://toolshed.g2.bx.psu.edu/view/yhoogstrate/ega_download_streamer.
Integration of EGA secure data access into Galaxy
Hoogstrate, Youri; Zhang, Chao; Senf, Alexander; Bijlard, Jochem; Hiltemann, Saskia; van Enckevort, David; Repo, Susanna; Heringa, Jaap; Jenster, Guido; Fijneman, Remond J.A.; Boiten, Jan-Willem; A. Meijer, Gerrit; Stubbs, Andrew; Rambla, Jordi; Spalding, Dylan; Abeln, Sanne
2016-01-01
High-throughput molecular profiling techniques are routinely generating vast amounts of data for translational medicine studies. Secure access controlled systems are needed to manage, store, transfer and distribute these data due to its personally identifiable nature. The European Genome-phenome Archive (EGA) was created to facilitate access and management to long-term archival of bio-molecular data. Each data provider is responsible for ensuring a Data Access Committee is in place to grant access to data stored in the EGA. Moreover, the transfer of data during upload and download is encrypted. ELIXIR, a European research infrastructure for life-science data, initiated a project (2016 Human Data Implementation Study) to understand and document the ELIXIR requirements for secure management of controlled-access data. As part of this project, a full ecosystem was designed to connect archived raw experimental molecular profiling data with interpreted data and the computational workflows, using the CTMM Translational Research IT (CTMM-TraIT) infrastructure http://www.ctmm-trait.nl as an example. Here we present the first outcomes of this project, a framework to enable the download of EGA data to a Galaxy server in a secure way. Galaxy provides an intuitive user interface for molecular biologists and bioinformaticians to run and design data analysis workflows. More specifically, we developed a tool -- ega_download_streamer - that can download data securely from EGA into a Galaxy server, which can subsequently be further processed. This tool will allow a user within the browser to run an entire analysis containing sensitive data from EGA, and to make this analysis available for other researchers in a reproducible manner, as shown with a proof of concept study. The tool ega_download_streamer is available in the Galaxy tool shed: https://toolshed.g2.bx.psu.edu/view/yhoogstrate/ega_download_streamer. PMID:28232859
ERIC Educational Resources Information Center
Roff, Sandra
2007-01-01
Treasures await students and researchers on the shelves of libraries and archives across the country, but unfortunately they often remain unknown to the "modern" researcher who limits his/her research to using the Internet. The process of physically going to the library stacks and browsing the shelves in a subject area is on the decline…
Supplementing the Digitized Sky Survey for UV-Mission Planning
NASA Technical Reports Server (NTRS)
McLean, Brian
2004-01-01
The Space Telescope Science Institute worked on a project to augment the Digitized Sky Survey archive by completing the scanning and processing of the POSS-I blue survey. This will provide an additional valuable resource to support UV-mission planning. All of the data will be made available through the NASA optical/UV archive (MAST) at STScI. The activities completed during this project are included.
SAM 2 and SAGE data management and processing
NASA Technical Reports Server (NTRS)
Osborn, M. T.; Trepte, C. R.
1987-01-01
The data management and processing supplied by ST Systems Corporation (STX) for the Stratospheric Aerosol Measurement 2 (SAM 2) and Stratospheric Aerosol and Gas Experiment (SAGE) experiments for the years 1983 to 1986 are described. Included are discussions of data validation, documentation, and scientific analysis, as well as the archival schedule met by the operational reduction of SAM 2 and SAGE data. Work under this contract resulted in the archiving of the first seven years of SAM 2 data and all three years of SAGE data. A list of publications and presentations supported was also included.
Astro-H Data Analysis, Processing and Archive
NASA Technical Reports Server (NTRS)
Angelini, Lorella; Terada, Yukikatsu; Loewenstein, Michael; Miller, Eric D.; Yamaguchi, Hiroya; Yaqoob, Tahir; Krimm, Hans; Harrus, Ilana; Takahashi, Hiromitsu; Nobukawa, Masayoshi;
2016-01-01
Astro-H (Hitomi) is an X-ray Gamma-ray mission led by Japan with international participation, launched on February 17, 2016. The payload consists of four different instruments (SXS, SXI, HXI and SGD) that operate simultaneously to cover the energy range from 0.3 keV up to 600 keV. This paper presents the analysis software and the data processing pipeline created to calibrate and analyze the Hitomi science data along with the plan for the archive and user support.These activities have been a collaborative effort shared between scientists and software engineers working in several institutes in Japan and USA.
The TESS Science Processing Operations Center
NASA Technical Reports Server (NTRS)
Jenkins, Jon; Twicken, Joseph D.; McCauliff, Sean; Campbell, Jennifer; Sanderfer, Dwight; Lung, David; Mansouri-Samani, Masoud; Girouard, Forrest; Tenenbaum, Peter; Klaus, Todd;
2016-01-01
The Transiting Exoplanet Survey Satellite (TESS) will conduct a search for Earth’s closest cousins starting in late 2017. TESS will discover approx.1,000 small planets and measure the masses of at least 50 of these small worlds. The Science Processing Operations Center (SPOC) is being developed based on the Kepler science pipeline and will generate calibrated pixels and light curves on the NAS Pleiades supercomputer. The SPOC will search for periodic transit events and generate validation products for the transit-like features in the light curves. All TESS SPOC data products will be archived to the Mikulski Archive for Space Telescopes.
The PDS-based Data Processing, Archiving and Management Procedures in Chang'e Mission
NASA Astrophysics Data System (ADS)
Zhang, Z. B.; Li, C.; Zhang, H.; Zhang, P.; Chen, W.
2017-12-01
PDS is adopted as standard format of scientific data and foundation of all data-related procedures in Chang'e mission. Unlike the geographically distributed nature of the planetary data system, all procedures of data processing, archiving, management and distribution are proceeded in the headquarter of Ground Research and Application System of Chang'e mission in a centralized manner. The RAW data acquired by the ground stations is transmitted to and processed by data preprocessing subsystem (DPS) for the production of PDS-compliant Level 0 Level 2 data products using established algorithms, with each product file being well described using an attached label, then all products with the same orbit number are put together into a scheduled task for archiving along with a XML archive list file recoding all product files' properties such as file name, file size etc. After receiving the archive request from DPS, data management subsystem (DMS) is provoked to parse the XML list file to validate all the claimed files and their compliance to PDS using a prebuilt data dictionary, then to exact metadata of each data product file from its PDS label and the fields of its normalized filename. Various requirements of data management, retrieving, distribution and application can be well met using the flexible combination of the rich metadata empowered by the PDS. In the forthcoming CE-5 mission, all the design of data structure and procedures will be updated from PDS version 3 used in previous CE-1, CE-2 and CE-3 missions to the new version 4, the main changes would be: 1) a dedicated detached XML label will be used to describe the corresponding scientific data acquired by the 4 instruments carried, the XML parsing framework used in archive list validation will be reused for the label after some necessary adjustments; 2) all the image data acquired by the panorama camera, landing camera and lunar mineralogical spectrometer should use an Array_2D_Image/Array_3D_Image object to store image data, and use a Table_Character object to store image frame header; the tabulated data acquired by the lunar regolith penetrating radar should use a Table_Binary object to store measurements.
HRSCview: a web-based data exploration system for the Mars Express HRSC instrument
NASA Astrophysics Data System (ADS)
Michael, G.; Walter, S.; Neukum, G.
2007-08-01
The High Resolution Stereo Camera (HRSC) on the ESA Mars Express spacecraft has been orbiting Mars since January 2004. By spring 2007 it had returned around 2 terabytes of image data, covering around 35% of the Martian surface in stereo and colour at a resolu-tion of 10-20 m/pixel. HRSCview provides a rapid means to explore these images up to their full resolu-tion with the data-subsetting, sub-sampling, stretching and compositing being carried out on-the-fly by the image server. It is a joint website of the Free University of Berlin and the German Aerospace Center (DLR). The system operates by on-the-fly processing of the six HRSC level-4 image products: the map-projected ortho-rectified nadir pan-chromatic and four colour channels, and the stereo-derived DTM (digital terrain model). The user generates a request via the web-page for an image with several parameters: the centre of the view in surface coordinates, the image resolution in metres/pixel, the image dimensions, and one of several colour modes. If there is HRSC coverage at the given location, the necessary segments are extracted from the full orbit images, resampled to the required resolution, and composited according to the user's choice. In all modes the nadir channel, which has the highest resolu-tion, is included in the composite so that the maximum detail is always retained. The images are stretched ac-cording to the current view: this applies to the eleva-tion colour scale, as well as the nadir brightness and the colour channels. There are modes for raw colour, stretched colour, enhanced colour (exaggerated colour differences), and a synthetic 'Mars-like' colour stretch. A colour ratio mode is given as an alternative way to examine colour differences (R=IR/R, G=R/G and B=G/B). The final image is packaged as a JPEG file and returned to the user over the web. Each request requires approximately 1 second to process. A link is provided from each view to a data product page, where header items describing the full map-projected science data product are displayed, and a direct link to the archived data products on the ESA Planetary Science Archive (PSA) is provided. At pre-sent the majority of the elevation composites are de-rived from the HRSC Preliminary 200m DTMs gener-ated at the German Aerospace Center (DLR), which will not be available as separately downloadable data products. These DTMs are being progressively super-seded by systematically generated higher resolution archival DTMs, also from DLR, which will become available for download through the PSA, and be simi-larly accessible via HRSCview. At the time of writing this abstract (May 2007), four such high resolution DTMs are available for download via the HRSCview data product pages (for images from orbits 0572, 0905, 1004, and 2039).
Satellite Imagery Production and Processing Using Apache Hadoop
NASA Astrophysics Data System (ADS)
Hill, D. V.; Werpy, J.
2011-12-01
The United States Geological Survey's (USGS) Earth Resources Observation and Science (EROS) Center Land Science Research and Development (LSRD) project has devised a method to fulfill its processing needs for Essential Climate Variable (ECV) production from the Landsat archive using Apache Hadoop. Apache Hadoop is the distributed processing technology at the heart of many large-scale, processing solutions implemented at well-known companies such as Yahoo, Amazon, and Facebook. It is a proven framework and can be used to process petabytes of data on thousands of processors concurrently. It is a natural fit for producing satellite imagery and requires only a few simple modifications to serve the needs of science data processing. This presentation provides an invaluable learning opportunity and should be heard by anyone doing large scale image processing today. The session will cover a description of the problem space, evaluation of alternatives, feature set overview, configuration of Hadoop for satellite image processing, real-world performance results, tuning recommendations and finally challenges and ongoing activities. It will also present how the LSRD project built a 102 core processing cluster with no financial hardware investment and achieved ten times the initial daily throughput requirements with a full time staff of only one engineer. Satellite Imagery Production and Processing Using Apache Hadoop is presented by David V. Hill, Principal Software Architect for USGS LSRD.
One-Dimensional Signal Extraction Of Paper-Written ECG Image And Its Archiving
NASA Astrophysics Data System (ADS)
Zhang, Zhi-ni; Zhang, Hong; Zhuang, Tian-ge
1987-10-01
A method for converting paper-written electrocardiograms to one dimensional (1-D) signals for archival storage on floppy disk is presented here. Appropriate image processing techniques were employed to remove the back-ground noise inherent to ECG recorder charts and to reconstruct the ECG waveform. The entire process consists of (1) digitization of paper-written ECGs with an image processing system via a TV camera; (2) image preprocessing, including histogram filtering and binary image generation; (3) ECG feature extraction and ECG wave tracing, and (4) transmission of the processed ECG data to IBM-PC compatible floppy disks for storage and retrieval. The algorithms employed here may also be used in the recognition of paper-written EEG or EMG and may be useful in robotic vision.
Forde, Arnell S.; Flocks, James G.; Wiese, Dana S.; Fredericks, Jake J.
2016-03-29
The archived trace data are in standard SEG Y rev. 0 format (Barry and others, 1975); the first 3,200 bytes of the card image header are in American Standard Code for Information Interchange (ASCII) format instead of Extended Binary Coded Decimal Interchange Code (EBCDIC) format. The SEG Y files are available on the DVD version of this report or online, downloadable via the USGS Coastal and Marine Geoscience Data System (http://cmgds.marine.usgs.gov). The data are also available for viewing using GeoMapApp (http://www.geomapapp.org) and Virtual Ocean (http://www.virtualocean.org) multi-platform open source software. The Web version of this archive does not contain the SEG Y trace files. To obtain the complete DVD archive, contact USGS Information Services at 1-888-ASK-USGS or infoservices@usgs.gov. The SEG Y files may be downloaded and processed with commercial or public domain software such as Seismic Unix (SU) (Cohen and Stockwell, 2010). See the How To Download SEG Y Data page for download instructions. The printable profiles are provided as Graphics Interchange Format (GIF) images processed and gained using SU software and can be viewed from theProfiles page or by using the links located on the trackline maps; refer to the Software page for links to example SU processing scripts.
The LCOGT Science Archive and Data Pipeline
NASA Astrophysics Data System (ADS)
Lister, Tim; Walker, Z.; Ciardi, D.; Gelino, C. R.; Good, J.; Laity, A.; Swain, M.
2013-01-01
Las Cumbres Observatory Global Telescope (LCOGT) is building and deploying a world-wide network of optical telescopes dedicated to time-domain astronomy. In the past year, we have deployed and commissioned four new 1m telescopes at McDonald Observatory, Texas and at CTIO, Chile, with more to come at SAAO, South Africa and Siding Spring Observatory, Australia. To handle these new data sources coming from the growing LCOGT network, and to serve them to end users, we have constructed a new data pipeline and Science Archive. We describe the new LCOGT pipeline, currently under development and testing, which makes use of the ORAC-DR automated recipe-based data reduction pipeline and illustrate some of the new data products. We also present the new Science Archive, which is being developed in partnership with the Infrared Processing and Analysis Center (IPAC) and show some of the new features the Science Archive provides.
NASA Astrophysics Data System (ADS)
Kiebuzinski, A. B.; Bories, C. M.; Kalluri, S.
2002-12-01
As part of its Earth Observing System (EOS), NASA supports operations for several satellites including Landsat 7, Terra, and Aqua. ECS (EOSDIS Core System) is a vast archival and distribution system and includes several Distributed Active Archive Centers (DAACs) located around the United States. EOSDIS reached a milestone in February when its data holdings exceeded one petabyte (1,000 terabytes) in size. It has been operational since 1999 and originally was intended to serve a large community of Earth Science researchers studying global climate change. The Synergy Program was initiated in 2000 with the purpose of exploring and expanding the use of remote sensing data beyond the traditional research community to the applications community including natural resource managers, disaster/emergency managers, urban planners and others. This included facilitating data access at the DAACs to enable non-researchers to exploit the data for their specific applications. The combined volume of data archived daily across the DAACs is of the order of three terabytes. These archived data are made available to the research community and to general users of ECS data. Currently, the average data volume distributed daily is two terabytes, which combined with an ever-increasing need for timely access to these data, taxes the ECS processing and archival resources for more real-time use than was previously intended for research purposes. As a result, the delivery of data sets to users was being delayed in many cases, to unacceptable limits. Raytheon, under the auspices of the Synergy Program, investigated methods at making data more accessible at a lower cost of resources (processing and archival) at the DAACs. Large on-line caches (as big as 70 Terabytes) of data were determined to be a solution that would allow users who require contemporary data to access them without having to pull it from the archive. These on-line caches are referred to as "Data Pools." In the Data Pool concept, data is inserted via subscriptions based on ECS events, for example, arrival of data matching a specific spatial context. Upon acquisition, these data are written to the Data Pools as well as to the permanent archive. The data is then accessed via a public Web interface, which provides a drilldown search, using data group, spatial, temporal and other flags. The result set is displayed as a list of ftp links to the data, which the user can click and directly download. Data Pool holdings are continuously renewed as the data is allowed to expire and is replaced by more current insertions. In addition, the Data Pool may also house data sets that though not contemporary, receive significant user attention, i.e. a Chernobyl-type of incident, a flood, or a forest fire. The benefits are that users who require contemporary data can access the data immediately (within 24 hours of acquisition) under a much improved access technique. Users not requiring contemporary data, benefit from the Data Pools by having greater archival and processing resources (and a shorter processing queue) made available to them. All users benefit now from the capability to have standing data orders for data matching a geographic context (spatial subscription), a capability also developed under the Synergy program. The Data Pools are currently being installed and checked at each of the DAACs. Additionally, several improvements to the search capabilities, data manipulation tools and overall storage capacity are being developed and will be installed in the First Quarter of 2003.
Status of the Landsat thematic mapper and multispectral scanner archive conversion system
Werner, Darla J.
1993-01-01
The U.S. Geological Survey's EROS Data Center (EDC) manages the National Satellite Land Remote Sensing Data Archive. This archive includes Landsat thematic mapper (TM) multispectral scanner (MSS) data acquired since 1972. The Landsat archive is an important resource to global change research. To ensure long-term availability of Landsat data from the archive, the EDC specified requirements for a Thematic Mapper and Multispectral Scanner Archive Conversion System (TMACS) that would preserve the data by transcribing it to a more durable medium. In addition to media conversion, hardware and software was installed at EDC in July 1992. In December 1992, the EDC began converting Landsat MSS data from high-density, open reel instrumentation tapes to digital cassette tapes. Almost 320,000 MSS images acquired since 1979 and more than 200,000 TM images acquired since 1982 will be converted to the new medium during the next 3 years. During the media conversion process, several high-density tapes have exhibited severe binder degradation. Even though these tapes have been stored in environmentally controlled conditions, hydrolysis has occurred, resulting in "sticky oxide shed". Using a thermostatically controlled oven built at EDC, tape "baking" has been 100 percent successful and actually improves the quality of some images.
Improving Access to NASA Earth Science Data through Collaborative Metadata Curation
NASA Astrophysics Data System (ADS)
Sisco, A. W.; Bugbee, K.; Shum, D.; Baynes, K.; Dixon, V.; Ramachandran, R.
2017-12-01
The NASA-developed Common Metadata Repository (CMR) is a high-performance metadata system that currently catalogs over 375 million Earth science metadata records. It serves as the authoritative metadata management system of NASA's Earth Observing System Data and Information System (EOSDIS), enabling NASA Earth science data to be discovered and accessed by a worldwide user community. The size of the EOSDIS data archive is steadily increasing, and the ability to manage and query this archive depends on the input of high quality metadata to the CMR. Metadata that does not provide adequate descriptive information diminishes the CMR's ability to effectively find and serve data to users. To address this issue, an innovative and collaborative review process is underway to systematically improve the completeness, consistency, and accuracy of metadata for approximately 7,000 data sets archived by NASA's twelve EOSDIS data centers, or Distributed Active Archive Centers (DAACs). The process involves automated and manual metadata assessment of both collection and granule records by a team of Earth science data specialists at NASA Marshall Space Flight Center. The team communicates results to DAAC personnel, who then make revisions and reingest improved metadata into the CMR. Implementation of this process relies on a network of interdisciplinary collaborators leveraging a variety of communication platforms and long-range planning strategies. Curating metadata at this scale and resolving metadata issues through community consensus improves the CMR's ability to serve current and future users and also introduces best practices for stewarding the next generation of Earth Observing System data. This presentation will detail the metadata curation process, its outcomes thus far, and also share the status of ongoing curation activities.
Nonsurgical Strategies in Patients With NET Liver Metastases: A Protocol of Four Systematic Reviews.
Limani, Perparim; Tschuor, Christoph; Gort, Laura; Balmer, Bettina; Gu, Alexander; Ceresa, Christos; Raptis, Dimitri Aristotle; Lesurtel, Mickael; Puhan, Milo; Breitenstein, Stefan
2014-03-07
Patients diagnosed with neuroendocrine tumors (NETs) with hepatic metastases generally have a worse prognosis as compared with patients with nonmetastasized NETs. Due to tumor location and distant metastases, a surgical approach is often not possible and nonsurgical therapeutic strategies may apply. The aim of these systematic reviews is to evaluate the role of nonsurgical therapy options for patients with nonresectable liver metastases of NETs. An objective group of librarians will provide an electronic search strategy to examine the MEDLINE, EMBASE, and The Cochrane Library (Cochrane Database of Systematic Reviews, Database of Abstracts of Reviews of Effects, Cochrane Central Register of Controlled Trials [CENTRAL]) databases. There will be no restriction concerning language and publication date. The qualitative and quantitative synthesis of the systematic review will be conducted with randomized controlled trials (RCT), prospective, and retrospective comparative cohort, and case-control studies. Case series will be collected in a separate database and only used for descriptive purposes. This study is ongoing and presents a protocol of four systematic reviews to assess the role of nonsurgical treatment options in patients with neuroendocrine liver metastases. These systematic reviews, performed according to this protocol, will assess the value of noninvasive therapy options for patients with nonresectable liver metastases of NETs in combination with invasive techniques, such as percutaneous liver-directed techniques and local ablation techniques. International Prospective Register of Systematic Reviews (PROSPERO): CRD42012002657; http://www.metaxis.com/PROSPERO/full_doc.asp?RecordID=2657 (Archived by WebCite at http://www.webcitation.org/6NDlYi37O); CRD42012002658; http://www.metaxis.com/PROSPERO/full_doc.asp?RecordID=2658 (Archived by WebCite at http://www.webcitation.org/6NDlfWSuD); CRD42012002659; www.metaxis.com/PROSPERO/full_doc.asp?RecordID=2659 (Arichived by Webcite at http://www.webcitation.org/6NDlmWAFM); and CRD42012002660; http://www.metaxis.com/PROSPERO/full_doc.asp?RecordID=2660 (Archived by WebCite at http://www.webcitation.org/6NDmnylzp).
BOREAS TE-18, 30-m, Radiometrically Rectified Landsat TM Imagery
NASA Technical Reports Server (NTRS)
Hall, Forrest G. (Editor); Knapp, David
2000-01-01
The BOREAS TE-18 team used a radiometric rectification process to produce standardized DN values for a series of Landsat TM images of the BOREAS SSA and NSA in order to compare images that were collected under different atmospheric conditions. The images for each study area were referenced to an image that had very clear atmospheric qualities. The reference image for the SSA was collected on 02-Sep-1994, while the reference image for the NSA was collected on 21-Jun-1995. the 23 rectified images cover the period of 07-Jul-1985 to 18 Sep-1994 in the SSA and from 22-Jun-1984 to 09-Jun-1994 in the NSA. Each of the reference scenes had coincident atmospheric optical thickness measurements made by RSS-11. The radiometric rectification process is described in more detail by Hall et al. (199 1). The original Landsat TM data were received from CCRS for use in the BOREAS project. The data are stored in binary image-format files. Due to the nature of the radiometric rectification process and copyright issues, these full-resolution images may not be publicly distributed. However, a spatially degraded 60-m resolution version of the images is available on the BOREAS CD-ROM series. See Sections 15 and 16 for information about how to possibly acquire the full resolution data. Information about the full-resolution images is provided in an inventory listing on the CD-ROMs. The data files are available on a CD-ROM (see document number 20010000884), or from the Oak Ridge National Laboratory (ORNL) Distributed Activity Archive Center (DAAC).
NASA ARCH- A FILE ARCHIVAL SYSTEM FOR THE DEC VAX
NASA Technical Reports Server (NTRS)
Scott, P. J.
1994-01-01
The function of the NASA ARCH system is to provide a permanent storage area for files that are infrequently accessed. The NASA ARCH routines were designed to provide a simple mechanism by which users can easily store and retrieve files. The user treats NASA ARCH as the interface to a black box where files are stored. There are only five NASA ARCH user commands, even though NASA ARCH employs standard VMS directives and the VAX BACKUP utility. Special care is taken to provide the security needed to insure file integrity over a period of years. The archived files may exist in any of three storage areas: a temporary buffer, the main buffer, and a magnetic tape library. When the main buffer fills up, it is transferred to permanent magnetic tape storage and deleted from disk. Files may be restored from any of the three storage areas. A single file, multiple files, or entire directories can be stored and retrieved. archived entities hold the same name, extension, version number, and VMS file protection scheme as they had in the user's account prior to archival. NASA ARCH is capable of handling up to 7 directory levels. Wildcards are supported. User commands include TEMPCOPY, DISKCOPY, DELETE, RESTORE, and DIRECTORY. The DIRECTORY command searches a directory of savesets covering all three archival areas, listing matches according to area, date, filename, or other criteria supplied by the user. The system manager commands include 1) ARCHIVE- to transfer the main buffer to duplicate magnetic tapes, 2) REPORTto determine when the main buffer is full enough to archive, 3) INCREMENT- to back up the partially filled main buffer, and 4) FULLBACKUP- to back up the entire main buffer. On-line help files are provided for all NASA ARCH commands. NASA ARCH is written in DEC VAX DCL for interactive execution and has been implemented on a DEC VAX computer operating under VMS 4.X. This program was developed in 1985.
NASA Astrophysics Data System (ADS)
Chen, S. E.; Yu, E.; Bhaskaran, A.; Chowdhury, F. R.; Meisenhelter, S.; Hutton, K.; Given, D.; Hauksson, E.; Clayton, R. W.
2011-12-01
Currently, the SCEDC archives continuous and triggered data from nearly 8400 data channels from 425 SCSN recorded stations, processing and archiving an average of 6.4 TB of continuous waveforms and 12,000 earthquakes each year. The SCEDC provides public access to these earthquake parametric and waveform data through its website www.data.scec.org and through client applications such as STP and DHI. This poster will describe the most significant developments at the SCEDC during 2011. New website design: ? The SCEDC has revamped its website. The changes make it easier for users to search the archive, discover updates and new content. These changes also improve our ability to manage and update the site. New data holdings: ? Post processing on El Mayor Cucapah 7.2 sequence continues. To date there have been 11847 events reviewed. Updates are available in the earthquake catalog immediately. ? A double difference catalog (Hauksson et. al 2011) spanning 1981 to 6/30/11 will be available for download at www.data.scec.org and available via STP. ? A focal mechanism catalog determined by Yang et al. 2011 is available for distribution at www.data.scec.org. ? Waveforms from Southern California NetQuake stations are now being stored in the SCEDC archive and available via STP as event associated waveforms. Amplitudes from these stations are also being stored in the archive and used by ShakeMap. ? As part of a NASA/AIST project in collaboration with JPL and SIO, the SCEDC will receive real time 1 sps streams of GPS displacement solutions from the California Real Time Network (http://sopac.ucsd.edu/projects/realtime; Genrich and Bock, 2006, J. Geophys. Res.). These channels will be archived at the SCEDC as miniSEED waveforms, which then can be distributed to the user community via applications such as STP. Improvements in the user tool STP: ? STP sac output now includes picks from the SCSN. New archival methods: ? The SCEDC is exploring the feasibility of archiving and distributing waveform data using cloud computing such as Google Apps. A month of continuous data from the SCEDC archive will be stored in Google Apps and a client developed to access it in a manner similar to STP. The data is stored in miniseed format with gzip compression. Time gaps between time series were padded with null values, which substantially increases search efficiency by make the records uniform in length.
Landsat-4 and Landsat-5 thematic mapper band 6 historical performance and calibration
Barsi, J.A.; Chander, G.; Markham, B.L.; Higgs, N.; ,
2005-01-01
Launched in 1982 and 1984 respectively, the Landsat-4 and -5 Thematic Mappers (TM) are the backbone of an extensive archive of moderate resolution Earth imagery. However, these sensors and their data products were not subjected to the type of intensive monitoring that has been part of the Landsat-7 system since its launch in 1999. With Landsat-4's 11 year and Landsat-5's 20+ year data record, there is a need to understand the historical behavior of the instruments in order to verify the scientific integrity of the archive and processed products. Performance indicators of the Landsat-4 and -5 thermal bands have recently been extracted from a processing system database allowing for a more complete study of thermal band characteristics and calibration than was previously possible. The database records responses to the internal calibration system, instrument temperatures and applied gains and offsets for each band for every scene processed through the National Landsat Archive Production System (NLAPS). Analysis of this database has allowed for greater understanding of the calibration and improvement in the processing system. This paper will cover the trends in the Landsat-4 and -5 thermal bands, the effect of the changes seen in the trends, and how these trends affect the use of the thermal data.
Recognition of Time Stamps on Full-Disk Hα Images Using Machine Learning Methods
NASA Astrophysics Data System (ADS)
Xu, Y.; Huang, N.; Jing, J.; Liu, C.; Wang, H.; Fu, G.
2016-12-01
Observation and understanding of the physics of the 11-year solar activity cycle and 22-year magnetic cycle are among the most important research topics in solar physics. The solar cycle is responsible for magnetic field and particle fluctuation in the near-earth environment that have been found increasingly important in affecting the living of human beings in the modern era. A systematic study of large-scale solar activities, as made possible by our rich data archive, will further help us to understand the global-scale magnetic fields that are closely related to solar cycles. The long-time-span data archive includes both full-disk and high-resolution Hα images. Prior to the widely use of CCD cameras in 1990s, 35-mm films were the major media to store images. The research group at NJIT recently finished the digitization of film data obtained by the National Solar Observatory (NSO) and Big Bear Solar Observatory (BBSO) covering the period of 1953 to 2000. The total volume of data exceeds 60 TB. To make this huge database scientific valuable, some processing and calibration are required. One of the most important steps is to read the time stamps on all of the 14 million images, which is almost impossible to be done manually. We implemented three different methods to recognize the time stamps automatically, including Optical Character Recognition (OCR), Classification Tree and TensorFlow. The latter two are known as machine learning algorithms which are very popular now a day in pattern recognition area. We will present some sample images and the results of clock recognition from all three methods.
NASA Astrophysics Data System (ADS)
Tibi, R.; Young, C. J.; Gonzales, A.; Ballard, S.; Encarnacao, A. V.
2016-12-01
The matched filtering technique involving the cross-correlation of a waveform of interest with archived signals from a template library has proven to be a powerful tool for detecting events in regions with repeating seismicity. However, waveform correlation is computationally expensive, and therefore impractical for large template sets unless dedicated distributed computing hardware and software are used. In this study, we introduce an Approximate Nearest Neighbor (ANN) approach that enables the use of very large template libraries for waveform correlation without requiring a complex distributed computing system. Our method begins with a projection into a reduced dimensionality space based on correlation with a randomized subset of the full template archive. Searching for a specified number of nearest neighbors is accomplished by using randomized K-dimensional trees. We used the approach to search for matches to each of 2700 analyst-reviewed signal detections reported for May 2010 for the IMS station MKAR. The template library in this case consists of a dataset of more than 200,000 analyst-reviewed signal detections for the same station from 2002-2014 (excluding May 2010). Of these signal detections, 60% are teleseismic first P, and 15% regional phases (Pn, Pg, Sn, and Lg). The analyses performed on a standard desktop computer shows that the proposed approach performs the search of the large template libraries about 20 times faster than the standard full linear search, while achieving recall rates greater than 80%, with the recall rate increasing for higher correlation values. To decide whether to confirm a match, we use a hybrid method involving a cluster approach for queries with two or more matches, and correlation score for single matches. Of the signal detections that passed our confirmation process, 52% were teleseismic first P, and 30% were regional phases.
Rendering an archive in three dimensions
NASA Astrophysics Data System (ADS)
Leiman, David A.; Twose, Claire; Lee, Teresa Y. H.; Fletcher, Alex; Yoo, Terry S.
2003-05-01
We examine the requirements for a publicly accessible, online collection of three-dimensional biomedical image data, including those yielded by radiological processes such as MRI, ultrasound and others. Intended as a repository and distribution mechanism for such medical data, we created the National Online Volumetric Archive (NOVA) as a case study aimed at identifying the multiple issues involved in realizing a large-scale digital archive. In the paper we discuss such factors as the current legal and health information privacy policy affecting the collection of human medical images, retrieval and management of information and technical implementation. This project culminated in the launching of a website that includes downloadable datasets and a prototype data submission system.
NASA Technical Reports Server (NTRS)
Tilmes, Curt
2004-01-01
In 2001, NASA Goddard Space Flight Center's Laboratory for Terrestrial Physics started the construction of a science Investigator-led Processing System (SIPS) for processing data from the Ozone Monitoring Instrument (OMI) which will launch on the Aura platform in mid 2004. The Ozone Monitoring Instrument (OMI) is a contribution of the Netherlands Agency for Aerospace Programs (NIVR) in collaboration with the Finnish Meteorological Institute (FMI) to the Earth Observing System (EOS) Aura mission. It will continue the Total Ozone Monitoring System (TOMS) record for total ozone and other atmospheric parameters related to ozone chemistry and climate. OMI measurements will be highly synergistic with the other instruments on the EOS Aura platform. The LTP previously developed the Moderate Resolution Imaging Spectrometer (MODIS) Data Processing System (MODAPS), which has been in full operations since the launches of the Terra and Aqua spacecrafts in December, 1999 and May, 2002 respectively. During that time, it has continually evolved to better support the needs of the MODIS team. We now run multiple instances of the system managing faster than real time reprocessings of the data as well as continuing forward processing. The new OMI Data Processing System (OMIDAPS) was adapted from the MODAPS. It will ingest raw data from the satellite ground station and process it to produce calibrated, geolocated higher level data products. These data products will be transmitted to the Goddard Distributed Active Archive Center (GDAAC) instance of the Earth Observing System (EOS) Data and Information System (EOSDIS) for long term archive and distribution to the public. The OMIDAPS will also provide data distribution to the OMI Science Team for quality assessment, algorithm improvement, calibration, etc. We have taken advantage of lessons learned from the MODIS experience and software already developed for MODIS. We made some changes in the hardware system organization, database and software to adapt the system for OMI. We replaced the fundamental database system, Sybase, with an Open Source RDBMS called PostgreSQL, and based the entire OMIDAPS on a cluster of Linux based commodity computers rather than the large SGI servers that MODAPS uses. Rather than relying on a central I/O server host, the new system distributes its data archive among multiple server hosts in the cluster. OMI is also customizing the graphical user interfaces and reporting structure to more closely meet the needs of the OMI Science Team. Prior to 2003, simulated OMI data and the science algorithms were not ready for production testing. We initially constructed a prototype system and tested using a 25 year dataset of Total Ozone Mapping Spectrometer (TOMS) and Solar Backscatter Ultraviolet Instrument (SBUV) data. This prototype system provided a platform to support the adaptation of the algorithms for OMI, and provided reprocessing of the historical data aiding in its analysis. In a recent reanalysis of the TOMS data, the OMIDAPS processed 108,000 full orbits of data through 4 processing steps per orbit, producing about 800,000 files (400 GiB) of level 2 and greater data files. More recently we have installed two instances of the OMIDAPS for integration and testing of OM1 science processes as they get delivered from the Science Team. A Test instance of the OMIDAPS has also supported a series of "Interface Confidence Tests" (ICTs) and End-to-End Ground System tests to ensure the launch readiness of the system. This paper will discuss the high-level hardware, software, and database organization of the OMIDAPS and how it builds on the MODAPS heritage system. It will also provide an overview of the testing and implementation of the production OMIDAPS.
NASA Astrophysics Data System (ADS)
Wäppling, Roger
2004-01-01
Physicists generally and our readers in particular are only too aware that the availability of scientific material on the Internet has both advantages and disadvantages. The ease with which a scientist can retrieve information from his/her office has greatly assisted the publication process since references, for example, can be searched for, checked for relevance or cross references with increasing ease. At the same time, however, it has become much easier to use materials without giving credit to the originators and this form of scientific misconduct is of growing concern to the publication process. With this in mind I would like to mention that the facility for retrieving information via the Internet is further developing so that major search engines like Google will be directly usable for retrieving, for example, a Physica Scripta article. Non-subscribers gaining access only to title and abstract whilst subscribers can access the full text in the same way as previously—through libraries and publishers. Physica Scripta has been in the vanguard of electronic development and has many thousands of accesses per day to its full on-line archive. These developments, together with some recent cases of scientific fraud, has led to an increased demand for guidelines for proper ethical conduct in the process of science publishing and, to this end, the International Union of Pure and Applied Physics, IUPAP, is working on a recommendation that I expect to be able to display here once adopted.
DATA QUALITY OBJECTIVES FOR SELECTING WASTE SAMPLES FOR THE BENCH STEAM REFORMER TEST
DOE Office of Scientific and Technical Information (OSTI.GOV)
BANNING DL
2010-08-03
This document describes the data quality objectives to select archived samples located at the 222-S Laboratory for Fluid Bed Steam Reformer testing. The type, quantity and quality of the data required to select the samples for Fluid Bed Steam Reformer testing are discussed. In order to maximize the efficiency and minimize the time to treat Hanford tank waste in the Waste Treatment and Immobilization Plant, additional treatment processes may be required. One of the potential treatment processes is the fluid bed steam reformer (FBSR). A determination of the adequacy of the FBSR process to treat Hanford tank waste is required.more » The initial step in determining the adequacy of the FBSR process is to select archived waste samples from the 222-S Laboratory that will be used to test the FBSR process. Analyses of the selected samples will be required to confirm the samples meet the testing criteria.« less
Science data archives of Indian Space Research Organisation (ISRO): Chandrayaan-1
NASA Astrophysics Data System (ADS)
Gopala Krishna, Barla; Singh Nain, Jagjeet; Moorthi, Manthira
The Indian Space Research Organisation (ISRO) has started a new initiative to launch dedicated scientific satellites earmarked for planetary exploration, astronomical observation and space sciences. The Chandrayaan-1 mission to Moon is one of the approved missions of this new initiative. The basic objective of the Chandrayaan-1 mission, scheduled for launch in mid 2008, is photoselenological and chemical mapping of the Moon with better spatial and spectral resolution. Consistent with this scientific objective, the following baseline payloads are included in this mission: (i) Terrain mapping stereo camera (TMC) with 20 km swath (400-900 nm band) for 3D imaging of lunar surface at a spatial resolution of 5m. (ii) Hyper Spectral Imager in the 400- 920 nm band with 64 channels and spatial resolution of 80m (20 km swath) for mineralogical mapping. (iii) High-energy X-ray (30-270 keV) spectrometer having a footprint of 40 km for study of volatile transport on Moon and (iv) Laser ranging instrument with vertical resolution of 5m. ISRO offered opportunity to the international scientific community to participate in Chandrayaan- 1 mission and six payloads that complement the basic objective of the Chandrayaan-1 mission have been selected and included in this mission viz., (i) a miniature imaging radar instrument (Mini-SAR) from APL, NASA to look for presence of ice in the polar region, (ii) a near infrared spectrometer (SIR-2) from Max Plank Institute, Germany, (iii) a Moon Mineralogy Mapper (M3) from JPL, NASA for mineralogical mapping in the infra-red regions (0.7 - 3.0 micron), (iv) a sub-keV atom reflecting analyzer (SARA) from Sweden, India, Switzerland and Japan for detection of low energy neutral atoms emanated from the lunar surface,(v) a radiation dose monitor (RADOM) from Bulgaria for monitoring energetic particle flux in the lunar environment and (vi) a collimated low energy (1-10keV) X-ray spectrometer (C1XS) with a field of view of 20 km for chemical mapping of the lunar surface from RAL, UK. Science data from the Chandrayaan-1 instruments is planned to be archived by combined efforts from all the instrument and Payload Operations Centre (POC) teams, the Indian Space Science Data Centre (ISSDC), the Chandrayaan-1 Spacecraft Control Centre (SCC). Chandrayaan-1 Science Data Archive (CSDA) is planned at ISSDC is the primary data center for the payload data archives of Indian Space Science Missions. This data center is responsible for the Ingest, Archive, and Dissemination of the payload and related ancillary data for Space Science missions like Chandrayaan-1. The archiving process includes the design, generation, validation and transfer of the data archive. The archive will include raw and reduced data, calibration data, auxiliary data, higher-level derived data products, documentation and software. The CSDA will make use of the well-proven archive standards of the Planetary Data System (PDS) and planned to follow IPDA guidelines. This is to comply with the global standards for long term preservation of the data, maintain their usability and facilitate scientific community with the high quality data for their analysis. The primary users of this facility will be the principal investigators of the science payloads initially till the lock-in period. After this, the data will be made accessible to scientists from other institutions and also to the general public. The raw payload data received through the data reception stations is further processed to generate Level-0 and Level-1 data products, which are stored in the CSDA for subsequent dissemination. According to the well documented Chandrayaan-1 archive plan agreed by the experiment teams, the data collection period is decided to be six months. The first data delivery to long term archive of CSDA after peer review is expected to be eighteen months after launch. At present, Experimenter to Archive ICDs of the instrument data are under the process of review.
Intelligent Systems Technologies and Utilization of Earth Observation Data
NASA Technical Reports Server (NTRS)
Ramapriyan, H. K.; McConaughy, G. R.; Morse, H. S.
2004-01-01
The addition of raw data and derived geophysical parameters from several Earth observing satellites over the last decade to the data held by NASA data centers has created a data rich environment for the Earth science research and applications communities. The data products are being distributed to a large and diverse community of users. Due to advances in computational hardware, networks and communications, information management and software technologies, significant progress has been made in the last decade in archiving and providing data to users. However, to realize the full potential of the growing data archives, further progress is necessary in the transformation of data into information, and information into knowledge that can be used in particular applications. Sponsored by NASA s Intelligent Systems Project within the Computing, Information and Communication Technology (CICT) Program, a conceptual architecture study has been conducted to examine ideas to improve data utilization through the addition of intelligence into the archives in the context of an overall knowledge building system (KBS). Potential Intelligent Archive concepts include: 1) Mining archived data holdings to improve metadata to facilitate data access and usability; 2) Building intelligence about transformations on data, information, knowledge, and accompanying services; 3) Recognizing the value of results, indexing and formatting them for easy access; 4) Interacting as a cooperative node in a web of distributed systems to perform knowledge building; and 5) Being aware of other nodes in the KBS, participating in open systems interfaces and protocols for virtualization, and achieving collaborative interoperability.
The imaging node for the Planetary Data System
Eliason, E.M.; LaVoie, S.K.; Soderblom, L.A.
1996-01-01
The Planetary Data System Imaging Node maintains and distributes the archives of planetary image data acquired from NASA's flight projects with the primary goal of enabling the science community to perform image processing and analysis on the data. The Node provides direct and easy access to the digital image archives through wide distribution of the data on CD-ROM media and on-line remote-access tools by way of Internet services. The Node provides digital image processing tools and the expertise and guidance necessary to understand the image collections. The data collections, now approaching one terabyte in volume, provide a foundation for remote sensing studies for virtually all the planetary systems in our solar system (except for Pluto). The Node is responsible for restoring data sets from past missions in danger of being lost. The Node works with active flight projects to assist in the creation of their archive products and to ensure that their products and data catalogs become an integral part of the Node's data collections.
NASA Astrophysics Data System (ADS)
Chalmers, Alex
2004-09-01
To increase the security and throughput of ISO traffic through international terminals more technology must be applied to the problem. A transnational central archive of inspection records is discussed that can be accessed by national agencies as ISO containers approach their borders. The intent is to improve the throughput and security of the cargo inspection process. A review of currently available digital media archiving technologies is presented and their possible application to the tracking of international ISO container shipments. Specific image formats employed by current x-ray inspection systems are discussed. Sample x-ray data from systems in use today are shown that could be entered into such a system. Data from other inspection technologies are shown to be easily integrated, as well as the creation of database records suitable for interfacing with other computer systems. Overall system performance requirements are discussed in terms of security, response time and capacity. Suggestions for pilot projects based on existing border inspection processes are made also.
NASA Astrophysics Data System (ADS)
Logan, T. A.; Arko, S. A.; Rosen, P. A.
2013-12-01
To demonstrate the feasibility of orbital remote sensing for global ocean observations, NASA launched Seasat on June 27th, 1978. Being the first space borne SAR mission, Seasat produced the most detailed SAR images of Earth from space ever seen to that point in time. While much of the data collected in the USA was processed optically, a mere 150 scenes had been digitally processed by March 1980. In fact, only an estimated 3% of Seasat data was ever digitally processed. Thus, for over three decades, the majority of the SAR data from this historic mission has been dormant, virtually unavailable to scientists in the 21st century. Over the last year, researchers at the Alaska Satellite Facility (ASF) Distributed Active Archive Center (DAAC) have processed the Seasat SAR archives into imagery products. A telemetry decoding system was created and the data were filtered into readily processable signal files. Due to nearly 35 years of bit rot, the bit error rate (BER) for the ASF DAAC Seasat archives was on the order of 1 out of 100 to 1 out of 100,000. This extremely high BER initially seemed to make much of the data undecodable - because the minor frame numbers are just 7 bits and no range line numbers exist in the telemetry even the 'simple' tasks of tracking the minor frame number or locating the start of each range line proved difficult. Eventually, using 5 frame numbers in sequence and a handful of heuristics, the data were successfully decoded into full range lines. Concurrently, all metadata were stored into external files. Recovery of this metadata was also problematic, the BER making the information highly suspect and, initially at least, unusable in any sort of automated fashion. Because of the BER, all of the single bit metadata fields proved unreliable. Even fields that should be constant for a data take (e.g. receiving station, day of the year) showed high variability, each requiring a median filter to be usable. The most challenging, however, were the supposedly 'steadily' changing millisecond (MSEC) timing values. The elevated BER made even a basic linear fit difficult. In addition, the MSEC field often shows a 'stair step' function, assumed to be a spacecraft clock malfunction. To fix these issues, three separate levels of time filtering were applied. After the initial three-pass time filter, a fourth procedure located and removed discontinuities - missing data sections that occurred randomly throughout the data takes - by inserting random valued lines into the effected data file and repeated value lines into the corresponding header file. Finally, a fifth pass through the metadata was required to fix remaining start time anomalies. After the data were filtered, all times were linearly increasing, and all discontinuities filled, images could finally be formed. ASF DAAC utilized a custom version of ROI, the Repeat Orbit Interferometric SAR processor, to focus the data. Special focusing tasks for Seasat included dealing with Doppler ambiguity issues and filtering out 'spikes' in the power spectra. Once these obstacles were overcome via additional pre-processing software developed in house, well-focused SAR imagery was obtained from approximately 80% the ASF DAAC archives. These focused products, packaged in either HDF5 or geotiff formats with XML metadata, are downloadable from ASF DAAC free of charge.
The combined EarthScope data set at the IRIS DMC
NASA Astrophysics Data System (ADS)
Trabant, C.; Sharer, G.; Benson, R.; Ahern, T.
2007-12-01
The IRIS Data Management Center (DMC) is the perpetual archive and access point for an ever-increasing variety of geophysical data in terms of volume, geographic distribution and scientific value. A particular highlight is the combined data set produced by the EarthScope project. The DMC archives data from each of the primary components: USArray, the Plate Boundary Observatory (PBO) & the San Andreas Fault Observatory at Depth (SAFOD). Growing at over 4.6 gigabytes per day, the USArray data set currently totals approximately 5 terabytes. Composed of four separate sub-components: the Permanent, Transportable, Flexible and Magnetotelluric Arrays, the USArray data set provides a multi-scale view of the western United States at present and the conterminous United States when it is completed. The primary data from USArray are in the form of broadband and short-period seismic recordings and magnetotelluric measurements. Complementing the data from USArray are the short- period, borehole seismic data and borehole and laser strain data from PBO. The DMC also archives the high- resolution seismic data from instruments in the SAFOD main and pilot drill holes. The SAFOD seismic data is available in two forms: lower-rate monitoring channels sampled at 250 hertz and full resolution channels varying between 1 and 4 kilohertz. Beyond data collection and archive management the DMC performs value-added functions. All data arriving at the DMC as real-time data streams are processed by QUACK, an automated Quality Control (QC) system. All the measurements made by this system are stored in a database and made available to data contributors and users via a web interface including customized report generation. In addition to the automated QC measurements, quality control is performed on USArray data at the DMC by a team of analysts. The primary functions of the analysts are to routinely report data quality assessment to the respective network operators and log serious, unfixable data issues for reference by data users. All of these data are managed in a unified SEED format archive and are seamlessly available to data users via the DMC's&pstandard data access methods along with all the other data managed by the DMC. The only exception is high resolution, special case SAFOD seismic data that is retained in its original SEG-2 format as an assembled data set. A data user can choose between a handful of data access methods ranging from simple email requests to technologically advanced CORBA-based access, streamlining the "information into application" philosophy. Currently totally over 8.5 terabytes and growing, the combined EarthScope data at the DMC provides an unparalleled, multi-measurement record of geophysical information ideal for determining Earth structure and processes in the United States and beyond. A website is maintained to provide current information regarding EarthScope data at the DMC: http://www.iris.edu/earthscope/.
Present status and future directions of the Mayo/IBM PACS project
NASA Astrophysics Data System (ADS)
Morin, Richard L.; Forbes, Glenn S.; Gehring, Dale G.; Salutz, James R.; Pavlicek, William
1991-07-01
This joint project began in 1988 and was motivated by the need to develop an alternative to the archival process in place at that time (magnetic tape) for magnetic resonance imaging and neurological computed tomography. In addition, this project was felt to be an important step in gaining the necessary clinical experience for the future implementation of various aspects of electronic imaging. The initial phase of the project was conceived and developed to prove the concept, test the fundamental components, and produce performance measurements for future work. The key functions of this phase centered on attachment of imaging equipment (GE Signa) and archival processes using a non-dedicated (institutionally supplied) local area network (LAN). Attachment of imaging equipment to the LAN was performed using commercially available devices (Ethernet, PS/2, Token Ring). Image data were converted to ACR/NEMA format with retention of the vendor specific header information. Performance measurements were encouraging and led to the design of following projects. The second phase has recently been concluded. The major features of this phase have been to greatly expand the network, put the network into clinical use, establish an efficient and useful viewing station, include diagnostic reports in the archive data, provide wide area network (WAN) capability via ISDN, and establish two-way real-time video between remote sites. This phase has heightened both departmental and institutional thought regarding various issues raised by electronic imaging. Much discussion regarding both present as well as future archival processes has occurred. The use of institutional LAN resources has proven to be adequate for the archival function examined thus far. Experiments to date have shown that use of dedicated resources will be necessary for retrieval activities at even a basic level. This report presents an overview of the background present status and future directions of the project.
GLAS Long-Term Archive: Preservation and Stewardship for a Vital Earth Observing Mission
NASA Astrophysics Data System (ADS)
Fowler, D. K.; Moses, J. F.; Zwally, J.; Schutz, B. E.; Hancock, D.; McAllister, M.; Webster, D.; Bond, C.
2012-12-01
Data Stewardship, preservation, and reproducibility are fast becoming principal parts of a data manager's work. In an era of distributed data and information systems, it is of vital importance that organizations make a commitment to both current and long-term goals of data management and the preservation of scientific data. Satellite missions and instruments go through a lifecycle that involves pre-launch calibration, on-orbit data acquisition and product generation, and final reprocessing. Data products and descriptions flow to the archives for distribution on a regular basis during the active part of the mission. However there is additional information from the product generation and science teams needed to ensure the observations will be useful for long term climate studies. Examples include ancillary input datasets, product generation software, and production history as developed by the team during the course of product generation. These data and information will need to be archived after product data processing is completed. NASA has developed a set of Earth science data and information content requirements for long term preservation that is being used for all the EOS missions as they come to completion. Since the ICESat/GLAS mission was one of the first to end, NASA and NSIDC, in collaboration with the science team, are collecting data, software, and documentation, preparing for long-term support of the ICESat mission. For a long-term archive, it is imperative to preserve sufficient information about how products were prepared in order to ensure future researchers that the scientific results are accurate, understandable, and useable. Our experience suggests data centers know what to preserve in most cases. That is, the processing algorithms along with the Level 0 or Level 1a input and ancillary products used to create the higher-level products will be archived and made available to users. In other cases, such as pre-launch, calibration/validation, and test data, the data centers must seek guidance from the science team. All these data are essential for product provenance, contributing to and helping establish the integrity of the scientific observations for long term climate studies. In this presentation we will describe application of information gathering with guidance from the ICESat/GLAS Science Team, and the flow of additional information from the ICESat Science team and Science Investigator-Led Processing System to the NSIDC Distributed Active Archive Center. This presentation will also cover how we envision user support through the years of the Long-Term Archive.
Calderon, Karynna; Dadisman, Shawn V.; Kindinger, Jack G.; Flocks, James G.; Morton, Robert A.; Wiese, Dana S.
2004-01-01
In June of 1994 and August and September of 1995, the U.S. Geological Survey, in cooperation with the University of Texas Bureau of Economic Geology, conducted geophysical surveys of the Sabine and Calcasieu Lake areas and the Gulf of Mexico offshore eastern Texas and western Louisiana. This report serves as an archive of unprocessed digital boomer seismic reflection data, trackline maps, navigation files, observers' logbooks, GIS information, and formal FGDC metadata. In addition, a filtered and gained GIF image of each seismic profile is provided. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG-Y format (Barry and others, 1975) and may be downloaded and processed with commercial or public domain software such as Seismic Unix (SU). Examples of SU processing scripts and in-house (USGS) software for viewing SEG-Y files (Zihlman, 1992) are also provided. Processed profile images, trackline maps, navigation files, and formal metadata may be viewed with a web browser. Scanned handwritten logbooks and Field Activity Collection System (FACS) logs may be viewed with Adobe Reader.
Calderon, Karynna; Dadisman, Shawn V.; Kindinger, Jack G.; Flocks, James G.; Ferina, Nicholas F.; Wiese, Dana S.
2004-01-01
In October of 2001 and August of 2002, the U.S. Geological Survey conducted geophysical surveys of the Lower Atchafalaya River, the Mississippi River Delta, Barataria Bay, and the Gulf of Mexico south of East Timbalier Island, Louisiana. This report serves as an archive of unprocessed digital marine seismic reflection data, trackline maps, navigation files, observers' logbooks, GIS information, and formal FGDC metadata. In addition, a filtered and gained GIF image of each seismic profile is provided. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG-Y format (Barry and othes, 1975) and may be downloaded and processed with commercial or public domain software such as Seismic Unix (SU). Examples of SU processing scripts and in-house (USGS) software for viewing SEG-Y files (Zihlman, 1992) are also provided. Processed profile images, trackline maps, navigation files, and formal metadata may be viewed with a web browser. Scanned handwritten logbooks and Field Activity Collection System (FACS) logs may be viewed with Adobe Reader.
Archiving InSight Lander Science Data Using PDS4 Standards
NASA Astrophysics Data System (ADS)
Stein, T.; Guinness, E. A.; Slavney, S.
2017-12-01
The InSight Mars Lander is scheduled for launch in 2018, and science data from the mission will be archived in the NASA Planetary Data System (PDS) using the new PDS4 standards. InSight is a geophysical lander with a science payload that includes a seismometer, a probe to measure subsurface temperatures and heat flow, a suite of meteorology instruments, a magnetometer, an experiment using radio tracking, and a robotic arm that will provide soil physical property information based on interactions with the surface. InSight is not the first science mission to archive its data using PDS4. However, PDS4 archives do not currently contain examples of the kinds of data that several of the InSight instruments will produce. Whereas the existing common PDS4 standards were sufficient for most of archiving requirements of InSight, the data generated by a few instruments required development of several extensions to the PDS4 information model. For example, the seismometer will deliver a version of its data in SEED format, which is standard for the terrestrial seismology community. This format required the design of a new product type in the PDS4 information model. A local data dictionary has also been developed for InSight that contains attributes that are not part of the common PDS4 dictionary. The local dictionary provides metadata relevant to all InSight data sets, and attributes specific to several of the instruments. Additional classes and attributes were designed for the existing PDS4 geometry dictionary that will capture metadata for the lander position and orientation, along with camera models for stereo image processing. Much of the InSight archive planning and design work has been done by a Data Archiving Working Group (DAWG), which has members from the InSight project and the PDS. The group coordinates archive design, schedules and peer review of the archive documentation and test products. The InSight DAWG archiving effort for PDS is being led by the PDS Geosciences Node with several other nodes working one-on-one with instruments relevant to their disciplines. Once the InSight mission begins operations, the DAWG will continue to provide oversight on release of InSight data to PDS. Lessons learned from InSight archive work will also feed forward to planning the archives for the Mars 2020 rover.
Data Management Considerations for the International Polar Year
NASA Astrophysics Data System (ADS)
Parsons, M. A.; Weaver, R. L.; Duerr, R.; Barry, R. G.
2004-12-01
The legacy of the International Geophysical Year and past International Polar Years is in the scientific data collected. The upcoming IPY will result in an unprecedented collection of geophysical and social science data from the polar regions. To realize the full scientific and interdisciplinary utility of these data it is essential to consider the design of data management systems early in the expirimental planning process. This paper will present an array of high level data management considerations for the IPY including cross-disciplinary data access, essential documentation, system guidance, and long-term data archiving. Specific recommendations from relevant international organizations such as the Joint Committee on Antarctic Data Management and the WCRP Climate and Cryosphere Programme will be considered. The potential role of the Electronic Geophysical Year and other International Years will also be discussed.
Flare forecasting at the Met Office Space Weather Operations Centre
NASA Astrophysics Data System (ADS)
Murray, S. A.; Bingham, S.; Sharpe, M.; Jackson, D. R.
2017-04-01
The Met Office Space Weather Operations Centre produces 24/7/365 space weather guidance, alerts, and forecasts to a wide range of government and commercial end-users across the United Kingdom. Solar flare forecasts are one of its products, which are issued multiple times a day in two forms: forecasts for each active region on the solar disk over the next 24 h and full-disk forecasts for the next 4 days. Here the forecasting process is described in detail, as well as first verification of archived forecasts using methods commonly used in operational weather prediction. Real-time verification available for operational flare forecasting use is also described. The influence of human forecasters is highlighted, with human-edited forecasts outperforming original model results and forecasting skill decreasing over longer forecast lead times.
ARES - A New Airborne Reflective Emissive Spectrometer
2005-10-01
Information and Management System (DIMS), an automated processing environment with robot archive interface as established for the handling of satellite data...consisting of geocoded ground reflectance data. All described processing steps will be integrated in the automated processing environment DIMS to assure a
Seismic Data from Roosevelt Hot Springs, Utah FORGE Study Area
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, John
This set of data contains raw and processed 2D and 3D seismic data from the Utah FORGE study area near Roosevelt Hot Springs. The zipped archives numbered from 1-100 to 1001-1122 contain 3D seismic uncorrelated shot gatherers SEG-Y files. The zipped archives numbered from 1-100C to 1001-1122C contain 3D seismic correlated shot gatherers SEG-Y files. Other data have intuitive names.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-03
...ApprovalProcess/FormsSubmissionRequirements/ElectronicSubmissions/ucm253101.htm , http://www.regulations.../Drugs/DevelopmentApprovalProcess/FormsSubmissionRequirements/ElectronicSubmissions/ucm253101.htm , http...), in a format that FDA can process, review, and archive. Currently, the Agency can process, review, and...
The SeaDAS Processing and Analysis System: SeaWiFS, MODIS, and Beyond
NASA Astrophysics Data System (ADS)
MacDonald, M. D.; Ruebens, M.; Wang, L.; Franz, B. A.
2005-12-01
The SeaWiFS Data Analysis System (SeaDAS) is a comprehensive software package for the processing, display, and analysis of ocean data from a variety of satellite sensors. Continuous development and user support by programmers and scientists for more than a decade has helped to make SeaDAS the most widely used software package in the world for ocean color applications, with a growing base of users from the land and sea surface temperature community. Full processing support for past (CZCS, OCTS, MOS) and present (SeaWiFS, MODIS) sensors, and anticipated support for future missions such as NPP/VIIRS, enables end users to reproduce the standard ocean archive product suite distributed by NASA's Ocean Biology Processing Group (OBPG), as well as a variety of evaluation and intermediate ocean, land, and atmospheric products. Availability of the processing algorithm source codes and a software build environment also provide users with the tools to implement custom algorithms. Recent SeaDAS enhancements include synchronization of MODIS processing with the latest code and calibration updates from the MODIS Calibration Support Team (MCST), support for all levels of MODIS processing including Direct Broadcast, a port to the Macintosh OS X operating system, release of the display/analysis-only SeaDAS-Lite, and an extremely active web-based user support forum.
NLSI Focus Group on Missing ALSEP Data Recovery: Progress and Plans
NASA Technical Reports Server (NTRS)
Lewis, L. R.; Nakamura, Y.; Nagihara, S.; Williams, D. R.; Chi, P.; Taylor, P. T.; Schmidt, G. K.; Grayzeck, E. J.
2011-01-01
On the six Apollo landed missions, the Astronauts deployed the Apollo Lunar Surface Experiments Package (ALSEP) science stations which measured active and passive seismic events, magnetic fields, charged particles, solar wind, heat flow, the diffuse atmosphere, meteorites and their ejecta, lunar dust, etc. Today's scientists are able to extract new information and make new discoveries from the old ALSEP data utilizing recent advances in computer capabilities and new analysis techniques. However, current-day investigators are encountering problems trying to use the ALSEP data. In 2007 archivists from NASA Goddard Space Flight Center (GSFC) National Space Science Data Center (NSSDC) estimated only about 50 percent of the processed ALSEP lunar surface data-of-interest to current lunar science investigators were in the NSSDC archives. The current-day lunar science investigators found most of the ALSEP data, then in the NSSDC archives. were extremely difficult to use. The data were in forms often not well described in the published reports and rerecording anomalies existed in the data which could only be resolved by tape experts. To resolve this problem, the DPS Lunar Data Node was established in 2008 at NSSDC and is in the process of successfully making the existing archived ALSEP data available to current-day investigators in easily useable forms. In July of 2010 the NASA Lunar Science Institute (NLSI) at Ames Research Center established the Recovery of Missing ALSEP Data Focus Group in recognition of the importance of the current activities to find the raw and processed ALSEP data missing from the NSSDC archives.
Johnston, A. E.
2018-01-01
Summary Long‐term field experiments that test a range of treatments and are intended to assess the sustainability of crop production, and thus food security, must be managed actively to identify any treatment that is failing to maintain or increase yields. Once identified, carefully considered changes can be made to the treatment or management, and if they are successful yields will change. If suitable changes cannot be made to an experiment to ensure its continued relevance to sustainable crop production, then it should be stopped. Long‐term experiments have many other uses. They provide a field resource and samples for research on plant and soil processes and properties, especially those properties where change occurs slowly and affects soil fertility. Archived samples of all inputs and outputs are an invaluable source of material for future research, and data from current and archived samples can be used to develop models to describe soil and plant processes. Such changes and uses in the Rothamsted experiments are described, and demonstrate that with the appropriate crop, soil and management, acceptable yields can be maintained for many years, with either organic manure or inorganic fertilizers. Highlights Long‐term experiments demonstrate sustainability and increases in crop yield when managed to optimize soil fertility.Shifting individual response curves into coincidence increases understanding of the factors involved.Changes in inorganic and organic pollutants in archived crop and soil samples are related to inputs over time.Models describing soil processes are developed from current and archived soil data. PMID:29527119
NASA Astrophysics Data System (ADS)
Agarwal, D.; Varadharajan, C.; Cholia, S.; Snavely, C.; Hendrix, V.; Gunter, D.; Riley, W. J.; Jones, M.; Budden, A. E.; Vieglais, D.
2017-12-01
The ESS-DIVE archive is a new U.S. Department of Energy (DOE) data archive designed to provide long-term stewardship and use of data from observational, experimental, and modeling activities in the earth and environmental sciences. The ESS-DIVE infrastructure is constructed with the long-term vision of enabling broad access to and usage of the DOE sponsored data stored in the archive. It is designed as a scalable framework that incentivizes data providers to contribute well-structured, high-quality data to the archive and that enables the user community to easily build data processing, synthesis, and analysis capabilities using those data. The key innovations in our design include: (1) application of user-experience research methods to understand the needs of users and data contributors; (2) support for early data archiving during project data QA/QC and before public release; (3) focus on implementation of data standards in collaboration with the community; (4) support for community built tools for data search, interpretation, analysis, and visualization tools; (5) data fusion database to support search of the data extracted from packages submitted and data available in partner data systems such as the Earth System Grid Federation (ESGF) and DataONE; and (6) support for archiving of data packages that are not to be released to the public. ESS-DIVE data contributors will be able to archive and version their data and metadata, obtain data DOIs, search for and access ESS data and metadata via web and programmatic portals, and provide data and metadata in standardized forms. The ESS-DIVE archive and catalog will be federated with other existing catalogs, allowing cross-catalog metadata search and data exchange with existing systems, including DataONE's Metacat search. ESS-DIVE is operated by a multidisciplinary team from Berkeley Lab, the National Center for Ecological Analysis and Synthesis (NCEAS), and DataONE. The primarily data copies are hosted at DOE's NERSC supercomputing facility with replicas at DataONE nodes.
Fermilab Today Tuesday, March 5, 2013 spacer Subscribe | Contact Us | Archive | Classifieds | Guidelines | Help Search GO spacer Calendar Have a safe day! Tuesday, March 5 3:30 p.m. DIRECTOR'S COFFEE Current Flag Status Flags at full staff Wilson Hall Cafe Tuesday, March 5 - Breakfast: All-American
Electronic Journal Market Overview 1997: Part II--The Aggregators.
ERIC Educational Resources Information Center
Machovec, George S.
1997-01-01
Reviews the electronic journals and online services marketplace. Discusses fees; types of materials that are accessible; search engines and compatibility with Web browsers; information currency; types and number of sources available and numbers; archives; indexing, abstracting and full text titles; electronic delivery; technological development;…
Electronic signatures for long-lasting storage purposes in electronic archives.
Pharow, Peter; Blobel, Bernd
2005-03-01
Communication and co-operation in healthcare and welfare require a certain set of trusted third party (TTP) services describing both status and relation of communicating principals as well as their corresponding keys and attributes. Additional TTP services are needed to provide trustworthy information about dynamic issues of communication and co-operation such as time and location of processes, workflow relations, and system behaviour. Legal and ethical requirements demand securely stored patient information and well-defined access rights. Among others, electronic signatures based on asymmetric cryptography are important means for securing the integrity of a message or file as well as for accountability purposes including non-repudiation of both origin and receipt. Electronic signatures along with certified time stamps or time signatures are especially important for electronic archives in general, electronic health records (EHR) in particular, and especially for typical purposes of long-lasting storage. Apart from technical storage problems (e.g. lifetime of the storage devices, interoperability of retrieval and presentation software), this paper identifies mechanisms of e.g. re-signing and re-stamping of data items, files, messages, sets of archived items or documents, archive structures, and even whole archives.
Studies of Global Solar Magnetic Field Patterns Using a Newly Digitized Archive
NASA Astrophysics Data System (ADS)
Hewins, I.; Webb, D. F.; Gibson, S. E.; McFadden, R.; Emery, B. A.; Malanushenko, A. V.
2017-12-01
The McIntosh Archive consists of a set of hand-drawn solar Carrington maps created by Patrick McIntosh from 1964 to 2009. McIntosh used mainly Ha, He 10830Å and photospheric magnetic measurements from both ground-based and NASA satellite observations. With these he traced polarity inversion lines (PILs), filaments, sunspots and plage and, later, coronal holes, yielding a unique 45-year record of features associated with the large-scale organization of the solar magnetic field. We discuss our efforts to preserve and digitize this archive; the original hand-drawn maps have been scanned, a method for processing these scans into digital, searchable format has been developed, and a website and an archival repository at NOAA's National Centers for Environmental Information (NCEI) has been created. The archive is complete for SC 23 and partially complete for SCs 21 and 22. In this paper we show examples of how the data base can be utilized for scientific applications. We compare the evolution of the areas and boundaries of CHs with other recent results, and we use the maps to track the global, SC-evolution of filaments, large-scale positive and negative polarity regions, PILs and sunspots.
Ensuring long-term reliability of the data storage on optical disc
NASA Astrophysics Data System (ADS)
Chen, Ken; Pan, Longfa; Xu, Bin; Liu, Wei
2008-12-01
"Quality requirements and handling regulation of archival optical disc for electronic records filing" is released by The State Archives Administration of the People's Republic of China (SAAC) on its network in March 2007. This document established a complete operative managing process for optical disc data storage in archives departments. The quality requirements of the optical disc used in archives departments are stipulated. Quality check of the recorded disc before filing is considered to be necessary and the threshold of the parameter of the qualified filing disc is set down. The handling regulations for the staffs in the archives departments are described. Recommended environment conditions of the disc preservation, recording, accessing and testing are presented. The block error rate of the disc is selected as main monitoring parameter of the lifetime of the filing disc and three classes pre-alarm lines are created for marking of different quality check intervals. The strategy of monitoring the variation of the error rate curve of the filing discs and moving the data to a new disc or a new media when the error rate of the disc reaches the third class pre-alarm line will effectively guarantee the data migration before permanent loss. Only when every step of the procedure is strictly implemented, it is believed that long-term reliability of the data storage on optical disc for archives departments can be effectively ensured.
DeWitt, Nancy T.; Flocks, James G.; Pendleton, Elizabeth A.; Hansen, Mark E.; Reynolds, B.J.; Kelso, Kyle W.; Wiese, Dana S.; Worley, Charles R.
2012-01-01
See the digital FACS equipment log for details about the acquisition equipment used. Raw datasets are stored digitally at the USGS St. Petersburg Coastal and Marine Science Center and processed systematically using Novatel's GrafNav version 7.6, SANDS version 3.7, SEA SWATHplus version 3.06.04.03, CARIS HIPS AND SIPS version 3.6, and ESRI ArcGIS version 9.3.1. For more information on processing refer to the Equipment and Processing page. Chirp seismic data were also collected during these surveys and are archived separately.
Shared Governance and Regional Accreditation: Institutional Processes and Perceptions
ERIC Educational Resources Information Center
McGrane, Wendy L.
2013-01-01
This qualitative single-case research study was conducted to gain deeper understanding of the institutional processes to address shared governance accreditation criteria and to determine whether institutional processes altered stakeholder perceptions of shared governance. The data collection strategies were archival records and personal…
Restoration of the Apollo 15 Heat Flow Experiment Data from 1975 to 1977
NASA Technical Reports Server (NTRS)
Nagihara, S.; Nakamura, Y.; Taylor, P. T.; Williams, D. R.; Kiefer, W. S.
2017-01-01
The Apollo 15 Heat Flow Experiment (HFE) was conducted from July 1971 through January 1977. Two heat flow probes were deployed roughly 8.5 meters apart. Probe 1 and Probe 2 penetrated to 1.4-meters and 1-meter depths into the lunar regolith, respectively. Temperatures at different depths and the surface were logged with 7.25-minute intervals and transmitted to Earth. At the conclusion of the experiment, only data obtained from July 1971 through December 1974 were processed and archived at the National Space Science Data Center (NSSDC) by the principal investigator of the experiment, Marcus Langseth of Columbia University. Langseth died in 1997. It is not known what happened to the HFE data tapes he used. Current researchers have strong interests in re-examining the HFE data for the full duration of the experiment. We have recovered and processed large portions of the Apollo 15 HFE data from 1975 through 1977 by assembling data and metadata from various sources.
Improved discovery of NEON data and samples though vocabularies, workflows, and web tools
NASA Astrophysics Data System (ADS)
Laney, C. M.; Elmendorf, S.; Flagg, C.; Harris, T.; Lunch, C. K.; Gulbransen, T.
2017-12-01
The National Ecological Observatory Network (NEON) is a continental-scale ecological observation facility sponsored by the National Science Foundation and operated by Battelle. NEON supports research on the impacts of invasive species, land use change, and environmental change on natural resources and ecosystems by gathering and disseminating a full suite of observational, instrumented, and airborne datasets from field sites across the U.S. NEON also collects thousands of samples from soil, water, and organisms every year, and partners with numerous institutions to analyze and archive samples. We have developed numerous new technologies to support processing and discovery of this highly diverse collection of data. These technologies include applications for data collection and sample management, processing pipelines specific to each collection system (field observations, installed sensors, and airborne instruments), and publication pipelines. NEON data and metadata are discoverable and downloadable via both a public API and data portal. We solicit continued engagement and advice from the informatics and environmental research communities, particularly in the areas of data versioning, usability, and visualization.
Agile based "Semi-"Automated Data ingest process : ORNL DAAC example
NASA Astrophysics Data System (ADS)
Santhana Vannan, S. K.; Beaty, T.; Cook, R. B.; Devarakonda, R.; Hook, L.; Wei, Y.; Wright, D.
2015-12-01
The ORNL DAAC archives and publishes data and information relevant to biogeochemical, ecological, and environmental processes. The data archived at the ORNL DAAC must be well formatted, self-descriptive, and documented, as well as referenced in a peer-reviewed publication. The ORNL DAAC ingest team curates diverse data sets from multiple data providers simultaneously. To streamline the ingest process, the data set submission process at the ORNL DAAC has been recently updated to use an agile process and a semi-automated workflow system has been developed to provide a consistent data provider experience and to create a uniform data product. The goals of semi-automated agile ingest process are to: 1.Provide the ability to track a data set from acceptance to publication 2. Automate steps that can be automated to improve efficiencies and reduce redundancy 3.Update legacy ingest infrastructure 4.Provide a centralized system to manage the various aspects of ingest. This talk will cover the agile methodology, workflow, and tools developed through this system.
Recovery of Missing Apollo Lunar ALSEP Data
NASA Astrophysics Data System (ADS)
Taylor, P. T.; Nagihara, S.; Nakamura, Y.; Williams, D. R.; Kiefer, W. S.
2016-12-01
Apollo astronauts on missions 12, 14, 15, 16, and 17 installed instruments on the lunar surface, the Apollo Lunar Surface Experiment Package (ALSEP). The last astronauts departed from the Moon in December 1972; however ALSEP instruments continued to send data until 1977. These long-term in-situ data, along with data from orbital satellites launched from the Command Module, are some of the best information on the Moon's environment, surface and interior. Much of these data were archived at the now NASA Space Science Data Coordinated Archive (NSSDCA) in the 70's and 80's, but some were never submitted. This is particularly true of the ALSEP data returned autonomously after the last Apollo astronauts departed. The data that were archived were generally on microfilm, microfiche, or magnetic tape in now obsolete formats, making them difficult to use. Some of the documentation and metadata are insufficient for current use. The Lunar Data Node at Goddard Space Flight Center, under the auspices of the Planetary Data System (PDS) Geosciences Node, is attempting to collect and restore the original data that were never archived, in addition to much of the archived data that were on media and in formats that are outmoded. 440 original data archival tapes for the ALSEP experiments were found at the Washington National Records Center. We have recently completed extraction of binary files from these tapes filling a number of gaps in the current ALSEP data collection at NSSDCA. Some of these experiments include: Solar Wind Spectrometer (Apollo12, 15); Cold Cathode Ion Gage (14, 15); Heat Flow (15, 17); Dust Detector (11, 12, 14, 15); Lunar Ejecta and Meteorites (17); Lunar Atmosphere composition Experiment (17); Suprathermal Ion Detector (12, 14, 15); Lunar Surface Magnetometer (12,15, 16). The purpose of the Lunar Data Project is to take data collections already archived at the NSSDCA and prepare them for archive through PDS, and to locate lunar data that were never archived into NSSDCA, and then archive them through PDS. In addition results of recent re-analyses of some of these data with advanced data processing algorithms revealed more detailed interpretation (e.g., seismicity data). We expect that more techniques will be developed in the future.
Fault-tolerant back-up archive using an ASP model for disaster recovery
NASA Astrophysics Data System (ADS)
Liu, Brent J.; Huang, H. K.; Cao, Fei; Documet, Luis; Sarti, Dennis A.
2002-05-01
A single point of failure in PACS during a disaster scenario is the main archive storage and server. When a major disaster occurs, it is possible to lose an entire hospital's PACS data. Few current PACS archives feature disaster recovery, but the design is limited at best. These drawbacks include the frequency with which the back-up is physically removed to an offsite facility, the operational costs associated to maintain the back-up, the ease-of-use to perform the backup consistently and efficiently, and the ease-of-use to perform the PACS image data recovery. This paper describes a novel approach towards a fault-tolerant solution for disaster recovery of short-term PACS image data using an Application Service Provider model for service. The ASP back-up archive provides instantaneous, automatic backup of acquired PACS image data and instantaneous recovery of stored PACS image data all at a low operational cost. A back-up archive server and RAID storage device is implemented offsite from the main PACS archive location. In the example of this particular hospital, it was determined that at least 2 months worth of PACS image exams were needed for back-up. Clinical data from a hospital PACS is sent to this ASP storage server in parallel to the exams being archived in the main server. A disaster scenario was simulated and the PACS exams were sent from the offsite ASP storage server back to the hospital PACS. Initially, connectivity between the main archive and the ASP storage server is established via a T-1 connection. In the future, other more cost-effective means of connectivity will be researched such as the Internet 2. A disaster scenario was initiated and the disaster recovery process using the ASP back-up archive server was success in repopulating the clinical PACS within a short period of time. The ASP back-up archive was able to recover two months of PACS image data for comparison studies with no complex operational procedures. Furthermore, no image data loss was encountered during the recovery.
Implementation of an ASP model offsite backup archive for clinical images utilizing Internet 2
NASA Astrophysics Data System (ADS)
Liu, Brent J.; Chao, Sander S.; Documet, Jorge; Lee, Jasper; Lee, Michael; Topic, Ian; Williams, Lanita
2005-04-01
With the development of PACS technology and an increasing demand by medical facilities to become filmless, there is a need for a fast and efficient method of providing data backup for disaster recovery and downtime scenarios. At the Image Processing Informatics Lab (IPI), an ASP Backup Archive was developed using a fault-tolerant server with a T1 connection to serve the PACS at the St. John's Health Center (SJHC) Santa Monica, California. The ASP archive server has been in clinical operation for more than 18 months, and its performance was presented at this SPIE Conference last year. This paper extends the ASP Backup Archive to serve the PACS at the USC Healthcare Consultation Center II (HCC2) utilizing an Internet2 connection. HCC2 is a new outpatient facility that recently opened in April 2004. The Internet2 connectivity between USC's HCC2 and IPI has been established for over one year. There are two novelties of the current ASP model: 1) Use of Internet2 for daily clinical operation, and 2) Modifying the existing backup archive to handle two sites in the ASP model. This paper presents the evaluation of the ASP Backup Archive based on the following two criteria: 1) Reliability and performance of the Internet2 connection between HCC2 and IPI using DICOM image transfer in a clinical environment, and 2) Ability of the ASP Fault-Tolerant backup archive to support two separate clinical PACS sites simultaneously. The performances of using T1 and Internet2 at the two different sites are also compared.
NASA Astrophysics Data System (ADS)
Smith, Edward M.; Wandtke, John; Robinson, Arvin E.
1999-07-01
The selection criteria for the archive were based on the objectives of the Medical Information, Communication and Archive System (MICAS), a multi-vendor incremental approach to PACS. These objectives include interoperability between all components, seamless integration of the Radiology Information System (RIS) with MICAS and eventually other hospital databases, all components must demonstrate DICOM compliance prior to acceptance and automated workflow that can be programmed to meet changes in the healthcare environment. The long-term multi-modality archive is being implemented in 3 or more phases with the first phase designed to provide a 12 to 18 month storage solution. This decision was made because the cost per GB of storage is rapidly decreasing and the speed at which data can be retrieved is increasing with time. The open-solution selected allows incorporation of leading edge, 'best of breed' hardware and software and provides maximum jukeboxes, provides maximum flexibility of workflow both within and outside of radiology. The selected solution is media independent, supports multiple jukeboxes, provides expandable storage capacity and will provide redundancy and fault tolerance at minimal cost. Some of the required attributes of the archive include scalable archive strategy, virtual image database with global query and object-oriented database. The selection process took approximately 10 months with Cemax-Icon being the vendor selected. Prior to signing a purchase order, Cemax-Icon performed a site survey, agreed upon the acceptance test protocol and provided a written guarantee of connectivity between their archive and the imaging modalities and other MICAS components.
NASA Astrophysics Data System (ADS)
Grussenmeyer, P.; Khalil, O. Al
2017-08-01
The paper presents photogrammetric archives from Aleppo (Syria), collected between 1999 and 2002 by the Committee for maintenance and restoration of the Great Mosque in partnership with the Engineering Unit of the University of Aleppo. During that period, terrestrial photogrammetric data and geodetic surveys of the Great Omayyad mosque were recorded for documentation purposes and geotechnical studies. During the recent war in Syria, the Mosque has unfortunately been seriously damaged and its minaret has been completely destroyed. The paper presents a summary of the documentation available from the past projects as well as solutions of 3D reconstruction based on the processing of the photogrammetric archives with the latest 3D image-based techniques.
Characterizing Space Environments with Long-Term Space Plasma Archive Resources
NASA Technical Reports Server (NTRS)
Minow, Joseph I.; Miller, J. Scott; Diekmann, Anne M.; Parker, Linda N.
2009-01-01
A significant scientific benefit of establishing and maintaining long-term space plasma data archives is the ready access the archives afford to resources required for characterizing spacecraft design environments. Space systems must be capable of operating in the mean environments driven by climatology as well as the extremes that occur during individual space weather events. Long- term time series are necessary to obtain quantitative information on environment variability and extremes that characterize the mean and worst case environments that may be encountered during a mission. In addition, analysis of large data sets are important to scientific studies of flux limiting processes that provide a basis for establishing upper limits to environment specifications used in radiation or charging analyses. We present applications using data from existing archives and highlight their contributions to space environment models developed at Marshall Space Flight Center including the Chandra Radiation Model, ionospheric plasma variability models, and plasma models of the L2 space environment.
NASA Astrophysics Data System (ADS)
Young, John; Peacock, Sheila
2016-04-01
The year 1996 has particular significance for forensic seismologists. This was the year when the Comprehensive Test Ban Treaty (CTBT) was signed in September at the United Nations, setting an international norm against nuclear testing. Blacknest, as a long time seismic centre for research into detecting and identifying underground explosions using seismology, provided significant technical advice during the CTBT negotiations. Since 1962 seismic recordings of both presumed nuclear explosions and earthquakes from the four seismometer arrays Eskdalemuir, Scotland (EKA), Yellowknife, Canada (YKA), Gauribidanur, India (GBA), and Warramunga, Australia (WRA) have been copied, digitised, and saved. There was a possibility this archive would be lost. It was decided to process the records and catalogue them for distribution to other groups and institutions. This work continues at Blacknest but the archive is no longer under threat. In addition much of the archive of analogue tape recordings has been re-digitised with modern equipment, allowing sampling rates of 100 rather than 20 Hz.
NASA Technical Reports Server (NTRS)
Lloyd, J. F., Sr.
1987-01-01
Industrial radiography is a well established, reliable means of providing nondestructive structural integrity information. The majority of industrial radiographs are interpreted by trained human eyes using transmitted light and various visual aids. Hundreds of miles of radiographic information are evaluated, documented and archived annually. In many instances, there are serious considerations in terms of interpreter fatigue, subjectivity and limited archival space. Quite often it is difficult to quickly retrieve radiographic information for further analysis or investigation. Methods of improving the quality and efficiency of the radiographic process are being explored, developed and incorporated whenever feasible. High resolution cameras, digital image processing, and mass digital data storage offer interesting possibilities for improving the industrial radiographic process. A review is presented of computer aided radiographic interpretation technology in terms of how it could be used to enhance the radiographic interpretation process in evaluating radiographs of aluminum welds.
The global Landsat archive: Status, consolidation, and direction
Wulder, Michael A.; White, Joanne C.; Loveland, Thomas; Woodcock, Curtis; Belward, Alan; Cohen, Warren B.; Fosnight, Eugene A.; Shaw, Jerad; Masek, Jeffery G.; Roy, David P.
2016-01-01
New and previously unimaginable Landsat applications have been fostered by a policy change in 2008 that made analysis-ready Landsat data free and open access. Since 1972, Landsat has been collecting images of the Earth, with the early years of the program constrained by onboard satellite and ground systems, as well as limitations across the range of required computing, networking, and storage capabilities. Rather than robust on-satellite storage for transmission via high bandwidth downlink to a centralized storage and distribution facility as with Landsat-8, a network of receiving stations, one operated by the U.S. government, the other operated by a community of International Cooperators (ICs), were utilized. ICs paid a fee for the right to receive and distribute Landsat data and over time, more Landsat data was held outside the archive of the United State Geological Survey (USGS) than was held inside, much of it unique. Recognizing the critical value of these data, the USGS began a Landsat Global Archive Consolidation (LGAC) initiative in 2010 to bring these data into a single, universally accessible, centralized global archive, housed at the Earth Resources Observation and Science (EROS) Center in Sioux Falls, South Dakota. The primary LGAC goals are to inventory the data held by ICs, acquire the data, and ingest and apply standard ground station processing to generate an L1T analysis-ready product. As of January 1, 2015 there were 5,532,454 images in the USGS archive. LGAC has contributed approximately 3.2 million of those images, more than doubling the original USGS archive holdings. Moreover, an additional 2.3 million images have been identified to date through the LGAC initiative and are in the process of being added to the archive. The impact of LGAC is significant and, in terms of images in the collection, analogous to that of having had twoadditional Landsat-5 missions. As a result of LGAC, there are regions of the globe that now have markedly improved Landsat data coverage, resulting in an enhanced capacity for mapping, monitoring change, and capturing historic conditions. Although future missions can be planned and implemented, the past cannot be revisited, underscoring the value and enhanced significance of historical Landsat data and the LGAC initiative. The aim of this paper is to report the current status of the global USGS Landsat archive, document the existing and anticipated contributions of LGAC to the archive, and characterize the current acquisitions of Landsat-7 and Landsat-8. Landsat-8 is adding data to the archive at an unprecedented rate as nearly all terrestrial images are now collected. We also offer key lessons learned so far from the LGAC initiative, plus insights regarding other critical elements of the Landsat program looking forward, such as acquisition, continuity, temporal revisit, and the importance of continuing to operationalize the Landsat program.
ARM Operations and Engineering Procedure Mobile Facility Site Startup
DOE Office of Scientific and Technical Information (OSTI.GOV)
Voyles, Jimmy W
2015-05-01
This procedure exists to define the key milestones, necessary steps, and process rules required to commission and operate an Atmospheric Radiation Measurement (ARM) Mobile Facility (AMF), with a specific focus toward on-time product delivery to the ARM Data Archive. The overall objective is to have the physical infrastructure, networking and communications, and instrument calibration, grooming, and alignment (CG&A) completed with data products available from the ARM Data Archive by the Operational Start Date milestone.
Security of patient data when decommissioning ultrasound systems.
Moggridge, James
2017-02-01
Although ultrasound systems generally archive to Picture Archiving and Communication Systems (PACS), their archiving workflow typically involves storage to an internal hard disk before data are transferred onwards. Deleting records from the local system will delete entries in the database and from the file allocation table or equivalent but, as with a PC, files can be recovered. Great care is taken with disposal of media from a healthcare organisation to prevent data breaches, but ultrasound systems are routinely returned to lease companies, sold on or donated to third parties without such controls. In this project, five methods of hard disk erasure were tested on nine ultrasound systems being decommissioned: the system's own delete function; full reinstallation of system software; the manufacturer's own disk wiping service; open source disk wiping software for full and just blank space erasure. Attempts were then made to recover data using open source recovery tools. All methods deleted patient data as viewable from the ultrasound system and from browsing the disk from a PC. However, patient identifiable data (PID) could be recovered following the system's own deletion and the reinstallation methods. No PID could be recovered after using the manufacturer's wiping service or the open source wiping software. The typical method of reinstalling an ultrasound system's software may not prevent PID from being recovered. When transferring ownership, care should be taken that an ultrasound system's hard disk has been wiped to a sufficient level, particularly if the scanner is to be returned with approved parts and in a fully working state.
Exploiting Satellite Archives to Estimate Global Glacier Volume Changes
NASA Astrophysics Data System (ADS)
McNabb, R. W.; Nuth, C.; Kääb, A.; Girod, L.
2017-12-01
In the past decade, the availability of, and ability to process, remote sensing data over glaciers has expanded tremendously. Newly opened satellite image archives, combined with new processing techniques as well as increased computing power and storage capacity, have given the glaciological community the ability to observe and investigate glaciological processes and changes on a truly global scale. In particular, the opening of the ASTER archives provides further opportunities to both estimate and monitor glacier elevation and volume changes globally, including potentially on sub-annual timescales. With this explosion of data availability, however, comes the challenge of seeing the forest instead of the trees. The high volume of data available means that automated detection and proper handling of errors and biases in the data becomes critical, in order to properly study the processes that we wish to see. This includes holes and blunders in digital elevation models (DEMs) derived from optical data or penetration of radar signals leading to biases in DEMs derived from radar data, among other sources. Here, we highlight new advances in the ability to sift through high-volume datasets, and apply these techniques to estimate recent glacier volume changes in the Caucasus Mountains, Scandinavia, Africa, and South America. By properly estimating and correcting for these biases, we additionally provide a detailed accounting of the uncertainties in these estimates of volume changes, leading to more reliable results that have applicability beyond the glaciological community.
Picture archiving and communication in radiology.
Napoli, Marzia; Nanni, Marinella; Cimarra, Stefania; Crisafulli, Letizia; Campioni, Paolo; Marano, Pasquale
2003-01-01
After over 80 years of exclusive archiving of radiologic films, at present, in Radiology, digital archiving is increasingly gaining ground. Digital archiving allows a considerable reduction in costs and space saving, but most importantly, immediate or remote consultation of all examinations and reports in the hospital clinical wards, is feasible. The RIS system, in this case, is the starting point of the process of electronic archiving which however is the task of PACS. The latter can be used as radiologic archive in accordance with the law provided that it is in conformance with some specifications as the use of optical long-term storage media or with electronic track of change. PACS archives, in a hierarchical system, all digital images produced by each diagnostic imaging modality. Images and patient data can be retrieved and used for consultation or remote consultation by the reporting radiologist who requires images and reports of previous radiologic examinations or by the referring physician of the ward. Modern PACS owing to the WEB server allow remote access to extremely simplified images and data however ensuring the due regulations and access protections. Since the PACS enables a simpler data communication within the hospital, security and patient privacy should be protected. A secure and reliable PACS should be able to minimize the risk of accidental data destruction, and should prevent non authorized access to the archive with adequate security measures in relation to the acquired knowledge and based on the technological advances. Archiving of data produced by modern digital imaging is a problem now present also in small Radiology services. The technology is able to readily solve problems which were extremely complex up to some years ago as the connection between equipment and archiving system owing also to the universalization of the DICOM 3.0 standard. The evolution of communication networks and the use of standard protocols as TCP/IP can minimize problems of data and image remote transmission within the healthcare enterprise as well as over the territory. However, new problems are appearing as that of digital data security profiles and of the different systems which should ensure it. Among these, algorithms of electronic signature should be mentioned. In Italy they are validated by law and therefore can be used in digital archives in accordance with the law.
Geoinformation web-system for processing and visualization of large archives of geo-referenced data
NASA Astrophysics Data System (ADS)
Gordov, E. P.; Okladnikov, I. G.; Titov, A. G.; Shulgina, T. M.
2010-12-01
Developed working model of information-computational system aimed at scientific research in area of climate change is presented. The system will allow processing and analysis of large archives of geophysical data obtained both from observations and modeling. Accumulated experience of developing information-computational web-systems providing computational processing and visualization of large archives of geo-referenced data was used during the implementation (Gordov et al, 2007; Okladnikov et al, 2008; Titov et al, 2009). Functional capabilities of the system comprise a set of procedures for mathematical and statistical analysis, processing and visualization of data. At present five archives of data are available for processing: 1st and 2nd editions of NCEP/NCAR Reanalysis, ECMWF ERA-40 Reanalysis, JMA/CRIEPI JRA-25 Reanalysis, and NOAA-CIRES XX Century Global Reanalysis Version I. To provide data processing functionality a computational modular kernel and class library providing data access for computational modules were developed. Currently a set of computational modules for climate change indices approved by WMO is available. Also a special module providing visualization of results and writing to Encapsulated Postscript, GeoTIFF and ESRI shape files was developed. As a technological basis for representation of cartographical information in Internet the GeoServer software conforming to OpenGIS standards is used. Integration of GIS-functionality with web-portal software to provide a basis for web-portal’s development as a part of geoinformation web-system is performed. Such geoinformation web-system is a next step in development of applied information-telecommunication systems offering to specialists from various scientific fields unique opportunities of performing reliable analysis of heterogeneous geophysical data using approved computational algorithms. It will allow a wide range of researchers to work with geophysical data without specific programming knowledge and to concentrate on solving their specific tasks. The system would be of special importance for education in climate change domain. This work is partially supported by RFBR grant #10-07-00547, SB RAS Basic Program Projects 4.31.1.5 and 4.31.2.7, SB RAS Integration Projects 4 and 9.
Maximal Processing, or, Archivist on a Pale Horse
ERIC Educational Resources Information Center
Cox, Robert S.
2010-01-01
With the promise of greater economy in handling an ever-increasing volume of material, minimal processing has quickly become a new orthodoxy for the archival profession despite a raft of unintended consequences for service and discovery. Taking a long-term view of the costs and benefits entailed in the process of processing, the three-stage…
Ensuring Safety of Navigation: A Three-Tiered Approach
NASA Astrophysics Data System (ADS)
Johnson, S. D.; Thompson, M.; Brazier, D.
2014-12-01
The primary responsibility of the Hydrographic Department at the Naval Oceanographic Office (NAVOCEANO) is to support US Navy surface and sub-surface Safety of Navigation (SoN) requirements. These requirements are interpreted, surveys are conducted, and accurate products are compiled and archived for future exploitation. For a number of years NAVOCEANO has employed a two-tiered data-basing structure to support SoN. The first tier (Data Warehouse, or DWH) provides access to the full-resolution sonar and lidar data. DWH preserves the original data such that any scale product can be built. The second tier (Digital Bathymetric Database - Variable resolution, or DBDB-V) served as the final archive for SoN chart scale, gridded products compiled from source bathymetry. DBDB-V has been incorporated into numerous DoD tactical decision aids and serves as the foundation bathymetry for ocean modeling. With the evolution of higher density survey systems and the addition of high-resolution gridded bathymetry product requirements, a two-tiered model did not provide an efficient solution for SoN. The two-tiered approach required scientists to exploit full-resolution data in order to build any higher resolution product. A new perspective on the archival and exploitation of source data was required. This new perspective has taken the form of a third tier, the Navigation Surface Database (NSDB). NSDB is an SQLite relational database populated with International Hydrographic Organization (IHO), S-102 compliant Bathymetric Attributed Grids (BAGs). BAGs archived within NSDB are developed at the highest resolution that the collection sensor system can support and contain nodal estimates for depth, uncertainty, separation values and metadata. Gridded surface analysis efforts culminate in the generation of the source resolution BAG files and their storage within NSDB. Exploitation of these resources eliminates the time and effort needed to re-grid and re-analyze native source file formats.
NASA's Earth Science Data Systems
NASA Technical Reports Server (NTRS)
Ramapriyan, H. K.
2015-01-01
NASA's Earth Science Data Systems (ESDS) Program has evolved over the last two decades, and currently has several core and community components. Core components provide the basic operational capabilities to process, archive, manage and distribute data from NASA missions. Community components provide a path for peer-reviewed research in Earth Science Informatics to feed into the evolution of the core components. The Earth Observing System Data and Information System (EOSDIS) is a core component consisting of twelve Distributed Active Archive Centers (DAACs) and eight Science Investigator-led Processing Systems spread across the U.S. The presentation covers how the ESDS Program continues to evolve and benefits from as well as contributes to advances in Earth Science Informatics.
The TESS science processing operations center
NASA Astrophysics Data System (ADS)
Jenkins, Jon M.; Twicken, Joseph D.; McCauliff, Sean; Campbell, Jennifer; Sanderfer, Dwight; Lung, David; Mansouri-Samani, Masoud; Girouard, Forrest; Tenenbaum, Peter; Klaus, Todd; Smith, Jeffrey C.; Caldwell, Douglas A.; Chacon, A. D.; Henze, Christopher; Heiges, Cory; Latham, David W.; Morgan, Edward; Swade, Daryl; Rinehart, Stephen; Vanderspek, Roland
2016-08-01
The Transiting Exoplanet Survey Satellite (TESS) will conduct a search for Earth's closest cousins starting in early 2018 and is expected to discover 1,000 small planets with Rp < 4 R⊕ and measure the masses of at least 50 of these small worlds. The Science Processing Operations Center (SPOC) is being developed at NASA Ames Research Center based on the Kepler science pipeline and will generate calibrated pixels and light curves on the NASA Advanced Supercomputing Division's Pleiades supercomputer. The SPOC will also search for periodic transit events and generate validation products for the transit-like features in the light curves. All TESS SPOC data products will be archived to the Mikulski Archive for Space Telescopes (MAST).
The NOAA-NASA CZCS Reanalysis Effort
NASA Technical Reports Server (NTRS)
Gregg, Watson W.; Conkright, Margarita E.; OReilly, John E.; Patt, Frederick S.; Wang, Meng-Hua; Yoder, James; Casey-McCabe, Nancy; Koblinsky, Chester J. (Technical Monitor)
2001-01-01
Satellite observations of global ocean chlorophyll span over two decades. However, incompatibilities between processing algorithms prevent us from quantifying natural variability. We applied a comprehensive reanalysis to the Coastal Zone Color Scanner (CZCS) archive, called the NOAA-NASA CZCS Reanalysis (NCR) Effort. NCR consisted of 1) algorithm improvement (AI), where CZCS processing algorithms were improved using modernized atmospheric correction and bio-optical algorithms, and 2) blending, where in situ data were incorporated into the CZCS AI to minimize residual errors. The results indicated major improvement over the previously available CZCS archive. Global spatial and seasonal patterns of NCR chlorophyll indicated remarkable correspondence with modern sensors, suggesting compatibility. The NCR permits quantitative analyses of interannual and interdecadal trends in global ocean chlorophyll.
Video Analytics for Indexing, Summarization and Searching of Video Archives
DOE Office of Scientific and Technical Information (OSTI.GOV)
Trease, Harold E.; Trease, Lynn L.
This paper will be submitted to the proceedings The Eleventh IASTED International Conference on. Signal and Image Processing. Given a video or video archive how does one effectively and quickly summarize, classify, and search the information contained within the data? This paper addresses these issues by describing a process for the automated generation of a table-of-contents and keyword, topic-based index tables that can be used to catalogue, summarize, and search large amounts of video data. Having the ability to index and search the information contained within the videos, beyond just metadata tags, provides a mechanism to extract and identify "useful"more » content from image and video data.« less
El Programa de Fortalecimiento de Capacidades de COSPAR
NASA Astrophysics Data System (ADS)
Gabriel, C.
2016-08-01
The provision of scientific data archives and analysis tools by diverse institutions in the world represents a unique opportunity for the development of scientific activities. An example of this is the European Space Agency's space observatory XMM-Newton with its Science Operations Centre at the European Space Astronomy Centre near Madrid, Spain. It provides through its science archive and web pages, not only the raw and processed data from the mission, but also analysis tools, and full documentation greatly helping their dissemination and use. These data and tools, freely accesible to anyone in the world, are the practical elements around which COSPAR (COmmittee on SPAce Research) Capacity Building Workshops have been conceived and developed, and held for a decade and a half in developing countries. The Programme started with X-ray workshops, but in-between it has been broadened to the most diverse space science areas. The workshops help to develop science at the highest level in those countries, in a long and substainable way, with a minimal investment (computer plus a moderate Internet connection). In this paper we discuss the basis, concepts, and achievements of the Capacity Building Programme. Two instances of the Programme have already taken place in Argentina, one of them devoted to X-ray astronomy and another to Infrared Astronomy. Several others have been organised for the Latin American region (Brazil, Uruguay and Mexico) with a large participation of young investigators from Argentina.
Anatomy of an Extensible Open Source PACS.
Valente, Frederico; Silva, Luís A Bastião; Godinho, Tiago Marques; Costa, Carlos
2016-06-01
The conception and deployment of cost effective Picture Archiving and Communication Systems (PACS) is a concern for small to medium medical imaging facilities, research environments, and developing countries' healthcare institutions. Financial constraints and the specificity of these scenarios contribute to a low adoption rate of PACS in those environments. Furthermore, with the advent of ubiquitous computing and new initiatives to improve healthcare information technologies and data sharing, such as IHE and XDS-i, a PACS must adapt quickly to changes. This paper describes Dicoogle, a software framework that enables developers and researchers to quickly prototype and deploy new functionality taking advantage of the embedded Digital Imaging and Communications in Medicine (DICOM) services. This full-fledged implementation of a PACS archive is very amenable to extension due to its plugin-based architecture and out-of-the-box functionality, which enables the exploration of large DICOM datasets and associated metadata. These characteristics make the proposed solution very interesting for prototyping, experimentation, and bridging functionality with deployed applications. Besides being an advanced mechanism for data discovery and retrieval based on DICOM object indexing, it enables the detection of inconsistencies in an institution's data and processes. Several use cases have benefited from this approach such as radiation dosage monitoring, Content-Based Image Retrieval (CBIR), and the use of the framework as support for classes targeting software engineering for clinical contexts.
NASA Astrophysics Data System (ADS)
Longmore, S. P.; Bikos, D.; Szoke, E.; Miller, S. D.; Brummer, R.; Lindsey, D. T.; Hillger, D.
2014-12-01
The increasing use of mobile phones equipped with digital cameras and the ability to post images and information to the Internet in real-time has significantly improved the ability to report events almost instantaneously. In the context of severe weather reports, a representative digital image conveys significantly more information than a simple text or phone relayed report to a weather forecaster issuing severe weather warnings. It also allows the forecaster to reasonably discern the validity and quality of a storm report. Posting geo-located, time stamped storm report photographs utilizing a mobile phone application to NWS social media weather forecast office pages has generated recent positive feedback from forecasters. Building upon this feedback, this discussion advances the concept, development, and implementation of a formalized Photo Storm Report (PSR) mobile application, processing and distribution system and Advanced Weather Interactive Processing System II (AWIPS-II) plug-in display software.The PSR system would be composed of three core components: i) a mobile phone application, ii) a processing and distribution software and hardware system, and iii) AWIPS-II data, exchange and visualization plug-in software. i) The mobile phone application would allow web-registered users to send geo-location, view direction, and time stamped PSRs along with severe weather type and comments to the processing and distribution servers. ii) The servers would receive PSRs, convert images and information to NWS network bandwidth manageable sizes in an AWIPS-II data format, distribute them on the NWS data communications network, and archive the original PSRs for possible future research datasets. iii) The AWIPS-II data and exchange plug-ins would archive PSRs, and the visualization plug-in would display PSR locations, times and directions by hour, similar to surface observations. Hovering on individual PSRs would reveal photo thumbnails and clicking on them would display the full resolution photograph.Here, we present initial NWS forecaster feedback received from social media posted PSRs, motivating the possible advantages of PSRs within AWIPS-II, the details of developing and implementing a PSR system, and possible future applications beyond severe weather reports and AWIPS-II.
Calderon, Karynna; Dadisman, Shawn V.; Tihansky, Ann B.; Lewelling, Bill R.; Flocks, James G.; Wiese, Dana S.; Kindinger, Jack G.; Harrison, Arnell S.
2006-01-01
In October and November of 1995 and February of 1996, the U.S. Geological Survey, in cooperation with the Southwest Florida Water Management District, conducted geophysical surveys of the Peace River in west-central Florida from east of Bartow to west of Arcadia. This report serves as an archive of unprocessed digital boomer seismic reflection data, trackline maps, navigation files, GIS files, Field Activity Collection System (FACS) logs, observers' logbooks, and formal FGDC metadata. Filtered and gained digital images of the seismic profiles are also provided. Refer to the Acronyms page for expansion of acronyms and abbreviations used in this report. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG-Y format (Barry and others, 1975) and may be downloaded and processed with commercial or public domain software such as Seismic Unix (SU). Example SU processing scripts and USGS software for viewing the SEG-Y files (Zihlman, 1992) are also provided.
NASA Technical Reports Server (NTRS)
Kempler, Steve; Alcott, Gary; Lynnes, Chris; Leptoukh, Greg; Vollmer, Bruce; Berrick, Steve
2008-01-01
NASA Earth Sciences Division (ESD) has made great investments in the development and maintenance of data management systems and information technologies, to maximize the use of NASA generated Earth science data. With information management system infrastructure in place, mature and operational, very small delta costs are required to fully support data archival, processing, and data support services required by the recommended Decadal Study missions. This presentation describes the services and capabilities of the Goddard Space Flight Center (GSFC) Earth Sciences Data and Information Services Center (GES DISC) and the reusability for these future missions. The GES DISC has developed a series of modular, reusable data management components currently in use. They include data archive and distribution (Simple, Scalable, Script-based, Science [S4] Product Archive aka S4PA), data processing (S4 Processor for Measurements aka S4PM), data search (Mirador), data browse, visualization, and analysis (Giovanni), and data mining services. Information management system components are based on atmospheric scientist inputs. Large development and maintenance cost savings can be realized through their reuse in future missions.
NASA Astrophysics Data System (ADS)
Tapete, Deodato; Cigna, Francesca; Donoghue, Daniel N. M.; Philip, Graham
2015-05-01
On the turn of radar space science with the recent launch of Sentinel-1A, we investigate how to better exploit the opportunities offered by large C-band SAR archives and increasing datasets of HR to VHR X-band data, to map changes and damages in urban and rural areas affected by conflicts. We implement a dual approach coupling multi-interferogram processing and amplitude change detection, to assess the impact of the recent civil war on the city of Homs, Western Syria, and the surrounding semi-arid landscape. More than 280,000 coherent pixels are retrieved from Small BAseline Subset (SBAS) processing of the 8year-long ENVISAT ASAR IS2 archive, to quantify land subsidence due to pre-war water abstraction in rural areas. Damages in Homs are detected by analysing the changes of SAR backscattering (σ0), comparing 3m-resolution StripMap TerraSAR-X pairs from 2009 to 2014. Pre-war alteration is differentiated from war-related damages via operator-driven interpretation of the σ0 patterns.
Applying Service-Oriented Architecture to Archiving Data in Control and Monitoring Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nogiec, J. M.; Trombly-Freytag, K.
Current trends in the architecture of software systems focus our attention on building systems using a set of loosely coupled components, each providing a specific functionality known as service. It is not much different in control and monitoring systems, where a functionally distinct sub-system can be identified and independently designed, implemented, deployed and maintained. One functionality that renders itself perfectly to becoming a service is archiving the history of the system state. The design of such a service and our experience of using it are the topic of this article. The service is built with responsibility segregation in mind, therefore,more » it provides for reducing data processing on the data viewer side and separation of data access and modification operations. The service architecture and the details concerning its data store design are discussed. An implementation of a service client capable of archiving EPICS process variables (PV) and LabVIEW shared variables is presented. Data access tools, including a browser-based data viewer and a mobile viewer, are also presented.« less
NASA Astrophysics Data System (ADS)
Roederer, Ian U.; Karakas, Amanda I.; Pignatari, Marco; Herwig, Falk
2016-04-01
We present a detailed analysis of the composition and nucleosynthetic origins of the heavy elements in the metal-poor ([Fe/H] = -1.62 ± 0.09) star HD 94028. Previous studies revealed that this star is mildly enhanced in elements produced by the slow neutron-capture process (s process; e.g., [Pb/Fe] = +0.79 ± 0.32) and rapid neutron-capture process (r process; e.g., [Eu/Fe] = +0.22 ± 0.12), including unusually large molybdenum ([Mo/Fe] = +0.97 ± 0.16) and ruthenium ([Ru/Fe] = +0.69 ± 0.17) enhancements. However, this star is not enhanced in carbon ([C/Fe] = -0.06 ± 0.19). We analyze an archival near-ultraviolet spectrum of HD 94028, collected using the Space Telescope Imaging Spectrograph on board the Hubble Space Telescope, and other archival optical spectra collected from ground-based telescopes. We report abundances or upper limits derived from 64 species of 56 elements. We compare these observations with s-process yields from low-metallicity AGB evolution and nucleosynthesis models. No combination of s- and r-process patterns can adequately reproduce the observed abundances, including the super-solar [As/Ge] ratio (+0.99 ± 0.23) and the enhanced [Mo/Fe] and [Ru/Fe] ratios. We can fit these features when including an additional contribution from the intermediate neutron-capture process (I process), which perhaps operated through the ingestion of H in He-burning convective regions in massive stars, super-AGB stars, or low-mass AGB stars. Currently, only the I process appears capable of consistently producing the super-solar [As/Ge] ratios and ratios among neighboring heavy elements found in HD 94028. Other metal-poor stars also show enhanced [As/Ge] ratios, hinting that operation of the I process may have been common in the early Galaxy. These data are associated with Program 072.B-0585(A), PI. Silva. Some data presented in this paper were obtained from the Barbara A. Mikulski Archive for Space Telescopes (MAST). The Space Telescope Science Institute is operated by the Association of Universities for Research in Astronomy, Inc., under NASA contract NAS5-26555. These data are associated with Programs GO-7402 and GO-8197. This work is based on data obtained from the European Southern Observatory (ESO) Science Archive Facility. These data are associated with Program 072.B-0585(A). This paper includes data taken at The McDonald Observatory of The University of Texas at Austin.
Using modern imaging techniques to old HST data: a summary of the ALICE program.
NASA Astrophysics Data System (ADS)
Choquet, Elodie; Soummer, Remi; Perrin, Marshall; Pueyo, Laurent; Hagan, James Brendan; Zimmerman, Neil; Debes, John Henry; Schneider, Glenn; Ren, Bin; Milli, Julien; Wolff, Schuyler; Stark, Chris; Mawet, Dimitri; Golimowski, David A.; Hines, Dean C.; Roberge, Aki; Serabyn, Eugene
2018-01-01
Direct imaging of extrasolar systems is a powerful technique to study the physical properties of exoplanetary systems and understand their formation and evolution mechanisms. The detection and characterization of these objects are challenged by their high contrast with their host star. Several observing strategies and post-processing algorithms have been developed for ground-based high-contrast imaging instruments, enabling the discovery of directly-imaged and spectrally-characterized exoplanets. The Hubble Space Telescope (HST), pioneer in directly imaging extrasolar systems, has yet been often limited to the detection of bright debris disks systems, with sensitivity limited by the difficulty to implement an optimal PSF subtraction stategy, which is readily offered on ground-based telescopes in pupil tracking mode.The Archival Legacy Investigations of Circumstellar Environments (ALICE) program is a consistent re-analysis of the 10 year old coronagraphic archive of HST's NICMOS infrared imager. Using post-processing methods developed for ground-based observations, we used the whole archive to calibrate PSF temporal variations and improve NICMOS's detection limits. We have now delivered ALICE-reprocessed science products for the whole NICMOS archival data back to the community. These science products, as well as the ALICE pipeline, were used to prototype the JWST coronagraphic data and reduction pipeline. The ALICE program has enabled the detection of 10 faint debris disk systems never imaged before in the near-infrared and several substellar companion candidates, which we are all in the process of characterizing through follow-up observations with both ground-based facilities and HST-STIS coronagraphy. In this publication, we provide a summary of the results of the ALICE program, advertise its science products and discuss the prospects of the program.
NASA Astrophysics Data System (ADS)
Stevens, S. E.; Nelson, B. R.; Langston, C.; Qi, Y.
2012-12-01
The National Mosaic and Multisensor QPE (NMQ/Q2) software suite, developed at NOAA's National Severe Storms Laboratory (NSSL) in Norman, OK, addresses a large deficiency in the resolution of currently archived precipitation datasets. Current standards, both radar- and satellite-based, provide for nationwide precipitation data with a spatial resolution of up to 4-5 km, with a temporal resolution as fine as one hour. Efforts are ongoing to process archived NEXRAD data for the period of record (1996 - present), producing a continuous dataset providing precipitation data at a spatial resolution of 1 km, on a timescale of only five minutes. In addition, radar-derived precipitation data are adjusted hourly using a wide variety of automated gauge networks spanning the United States. Applications for such a product range widely, from emergency management and flash flood guidance, to hydrological studies and drought monitoring. Results are presented from a subset of the NEXRAD dataset, providing basic statistics on the distribution of rainrates, relative frequency of precipitation types, and several other variables which demonstrate the variety of output provided by the software. Precipitation data from select case studies are also presented to highlight the increased resolution provided by this reanalysis and the possibilities that arise from the availability of data on such fine scales. A previously completed pilot project and steps toward a nationwide implementation are presented along with proposed strategies for managing and processing such a large dataset. Reprocessing efforts span several institutions in both North Carolina and Oklahoma, and data/software coordination are key in producing a homogeneous record of precipitation to be archived alongside NOAA's other Climate Data Records. Methods are presented for utilizing supercomputing capability in expediting processing, to allow for the iterative nature of a reanalysis effort.
Characterization and Validation of Transiting Planets in the TESS SPOC Pipeline
NASA Astrophysics Data System (ADS)
Twicken, Joseph D.; Caldwell, Douglas A.; Davies, Misty; Jenkins, Jon Michael; Li, Jie; Morris, Robert L.; Rose, Mark; Smith, Jeffrey C.; Tenenbaum, Peter; Ting, Eric; Wohler, Bill
2018-06-01
Light curves for Transiting Exoplanet Survey Satellite (TESS) target stars will be extracted and searched for transiting planet signatures in the Science Processing Operations Center (SPOC) Science Pipeline at NASA Ames Research Center. Targets for which the transiting planet detection threshold is exceeded will be processed in the Data Validation (DV) component of the Pipeline. The primary functions of DV are to (1) characterize planets identified in the transiting planet search, (2) search for additional transiting planet signatures in light curves after modeled transit signatures have been removed, and (3) perform a comprehensive suite of diagnostic tests to aid in discrimination between true transiting planets and false positive detections. DV data products include extensive reports by target, one-page summaries by planet candidate, and tabulated transit model fit and diagnostic test results. DV products may be employed by humans and automated systems to vet planet candidates identified in the Pipeline. TESS will launch in 2018 and survey the full sky for transiting exoplanets over a period of two years. The SPOC pipeline was ported from the Kepler Science Operations Center (SOC) codebase and extended for TESS after the mission was selected for flight in the NASA Astrophysics Explorer program. We describe the Data Validation component of the SPOC Pipeline. The diagnostic tests exploit the flux (i.e., light curve) and pixel time series associated with each target to support the determination of the origin of each purported transiting planet signature. We also highlight the differences between the DV components for Kepler and TESS. Candidate planet detections and data products will be delivered to the Mikulski Archive for Space Telescopes (MAST); the MAST URL is archive.stsci.edu/tess. Funding for the TESS Mission has been provided by the NASA Science Mission Directorate.
Application of Remote Sensing for the Analysis of Environmental Changes in Albania
NASA Astrophysics Data System (ADS)
Frasheri, N.; Beqiraj, G.; Bushati, S.; Frasheri, A.
2016-08-01
In the paper there is presented a review of remote sensing studies carried out for investigation of environmental changes in Albania. Using, often simple methodologies and general purpose image processing software, and exploiting free Internet archives of satellite imagery, significant results were obtained for hot areas of environmental changes. Such areas include sea coasts experiencing sea transgression, temporal variations of vegetation and aerosols, lakes, landslides and regional tectonics. Internet archives of European Space Agency ESA and USA Geological Service USGS are used.
Improving Image Drizzling in the HST Archive: Advanced Camera for Surveys
NASA Astrophysics Data System (ADS)
Hoffmann, Samantha L.; Avila, Roberto J.
2017-06-01
The Mikulski Archive for Space Telescopes (MAST) pipeline performs geometric distortion corrections, associated image combinations, and cosmic ray rejections with AstroDrizzle on Hubble Space Telescope (HST) data. The MDRIZTAB reference table contains a list of relevant parameters that controls this program. This document details our photometric analysis of Advanced Camera for Surveys Wide Field Channel (ACS/WFC) data processed by AstroDrizzle. Based on this analysis, we update the MDRIZTAB table to improve the quality of the drizzled products delivered by MAST.
Extending the role of a healthcare digital library environment to support orthopaedic research.
Miles-Board, Timothy; Carr, Leslie; Wills, Gary; Power, Guillermo; Bailey, Christopher; Hall, Wendy; Stenning, Matthew; Grange, Simon
2006-06-01
A digital archive, together with its users and its contents, does not exist in isolation; there is a cycle of activities which provides the context for the archive's existence. In arguing for the broadening of the traditional view of digital libraries as merely collections towards the processes of collecting and deploying, we have developed an extend ed digital library environment for orthopaedic surgeons which bridges the gap between the undertaking of experimental work and the dissemination of its results through electronic publication.
2007-04-01
the underlying parameters are available. Standard data format. Battelle, SAGEM Avionics, and Austin Digital, Inc. agreed upon a standard data format...data was initiated at four airlines by SAGEM Avionics beginning January 1, 2006. Transfer was initiated at one airline by Aus- tin Digital, Inc...internal issues have been resolved. As of April 0, 2006, more than 124,000 flights have been transferred to local archive servers by SAGEM and over
Kyiv UkrVO glass archives: new life
NASA Astrophysics Data System (ADS)
Pakuliak, L.; Golovnya, V.; Andruk, V.; Shatokhina, S.; Yizhakevych, O.; Kazantseva, L.; Lukianchuk, V.
In the framework of UkrVO national project the new methods of plate digital image processing are developed. The photographic material of the UkrVO Joint Digital Archive (JDA) is used for the solution of classic astrometric problem - positional and photometric determinations of objects registered on the plates. The results of tested methods show that the positional rms errors are better than ±150 mas for both coordinates and photometric ones are better than ±0.20m with the Tycho-2 catalogue as reference.
Characterizing the LANDSAT Global Long-Term Data Record
NASA Technical Reports Server (NTRS)
Arvidson, T.; Goward, S. N.; Williams, D. L.
2006-01-01
The effects of global climate change are fast becoming politically, sociologically, and personally important: increasing storm frequency and intensity, lengthening cycles of drought and flood, expanding desertification and soil salinization. A vital asset in the analysis of climate change on a global basis is the 34-year record of Landsat imagery. In recognition of its increasing importance, a detailed analysis of the Landsat observation coverage within the US archive was commissioned. Results to date indicate some unexpected gaps in the US-held archive. Fortunately, throughout the Landsat program, data have been downlinked routinely to International Cooperator (IC) ground stations for archival, processing, and distribution. These IC data could be combined with the current US holdings to build a nearly global, annual observation record over this 34-year period. Today, we have inadequate information as to which scenes are available from which IC archives. Our best estimate is that there are over four million digital scenes in the IC archives, compared with the nearly two million scenes held in the US archive. This vast pool of Landsat observations needs to be accurately documented, via metadata, to determine the existence of complementary scenes and to characterize the potential scope of the global Landsat observation record. Of course, knowing the extent and completeness of the data record is but the first step. It will be necessary to assure that the data record is easy to use, internally consistent in terms of calibration and data format, and fully accessible in order to fully realize its potential.
Harmonize Pipeline and Archiving Aystem: PESSTO@IA2 Use Case
NASA Astrophysics Data System (ADS)
Smareglia, R.; Knapic, C.; Molinaro, M.; Young, D.; Valenti, S.
2013-10-01
Italian Astronomical Archives Center (IA2) is a research infrastructure project that aims at coordinating different national and international initiatives to improve the quality of astrophysical data services. IA2 is now also involved in the PESSTO (Public ESO Spectroscopic Survey of Transient Objects) collaboration, developing a complete archiving system to store calibrated post processed data (including sensitive intermediate products), a user interface to access private data and Virtual Observatory (VO) compliant web services to access public fast reduction data via VO tools. The archive system shall rely on the PESSTO Marshall to provide file data and its associated metadata output by the PESSTO data-reduction pipeline. To harmonize the object repository, data handling and archiving system, new tools are under development. These systems must have a strong cross-interaction without increasing the complexities of any single task, in order to improve the performances of the whole system and must have a sturdy logic in order to perform all operations in coordination with the other PESSTO tools. MySQL Replication technology and triggers are used for the synchronization of new data in an efficient, fault tolerant manner. A general purpose library is under development to manage data starting from raw observations to final calibrated ones, open to the overriding of different sources, formats, management fields, storage and publication policies. Configurations for all the systems are stored in a dedicated schema (no configuration files), but can be easily updated by a planned Archiving System Configuration Interface (ASCI).
Edward Bermingham and the archives of clinical surgery: America's First Surgical Journal.
Rutkow, Ira
2015-04-01
To explore the life of Edward J. Bermingham (1853-1922) and his founding, in 1876, of the Archives of Clinical Surgery, the nation's first surgical journal. Beginning in the 1870s, American medicine found itself in the middle of a revolution marked by fundamental economic, scientific, and social transformations. For those physicians who wanted to be regarded as surgeons, the push toward specialization was foremost among these changes. The rise of surgery as a specialty was accomplished through various new initiatives; among them was the development of dedicated literature in the form of specialty journals to disseminate news of surgical research and technical innovations in a timely fashion. An analysis of the published medical and lay literature and unpublished documents relating to Edward J. Bermingham and the Archives of Clinical Surgery. At a time when surgery was not considered a separate branch of medicine but a mere technical mode of treatment, Bermingham's publication of the Archives of Clinical Surgery was a milestone event in the ensuing rise of surgery as a specialty within the whole of medicine. The long forgotten Archives of Clinical Surgery provides a unique window into the world of surgery, as it existed when the medical revolution and the process of specialization were just beginning. For this reason, the Archives is among the more important primary resources with which to gain an understanding of prescientific surgery as it reached its endpoint in America.
NASA Astrophysics Data System (ADS)
Helly, M.; Massell Symons, C.; Reining, J.; Staudigel, H.; Koppers, A.; Helly, J.; Miller, S.
2005-12-01
The Enduring Resources for Earth Science Education (ERESE) project has now held two professional development workshops to teach and apply the five stage inquiry lesson model for teaching plate tectonics. This development based on a collaborative effort between earth scientists, educators, librarians, and data archive managers, and works towards a classroom practice that focuses on transferring ownership of a classroom inquiry to the learner. The ERESE inquiry model features a modular, five stage approach: (1) a thoughtful orientation to create an environment of physical and intellectual safety for the learner, (2) a carefully chosen provocative phenomenon used to allow the learner to develop a wide range of scientific questions (3) a debriefing that reviews and honors the learners' questions along with the development of a testable hypothesis, (4) learners consult with ERESE resource matrices and the internet to obtain data and other information to test the hypothesis, and (5) the learners present their results in a presentation. The process of ERESE inquiry lessons is guided by a master template and involves a detailed teachers log for documentation of all activities. All products of the process are archived. The master template and teachers log are designed in a modular fashion that ultimately will accommodate a wide range of inquiry lesson styles and the variety of resources available to support the process. Key ERESE modules include: (1) a master template that provides a framework for lesson development, (2) provocative phenomenon for question generation and hypothesis development by the learner, (3) the ERESE resource matrix (which archives text, images and data by expert level for a wide range of scientific questions), and (4) a reflective essay that monitors the ownership transfer to the learner. Modular design of ERESE products allows for the archival of specific types of materials that can be independently accessed and applied to different inquiry styles. The broad appeal is an important step toward a more general product for inquiry based teaching.
NASA Technical Reports Server (NTRS)
Graves, Sara J.
1994-01-01
Work on this project was focused on information management techniques for Marshall Space Flight Center's EOSDIS Version 0 Distributed Active Archive Center (DAAC). The centerpiece of this effort has been participation in EOSDIS catalog interoperability research, the result of which is a distributed Information Management System (IMS) allowing the user to query the inventories of all the DAAC's from a single user interface. UAH has provided the MSFC DAAC database server for the distributed IMS, and has contributed to definition and development of the browse image display capabilities in the system's user interface. Another important area of research has been in generating value-based metadata through data mining. In addition, information management applications for local inventory and archive management, and for tracking data orders were provided.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Prather, J. C.; Smith, S. K.; Watson, C. R.
The National Radiobiology Archives is a comprehensive effort to gather, organize, and catalog original data, representative specimens, and supporting materials related to significant radiobiology studies. This provides researchers with information for analyses which compare or combine results of these and other studies and with materials for analysis by advanced molecular biology techniques. This Programmer's Guide document describes the database access software, NRADEMO, and the subset loading script NRADEMO/MAINT/MAINTAIN, which comprise the National Laboratory Archives Distributed Access Package. The guide is intended for use by an experienced database management specialist. It contains information about the physical and logical organization of themore » software and data files. It also contains printouts of all the scripts and associated batch processing files. It is part of a suite of documents published by the National Radiobiology Archives.« less
Restoration of Apollo Data by the NSSDC and the PDS Lunar Data Node
NASA Technical Reports Server (NTRS)
Williams, David R.; Hills, H. Kent; Lowman, Paul D.; Taylor, Patrick T.; Guinness, Edward A.
2011-01-01
The Lunar Data Node (LDN), under the auspices of the Geosciences Node of the Planetary Data System (PDS), is restoring Apollo data archived at the National Space Science Data Center. The Apollo data were arch ived on older media (7 -track tapes. microfilm, microfiche) and in ob solete digital formats, which limits use of the data. The LDN is maki ng these data accessible by restoring them to standard formats and archiving them through PDS. The restoration involves reading the older m edia, collecting supporting data (metadata), deciphering and understa nding the data, and organizing into a data set. The data undergo a pe er review before archive at PDS. We will give an update on last year' s work. We have scanned notebooks from Otto Berg, P.1. for the Lunar Ejecta and Meteorites Experiment. These notebooks contain information on the data and calibration coefficients which we hope to be able to use to restore the raw data into a usable archive. We have scanned Ap ollo 14 and 15 Dust Detector data from microfilm and are in the proce ss of archiving thc scans with PDS. We are also restoring raw dust de tector data from magnetic tape supplied by Yosio Nakamura (UT Austin) . Seiichi Nagihara (Texas Tech Univ.) and others in cooperation with NSSDC are recovering ARCSAV tapes (tapes containing raw data streams from all the ALSEP instruments). We will be preparing these data for archive with PDS. We are also in the process of recovering and archivi ng data not previously archived, from the Apollo 16 Gamma Ray Spectro meter and the Apollo 17 Infrared Spectrometer.
A Waveform Archiving System for the GE Solar 8000i Bedside Monitor.
Fanelli, Andrea; Jaishankar, Rohan; Filippidis, Aristotelis; Holsapple, James; Heldt, Thomas
2018-01-01
Our objective was to develop, deploy, and test a data-acquisition system for the reliable and robust archiving of high-resolution physiological waveform data from a variety of bedside monitoring devices, including the GE Solar 8000i patient monitor, and for the logging of ancillary clinical and demographic information. The data-acquisition system consists of a computer-based archiving unit and a GE Tram Rac 4A that connects to the GE Solar 8000i monitor. Standard physiological front-end sensors connect directly to the Tram Rac, which serves as a port replicator for the GE monitor and provides access to these waveform signals through an analog data interface. Together with the GE monitoring data streams, we simultaneously collect the cerebral blood flow velocity envelope from a transcranial Doppler ultrasound system and a non-invasive arterial blood pressure waveform along a common time axis. All waveform signals are digitized and archived through a LabView-controlled interface that also allows for the logging of relevant meta-data such as clinical and patient demographic information. The acquisition system was certified for hospital use by the clinical engineering team at Boston Medical Center, Boston, MA, USA. Over a 12-month period, we collected 57 datasets from 11 neuro-ICU patients. The system provided reliable and failure-free waveform archiving. We measured an average temporal drift between waveforms from different monitoring devices of 1 ms every 66 min of recorded data. The waveform acquisition system allows for robust real-time data acquisition, processing, and archiving of waveforms. The temporal drift between waveforms archived from different devices is entirely negligible, even for long-term recording.
NASA Technical Reports Server (NTRS)
Green, James L.
1989-01-01
The National Space Science Data Center (NSSDC), established in 1966, is the largest archive for processed data from NASA's space and Earth science missions. The NSSDC manages over 120,000 data tapes with over 4,000 data sets. The size of the digital archive is approximately 6,000 gigabytes with all of this data in its original uncompressed form. By 1995 the NSSDC digital archive is expected to more than quadruple in size reaching over 28,000 gigabytes. The NSSDC digital archive is expected to more than quadruple in size reaching over 28,000 gigabytes. The NSSDC is beginning several thrusts allowing it to better serve the scientific community and keep up with managing the ever increasing volumes of data. These thrusts involve managing larger and larger amounts of information and data online, employing mass storage techniques, and the use of low rate communications networks to move requested data to remote sites in the United States, Europe and Canada. The success of these thrusts, combined with the tremendous volume of data expected to be archived at the NSSDC, clearly indicates that innovative storage and data management solutions must be sought and implemented. Although not presently used, data compression techniques may be a very important tool for managing a large fraction or all of the NSSDC archive in the future. Some future applications would consist of compressing online data in order to have more data readily available, compress requested data that must be moved over low rate ground networks, and compress all the digital data in the NSSDC archive for a cost effective backup that would be used only in the event of a disaster.
Life Sciences Data Archives (LSDA) in the Post-Shuttle Era
NASA Technical Reports Server (NTRS)
Fitts, Mary A.; Johnson-Throop, Kathy; Havelka, Jacque; Thomas, Diedre
2010-01-01
Now, more than ever before, NASA is realizing the value and importance of their intellectual assets. Principles of knowledge management-the systematic use and reuse of information, experience, and expertise to achieve a specific goal-are being applied throughout the agency. LSDA is also applying these solutions, which rely on a combination of content and collaboration technologies, to enable research teams to create, capture, share, and harness knowledge to do the things they do well, even better. In the early days of spaceflight, space life sciences data were collected and stored in numerous databases, formats, media-types and geographical locations. These data were largely unknown/unavailable to the research community. The Biomedical Informatics and Health Care Systems Branch of the Space Life Sciences Directorate at JSC and the Data Archive Project at ARC, with funding from the Human Research Program through the Exploration Medical Capability Element, are fulfilling these requirements through the systematic population of the Life Sciences Data Archive. This project constitutes a formal system for the acquisition, archival and distribution of data for HRP-related experiments and investigations. The general goal of the archive is to acquire, preserve, and distribute these data and be responsive to inquiries for the science communities. Information about experiments and data, as well as non-attributable human data and data from other species' are available on our public Web site http://lsda.jsc.nasa.gov. The Web site also includes a repository for biospecimens, and a utilization process. NASA has undertaken an initiative to develop a Shuttle Data Archive repository. The Shuttle program is nearing its end in 2010 and it is critical that the medical and research data related to the Shuttle program be captured, retained, and usable for research, lessons learned, and future mission planning. Communities of practice are groups of people who share a concern or a passion for something they do, and learn how to do it better as they interact regularly. LSDA works with the HRP community of practice to ensure that we are preserving the relevant research and data they need in the LSDA repository. An evidence-based approach to risk management is required in space life sciences. Evidence changes over time. LSDA has a pilot project with Collexis, a new type of Web-based search engine. Collexis differentiates itself from full-text search engines by making use of thesauri for information retrieval. The high-quality search is based on semantics that have been defined in a life sciences ontology. Additionally, Collexis' matching technology is unique, allowing discovery of partially matching dicuments. Users do not have to construct a complicated (Boolean) search query, but can simply enter a free text search without the risk of getting "no results". Collexis may address these issues by virtue of its retrieval and discovery capabilities across multiple repositories.
Representation of viruses in the remediated PDB archive
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lawson, Catherine L., E-mail: cathy.lawson@rutgers.edu; Dutta, Shuchismita; Westbrook, John D.
2008-08-01
A new data model for PDB entries of viruses and other biological assemblies with regular noncrystallographic symmetry is described. A new scheme has been devised to represent viruses and other biological assemblies with regular noncrystallographic symmetry in the Protein Data Bank (PDB). The scheme describes existing and anticipated PDB entries of this type using generalized descriptions of deposited and experimental coordinate frames, symmetry and frame transformations. A simplified notation has been adopted to express the symmetry generation of assemblies from deposited coordinates and matrix operations describing the required point, helical or crystallographic symmetry. Complete correct information for building full assemblies,more » subassemblies and crystal asymmetric units of all virus entries is now available in the remediated PDB archive.« less
Evolution of filmless PACS in Korea
NASA Astrophysics Data System (ADS)
Choi, Hyung-Sik
2002-05-01
The growth of PACS (Picture Archiving and Communications System) market in Korea over the past 10 years is a brilliant development. In order to reach these brilliant achievements, the efforts of the Korean Society of PACS, the supports of the government on the information technology industry and the efforts of PACS companies in market expansion were all served as vital manures of the sowing time. By the end of 2001, 21% of the total Korean hospitals were under the clinical operation using filmless full PACS and it is believed to be the first incident in the world. The purpose of this paper is to look back upon the growing process of filmless PACS in Korea and analyze the cause of this tremendous growth. I believe that the Korean PACS experience would be helpful to many PACS experts who pray for a proliferation of PACS distribution.
Xiao, Yongli; Sheng, Zong-Mei; Taubenberger, Jeffery K.
2015-01-01
The vast majority of surgical biopsy and post-mortem tissue samples are formalin-fixed and paraffin-embedded (FFPE), but this process leads to RNA degradation that limits gene expression analysis. As an example, the viral RNA genome of the 1918 pandemic influenza A virus was previously determined in a 9-year effort by overlapping RT-PCR from post-mortem samples. Using the protocols described here, the full genome of the 1918 virus at high coverage was determined in one high-throughput sequencing run of a cDNA library derived from total RNA of a 1918 FFPE sample after duplex-specific nuclease treatments. This basic methodological approach should assist in the analysis of FFPE tissue samples isolated over the past century from a variety of infectious diseases. PMID:26344216
NASA Technical Reports Server (NTRS)
Price, Robert D.; Pedelty, Kathleen S.; Ardanuy, Philip E.; Hobish, Mitchell K.
1993-01-01
In order to manage the global data sets required to understand the earth as a system, the EOS Data and Information System (EOSDIS) will collect and store satellite, aircraft, and in situ measurements and their resultant data products, and will distribute the data conveniently. EOSDIS will also provide product generation and science computing facilities to support the development, processing, and validation of standard EOS science data products. The overall architecture of EOSDIS, and how the Distributed Active Archive Centers fit into that structure, are shown. EOSDIS will enable users to query data bases nationally, make use of keywords and other mnemonic identifiers, and see graphic images of subsets of available data prior to ordering full (or selected pieces of) data sets for use in their 'home' environment.
GEOSPATICAL INFORMATION TECHNOLOGY AND INFORMATION MANAGEMENT QUALITY ASSURANCE
Most of the geospatial data in use are originated electronically. As a result, these data are acquired, stored, transformed, processed, presented, and archived electronically. The organized system of computer hardware and software used in these processes is called an Informatio...
NASA Technical Reports Server (NTRS)
Holley, Daniel C.; Haight, Kyle G.; Lindstrom, Ted
1997-01-01
The purpose of this study was to expose a range of naive individuals to the NASA Data Archive and to obtain feedback from them, with the goal of learning how useful people with varied backgrounds would find the Archive for research and other purposes. We processed 36 subjects in four experimental categories, designated in this report as C+R+, C+R-, C-R+ and C-R-, for computer experienced researchers, computer experienced non-researchers, non-computer experienced researchers, and non-computer experienced non-researchers, respectively. This report includes an assessment of general patterns of subject responses to the various aspects of the NASA Data Archive. Some of the aspects examined were interface-oriented, addressing such issues as whether the subject was able to locate information, figure out how to perform desired information retrieval tasks, etc. Other aspects were content-related. In doing these assessments, answers given to different questions were sometimes combined. This practice reflects the tendency of the subjects to provide answers expressing their experiences across question boundaries. Patterns of response are cross-examined by subject category in order to bring out deeper understandings of why subjects reacted the way they did to the archive. After the general assessment, there will be a more extensive summary of the replies received from the test subjects.
Structural biology data archiving - where we are and what lies ahead.
Kleywegt, Gerard J; Velankar, Sameer; Patwardhan, Ardan
2018-05-10
For almost 50 years, structural biology has endeavoured to conserve and share its experimental data and their interpretations (usually, atomistic models) through global public archives such as the Protein Data Bank, Electron Microscopy Data Bank and Biological Magnetic Resonance Data Bank (BMRB). These archives are treasure troves of freely accessible data that document our quest for molecular or atomic understanding of biological function and processes in health and disease. They have prepared the field to tackle new archiving challenges as more and more (combinations of) techniques are being utilized to elucidate structure at ever increasing length scales. Furthermore, the field has made substantial efforts to develop validation methods that help users to assess the reliability of structures and to identify the most appropriate data for their needs. In this Review, we present an overview of public data archives in structural biology and discuss the importance of validation for users and producers of structural data. Finally, we sketch our efforts to integrate structural data with bioimaging data and with other sources of biological data. This will make relevant structural information available and more easily discoverable for a wide range of scientists. © 2018 The Authors. FEBS Letters published by John Wiley & Sons Ltd on behalf of Federation of European Biochemical Societies.
Natural and Cultural Preservation - Complementary Endeavors through Soil Archive Research
NASA Astrophysics Data System (ADS)
Ackermann, Oren; Frumin, Suembikya; Kolska Horwitz, Liora; Maeir, Aren M.; Weiss, Ehud; Zhevelev, Helena M.
2016-04-01
Soil is an excellent archive for the history of landscape components such as ancient topography, erosion and accumulation processes, and vegetation characterization. In special cases, the soil archive even preserves botanical faunal and mollusc assemblages, allowing for the production of an archive of organic species as well. Long-term human activities in the past have left their imprints on certain landscape systems, leading to the formation of landscapes composed of both cultural and natural assets. The aim of this presentation is to suggest a conceptual model, based on the soil archive, which enables the preservation and sustainability of such environments. The proposed area (eastern Mediterranean) underwent cycles of ancient site establishment and abandonment. When areas were occupied, the natural vegetation around settlements experienced human interventions such as woodcutting, grazing and horticulture. During site abandonment, these interventions ceased, resulting in vegetation regeneration, a reduction in biodiversity, increased fire hazard, etc. This ultimately led to the deterioration of the landscape system as well as the destruction of cultural assets such as ancient buildings and/or remnants. In order to preserve and restore these sites, a conceptual model that combines both modern natural conservation strategies and restoration of traditional land-use techniques is proposed. This model provides a complementary approach to existing natural and cultural preservation efforts.
Hłyń, M
2000-01-01
Interesting archival materials collected by Prof. Wiktor Dega are held in the Department of the History of Medical Sciences belonging to Karol Marcinkowski University. There are mainly personal documents including: a military booklet, passport and different identity cards. They are also the diary from 1913. Noteworthy are the notebooks from his student period and diaries full of reflections from his scientific journeys abroad and chrestomathy from the professional literature. Moreover, the archival material about Prof. Degas' pre-war activity and a organiser of cost-free gymnastic courses for children with posture defects should be mentioned in Poznań. After the Second World War Prof. Dega worked on the Committee of Rehabilitation and Adaptation of Human Beings and organised the Polish Branch of the International College of Surgeons, and materials from that time are also available. Also important are documents associated with Prof. Dega's the Order of Smile from the St. Maria Magdelena secondary school in Poznań. His letters are extremely valuable and the interesting press articles, photos and diplomas are also noteworthy.
Landsat: building a strong future
Loveland, Thomas R.; Dwyer, John L.
2012-01-01
Conceived in the 1960s, the Landsat program has experienced six successful missions that have contributed to an unprecedented 39-year record of Earth Observations that capture global land conditions and dynamics. Incremental improvements in imaging capabilities continue to improve the quality of Landsat science data, while ensuring continuity over the full instrument record. Landsats 5 and 7 are still collecting imagery. The planned launch of the Landsat Data Continuity Mission in December 2012 potentially extends the Landsat record to nearly 50 years. The U.S. Geological Survey (USGS) Landsat archive contains nearly three million Landsat images. All USGS Landsat data are available at no cost via the Internet. The USGS is committed to improving the content of the historical Landsat archive though the consolidation of Landsat data held in international archives. In addition, the USGS is working on a strategy to develop higher-level Landsat geo- and biophysical datasets. Finally, Federal efforts are underway to transition Landsat into a sustained operational program within the Department of the Interior and to authorize the development of the next two satellites — Landsats 9 and 10.
NASA Astrophysics Data System (ADS)
2010-02-01
The site's homepage calls it "a repository of ideas and perspectives regarding the science, engineering and philosophy of complexity", and it pretty much does what it says on the tin. Part blog, part links archive, part library of modelling tips and tricks, the site is chock full of information that comes under the general heading of "complexity".
Science Enabling Applications of Gridded Radiances and Products
NASA Astrophysics Data System (ADS)
Goldberg, M.; Wolf, W.; Zhou, L.
2005-12-01
New generations of hyperspectral sounders and imagers are not only providing vastly improved information to monitor, assess and predict the Earth's environment, they also provide tremendous volumes of data to manage. Key management challenges must include data processing, distribution, archive and utilization. At the NOAA/NESDIS Office of Research and Applications, we have started to address the challenge of utilizing high volume satellite by thinning observations and developing gridded datasets from the observations made from the NASA AIRS, AMSU and MODIS instrument. We have developed techniques for intelligent thinning of AIRS data for numerical weather prediction, by selecting the clearest AIRS 14 km field of view within a 3 x 3 array. The selection uses high spatial resolution 1 km MODIS data which are spatially convolved to the AIRS field of view. The MODIS cloud masks and AIRS cloud tests are used to select the clearest. During the real-time processing the data are thinned and gridded to support monitoring, validation and scientific studies. Products from AIRS, which includes profiles of temperature, water vapor and ozone and cloud-corrected infrared radiances for more than 2000 channels, are derived from a single AIRS/AMSU field of regard, which is a 3 x 3 array of AIRS footprints (each with a 14 km spatial resolution) collocated with a single AMSU footprint (42 km). One of our key gridded dataset is a daily 3 x 3 latitude/longitude projection which contains the nearest AIRS/AMSU field of regard with respect to the center of the 3 x 3 lat/lon grid. This particular gridded dataset is 1/40 the size of the full resolution data. This gridded dataset is the type of product request that can be used to support algorithm validation and improvements. It also provides for a very economical approach for reprocessing, testing and improving algorithms for climate studies without having to reprocess the full resolution data stored at the DAAC. For example, on a single CPU workstation, all the AIRS derived products can be derived from a single year of gridded data in 5 days. This relatively short turnaround time, which can be reduced considerably to 3 hours by using a cluster of 40 pc G5processors, allows for repeated reprocessing at the PIs home institution before substantial investments are made to reprocess the full resolution data sets archived at the DAAC. In other words, do not reprocess the full resolution data until the science community have tested and selected the optimal algorithm on the gridded data. Development and applications of gridded radiances and products will be discussed. The applications can be provided as part of a web-based service.
Implementing DOIs for Oceanographic Satellite Data at PO.DAAC
NASA Astrophysics Data System (ADS)
Hausman, J.; Tauer, E.; Chung, N.; Chen, C.; Moroni, D. F.
2013-12-01
The Physical Oceanographic Distributed Active Archive Center (PO.DAAC) is NASA's archive for physical oceanographic satellite data. It distributes over 500 datasets from gravity, ocean wind, sea surface topography, sea ice, ocean currents, salinity, and sea surface temperature satellite missions. A dataset is a collection of granules/files that share the same mission/project, versioning, processing level, spatial, and temporal characteristics. The large number of datasets is partially due to the number of satellite missions, but mostly because a single satellite mission typically has multiple versions or even temporal and spatial resolutions of data. As a result, a user might mistake one dataset for a different dataset from the same satellite mission. Due to the PO.DAAC'S vast variety and volume of data and growing requirements to report dataset usage, it has begun implementing DOIs for the datasets it archives and distributes. However, this was not as simple as registering a name for a DOI and providing a URL. Before implementing DOIs multiple questions needed to be answered. What are the sponsor and end-user expectations regarding DOIs? At what level does a DOI get assigned (dataset, file/granule)? Do all data get a DOI, or only selected data? How do we create a DOI? How do we create landing pages and manage them? What changes need to be made to the data archive, life cycle policy and web portal to accommodate DOIs? What if the data also exists at another archive and a DOI already exists? How is a DOI included if the data were obtained via a subsetting tool? How does a researcher or author provide a unique, definitive reference (standard citation) for a given dataset? This presentation will discuss how these questions were answered through changes in policy, process, and system design. Implementing DOIs is not a trivial undertaking, but as DOIs are rapidly becoming the de facto approach, it is worth the effort. Researchers have historically referenced the source satellite and data center (or archive), but scientific writings do not typically provide enough detail to point to a singular, uniquely identifiable dataset. DOIs provide the means to help researchers be precise in their data citations and provide needed clarity, standardization and permanence.
The Hubble Legacy Archive: Data Processing in the Era of AstroDrizzle
NASA Astrophysics Data System (ADS)
Strolger, Louis-Gregory; Hubble Legacy Archive Team, The Hubble Source Catalog Team
2015-01-01
The Hubble Legacy Archive (HLA) expands the utility of Hubble Space Telescope wide-field imaging data by providing high-level composite images and source lists, perusable and immediately available online. The latest HLA data release (DR8.0) marks a fundamental change in how these image combinations are produced, using DrizzlePac tools and Astrodrizzle to reduce geometric distortion and provide improved source catalogs for all publicly available data. We detail the HLA data processing and source list schemas, what products are newly updated and available for WFC3 and ACS, and how these data products are further utilized in the production of the Hubble Source Catalog. We also discuss plans for future development, including updates to WFPC2 products and field mosaics.
Development of the SOFIA Image Processing Tool
NASA Technical Reports Server (NTRS)
Adams, Alexander N.
2011-01-01
The Stratospheric Observatory for Infrared Astronomy (SOFIA) is a Boeing 747SP carrying a 2.5 meter infrared telescope capable of operating between at altitudes of between twelve and fourteen kilometers, which is above more than 99 percent of the water vapor in the atmosphere. The ability to make observations above most water vapor coupled with the ability to make observations from anywhere, anytime, make SOFIA one of the world s premiere infrared observatories. SOFIA uses three visible light CCD imagers to assist in pointing the telescope. The data from these imagers is stored in archive files as is housekeeping data, which contains information such as boresight and area of interest locations. A tool that could both extract and process data from the archive files was developed.
The MSFC Systems Engineering Guide: An Overview and Plan
NASA Technical Reports Server (NTRS)
Shelby, Jerry; Thomas, L. Dale
2007-01-01
This paper describes the guiding vision, progress to date and the plan forward for development of the Marshall Space Flight Center (MSFC) Systems Engineering Guide (SEG), a virtual systems engineering handbook and archive that describes the system engineering processes used by MSFC in the development of ongoing complex space systems such as the Ares launch vehicle and forthcoming ones as well. It is the intent of this website to be a "One Stop Shop' for MSFC systems engineers that will provide tutorial information, an overview of processes and procedures and links to assist system engineering with guidance and references, and provide an archive of relevant systems engineering artifacts produced by the many NASA projects developed and managed by MSFC over the years.
NASA Technical Reports Server (NTRS)
Touch, Joseph D.
1994-01-01
Future NASA earth science missions, including the Earth Observing System (EOS), will be generating vast amounts of data that must be processed and stored at various locations around the world. Here we present a stepwise-refinement of the intelligent database management (IDM) of the distributed active archive center (DAAC - one of seven regionally-located EOSDIS archive sites) architecture, to showcase the telecommunications issues involved. We develop this architecture into a general overall design. We show that the current evolution of protocols is sufficient to support IDM at Gbps rates over large distances. We also show that network design can accommodate a flexible data ingestion storage pipeline and a user extraction and visualization engine, without interference between the two.
The TESS Science Processing Operations Center
NASA Technical Reports Server (NTRS)
Jenkins, Jon M.; Twicken, Joseph D.; McCauliff, Sean; Campbell, Jennifer; Sanderfer, Dwight; Lung, David; Mansouri-Samani, Masoud; Girouard, Forrest; Tenenbaum, Peter; Klaus, Todd;
2016-01-01
The Transiting Exoplanet Survey Satellite (TESS) will conduct a search for Earth's closest cousins starting in early 2018 and is expected to discover approximately 1,000 small planets with R(sub p) less than 4 (solar radius) and measure the masses of at least 50 of these small worlds. The Science Processing Operations Center (SPOC) is being developed at NASA Ames Research Center based on the Kepler science pipeline and will generate calibrated pixels and light curves on the NASA Advanced Supercomputing Division's Pleiades supercomputer. The SPOC will also search for periodic transit events and generate validation products for the transit-like features in the light curves. All TESS SPOC data products will be archived to the Mikulski Archive for Space Telescopes (MAST).
Nakazato, Takeru; Ohta, Tazro; Bono, Hidemasa
2013-01-01
High-throughput sequencing technology, also called next-generation sequencing (NGS), has the potential to revolutionize the whole process of genome sequencing, transcriptomics, and epigenetics. Sequencing data is captured in a public primary data archive, the Sequence Read Archive (SRA). As of January 2013, data from more than 14,000 projects have been submitted to SRA, which is double that of the previous year. Researchers can download raw sequence data from SRA website to perform further analyses and to compare with their own data. However, it is extremely difficult to search entries and download raw sequences of interests with SRA because the data structure is complicated, and experimental conditions along with raw sequences are partly described in natural language. Additionally, some sequences are of inconsistent quality because anyone can submit sequencing data to SRA with no quality check. Therefore, as a criterion of data quality, we focused on SRA entries that were cited in journal articles. We extracted SRA IDs and PubMed IDs (PMIDs) from SRA and full-text versions of journal articles and retrieved 2748 SRA ID-PMID pairs. We constructed a publication list referring to SRA entries. Since, one of the main themes of -omics analyses is clarification of disease mechanisms, we also characterized SRA entries by disease keywords, according to the Medical Subject Headings (MeSH) extracted from articles assigned to each SRA entry. We obtained 989 SRA ID-MeSH disease term pairs, and constructed a disease list referring to SRA data. We previously developed feature profiles of diseases in a system called “Gendoo”. We generated hyperlinks between diseases extracted from SRA and the feature profiles of it. The developed project, publication and disease lists resulting from this study are available at our web service, called “DBCLS SRA” (http://sra.dbcls.jp/). This service will improve accessibility to high-quality data from SRA. PMID:24167589
NOAA's Scientific Data Stewardship Program
NASA Astrophysics Data System (ADS)
Bates, J. J.
2004-12-01
The NOAA mission is to understand and predict changes in the Earth's environment and conserve and manage coastal and marine resources to meet the Nation's economic, social and environmental needs. NOAA has responsibility for long-term archiving of the United States environmental data and has recently integrated several data management functions into a concept called Scientific Data Stewardship. Scientific Data Stewardship a new paradigm in data management consisting of an integrated suite of functions to preserve and exploit the full scientific value of NOAA's, and the world's, environmental data These functions include careful monitoring of observing system performance for long-term applications, the generation of authoritative long-term climate records from multiple observing platforms, and the proper archival of and timely access to data and metadata. NOAA has developed a conceptual framework to implement the functions of scientific data stewardship. This framework has five objectives: 1) develop real-time monitoring of all satellite observing systems for climate applications, 2) process large volumes of satellite data extending up to decades in length to account for systematic errors and to eliminate artifacts in the raw data (referred to as fundamental climate data records, FCDRs), 3) generate retrieved geophysical parameters from the FCDRs (referred to as thematic climate data records TCDRs) including combining observations from all sources, 4) conduct monitoring and research by analyzing data sets to uncover climate trends and to provide evaluation and feedback for steps 2) and 3), and 5) provide archives of metadata, FCDRs, and TCDRs, and facilitate distribution of these data to the user community. The term `climate data record' and related terms, such as climate data set, have been used for some time, but the climate community has yet to settle on a concensus definition. A recent United States National Academy of Sciences report recommends using the following definition: a climate data record (CDR) is a time series of measurements of sufficient length, consistency, and continuity to determine climate variability and change.
Lugina, K. M. [Department of Geography, St. Petersburg State University, St. Petersburg, Russia; Groisman, P. Ya. [National Climatic Data Center, Asheville, North Carolina USA); Vinnikov, K. Ya. [Department of Atmospheric Sciences, University of Maryland, College Park, Maryland (USA); Koknaeva, V. V. [State Hydrological Institute, St. Petersburg, Russia; Speranskaya, N. A. [State Hydrological Institute, St. Petersburg, Russia
2006-01-01
The mean monthly and annual values of surface air temperature compiled by Lugina et al. have been taken mainly from the World Weather Records, Monthly Climatic Data for the World, and Meteorological Data for Individual Years over the Northern Hemisphere Excluding the USSR. These published records were supplemented with information from different national publications. In the original archive, after removal of station records believed to be nonhomogeneous or biased, 301 and 265 stations were used to determine the mean temperature for the Northern and Southern hemispheres, respectively. The new version of the station temperature archive (used for evaluation of the zonally-averaged temperatures) was created in 1995. The change to the archive was required because data from some stations became unavailable for analyses in the 1990s. During this process, special care was taken to secure homogeneity of zonally averaged time series. When a station (or a group of stations) stopped reporting, a "new" station (or group of stations) was selected in the same region, and its data for the past 50 years were collected and added to the archive. The processing (area-averaging) was organized in such a way that each time series from a new station spans the reference period (1951-1975) and the years thereafter. It was determined that the addition of the new stations had essentially no effect on the zonally-averaged values for the pre-1990 period.
Exploring cloud and big data components for SAR archiving and analysis
NASA Astrophysics Data System (ADS)
Baker, S.; Crosby, C. J.; Meertens, C.; Phillips, D.
2017-12-01
Under the Geodesy Advancing Geoscience and EarthScope (GAGE) NSF Cooperative Agreement, UNAVCO has seen the volume of the SAR Data Archive grow at a substantial rate, from 2 TB in Y1 and 5 TB in Y2 to 41 TB in Y3 primarily due to WInSAR PI proposal management of ALOS-2/JAXA (Japan Aerospace Exploration Agency) data and to a lesser extent Supersites and other data collections. JAXA provides a fixed number of scenes per year for each PI, and some data files are 50-60GB each, which accounts for the large volume of data. In total, over 100TB of SAR data are in the WInSAR/UNAVCO archive and a large portion of these are available unrestricted for WInSAR members. In addition to the existing data, newer data streams from the Sentinel-1 and NISAR missions will require efficient processing pipelines and easily scalable infrastructure to handle processed results. With these growing data sizes and space concerns, the SAR archive operations migrated to the Texas Advanced Computing Center (TACC) via an NSF XSEDE proposal in spring 2017. Data are stored on an HPC system while data operations are running on Jetstream virtual machines within the same datacenter. In addition to the production data operations, testing was done in early 2017 with container based InSAR processing analysis using JupyterHub and Docker images deployed on a VM cluster on Jetstream. The JupyterHub environment is well suited for short courses and other training opportunities for the community such as labs for university courses on InSAR. UNAVCO is also exploring new processing methodologies using DC/OS (the datacenter operating system) for batch and stream processing workflows and time series analysis with Big Data open source components like the Spark, Mesos, Akka, Cassandra, Kafka (SMACK) stack. The comparison of the different methodologies will provide insight into the pros and cons for each and help the SAR community with decisions about infrastructure and software requirements to meet their research goals.
Enabling Velocity-Resolved Science with Advanced Processing of Herschel/HIFI Observations
NASA Astrophysics Data System (ADS)
Morris, Patrick
The Herschel/HIFI instrument was a heterodyne spectrometer with technology demonstrating and flight components built by NASA/JPL, and acquired over 9000 astronomical observations at velocity resolutions of better than 1 km/s between 480 -1910 GHz (157 - 612 microns). Its performances designed around the scientific goals of exploring the cyclical interrelation of stars and the ISM in diverse environments unified by copious amounts molecular and atomic gas and dust have resulted in over 350 refereed scientific publications, providing a successful foundation and inspiration for current and future science with terahertz instrumentation above the Earth's atmosphere. Nonetheless, almost 60% of the valid observations in the Herschel Science Archive (HSA) are unpublished. This is in largest part due to the limitations of the automated pipeline, and the complexities of interactive treatment the data to bring them to science-ready quality. New users of the archive lacking knowledge of the nuances of heterodyne instrumentation and/or experience with the data processing system are particularly challenged to optimize the data around their science interests or goals with ultra-high resolution spectra. Similarly, the effort to remove quality-degrading instrument artifacts and apply noise performance enhancements is a challenge at this stage even for more experienced users and original program observers who have not yet exploited their observations, either in part or in full as many published observations may also be further harvested for new science results. Recognizing that this situation will likely not improve over time, the HIFI instrument team put substantial effort during the funded post-cryo phase into interactively creating Highly Processed Data Products (HPDPs) from a set of observations in need of corrections and enhancements, in order to promote user accessibility and HIFI's scientific legacy. A set HPDPs created from 350 spectral mapping observations were created in an effort lead at the NASA Herschel Science Center, and delivered in November 2016 to the NASA InfraRed Science Archive (IRSA) and the HSA where they are available to the community. Due to limited resources, this effort could not cover the full list of observations in need of interactive treatments. We are proposing to cover that final set observations (spectral maps and a selection of spectral scans and point observations) in a project spread over 2 years with 0.5 FTE funding, for a guaranteed set of phased deliverables produced with optimized quality at high efficiency using expert processing and delivery procedures already in place. This effort will tackle the quality-degrading artifacts which could not be corrected in the automatic pipeline -- and becoming more and more remote for potential users to correct on their own even with scripted guidance. The expectation is that the huge investments by the funding agencies, and the successful operations of the observatory meeting and often exceeding performance requirements, can be returned to the maximum scientific extent possible. We can guarantee some of that scientific return, in a study of fundamental carbon chemistry in energetic star forming regions, using the proposed HPDPs from unpublished and partially unexploited HIFI data to probe UV- and shockdriven chemistries to explain an unexpected deficiency of C+ in the Orion KL eruptive outflow. We will test a hypothesis that C+ is depleted by production of CO rather than CH+, through a chain of reactions involving intermediate products suited to the molecular environment.
Toward Automatic Georeferencing of Archival Aerial Photogrammetric Surveys
NASA Astrophysics Data System (ADS)
Giordano, S.; Le Bris, A.; Mallet, C.
2018-05-01
Images from archival aerial photogrammetric surveys are a unique and relatively unexplored means to chronicle 3D land-cover changes over the past 100 years. They provide a relatively dense temporal sampling of the territories with very high spatial resolution. Such time series image analysis is a mandatory baseline for a large variety of long-term environmental monitoring studies. The current bottleneck for accurate comparison between epochs is their fine georeferencing step. No fully automatic method has been proposed yet and existing studies are rather limited in terms of area and number of dates. State-of-the art shows that the major challenge is the identification of ground references: cartographic coordinates and their position in the archival images. This task is manually performed, and extremely time-consuming. This paper proposes to use a photogrammetric approach, and states that the 3D information that can be computed is the key to full automation. Its original idea lies in a 2-step approach: (i) the computation of a coarse absolute image orientation; (ii) the use of the coarse Digital Surface Model (DSM) information for automatic absolute image orientation. It only relies on a recent orthoimage+DSM, used as master reference for all epochs. The coarse orthoimage, compared with such a reference, allows the identification of dense ground references and the coarse DSM provides their position in the archival images. Results on two areas and 5 dates show that this method is compatible with long and dense archival aerial image series. Satisfactory planimetric and altimetric accuracies are reported, with variations depending on the ground sampling distance of the images and the location of the Ground Control Points.
NASA Astrophysics Data System (ADS)
Linick, J. P.; Pieri, D. C.; Sanchez, R. M.
2014-12-01
The physical and temporal systematics of the world's volcanic activity is a compelling and productive arena for the exercise of orbital remote sensing techniques, informing studies ranging from basic volcanology to societal risk. Comprised of over 160,000 frames and spanning 15 years of the Terra platform mission, the ASTER Volcano Archive (AVA: http://ava.jpl.nasa.gov) is the world's largest (100+Tb) high spatial resolution (15-30-90m/pixel), multi-spectral (visible-SWIR-TIR), downloadable (kml enabled) dedicated archive of volcano imagery. We will discuss the development of the AVA, and describe its growing capability to provide new easy public access to ASTER global volcano remote sensing data. AVA system architecture is designed to facilitate parameter-based data mining, and for the implementation of archive-wide data analysis algorithms. Such search and analysis capabilities exploit AVA's unprecedented time-series data compilations for over 1,550 volcanoes worldwide (Smithsonian Holocene catalog). Results include thermal anomaly detection and mapping, as well as detection of SO2 plumes from explosive eruptions and passive SO2 emissions confined to the troposphere. We are also implementing retrospective ASTER image retrievals based on volcanic activity reports from Volcanic Ash Advisory Centers (VAACs) and the US Air Force Weather Agency (AFWA). A major planned expansion of the AVA is currently underway, with the ingest of the full 1972-present LANDSAT, and NASA EO-1, volcano imagery for comparison and integration with ASTER data. Work described here is carried out under contract to NASA at the Jet Propulsion Laboratory as part of the California Institute of Technology.
ACE: A distributed system to manage large data archives
NASA Technical Reports Server (NTRS)
Daily, Mike I.; Allen, Frank W.
1993-01-01
Competitive pressures in the oil and gas industry are requiring a much tighter integration of technical data into E and P business processes. The development of new systems to accommodate this business need must comprehend the significant numbers of large, complex data objects which the industry generates. The life cycle of the data objects is a four phase progression from data acquisition, to data processing, through data interpretation, and ending finally with data archival. In order to implement a cost effect system which provides an efficient conversion from data to information and allows effective use of this information, an organization must consider the technical data management requirements in all four phases. A set of technical issues which may differ in each phase must be addressed to insure an overall successful development strategy. The technical issues include standardized data formats and media for data acquisition, data management during processing, plus networks, applications software, and GUI's for interpretation of the processed data. Mass storage hardware and software is required to provide cost effective storage and retrieval during the latter three stages as well as long term archival. Mobil Oil Corporation's Exploration and Producing Technical Center (MEPTEC) has addressed the technical and cost issues of designing, building, and implementing an Advanced Computing Environment (ACE) to support the petroleum E and P function, which is critical to the corporation's continued success. Mobile views ACE as a cost effective solution which can give Mobile a competitive edge as well as a viable technical solution.
The McIntosh Archive: A solar feature database spanning four solar cycles
NASA Astrophysics Data System (ADS)
Gibson, S. E.; Malanushenko, A. V.; Hewins, I.; McFadden, R.; Emery, B.; Webb, D. F.; Denig, W. F.
2016-12-01
The McIntosh Archive consists of a set of hand-drawn solar Carrington maps created by Patrick McIntosh from 1964 to 2009. McIntosh used mainly H-alpha, He-1 10830 and photospheric magnetic measurements from both ground-based and NASA satellite observations. With these he traced coronal holes, polarity inversion lines, filaments, sunspots and plage, yielding a unique 45-year record of the features associated with the large-scale solar magnetic field. We will present the results of recent efforts to preserve and digitize this archive. Most of the original hand-drawn maps have been scanned, a method for processing these scans into digital, searchable format has been developed and streamlined, and an archival repository at NOAA's National Centers for Environmental Information (NCEI) has been created. We will demonstrate how Solar Cycle 23 data may now be accessed and how it may be utilized for scientific applications. In addition, we will discuss how this database of human-recognized features, which overlaps with the onset of high-resolution, continuous modern solar data, may act as a training set for computer feature recognition algorithms.
The worldwide Protein Data Bank (wwPDB): ensuring a single, uniform archive of PDB data
Berman, Helen; Henrick, Kim; Nakamura, Haruki; Markley, John L.
2007-01-01
The worldwide Protein Data Bank (wwPDB) is the international collaboration that manages the deposition, processing and distribution of the PDB archive. The online PDB archive is a repository for the coordinates and related information for more than 38 000 structures, including proteins, nucleic acids and large macromolecular complexes that have been determined using X-ray crystallography, NMR and electron microscopy techniques. The founding members of the wwPDB are RCSB PDB (USA), MSD-EBI (Europe) and PDBj (Japan) [H.M. Berman, K. Henrick and H. Nakamura (2003) Nature Struct. Biol., 10, 980]. The BMRB group (USA) joined the wwPDB in 2006. The mission of the wwPDB is to maintain a single archive of macromolecular structural data that are freely and publicly available to the global community. Additionally, the wwPDB provides a variety of services to a broad community of users. The wwPDB website at provides information about services provided by the individual member organizations and about projects undertaken by the wwPDB. PMID:17142228
Viking Seismometer PDS Archive Dataset
NASA Astrophysics Data System (ADS)
Lorenz, R. D.
2016-12-01
The Viking Lander 2 seismometer operated successfully for over 500 Sols on the Martian surface, recording at least one likely candidate Marsquake. The Viking mission, in an era when data handling hardware (both on board and on the ground) was limited in capability, predated modern planetary data archiving, and ad-hoc repositories of the data, and the very low-level record at NSSDC, were neither convenient to process nor well-known. In an effort supported by the NASA Mars Data Analysis Program, we have converted the bulk of the Viking dataset (namely the 49,000 and 270,000 records made in High- and Event- modes at 20 and 1 Hz respectively) into a simple ASCII table format. Additionally, since wind-generated lander motion is a major component of the signal, contemporaneous meteorological data are included in summary records to facilitate correlation. These datasets are being archived at the PDS Geosciences Node. In addition to brief instrument and dataset descriptions, the archive includes code snippets in the freely-available language 'R' to demonstrate plotting and analysis. Further, we present examples of lander-generated noise, associated with the sampler arm, instrument dumps and other mechanical operations.
NASA Astrophysics Data System (ADS)
Oswald, Helmut; Mueller-Jones, Kay; Builtjes, Jan; Fleck, Eckart
1998-07-01
The developments in information technologies -- computer hardware, networking and storage media -- has led to expectations that these advances make it possible to replace 35 mm film completely by digital techniques in the catheter laboratory. Besides the role of an archival medium, cine film is used as the major image review and exchange medium in cardiology. None of the today technologies can fulfill completely the requirements to replace cine film. One of the major drawbacks of cine film is the single access in time and location. For the four catheter laboratories in our institutions we have designed a complementary concept combining the CD-R, also called CD-medical, as a single patient storage and exchange medium, and a digital archive for on-line access and image review of selected frames or short sequences on adequate medical workstations. The image data from various modalities as well as all digital documents regarding to a patient are part of an electronic patient record. The access, the processing and the display of documents is supported by an integrated medical application.
Security of patient data when decommissioning ultrasound systems
2017-01-01
Background Although ultrasound systems generally archive to Picture Archiving and Communication Systems (PACS), their archiving workflow typically involves storage to an internal hard disk before data are transferred onwards. Deleting records from the local system will delete entries in the database and from the file allocation table or equivalent but, as with a PC, files can be recovered. Great care is taken with disposal of media from a healthcare organisation to prevent data breaches, but ultrasound systems are routinely returned to lease companies, sold on or donated to third parties without such controls. Methods In this project, five methods of hard disk erasure were tested on nine ultrasound systems being decommissioned: the system’s own delete function; full reinstallation of system software; the manufacturer’s own disk wiping service; open source disk wiping software for full and just blank space erasure. Attempts were then made to recover data using open source recovery tools. Results All methods deleted patient data as viewable from the ultrasound system and from browsing the disk from a PC. However, patient identifiable data (PID) could be recovered following the system’s own deletion and the reinstallation methods. No PID could be recovered after using the manufacturer’s wiping service or the open source wiping software. Conclusion The typical method of reinstalling an ultrasound system’s software may not prevent PID from being recovered. When transferring ownership, care should be taken that an ultrasound system’s hard disk has been wiped to a sufficient level, particularly if the scanner is to be returned with approved parts and in a fully working state. PMID:28228821
Development and Operations of the Astrophysics Data System
NASA Technical Reports Server (NTRS)
Murray, Stephen S.; Oliversen, Ronald (Technical Monitor)
2003-01-01
The ADS was represented at the AAS meeting with 3 poster papers and a demonstration booth. We have setup a mirror site of the Vizier data base system at the CDS. functionality of the ADC at Goddard. This will replace the Preparations for the APS and LPSC meetings in March started. We will have demonstrations at both meetings. Preparations for the APS and LPSC meetings in March continued. We will have demonstrations at both meetings. The ADS was represented with a poster at the joint AGUEGU meeting in Nice, France. Discussions about the on-going collaboration between the ADS and the CDS in Strasbourg, France were held in Strasbourg. The ADS was invited to organize a session about the ADS and its mirror sites at the next United Nations Workshop on Basic Space Sciences in the Developing World. Efforts are under way to enter the tables of contents of all conference proceedings in the SA0 library into the ADS. This requires copying the tables of contents from all volumes in the library and have them typed in. This will greatly enhance the coverage of the literature in the ADS. We started the development of a search system for the full text of all scanned material in the ADS. This will eventually allow our users search capabilities that so far do not exist in any form. I order to enable the full text searching, we have purchased OCR software and are in the process of OCRing the scanned pages in the ADS. Efforts are in progress to handle the inclusion of data set identifiers in article manuscripts. The ADS will be the central system that will allow the journals to verify data set identifiers. The "master verifier" has been implemented in prototype form at the ADS. We started to include more journals in Geosciences/Geophysics in the ADS. The Royal astronomical Society has decided to archive their on-line journals in the ADS three years after publication. We have started to process these older on-line articles in order to archive them in the ADS. Our mirror site in Korea now has a full article mirror. We developed XML output capability in the ADS. This will make it easier to exchange data with other data systems. We started the development of new indexing software that will eventually reduce the indexing time for a database from days to hours or less. The ADS was represented at the IAU General Assembly with a poster. Discussions with the IAU management were held about extending the ADS IAU collaborations.
Fermilab Today - Related Content
Fermilab Today Related Content Subscribe | Contact Fermilab Today | Archive | Classifieds Search Experiment Profiles Current Archive Current Fermilab Today Archive of 2015 Archive of 2014 Archive of 2013 Archive of 2012 Archive of 2011 Archive of 2010 Archive of 2009 Archive of 2008 Archive of 2007 Archive of
SeaWiFS Science Algorithm Flow Chart
NASA Technical Reports Server (NTRS)
Darzi, Michael
1998-01-01
This flow chart describes the baseline science algorithms for the Sea-viewing Wide Field-of-view Sensor (SeaWiFS) Data Processing System (SDPS). As such, it includes only processing steps used in the generation of the operational products that are archived by NASA's Goddard Space Flight Center (GSFC) Distributed Active Archive Center (DAAC). It is meant to provide the reader with a basic understanding of the scientific algorithm steps applied to SeaWiFS data. It does not include non-science steps, such as format conversions, and places the greatest emphasis on the geophysical calculations of the level-2 processing. Finally, the flow chart reflects the logic sequences and the conditional tests of the software so that it may be used to evaluate the fidelity of the implementation of the scientific algorithm. In many cases however, the chart may deviate from the details of the software implementation so as to simplify the presentation.
Demiris, A M; Meinzer, H P
1997-01-01
Whether or not a computerized system enhances the conditions of work in the application domain, very much demands on the user interface. Graphical user interfaces seem to attract the interest of the users but mostly ignore some basic rules of visual information processing thus leading to systems which are difficult to use, lowering productivity and increasing working stress (cognitive and work load). In this work we present some fundamental ergonomic considerations and their application to the medical image processing and archiving domain. We introduce the extensions to an existing concept needed to control and guide the development of GUIs with respect to domain specific ergonomics. The suggested concept, called Model-View-Controller Constraints (MVCC), can be used to programmatically implement ergonomic constraints, and thus has some advantages over written style guides. We conclude with the presentation of existing norms and methods to evaluate user interfaces.
NASA Astrophysics Data System (ADS)
Liu, Brent J.; Documet, Luis; Documet, Jorge; Huang, H. K.; Muldoon, Jean
2004-04-01
An Application Service Provider (ASP) archive model for disaster recovery for Saint John"s Health Center (SJHC) clinical PACS data has been implemented using a Fault-Tolerant Archive Server at the Image Processing and Informatics Laboratory, Marina del Rey, CA (IPIL) since mid-2002. The purpose of this paper is to provide clinical experiences with the implementation of an ASP model backup archive in conjunction with handheld wireless technologies for a particular disaster recovery scenario, an earthquake, in which the local PACS archive and the hospital are destroyed and the patients are moved from one hospital to another. The three sites involved are: (1) SJHC, the simulated disaster site; (2) IPIL, the ASP backup archive site; and (3) University of California, Los Angeles Medical Center (UCLA), the relocated patient site. An ASP backup archive has been established at IPIL to receive clinical PACS images daily using a T1 line from SJHC for backup and disaster recovery storage. Procedures were established to test the network connectivity and data integrity on a regular basis. In a given disaster scenario where the local PACS archive has been destroyed and the patients need to be moved to a second hospital, a wireless handheld device such as a Personal Digital Assistant (PDA) can be utilized to route images to the second hospital site with a PACS and reviewed by radiologists. To simulate this disaster scenario, a wireless network was implemented within the clinical environment in all three sites: SJHC, IPIL, and UCLA. Upon executing the disaster scenario, the SJHC PACS archive server simulates a downtime disaster event. Using the PDA, the radiologist at UCLA can query the ASP backup archive server at IPIL for PACS images and route them directly to UCLA. Implementation experiences integrating this solution within the three clinical environments as well as the wireless performance are discussed. A clinical downtime disaster scenario was implemented and successfully tested. Radiologists were able to successfully query PACS images utilizing a wireless handheld device from the ASP backup archive at IPIL and route the PACS images directly to a second clinical site at UCLA where they and the patients are located at that time. In a disaster scenario, using a wireless device, radiologists at the disaster health care center can route PACS data from an ASP backup archive server to be reviewed in a live clinical PACS environment at a secondary site. This solution allows Radiologists to use a wireless handheld device to control the image workflow and to review PACS images during a major disaster event where patients must be moved to a secondary site.
ERIC Educational Resources Information Center
Haapaniemi, Peter
1990-01-01
Describes imaging technology, which allows huge numbers of words and illustrations to be reduced to tiny fraction of space required by originals and discusses current applications. Highlights include image processing system at National Archives; use by banks for high-speed check processing; engineering document management systems (EDMS); folder…
The NSO FTS database program and archive (FTSDBM)
NASA Technical Reports Server (NTRS)
Lytle, D. M.
1992-01-01
Data from the NSO Fourier transform spectrometer is being re-archived from half inch tape onto write-once compact disk. In the process, information about each spectrum and a low resolution copy of each spectrum is being saved into an on-line database. FTSDBM is a simple database management program in the NSO external package for IRAF. A command language allows the FTSDBM user to add entries to the database, delete entries, select subsets from the database based on keyword values including ranges of values, create new database files based on these subsets, make keyword lists, examine low resolution spectra graphically, and make disk number/file number lists. Once the archive is complete, FTSDBM will allow the database to be efficiently searched for data of interest to the user and the compact disk format will allow random access to that data.
Data grid: a distributed solution to PACS
NASA Astrophysics Data System (ADS)
Zhang, Xiaoyan; Zhang, Jianguo
2004-04-01
In a hospital, various kinds of medical images acquired from different modalities are generally used and stored in different department and each modality usually attaches several workstations to display or process images. To do better diagnosis, radiologists or physicians often need to retrieve other kinds of images for reference. The traditional image storage solution is to buildup a large-scale PACS archive server. However, the disadvantages of pure centralized management of PACS archive server are obvious. Besides high costs, any failure of PACS archive server would cripple the entire PACS operation. Here we present a new approach to develop the storage grid in PACS, which can provide more reliable image storage and more efficient query/retrieval for the whole hospital applications. In this paper, we also give the performance evaluation by comparing the three popular technologies mirror, cluster and grid.
From The Horse's Mouth: Engaging With Geoscientists On Science
NASA Astrophysics Data System (ADS)
Katzenberger, J.; Morrow, C. A.; Arnott, J. C.
2011-12-01
"From the Horse's Mouth" is a project of the Aspen Global Change Institute (AGCI) that utilizes selected short video clips of scientists presenting and discussing their research in an interdisciplinary setting at AGCI as the core of an online interactive set of learning modules in the geosciences for grades 9-12 and 1st and 2nd year undergraduate students. The video archive and associated material as is has limited utility, but here we illustrate how it can be leveraged for educational purposes by a systematic mining of the resource integrated with a variety of supplemental user experiences. The project furthers several broad goals to: (a) improve the quality of formal and informal geoscience education with an emphasis on 9-12 and early undergraduate, (b) encourage and facilitate the engagement of geoscientists to strengthen STEM education by leveraging AGCI's interdisciplinary science program for educational purposes, (c) explore science as a human endeavor by providing a unique view of how scientists communicate in a research setting, potentially stimulating students to consider traditional and non-traditional geoscience careers, (d) promote student understanding of scientific methodology and inquiry, and (e) further student appreciation of the role of science in society, particularly related to understanding Earth system science and global change. The resource material at the core of this project is a videotape record of presentation and discussion among leading scientists from 35 countries participating in interdisciplinary workshops at AGCI on a broad array of geoscience topics over a period of 22 years. The unique archive represents approximately 1200 hours of video footage obtained over the course of 43 scientific workshops and 62 hours of public talks. The full spectrum of material represents scientists active on all continents with a diverse set of backgrounds and academic expertise in both natural and social sciences. We report on the video database resource, our data acquisition protocols, conceptual design for the learning modules, excerpts from the video archive illustrating both geoscience content utilized in educational module development and examples of video clips that explore the process of science and its nature as a human endeavor. A prototype of the user interface featuring a navigational strategy, a discussion of both content and process goals represented in the pilot material and its use in both formal and informal settings are presented.
Particular geoscientific perspectives on stable isotope analysis in the arboreal system
NASA Astrophysics Data System (ADS)
Helle, Gerhard; Balting, Daniel; Pauly, Maren; Slotta, Franziska
2017-04-01
In geosciences stable isotopes of carbon, oxygen and hydrogen from the tree ring archive have been used for several decades to trace the course of past environmental and climatological fluctuations. In contrast to ice cores, the tree ring archive is of biological nature (like many other terrestrial archives), but provides the opportunity to establish site networks with very high resolution in space and time. Many of the basic physical mechanisms of isotope shifts are known, but biologically mediated processes may lead to isotope effects that are poorly understood. This implies that the many processes within the arboreal system leading to archived isotope ratios in wood material are governed by a multitude of environmental variables that are not only tied to the isotopic composition of atmospheric source values (precipitation, CO2), but also to seasonally changing metabolic flux rates and pool sizes of photosynthates within the trees. Consequently, the extraction of climate and environmental information is particularly challenging and reconstructions are still of rather qualitative nature. Over the last 10 years or so, monitoring studies have been implemented to investigate stable isotope, climate and environmental signal transfer within the arboreal system to develop transfer or response functions that can translate the relevant isotope values extracted from tree rings into climate or other environmental variables. To what extent have these efforts lead to a better understanding that helps improving the meaningfulness of tree ring isotope signals? For example, do monitoring studies help deciphering the causes for age-related trends in tree ring stable isotope sequences that are published in a growing number of papers. Are existing monitoring studies going into detail enough or is it already too much effort for the outcome? Based on what we know already particularly in mesic habitats, tree ring stable isotopes are much better climate proxies than other tree ring parameters. However, millennial or multi-millennial high quality reconstructions from tree ring isotopes are still rare. This is because of i) methodological constraints related to mass spectrometric analyses and ii) the nature of tree-ring chronologies that are put together by many trees of various individual ages. In view of this: What is the state-of-the-art in high throughput tree ring stable isotope analyses? Is it necessary to advance existing methodologies further to conserve the annual time resolution provided by the tree-ring archive? Other terrestrial archives, like lake sediments and speleothems rarely provide annually resolved stable isotope data. Furthermore, certain tree species from tropical or sub-tropical regions cannot be dated properly by dendrochronology and hence demand specific stable isotope measuring strategies, etc.. Although the points raised here do specifically apply for the tree ring archive, some of them are important for all proxy archives of organic origin.
The Hopkins Ultraviolet Telescope: The Final Archive
NASA Astrophysics Data System (ADS)
Dixon, William V.; Blair, William P.; Kruk, Jeffrey W.; Romelfanger, Mary L.
2013-04-01
The Hopkins Ultraviolet Telescope (HUT) was a 0.9 m telescope and moderate-resolution (Δλ = 3 Å) far-ultraviolet (820-1850 Å) spectrograph that flew twice on the space shuttle, in 1990 December (Astro-1, STS-35) and 1995 March (Astro-2, STS-67). The resulting spectra were originally archived in a nonstandard format that lacked important descriptive metadata. To increase their utility, we have modified the original data-reduction software to produce a new and more user-friendly data product, a time-tagged photon list similar in format to the Intermediate Data Files (IDFs) produced by the Far Ultraviolet Spectroscopic Explorer calibration pipeline. We have transferred all relevant pointing and instrument-status information from locally-archived science and engineering databases into new FITS header keywords for each data set. Using this new pipeline, we have reprocessed the entire HUT archive from both missions, producing a new set of calibrated spectral products in a modern FITS format that is fully compliant with Virtual Observatory requirements. For each exposure, we have generated quick-look plots of the fully-calibrated spectrum and associated pointing history information. Finally, we have retrieved from our archives HUT TV guider images, which provide information on aperture positioning relative to guide stars, and converted them into FITS-format image files. All of these new data products are available in the new HUT section of the Mikulski Archive for Space Telescopes (MAST), along with historical and reference documents from both missions. In this article, we document the improved data-processing steps applied to the data and show examples of the new data products.
The Hopkins Ultraviolet Telescope: The Final Archive
NASA Technical Reports Server (NTRS)
Dixon, William V.; Blair, William P.; Kruk, Jeffrey W.; Romelfanger, Mary L.
2013-01-01
The Hopkins Ultraviolet Telescope (HUT) was a 0.9 m telescope and moderate-resolution (Delta)lambda equals 3 A) far-ultraviolet (820-1850 Å) spectrograph that flew twice on the space shuttle, in 1990 December (Astro-1, STS-35) and 1995 March (Astro-2, STS-67). The resulting spectra were originally archived in a nonstandard format that lacked important descriptive metadata. To increase their utility, we have modified the original datareduction software to produce a new and more user-friendly data product, a time-tagged photon list similar in format to the Intermediate Data Files (IDFs) produced by the Far Ultraviolet Spectroscopic Explorer calibration pipeline. We have transferred all relevant pointing and instrument-status information from locally-archived science and engineering databases into new FITS header keywords for each data set. Using this new pipeline, we have reprocessed the entire HUT archive from both missions, producing a new set of calibrated spectral products in a modern FITS format that is fully compliant with Virtual Observatory requirements. For each exposure, we have generated quicklook plots of the fully-calibrated spectrum and associated pointing history information. Finally, we have retrieved from our archives HUT TV guider images, which provide information on aperture positioning relative to guide stars, and converted them into FITS-format image files. All of these new data products are available in the new HUT section of the Mikulski Archive for Space Telescopes (MAST), along with historical and reference documents from both missions. In this article, we document the improved data-processing steps applied to the data and show examples of the new data products.
Analytical test results for archived core composite samples from tanks 241-TY-101 and 241-TY-103
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beck, M.A.
1993-07-16
This report describes the analytical tests performed on archived core composite samples form a 1.085 sampling of the 241-TY-101 (101-TY) and 241-TY-103 (103-TY) single shell waste tanks. Both tanks are suspected of containing quantities of ferrocyanide compounds, as a result of process activities in the late 1950`s. Although limited quantities of the composite samples remained, attempts were made to obtain as much analytical information as possible, especially regarding the chemical and thermal properties of the material.
2012-04-30
Conrad Crane, “Phase IV Operations: Where Wars are Really Won.” Military Review, ( May -June 2005, http://usacac.army.mil/ CAC2 /MilitaryReview/Archives...Wars are Really Won.” Military Review, ( May -June 2005, http://usacac.army.mil/ CAC2 /MilitaryReview/Archives/English...Yerace, Joshua J 5f. WORK UNIT NUMBER N/A 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) USMC Command and Staff College Marine Corps
Knowledge-driven information mining in remote-sensing image archives
NASA Astrophysics Data System (ADS)
Datcu, M.; Seidel, K.; D'Elia, S.; Marchetti, P. G.
2002-05-01
Users in all domains require information or information-related services that are focused, concise, reliable, low cost and timely and which are provided in forms and formats compatible with the user's own activities. In the current Earth Observation (EO) scenario, the archiving centres generally only offer data, images and other "low level" products. The user's needs are being only partially satisfied by a number of, usually small, value-adding companies applying time-consuming (mostly manual) and expensive processes relying on the knowledge of experts to extract information from those data or images.
CARDS: A blueprint and environment for domain-specific software reuse
NASA Technical Reports Server (NTRS)
Wallnau, Kurt C.; Solderitsch, Anne Costa; Smotherman, Catherine
1992-01-01
CARDS (Central Archive for Reusable Defense Software) exploits advances in domain analysis and domain modeling to identify, specify, develop, archive, retrieve, understand, and reuse domain-specific software components. An important element of CARDS is to provide visibility into the domain model artifacts produced by, and services provided by, commercial computer-aided software engineering (CASE) technology. The use of commercial CASE technology is important to provide rich, robust support for the varied roles involved in a reuse process. We refer to this kind of use of knowledge representation systems as supporting 'knowledge-based integration.'
Mission operations update for the restructured Earth Observing System (EOS) mission
NASA Technical Reports Server (NTRS)
Kelly, Angelita Castro; Chang, Edward S.
1993-01-01
The National Aeronautics and Space Administration's (NASA) Earth Observing System (EOS) will provide a comprehensive long term set of observations of the Earth to the Earth science research community. The data will aid in determining global changes caused both naturally and through human interaction. Understanding man's impact on the global environment will allow sound policy decisions to be made to protect our future. EOS is a major component of the Mission to Planet Earth program, which is NASA's contribution to the U.S. Global Change Research Program. EOS consists of numerous instruments on multiple spacecraft and a distributed ground system. The EOS Data and Information System (EOSDIS) is the major ground system developed to support EOS. The EOSDIS will provide EOS spacecraft command and control, data processing, product generation, and data archival and distribution services for EOS spacecraft. Data from EOS instruments on other Earth science missions (e.g., Tropical Rainfall Measuring Mission (TRMM)) will also be processed, distributed, and archived in EOSDIS. The U.S. and various International Partners (IP) (e.g., the European Space Agency (ESA), the Ministry of International Trade and Industry (MITI) of Japan, and the Canadian Space Agency (CSA)) participate in and contribute to the international EOS program. The EOSDIS will also archive processed data from other designated NASA Earth science missions (e.g., UARS) that are under the broad umbrella of Mission to Planet Earth.
AppEEARS: A Simple Tool that Eases Complex Data Integration and Visualization Challenges for Users
NASA Astrophysics Data System (ADS)
Maiersperger, T.
2017-12-01
The Application for Extracting and Exploring Analysis-Ready Samples (AppEEARS) offers a simple and efficient way to perform discovery, processing, visualization, and acquisition across large quantities and varieties of Earth science data. AppEEARS brings significant value to a very broad array of user communities by 1) significantly reducing data volumes, at-archive, based on user-defined space-time-variable subsets, 2) promoting interoperability across a wide variety of datasets via format and coordinate reference system harmonization, 3) increasing the velocity of both data analysis and insight by providing analysis-ready data packages and by allowing interactive visual exploration of those packages, and 4) ensuring veracity by making data quality measures more apparent and usable and by providing standards-based metadata and processing provenance. Development and operation of AppEEARS is led by the National Aeronautics and Space Administration (NASA) Land Processes Distributed Active Archive Center (LP DAAC). The LP DAAC also partners with several other archives to extend the capability across a larger federation of geospatial data providers. Over one hundred datasets are currently available, covering a diversity of variables including land cover, population, elevation, vegetation indices, and land surface temperature. Many hundreds of users have already used this new web-based capability to make the complex tasks of data integration and visualization much simpler and more efficient.
Physical Oceanography Distributed Active Archive Center
NASA Technical Reports Server (NTRS)
Benada, J. Robert
1997-01-01
This new TOPEX/POSEIDON product provides the full mission data set in a new format with many data quality improvements brought about by the work of many scientists during the mission. It is a revised form to the MGDR-A CD-ROM set which covered data from the beginning of the mission, Septermber 22, 1992 to April 23, 1996.
Nosocomial Outbreak of Parechovirus 3 Infection among Newborns, Austria, 2014.
Strenger, Volker; Diedrich, Sabine; Boettcher, Sindy; Richter, Susanne; Maritschnegg, Peter; Gangl, Dietmar; Fuchs, Simone; Grangl, Gernot; Resch, Bernhard; Urlesberger, Berndt
2016-09-01
In 2014, sepsis-like illness affected 9 full-term newborns in 1 hospital in Austria. Although results of initial microbiological testing were negative, electron microscopy identified picornavirus. Archived serum samples and feces obtained after discharge were positive by PCR for human parechovirus 3. This infection should be included in differential diagnoses of sepsis-like illness in newborns.
The Relationship between Retention and College Counseling for High-risk Students
ERIC Educational Resources Information Center
Bishop, Kyle K.
2016-01-01
The author used an archival study to explore the relationship between college counseling and retention. The cohort for this study was a college's 2006 class of full-time, 1st-year students (N = 429). The results of chi-square analyses and regression analyses indicated (a) a significant difference in retention between high-risk and low-risk…
21 CFR 892.2050 - Picture archiving and communications system.
Code of Federal Regulations, 2014 CFR
2014-04-01
... processing of medical images. Its hardware components may include workstations, digitizers, communications... hardcopy devices. The software components may provide functions for performing operations related to image...
21 CFR 892.2050 - Picture archiving and communications system.
Code of Federal Regulations, 2011 CFR
2011-04-01
... processing of medical images. Its hardware components may include workstations, digitizers, communications... hardcopy devices. The software components may provide functions for performing operations related to image...
21 CFR 892.2050 - Picture archiving and communications system.
Code of Federal Regulations, 2012 CFR
2012-04-01
... processing of medical images. Its hardware components may include workstations, digitizers, communications... hardcopy devices. The software components may provide functions for performing operations related to image...
15 CFR 960.11 - Conditions for operation.
Code of Federal Regulations, 2014 CFR
2014-01-01
.... During such limitations, the licensee shall, on request, provide unenhanced restricted images on a..., processing, archiving and dissemination. (i) If the operating license restricts the distribution of certain...
21 CFR 892.2050 - Picture archiving and communications system.
Code of Federal Regulations, 2013 CFR
2013-04-01
... processing of medical images. Its hardware components may include workstations, digitizers, communications... hardcopy devices. The software components may provide functions for performing operations related to image...
Lu, David; Graf, Ryon P.; Harvey, Melissa; Madan, Ravi A.; Heery, Christopher; Marte, Jennifer; Beasley, Sharon; Tsang, Kwong Y.; Krupa, Rachel; Louw, Jessica; Wahl, Justin; Bales, Natalee; Landers, Mark; Marrinucci, Dena; Schlom, Jeffrey; Gulley, James L.; Dittamore, Ryan
2015-01-01
Retrospective analysis of patient tumour samples is a cornerstone of clinical research. CTC biomarker characterization offers a non-invasive method to analyse patient samples. However, current CTC technologies require prospective blood collection, thereby reducing the ability to utilize archived clinical cohorts with long-term outcome data. We sought to investigate CTC recovery from frozen, archived patient PBMC pellets. Matched samples from both mCRPC patients and mock samples, which were prepared by spiking healthy donor blood with cultured prostate cancer cell line cells, were processed “fresh” via Epic CTC Platform or from “frozen” PBMC pellets. Samples were analysed for CTC enumeration and biomarker characterization via immunofluorescent (IF) biomarkers, fluorescence in-situ hybridization (FISH) and CTC morphology. In the frozen patient PMBC samples, the median CTC recovery was 18%, compared to the freshly processed blood. However, abundance and localization of cytokeratin (CK) and androgen receptor (AR) protein, as measured by IF, were largely concordant between the fresh and frozen CTCs. Furthermore, a FISH analysis of PTEN loss showed high concordance in fresh vs. frozen. The observed data indicate that CTC biomarker characterization from frozen archival samples is feasible and representative of prospectively collected samples. PMID:28936240
A PDS Archive for Observations of Mercury's Na Exosphere
NASA Astrophysics Data System (ADS)
Backes, C.; Cassidy, T.; Merkel, A. W.; Killen, R. M.; Potter, A. E.
2016-12-01
We present a data product consisting of ground-based observations of Mercury's sodium exosphere. We have amassed a sizeable dataset of several thousand spectral observations of Mercury's exosphere from the McMath-Pierce solar telescope. Over the last year, a data reduction pipeline has been developed and refined to process and reconstruct these spectral images into low resolution images of sodium D2 emission. This dataset, which extends over two decades, will provide an unprecedented opportunity to analyze the dynamics of Mercury's mid to high-latitude exospheric emissions, which have long been attributed to solar wind ion bombardment. This large archive of observations will be of great use to the Mercury science community in studying the effects of space weather on Mercury's tenuous exosphere. When completely processed, images in this dataset will show the observed spatial distribution of Na D2 in the Mercurian exosphere, have measurements of this sodium emission per pixel in units of kilorayleighs, and be available through NASA's Planetary Data System. The overall goal of the presentation will be to provide the Planetary Science community with a clear picture of what information and data this archival product will make available.
National Aeronautics and Space Administration Biological Specimen Repository
NASA Technical Reports Server (NTRS)
McMonigal, Kathleen A.; Pietrzyk, Robert a.; Johnson, Mary Anne
2008-01-01
The National Aeronautics and Space Administration Biological Specimen Repository (Repository) is a storage bank that is used to maintain biological specimens over extended periods of time and under well-controlled conditions. Samples from the International Space Station (ISS), including blood and urine, will be collected, processed and archived during the preflight, inflight and postflight phases of ISS missions. This investigation has been developed to archive biosamples for use as a resource for future space flight related research. The International Space Station (ISS) provides a platform to investigate the effects of microgravity on human physiology prior to lunar and exploration class missions. The storage of crewmember samples from many different ISS flights in a single repository will be a valuable resource with which researchers can study space flight related changes and investigate physiological markers. The development of the National Aeronautics and Space Administration Biological Specimen Repository will allow for the collection, processing, storage, maintenance, and ethical distribution of biosamples to meet goals of scientific and programmatic relevance to the space program. Archiving of the biosamples will provide future research opportunities including investigating patterns of physiological changes, analysis of components unknown at this time or analyses performed by new methodologies.
SCOPE - Stellar Classification Online Public Exploration
NASA Astrophysics Data System (ADS)
Harenberg, Steven
2010-01-01
The Astronomical Photographic Data Archive (APDA) has been established to be the primary North American archive for the collections of astronomical photographic plates. Located at the Pisgah Astronomical Research Institute (PARI) in Rosman, NC, the archive contains hundreds of thousands stellar spectra, many of which have never before been classified. To help classify the vast number of stars, the public is invited to participate in a distributed computing online environment called Stellar Classification Online - Public Exploration (SCOPE). Through a website, the participants will have a tutorial on stellar spectra and practice classifying. After practice, the participants classify spectra on photographic plates uploaded online from APDA. These classifications will be recorded in a database where the results from many users will be statistically analyzed. Stars with known spectral types will be included to test the reliability of classifications. The process of building the database of stars from APDA, which the citizen scientist will be able to classify, includes: scanning the photographic plates, orienting the plate to correct for the change in right ascension/declination using Aladin, stellar HD catalog identification using Simbad, marking the boundaries for each spectrum, and setting up the image for use on the website. We will describe the details of this process.
Computer Code Gives Astrophysicists First Full Simulation of Star's Final Hours
Applin, Bradford; Almgren, Ann S.; Nonaka, Andy
2018-05-11
The precise conditions inside a white dwarf star in the hours leading up to its explosive end as a Type Ia supernova are one of the mysteries confronting astrophysicists studying these massive stellar explosions. But now, a team of researchers, composed of three applied mathematicians at the U.S. Department of Energy's (DOE) Lawrence Berkeley National Laboratory and two astrophysicists, has created the first full-star simulation of the hours preceding the largest thermonuclear explosions in the universe. http://www.lbl.gov/cs/Archive/news091509.html
Processes and Dynamics behind Whole-School Reform: Nine-Year Journeys of Four Primary Schools
ERIC Educational Resources Information Center
Li, Yuk Yung
2017-01-01
Despite decades of research, little is known about the dynamics of sustaining change in school reform and how the process of change unfolds. By tracing the nine-year reform journeys of four primary schools in Hong Kong (using multiyear interview, observational, and archival data), this study uncovers the micro-processes the schools experienced…
NASA Astrophysics Data System (ADS)
Walter, R. J.; Protack, S. P.; Harris, C. J.; Caruthers, C.; Kusterer, J. M.
2008-12-01
NASA's Atmospheric Science Data Center at the NASA Langley Research Center performs all of the science data processing for the Multi-angle Imaging SpectroRadiometer (MISR) instrument. MISR is one of the five remote sensing instruments flying aboard NASA's Terra spacecraft. From the time of Terra launch in December 1999 until February 2008, all MISR science data processing was performed on a Silicon Graphics, Inc. (SGI) platform. However, dramatic improvements in commodity computing technology coupled with steadily declining project budgets during that period eventually made transitioning MISR processing to a commodity computing environment both feasible and necessary. The Atmospheric Science Data Center has successfully ported the MISR science data processing environment from the SGI platform to a Linux cluster environment. There were a multitude of technical challenges associated with this transition. Even though the core architecture of the production system did not change, the manner in which it interacted with underlying hardware was fundamentally different. In addition, there are more potential throughput bottlenecks in a cluster environment than there are in a symmetric multiprocessor environment like the SGI platform and each of these had to be addressed. Once all the technical issues associated with the transition were resolved, the Atmospheric Science Data Center had a MISR science data processing system with significantly higher throughput than the SGI platform at a fraction of the cost. In addition to the commodity hardware, free and open source software such as S4PM, Sun Grid Engine, PostgreSQL and Ganglia play a significant role in the new system. Details of the technical challenges and resolutions, software systems, performance improvements, and cost savings associated with the transition will be discussed. The Atmospheric Science Data Center in Langley's Science Directorate leads NASA's program for the processing, archival and distribution of Earth science data in the areas of radiation budget, clouds, aerosols, and tropospheric chemistry. The Data Center was established in 1991 to support NASA's Earth Observing System and the U.S. Global Change Research Program. It is unique among NASA data centers in the size of its archive, cutting edge computing technology, and full range of data services. For more information regarding ASDC data holdings, documentation, tools and services, visit http://eosweb.larc.nasa.gov
ESA's Planetary Science Archive: International collaborations towards transparent data access
NASA Astrophysics Data System (ADS)
Heather, David
The European Space Agency's (ESA) Planetary Science Archive (PSA) is the central repository for science data returned by all ESA planetary missions. Current holdings include data from Giotto, SMART-1, Cassini-Huygens, Mars Express, Venus Express, and Rosetta. In addition to the basic management and distribution of these data to the community through our own interfaces, ESA has been working very closely with international partners to globalize the archiving standards used and the access to our data. Part of this ongoing effort is channelled through our participation in the International Planetary Data Alliance (IPDA), whose focus is on allowing transparent and interoperable access to data holdings from participating Agencies around the globe. One major focus of this work has been the development of the Planetary Data Access Protocol (PDAP) that will allow for the interoperability of archives and sharing of data. This is already used for transparent access to data from Venus Express, and ESA are currently working with ISRO and NASA to provide interoperable access to ISRO's Chandrayaan-1 data through our systems using this protocol. Close interactions are ongoing with NASA's Planetary Data System as the standards used for planetary data archiving evolve, and two of our upcoming missions are to be the first to implement the new 'PDS4' standards in ESA: BepiColombo and ExoMars. Projects have been established within the IPDA framework to guide these implementations to try and ensure interoperability and maximise the usability of the data by the community. BepiColombo and ExoMars are both international missions, in collaboration with JAXA and IKI respectively, and a strong focus has been placed on close interaction and collaboration throughout the development of each archive. For both of these missions there is a requirement to share data between the Agencies prior to public access, as well as providing complete open access globally once the proprietary periods have elapsed. This introduces a number of additional challenges in terms of managing different access rights to data throughout the mission lifetime. Both of these mission will have data pipelines running internally to our Science Ground Segment, in order to release the instrument teams to work more on science analyses. We have followed the IPDA recommendations of trying to start work on archiving with these missions very early in the life-cycle (especially on BepiColombo and now starting on JUICE), and endeavour to make sure that archiving requirements are clearly stated in official mission documentation at the time of selection. This has helped to ensure that adequate resources are available internally and within the instrument teams to support archive development. This year will also see major milestones for two of our operational missions. Venus Express will start an aerobraking phase in late spring / early summer, and will wind down science operations this year, while Rosetta will encounter the comet Churyamov-Gerasimenko, deploy the lander and start its main science phase. While these missions are at opposite ends of their science phases, many of the challenges from the archiving side are similar. Venus Express will have a full mission archive review this year and data pipelines will start to be updated / corrected where necessary in order to ensure long-term usability and interoperable access to the data. Rosetta will start to deliver science data in earnest towards the end of the year, and the focus will be on ensuring that data pipelines are ready and robust enough to maintain deliveries throughout the main science phase. For both missions, we aim to use the lessons learned and technologies developed through our international collaborations to maximise the availability and usability of the data delivered. In 2013, ESA established a Planetary Science Archive User Group (PSA-UG) to provide independent advice on ways to improve our services and our provision of data to the community. The PSA-UG will be a key link to the international planetary science community, providing requirements and recommendations that will allow us to better meet their needs, and promoting the use of the PSA and its data holdings. This presentation will outline the many international collaborations currently in place for the PSA, both for missions in operations and for those under development. There is a strong desire to provide full transparent science data access and improved services to the planetary science community around the world, and our continuing work with our international partners brings us ever closer to achieving this goal. Many challenges still remain, and these will be outlined in the presentation.
NASA Astrophysics Data System (ADS)
Robbins, William L.; Conklin, James J.
1995-10-01
Medical images (angiography, CT, MRI, nuclear medicine, ultrasound, x ray) play an increasingly important role in the clinical development and regulatory review process for pharmaceuticals and medical devices. Since medical images are increasingly acquired and archived digitally, or are readily digitized from film, they can be visualized, processed and analyzed in a variety of ways using digital image processing and display technology. Moreover, with image-based data management and data visualization tools, medical images can be electronically organized and submitted to the U.S. Food and Drug Administration (FDA) for review. The collection, processing, analysis, archival, and submission of medical images in a digital format versus an analog (film-based) format presents both challenges and opportunities for the clinical and regulatory information management specialist. The medical imaging 'core laboratory' is an important resource for clinical trials and regulatory submissions involving medical imaging data. Use of digital imaging technology within a core laboratory can increase efficiency and decrease overall costs in the image data management and regulatory review process.
Synergy Between Archives, VO, and the Grid at ESAC
NASA Astrophysics Data System (ADS)
Arviset, C.; Alvarez, R.; Gabriel, C.; Osuna, P.; Ott, S.
2011-07-01
Over the years, in support to the Science Operations Centers at ESAC, we have set up two Grid infrastructures. These have been built: 1) to facilitate daily research for scientists at ESAC, 2) to provide high computing capabilities for project data processing pipelines (e.g., Herschel), 3) to support science operations activities (e.g., calibration monitoring). Furthermore, closer collaboration between the science archives, the Virtual Observatory (VO) and data processing activities has led to an other Grid use case: the Remote Interface to XMM-Newton SAS Analysis (RISA). This web service-based system allows users to launch SAS tasks transparently to the GRID, save results on http-based storage and visualize them through VO tools. This paper presents real and operational use cases of Grid usages in these contexts
EOSDIS Terra Data Sampler #1: Western US Wildfires 2000. 1.1
NASA Technical Reports Server (NTRS)
Perkins, Dorothy C. (Technical Monitor)
2000-01-01
This CD-ROM contains sample data in HDF-EOS format from the instruments on board the Earth Observing System (EOS) Terra satellite: (1) Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER); (2) Clouds and the Earth's Radiant Energy System (CERES); (3) Multi-angle Imaging Spectroradiometer (MISR); and (4) Moderate Resolution Imaging Spectroradiometer (MODIS). Data from the Measurements of Pollution in the Troposphere (MOPITT) instrument were not available for distribution (as of October 17, 2000). The remotely sensed, coincident data for the Western US wildfires were acquired August 30, 2000. This CD-ROM provides information about the Terra mission, instruments, data, and viewing tools. It also provides the Collage tool for viewing data, and links to Web sites containing other digital data processing software. Full granules of the data on this CD-ROM and other EOS Data and Information System (EOSDIS) data products are available from the NASA Distributed Active Archive Centers (DAACs).
Fast image interpolation for motion estimation using graphics hardware
NASA Astrophysics Data System (ADS)
Kelly, Francis; Kokaram, Anil
2004-05-01
Motion estimation and compensation is the key to high quality video coding. Block matching motion estimation is used in most video codecs, including MPEG-2, MPEG-4, H.263 and H.26L. Motion estimation is also a key component in the digital restoration of archived video and for post-production and special effects in the movie industry. Sub-pixel accurate motion vectors can improve the quality of the vector field and lead to more efficient video coding. However sub-pixel accuracy requires interpolation of the image data. Image interpolation is a key requirement of many image processing algorithms. Often interpolation can be a bottleneck in these applications, especially in motion estimation due to the large number pixels involved. In this paper we propose using commodity computer graphics hardware for fast image interpolation. We use the full search block matching algorithm to illustrate the problems and limitations of using graphics hardware in this way.
Building a cloud based distributed active archive data center
NASA Astrophysics Data System (ADS)
Ramachandran, Rahul; Baynes, Katie; Murphy, Kevin
2017-04-01
NASA's Earth Science Data System (ESDS) Program serves as a central cog in facilitating the implementation of NASA's Earth Science strategic plan. Since 1994, the ESDS Program has committed to the full and open sharing of Earth science data obtained from NASA instruments to all users. One of the key responsibilities of the ESDS Program is to continuously evolve the entire data and information system to maximize returns on the collected NASA data. An independent review was conducted in 2015 to holistically review the EOSDIS in order to identify gaps. The review recommendations were to investigate two areas: one, whether commercial cloud providers offer potential for storage, processing, and operational efficiencies, and two, the potential development of new data access and analysis paradigms. In response, ESDS has initiated several prototypes investigating the advantages and risks of leveraging cloud computing. This poster will provide an overview of one such prototyping activity, "Cumulus". Cumulus is being designed and developed as a "native" cloud-based data ingest, archive and management system that can be used for all future NASA Earth science data streams. The long term vision for Cumulus, its requirements, overall architecture, and implementation details, as well as lessons learned from the completion of the first phase of this prototype will be covered. We envision Cumulus will foster design of new analysis/visualization tools to leverage collocated data from all of the distributed DAACs as well as elastic cloud computing resources to open new research opportunities.
Interactively Browsing NASA's EOS Imagery in Full Resolution
NASA Astrophysics Data System (ADS)
Boller, R. A.; Joshi, T.; Schmaltz, J. E.; Ilavajhala, S.; Davies, D.; Murphy, K. J.
2012-12-01
Worldview is a new tool designed to interactively browse full-resolution imagery from NASA's fleet of Earth Observing System (EOS) satellites. It is web-based and developed using open standards (JavaScript, CSS, HTML) for cross-platform compatibility. It addresses growing user demands for access to full-resolution imagery by providing a responsive, interactive interface with global coverage, no artificial boundaries, and views in geographic and polar projections. Currently tailored to the near real-time community, Worldview enables the rapid evaluation and comparison of imagery related to such application areas as fires, floods, and air quality. It is supported by the Global Imagery Browse Services (GIBS), a system that continuously ingests, mosaics, and serves approximately 21GB of imagery daily. This imagery spans over 50 data products that are available within three hours of observation from instruments aboard Terra, Aqua, and Aura. The GIBS image archive began in May 2012 and will have published approximately 4.4TB of imagery as of December 2012. Worldview facilitates rapid access to this archive and is supplemented by socioeconomic data layers from the Socioeconomic Data and Applications Center (SEDAC), including products such as population density and economic risk from cyclones. Future plans include the accessibility of additional products that cover the entire Terra/MODIS and Aqua/MODIS missions (>150TB) and the ability to download the underlying science data of the onscreen imagery.
Integration Of An MR Image Network Into A Clinical PACS
NASA Astrophysics Data System (ADS)
Ratib, Osman M.; Mankovich, Nicholas J.; Taira, Ricky K.; Cho, Paul S.; Huang, H. K.
1988-06-01
A direct link between a clinical pediatric PACS module and a FONAR MRI image network was implemented. The original MR network combines together the MR scanner, a remote viewing station and a central archiving station. The pediatric PACS directly connects to the archiving unit through an Ethernet TCP-IP network adhering to FONAR's protocol. The PACS communication software developed supports the transfer of patient studies and the patient information directly from the MR archive database to the pediatric PACS. In the first phase of our project we developed a package to transfer data between a VAX-111750 and the IBM PC I AT-based MR archive database through the Ethernet network. This system served as a model for PACS-to-modality network communication. Once testing was complete on this research network, the software and network hardware was moved to the clinical pediatric VAX for full PACS integration. In parallel to the direct transmission of digital images to the Pediatric PACS, a broadband communication system in video format was developed for real-time broadcasting of images originating from the MR console to 8 remote viewing stations distributed in the radiology department. These analog viewing stations allow the radiologists to directly monitor patient positioning and to select the scan levels during a patient examination from remote locations in the radiology department. This paper reports (1) the technical details of this implementation, (2) the merits of this network development scheme, and (3) the performance statistics of the network-to-PACS interface.
Problem of data quality and the limitations of the infrastructure approach
NASA Astrophysics Data System (ADS)
Behlen, Fred M.; Sayre, Richard E.; Rackus, Edward; Ye, Dingzhong
1998-07-01
The 'Infrastructure Approach' is a PACS implementation methodology wherein the archive, network and information systems interfaces are acquired first, and workstations are installed later. The approach allows building a history of archived image data, so that most prior examinations are available in digital form when workstations are deployed. A limitation of the Infrastructure Approach is that the deferred use of digital image data defeats many data quality management functions that are provided automatically by human mechanisms when data is immediately used for the completion of clinical tasks. If the digital data is used solely for archiving while reports are interpreted from film, the radiologist serves only as a check against lost films, and another person must be designated as responsible for the quality of the digital data. Data from the Radiology Information System and the PACS were analyzed to assess the nature and frequency of system and data quality errors. The error level was found to be acceptable if supported by auditing and error resolution procedures requiring additional staff time, and in any case was better than the loss rate of a hardcopy film archive. It is concluded that the problem of data quality compromises but does not negate the value of the Infrastructure Approach. The Infrastructure Approach should best be employed only to a limited extent, and that any phased PACS implementation should have a substantial complement of workstations dedicated to softcopy interpretation for at least some applications, and with full deployment following not long thereafter.
NASA Astrophysics Data System (ADS)
Mitchell, A. E.; Lowe, D. R.; Murphy, K. J.; Ramapriyan, H. K.
2011-12-01
Initiated in 1990, NASA's Earth Observing System Data and Information System (EOSDIS) is currently a petabyte-scale archive of data designed to receive, process, distribute and archive several terabytes of science data per day from NASA's Earth science missions. Comprised of 12 discipline specific data centers collocated with centers of science discipline expertise, EOSDIS manages over 6800 data products from many science disciplines and sources. NASA supports global climate change research by providing scalable open application layers to the EOSDIS distributed information framework. This allows many other value-added services to access NASA's vast Earth Science Collection and allows EOSDIS to interoperate with data archives from other domestic and international organizations. EOSDIS is committed to NASA's Data Policy of full and open sharing of Earth science data. As metadata is used in all aspects of NASA's Earth science data lifecycle, EOSDIS provides a spatial and temporal metadata registry and order broker called the EOS Clearing House (ECHO) that allows efficient search and access of cross domain data and services through the Reverb Client and Application Programmer Interfaces (APIs). Another core metadata component of EOSDIS is NASA's Global Change Master Directory (GCMD) which represents more than 25,000 Earth science data set and service descriptions from all over the world, covering subject areas within the Earth and environmental sciences. With inputs from the ECHO, GCMD and Soil Moisture Active Passive (SMAP) mission metadata models, EOSDIS is developing a NASA ISO 19115 Best Practices Convention. Adoption of an international metadata standard enables a far greater level of interoperability among national and international data products. NASA recently concluded a 'Metadata Harmony Study' of EOSDIS metadata capabilities/processes of ECHO and NASA's Global Change Master Directory (GCMD), to evaluate opportunities for improved data access and use, reduce efforts by data providers and improve metadata integrity. The result was a recommendation for EOSDIS to develop a 'Common Metadata Repository (CMR)' to manage the evolution of NASA Earth Science metadata in a unified and consistent way by providing a central storage and access capability that streamlines current workflows while increasing overall data quality and anticipating future capabilities. For applications users interested in monitoring and analyzing a wide variety of natural and man-made phenomena, EOSDIS provides access to near real-time products from the MODIS, OMI, AIRS, and MLS instruments in less than 3 hours from observation. To enable interactive exploration of NASA's Earth imagery, EOSDIS is developing a set of standard services to deliver global, full-resolution satellite imagery in a highly responsive manner. EOSDIS is also playing a lead role in the development of the CEOS WGISS Integrated Catalog (CWIC), which provides search and access to holdings of participating international data providers. EOSDIS provides a platform to expose and share information on NASA Earth science tools and data via Earthdata.nasa.gov while offering a coherent and interoperable system for the NASA Earth Science Data System (ESDS) Program.
NASA Astrophysics Data System (ADS)
Mitchell, A. E.; Lowe, D. R.; Murphy, K. J.; Ramapriyan, H. K.
2013-12-01
Initiated in 1990, NASA's Earth Observing System Data and Information System (EOSDIS) is currently a petabyte-scale archive of data designed to receive, process, distribute and archive several terabytes of science data per day from NASA's Earth science missions. Comprised of 12 discipline specific data centers collocated with centers of science discipline expertise, EOSDIS manages over 6800 data products from many science disciplines and sources. NASA supports global climate change research by providing scalable open application layers to the EOSDIS distributed information framework. This allows many other value-added services to access NASA's vast Earth Science Collection and allows EOSDIS to interoperate with data archives from other domestic and international organizations. EOSDIS is committed to NASA's Data Policy of full and open sharing of Earth science data. As metadata is used in all aspects of NASA's Earth science data lifecycle, EOSDIS provides a spatial and temporal metadata registry and order broker called the EOS Clearing House (ECHO) that allows efficient search and access of cross domain data and services through the Reverb Client and Application Programmer Interfaces (APIs). Another core metadata component of EOSDIS is NASA's Global Change Master Directory (GCMD) which represents more than 25,000 Earth science data set and service descriptions from all over the world, covering subject areas within the Earth and environmental sciences. With inputs from the ECHO, GCMD and Soil Moisture Active Passive (SMAP) mission metadata models, EOSDIS is developing a NASA ISO 19115 Best Practices Convention. Adoption of an international metadata standard enables a far greater level of interoperability among national and international data products. NASA recently concluded a 'Metadata Harmony Study' of EOSDIS metadata capabilities/processes of ECHO and NASA's Global Change Master Directory (GCMD), to evaluate opportunities for improved data access and use, reduce efforts by data providers and improve metadata integrity. The result was a recommendation for EOSDIS to develop a 'Common Metadata Repository (CMR)' to manage the evolution of NASA Earth Science metadata in a unified and consistent way by providing a central storage and access capability that streamlines current workflows while increasing overall data quality and anticipating future capabilities. For applications users interested in monitoring and analyzing a wide variety of natural and man-made phenomena, EOSDIS provides access to near real-time products from the MODIS, OMI, AIRS, and MLS instruments in less than 3 hours from observation. To enable interactive exploration of NASA's Earth imagery, EOSDIS is developing a set of standard services to deliver global, full-resolution satellite imagery in a highly responsive manner. EOSDIS is also playing a lead role in the development of the CEOS WGISS Integrated Catalog (CWIC), which provides search and access to holdings of participating international data providers. EOSDIS provides a platform to expose and share information on NASA Earth science tools and data via Earthdata.nasa.gov while offering a coherent and interoperable system for the NASA Earth Science Data System (ESDS) Program.
NASA Astrophysics Data System (ADS)
Walker, D. A.; Breen, A. L.; Broderson, D.; Epstein, H. E.; Fisher, W.; Grunblatt, J.; Heinrichs, T.; Raynolds, M. K.; Walker, M. D.; Wirth, L.
2013-12-01
Abundant ground-based information will be needed to inform remote-sensing and modeling studies of NASA's Arctic-Boreal Vulnerability Experiment (ABoVE). A large body of plot and map data collected by the Alaska Geobotany Center (AGC) and collaborators from the Arctic regions of Alaska and the circumpolar Arctic over the past several decades is being archived and made accessible to scientists and the public via the Geographic Information Network of Alaska's (GINA's) 'Catalog' display and portal system. We are building two main types of data archives: Vegetation Plot Archive: For the plot information we use a Turboveg database to construct the Alaska portion of the international Arctic Vegetation Archive (AVA) http://www.geobotany.uaf.edu/ava/. High quality plot data and non-digital legacy datasets in danger of being lost have highest priority for entry into the archive. A key aspect of the database is the PanArctic Species List (PASL-1), developed specifically for the AVA to provide a standard of species nomenclature for the entire Arctic biome. A wide variety of reports, documents, and ancillary data are linked to each plot's geographic location. Geoecological Map Archive: This database includes maps and remote sensing products and links to other relevant data associated with the maps, mainly those produced by the Alaska Geobotany Center. Map data include GIS shape files of vegetation, land-cover, soils, landforms and other categorical variables and digital raster data of elevation, multispectral satellite-derived data, and data products and metadata associated with these. The map archive will contain all the information that is currently in the hierarchical Toolik-Arctic Geobotanical Atlas (T-AGA) in Alaska http://www.arcticatlas.org, plus several additions that are in the process of development and will be combined with GINA's already substantial holdings of spatial data from northern Alaska. The Geoecological Atlas Portal uses GINA's Catalog tool to develop a web interface to view and access the plot and map data. The mapping portal allows visualization of GIS data, sample-point locations and imagery and access to the map data. Catalog facilitates the discovery and dissemination of science-based information products in support of analysis and decision-making concerned with development and climate change and is currently used by GINA in several similar archive/distribution portals.
Back to the Future: Long-Term Seismic Archives Revisited
NASA Astrophysics Data System (ADS)
Waldhauser, F.; Schaff, D. P.
2007-12-01
Archives of digital seismic data recorded by seismometer networks around the world have grown tremendously over the last several decades helped by the deployment of seismic stations and their continued operation within the framework of monitoring seismic activity. These archives typically consist of waveforms of seismic events and associated parametric data such as phase arrival time picks and the location of hypocenters. Catalogs of earthquake locations are fundamental data in seismology, and even in the Earth sciences in general. Yet, these locations have notoriously low spatial resolution because of errors in both the picks and the models commonly used to locate events one at a time. This limits their potential to address fundamental questions concerning the physics of earthquakes, the structure and composition of the Earth's interior, and the seismic hazards associated with active faults. We report on the comprehensive use of modern waveform cross-correlation based methodologies for high- resolution earthquake location - as applied to regional and global long-term seismic databases. By simultaneous re-analysis of two decades of the digital seismic archive of Northern California, reducing pick errors via cross-correlation and model errors via double-differencing, we achieve up to three orders of magnitude resolution improvement over existing hypocenter locations. The relocated events image networks of discrete faults at seismogenic depths across various tectonic settings that until now have been hidden in location uncertainties. Similar location improvements are obtained for earthquakes recorded at global networks by re- processing 40 years of parametric data from the ISC and corresponding waveforms archived at IRIS. Since our methods are scaleable and run on inexpensive Beowulf clusters, periodic re-analysis of entire archives may thus become a routine procedure to continuously improve resolution in existing catalogs. We demonstrate the role of seismic archives in obtaining the precise location of new events in real-time. Such information has considerable social and economic impact in the evaluation and mitigation of seismic hazards, for example, and highlights the need for consistent long-term seismic monitoring and archiving of records.
NASA Astrophysics Data System (ADS)
Gopu, Arvind; Hayashi, Soichi; Young, Michael D.; Harbeck, Daniel R.; Boroson, Todd; Liu, Wilson; Kotulla, Ralf; Shaw, Richard; Henschel, Robert; Rajagopal, Jayadev; Stobie, Elizabeth; Knezek, Patricia; Martin, R. Pierre; Archbold, Kevin
2014-07-01
The One Degree Imager-Portal, Pipeline, and Archive (ODI-PPA) is a web science gateway that provides astronomers a modern web interface that acts as a single point of access to their data, and rich computational and visualization capabilities. Its goal is to support scientists in handling complex data sets, and to enhance WIYN Observatory's scientific productivity beyond data acquisition on its 3.5m telescope. ODI-PPA is designed, with periodic user feedback, to be a compute archive that has built-in frameworks including: (1) Collections that allow an astronomer to create logical collations of data products intended for publication, further research, instructional purposes, or to execute data processing tasks (2) Image Explorer and Source Explorer, which together enable real-time interactive visual analysis of massive astronomical data products within an HTML5 capable web browser, and overlaid standard catalog and Source Extractor-generated source markers (3) Workflow framework which enables rapid integration of data processing pipelines on an associated compute cluster and users to request such pipelines to be executed on their data via custom user interfaces. ODI-PPA is made up of several light-weight services connected by a message bus; the web portal built using Twitter/Bootstrap, AngularJS and jQuery JavaScript libraries, and backend services written in PHP (using the Zend framework) and Python; it leverages supercomputing and storage resources at Indiana University. ODI-PPA is designed to be reconfigurable for use in other science domains with large and complex datasets, including an ongoing offshoot project for electron microscopy data.
A Refreshable, On-line Cache for HST Data Retrieval
NASA Astrophysics Data System (ADS)
Fraquelli, Dorothy A.; Ellis, Tracy A.; Ridgaway, Michael; DPAS Team
2016-01-01
We discuss upgrades to the HST Data Processing System, with an emphasis on the changes Hubble Space Telescope (HST) Archive users will experience. In particular, data are now held on-line (in a cache) removing the need to reprocess the data every time they are requested from the Archive. OTFR (on the fly reprocessing) has been replaced by a reprocessing system, which runs in the background. Data in the cache are automatically placed in the reprocessing queue when updated calibration reference files are received or when an improved calibration algorithm is installed. Data in the on-line cache are expected to be the most up to date version. These changes were phased in throughout 2015 for all active instruments.The on-line cache was populated instrument by instrument over the course of 2015. As data were placed in the cache, the flag that triggers OTFR was reset so that OTFR no longer runs on these data. "Hybrid" requests to the Archive are handled transparently, with data not yet in the cache provided via OTFR and the remaining data provided from the cache. Users do not need to make separate requests.Users of the MAST Portal will be able to download data from the cache immediately. For data not in the cache, the Portal will send the user to the standard "Retrieval Options Page," allowing the user to direct the Archive to process and deliver the data.The classic MAST Search and Retrieval interface has the same look and feel as previously. Minor changes, unrelated to the cache, have been made to the format of the Retrieval Options Page.
Archiving, sharing, processing and publishing historical earthquakes data: the IT point of view
NASA Astrophysics Data System (ADS)
Locati, Mario; Rovida, Andrea; Albini, Paola
2014-05-01
Digital tools devised for seismological data are mostly designed for handling instrumentally recorded data. Researchers working on historical seismology are forced to perform their daily job using a general purpose tool and/or coding their own to address their specific tasks. The lack of out-of-the-box tools expressly conceived to deal with historical data leads to a huge amount of time lost in performing tedious task to search for the data and, to manually reformat it in order to jump from one tool to the other, sometimes causing a loss of the original data. This reality is common to all activities related to the study of earthquakes of the past centuries, from the interpretations of past historical sources, to the compilation of earthquake catalogues. A platform able to preserve the historical earthquake data, trace back their source, and able to fulfil many common tasks was very much needed. In the framework of two European projects (NERIES and SHARE) and one global project (Global Earthquake History, GEM), two new data portals were designed and implemented. The European portal "Archive of Historical Earthquakes Data" (AHEAD) and the worldwide "Global Historical Earthquake Archive" (GHEA), are aimed at addressing at least some of the above mentioned issues. The availability of these new portals and their well-defined standards makes it easier than before the development of side tools for archiving, publishing and processing the available historical earthquake data. The AHEAD and GHEA portals, their underlying technologies and the developed side tools are presented.
Valorisation of Como Historical Cadastral Maps Through Modern Web Geoservices
NASA Astrophysics Data System (ADS)
Brovelli, M. A.; Minghini, M.; Zamboni, G.
2012-07-01
Cartographic cultural heritage preserved in worldwide archives is often stored in the original paper version only, thus restricting both the chances of utilization and the range of possible users. The Web C.A.R.T.E. system addressed this issue with regard to the precious cadastral maps preserved at the State Archive of Como. Aim of the project was to improve the visibility and accessibility of this heritage using the latest free and open source tools for processing, cataloguing and web publishing the maps. The resulting architecture should therefore assist the State Archive of Como in managing its cartographic contents. After a pre-processing consisting of digitization and georeferencing steps, maps were provided with metadata, compiled according to the current Italian standards and managed through an ad hoc version of the GeoNetwork Opensource geocatalog software. A dedicated MapFish-based webGIS client, with an optimized version also for mobile platforms, was built for maps publication and 2D navigation. A module for 3D visualization of cadastral maps was finally developed using the NASA World Wind Virtual Globe. Thanks to a temporal slidebar, time was also included in the system producing a 4D Graphical User Interface. The overall architecture was totally built with free and open source software and allows a direct and intuitive consultation of historical maps. Besides the notable advantage of keeping original paper maps intact, the system greatly simplifies the work of the State Archive of Como common users and together widens the same range of users thanks to the modernization of map consultation tools.
Building a Trustworthy Environmental Science Data Repository: Lessons Learned from the ORNL DAAC
NASA Astrophysics Data System (ADS)
Wei, Y.; Santhana Vannan, S. K.; Boyer, A.; Beaty, T.; Deb, D.; Hook, L.
2017-12-01
The Oak Ridge National Laboratory Distributed Active Archive Center (ORNL DAAC, https://daac.ornl.gov) for biogeochemical dynamics is one of NASA's Earth Observing System Data and Information System (EOSDIS) data centers. The mission of the ORNL DAAC is to assemble, distribute, and provide data services for a comprehensive archive of terrestrial biogeochemistry and ecological dynamics observations and models to facilitate research, education, and decision-making in support of NASA's Earth Science. Since its establishment in 1994, ORNL DAAC has been continuously building itself into a trustworthy environmental science data repository by not only ensuring the quality and usability of its data holdings, but also optimizing its data publication and management process. This paper describes the lessons learned from ORNL DAAC's effort toward this goal. ORNL DAAC has been proactively implementing international community standards throughout its data management life cycle, including data publication, preservation, discovery, visualization, and distribution. Data files in standard formats, detailed documentation, and metadata following standard models are prepared to improve the usability and longevity of data products. Assignment of a Digital Object Identifier (DOI) ensures the identifiability and accessibility of every data product, including the different versions and revisions of its life cycle. ORNL DAAC's data citation policy assures data producers receive appropriate recognition of use of their products. Web service standards, such as OpenSearch and Open Geospatial Consortium (OGC), promotes the discovery, visualization, distribution, and integration of ORNL DAAC's data holdings. Recently, ORNL DAAC began efforts to optimize and standardize its data archival and data publication workflows, to improve the efficiency and transparency of its data archival and management processes.
36 CFR 1253.10 - Notification process for changes in hours.
Code of Federal Regulations, 2011 CFR
2011-07-01
... operation. (c) The notification process must proceed as follows: (1) Post a notice on http://www.archives.gov. (2) Post notices in areas visible to the public in their research room, exhibit areas or museum... the public of events at their location. (4) These notices will provide written determination...
36 CFR 1253.10 - Notification process for changes in hours.
Code of Federal Regulations, 2014 CFR
2014-07-01
... operation. (c) The notification process must proceed as follows: (1) Post a notice on http://www.archives.gov. (2) Post notices in areas visible to the public in their research room, exhibit areas or museum... the public of events at their location. (4) These notices will provide written determination...
36 CFR 1253.10 - Notification process for changes in hours.
Code of Federal Regulations, 2012 CFR
2012-07-01
... operation. (c) The notification process must proceed as follows: (1) Post a notice on http://www.archives.gov. (2) Post notices in areas visible to the public in their research room, exhibit areas or museum... the public of events at their location. (4) These notices will provide written determination...
NASA Technical Reports Server (NTRS)
Smarr, Larry; Press, William; Arnett, David W.; Cameron, Alastair G. W.; Crutcher, Richard M.; Helfand, David J.; Horowitz, Paul; Kleinmann, Susan G.; Linsky, Jeffrey L.; Madore, Barry F.
1991-01-01
The applications of computers and data processing to astronomy are discussed. Among the topics covered are the emerging national information infrastructure, workstations and supercomputers, supertelescopes, digital astronomy, astrophysics in a numerical laboratory, community software, archiving of ground-based observations, dynamical simulations of complex systems, plasma astrophysics, and the remote control of fourth dimension supercomputers.
Fermilab History and Archives Project | Norman F. Ramsey
Fermilab History and Archives Project Fermilab History and Archives Project Fermilab History and Archives Project Home About the Archives History and Archives Online Request Contact Us History & ; Archives Project Fermilab History and Archives Project Norman F. Ramsey Back to History and Archives
Scientific Astronomical School by Professor Volodymyr P. Tsesevich on the Physics of Variable Stars
NASA Astrophysics Data System (ADS)
Vavilova, I. B.
This paper is dedicated to the Prof. Volodymyr Platonovych Tsesevich (1907-1983), an outstanding scientist and legendary personality of the XX century. We describe briefly the Kyiv period of his life and activity taken from his Personal Dossier from the Archive of the Presidium of the NAS of Ukraine. A particular attention is paid to the role by V.P. Tsesevich in the development of astrophysical research at the Main Astronomical Observatory of the Academy of Sciences of UkrSSR, when he served as the Director (19.11.1948-03.05.1951), and to the fruitful cooperation between Kyiv and Odesa astronomers. We present briefly a "tree" of the scientific astronomical school by Prof. V.P. Tsesevich on the physics of stars. The data were obtained from different archives (Astronomical Observatory of the I.I. Mechnikov National University of Odesa, Main Astronomical Observatory of the NAS of Ukraine, Archive of the Vernadsky National Library, Archive of the Russian AS, and other institutions). The full database contains of a brief information on the about 100 representatives of this school as follows: name, title and year of thesis's defense, past/present affiliation). The scientific school is formed since 1950-ies till now having its greatest continuation in the work of such astronomers as N.S.Komarov, V.G.Karetnikov, Yu.S. Romanov, and I.L.Andronov (a branch of this school after V.P. Tsesevich), as well as S.M.Andrievsky as the follower by V.G. Karetnikov and T.V. Mishenina, V.F. Gopka, V.V. Kovtykh as the followers by N.S. Komarov. The given information on the school by V.P. Tsesevich is not absolutely full, for example, 1) there are no the data on thesis's defense under his supervision before 1948; 2) information on the astronomical school developed by A.M. Stafeev and some other scientists is a very poor; 3) some inaccuracies may be present. We will grateful for all the additions and corrections to update a tree of this scientific school, which played and plays a prominent role in the development of our knowledge on physics of stars.
Scanning Apollo Flight Films and Reconstructing CSM Trajectories
NASA Astrophysics Data System (ADS)
Speyerer, E.; Robinson, M. S.; Grunsfeld, J. M.; Locke, S. D.; White, M.
2006-12-01
Over thirty years ago, the astronauts of the Apollo program made the journey from the Earth to the Moon and back. To record their historic voyages and collect scientific observations many thousands of photographs were acquired with handheld and automated cameras. After returning to Earth, these films were developed and stored at the film archive at Johnson Space Center (JSC), where they still reside. Due to the historical significance of the original flight films typically only duplicate (2nd or 3rd generation) film products are studied and used to make prints. To allow full access to the original flight films for both researchers and the general public, JSC and Arizona State University are scanning and creating an online digital archive. A Leica photogrammetric scanner is being used to insure geometric and radiometric fidelity. Scanning resolution will preserve the grain of the film. Color frames are being scanned and archived as 48 bit pixels to insure capture of the full dynamic range of the film (16 bit for BW). The raw scans will consist of 70 Terabytes of data (10,000 BW Hasselblad, 10,000 color Hasselblad, 10,000 Metric frames, 4500 Pan frames, 620 35mm frames counts; are estimates). All the scanned films will be made available for download through a searchable database. Special tools are being developed to locate images based on various search parameters. To geolocate metric and panoramic frames acquired during Apollos 15\\-17, prototype SPICE kernels are being generated from existing photographic support data by entering state vectors and timestamps from multiple points throughout each orbit into the NAIF toolkit to create a type 9 Spacecraft and Planet Ephemeris Kernel (SPK), a nadir pointing C\\- matrix Kernel (CK), and a Spacecraft Clock Kernel (SCLK). These SPICE kernels, in addition to the Instrument Kernel (IK) and Frames Kernel (FK) that also under development, will be archived along with the scanned images. From the generated kernels, several IDL programs have been designed to display orbital tracks, produce footprint plots, and create image projections. Using the output from these SPICE based programs enables accurate geolocating of SIM bay photography as well as providing potential data from lunar gravitational studies.
NASA Technical Reports Server (NTRS)
Nagihara, S.; Nakamura, Y.; Williams, D. R.; Taylor, P. T.; Kiefer, W. S.; Hager, M. A.; Hills, H. K.
2016-01-01
In year 2010, 440 original data archival tapes for the Apollo Lunar Science Experiment Package (ALSEP) experiments were found at the Washington National Records Center. These tapes hold raw instrument data received from the Moon for all the ALSEP instruments for the period of April through June 1975. We have recently completed extraction of binary files from these tapes, and we have delivered them to the NASA Space Science Data Cordinated Archive (NSSDCA). We are currently processing the raw data into higher order data products in file formats more readily usable by contemporary researchers. These data products will fill a number of gaps in the current ALSEP data collection at NSSDCA. In addition, we have estabilished a digital, searcheable archive of ALSEP document and metadata as part of the web portal of the Lunar and Planetary Institute. It currently holds approx. 700 documents totaling approx. 40,000 pages
The MATISSE analysis of large spectral datasets from the ESO Archive
NASA Astrophysics Data System (ADS)
Worley, C.; de Laverny, P.; Recio-Blanco, A.; Hill, V.; Vernisse, Y.; Ordenovic, C.; Bijaoui, A.
2010-12-01
The automated stellar classification algorithm, MATISSE, has been developed at the Observatoire de la Côte d'Azur (OCA) in order to determine stellar temperatures, gravities and chemical abundances for large datasets of stellar spectra. The Gaia Data Processing and Analysis Consortium (DPAC) has selected MATISSE as one of the key programmes to be used in the analysis of the Gaia Radial Velocity Spectrometer (RVS) spectra. MATISSE is currently being used to analyse large datasets of spectra from the ESO archive with the primary goal of producing advanced data products to be made available in the ESO database via the Virtual Observatory. This is also an invaluable opportunity to identify and address issues that can be encountered with the analysis large samples of real spectra prior to the launch of Gaia in 2012. The analysis of the archived spectra of the FEROS spectrograph is currently underway and preliminary results are presented.
Archive Management of NASA Earth Observation Data to Support Cloud Analysis
NASA Technical Reports Server (NTRS)
Lynnes, Christopher; Baynes, Kathleen; McInerney, Mark A.
2017-01-01
NASA collects, processes and distributes petabytes of Earth Observation (EO) data from satellites, aircraft, in situ instruments and model output, with an order of magnitude increase expected by 2024. Cloud-based web object storage (WOS) of these data can simplify the execution of such an increase. More importantly, it can also facilitate user analysis of those volumes by making the data available to the massively parallel computing power in the cloud. However, storing EO data in cloud WOS has a ripple effect throughout the NASA archive system with unexpected challenges and opportunities. One challenge is modifying data servicing software (such as Web Coverage Service servers) to access and subset data that are no longer on a directly accessible file system, but rather in cloud WOS. Opportunities include refactoring of the archive software to a cloud-native architecture; virtualizing data products by computing on demand; and reorganizing data to be more analysis-friendly.
The Cambridge Structural Database: a quarter of a million crystal structures and rising.
Allen, Frank H
2002-06-01
The Cambridge Structural Database (CSD) now contains data for more than a quarter of a million small-molecule crystal structures. The information content of the CSD, together with methods for data acquisition, processing and validation, are summarized, with particular emphasis on the chemical information added by CSD editors. Nearly 80% of new structural data arrives electronically, mostly in CIF format, and the CCDC acts as the official crystal structure data depository for 51 major journals. The CCDC now maintains both a CIF archive (more than 73,000 CIFs dating from 1996), as well as the distributed binary CSD archive; the availability of data in both archives is discussed. A statistical survey of the CSD is also presented and projections concerning future accession rates indicate that the CSD will contain at least 500,000 crystal structures by the year 2010.
Hierarchical storage of large volume of multidector CT data using distributed servers
NASA Astrophysics Data System (ADS)
Ratib, Osman; Rosset, Antoine; Heuberger, Joris; Bandon, David
2006-03-01
Multidector scanners and hybrid multimodality scanners have the ability to generate large number of high-resolution images resulting in very large data sets. In most cases, these datasets are generated for the sole purpose of generating secondary processed images and 3D rendered images as well as oblique and curved multiplanar reformatted images. It is therefore not essential to archive the original images after they have been processed. We have developed an architecture of distributed archive servers for temporary storage of large image datasets for 3D rendering and image processing without the need for long term storage in PACS archive. With the relatively low cost of storage devices it is possible to configure these servers to hold several months or even years of data, long enough for allowing subsequent re-processing if required by specific clinical situations. We tested the latest generation of RAID servers provided by Apple computers with a capacity of 5 TBytes. We implemented a peer-to-peer data access software based on our Open-Source image management software called OsiriX, allowing remote workstations to directly access DICOM image files located on the server through a new technology called "bonjour". This architecture offers a seamless integration of multiple servers and workstations without the need for central database or complex workflow management tools. It allows efficient access to image data from multiple workstation for image analysis and visualization without the need for image data transfer. It provides a convenient alternative to centralized PACS architecture while avoiding complex and time-consuming data transfer and storage.
Simple re-instantiation of small databases using cloud computing.
Tan, Tin Wee; Xie, Chao; De Silva, Mark; Lim, Kuan Siong; Patro, C Pawan K; Lim, Shen Jean; Govindarajan, Kunde Ramamoorthy; Tong, Joo Chuan; Choo, Khar Heng; Ranganathan, Shoba; Khan, Asif M
2013-01-01
Small bioinformatics databases, unlike institutionally funded large databases, are vulnerable to discontinuation and many reported in publications are no longer accessible. This leads to irreproducible scientific work and redundant effort, impeding the pace of scientific progress. We describe a Web-accessible system, available online at http://biodb100.apbionet.org, for archival and future on demand re-instantiation of small databases within minutes. Depositors can rebuild their databases by downloading a Linux live operating system (http://www.bioslax.com), preinstalled with bioinformatics and UNIX tools. The database and its dependencies can be compressed into an ".lzm" file for deposition. End-users can search for archived databases and activate them on dynamically re-instantiated BioSlax instances, run as virtual machines over the two popular full virtualization standard cloud-computing platforms, Xen Hypervisor or vSphere. The system is adaptable to increasing demand for disk storage or computational load and allows database developers to use the re-instantiated databases for integration and development of new databases. Herein, we demonstrate that a relatively inexpensive solution can be implemented for archival of bioinformatics databases and their rapid re-instantiation should the live databases disappear.
Simple re-instantiation of small databases using cloud computing
2013-01-01
Background Small bioinformatics databases, unlike institutionally funded large databases, are vulnerable to discontinuation and many reported in publications are no longer accessible. This leads to irreproducible scientific work and redundant effort, impeding the pace of scientific progress. Results We describe a Web-accessible system, available online at http://biodb100.apbionet.org, for archival and future on demand re-instantiation of small databases within minutes. Depositors can rebuild their databases by downloading a Linux live operating system (http://www.bioslax.com), preinstalled with bioinformatics and UNIX tools. The database and its dependencies can be compressed into an ".lzm" file for deposition. End-users can search for archived databases and activate them on dynamically re-instantiated BioSlax instances, run as virtual machines over the two popular full virtualization standard cloud-computing platforms, Xen Hypervisor or vSphere. The system is adaptable to increasing demand for disk storage or computational load and allows database developers to use the re-instantiated databases for integration and development of new databases. Conclusions Herein, we demonstrate that a relatively inexpensive solution can be implemented for archival of bioinformatics databases and their rapid re-instantiation should the live databases disappear. PMID:24564380
Comparison of property between two Viking Seismic tapes
NASA Astrophysics Data System (ADS)
Yamamoto, Y.; Yamada, R.
2016-12-01
Tthe restoration work of the seismometer data onboard Viking Lander 2 is still continuing. Originally, the data were processed and archived both in MIT and UTIG separately, and each data is accessible via the Internet today. Their file formats to store the data are different, but both of them are currently readable due to the continuous investigation. However, there is some inconsistency between their data although most of their data are highly consistent. To understand the differences, the knowledge of archiving and off-line processing of spacecraft is required because these differences are caused by the off-line processing.The data processing of spacecraft often requires merge and sort processing of raw data. The merge processing is normally performed to eliminate duplicated data, and the sort processing is performed to fix data order. UTIG did not seem to perform these merge and sort processing. Therefore, the UTIG processed data remain duplication. The MIT processed data did these merge and sort processing, but the raw data sometimes include wrong time tags, and it cannot be fixed strictly after sort processing. Also, the MIT processed data has enough documents to understand metadata, while UTIG data has a brief instruction. Therefore, both of MIT and UTIG data are treated complementary. A better data set can be established using both of them. In this presentation, we would show the method to build a better data set of Viking Lander 2 seismic data.
Near-line Archive Data Mining at the Goddard Distributed Active Archive Center
NASA Astrophysics Data System (ADS)
Pham, L.; Mack, R.; Eng, E.; Lynnes, C.
2002-12-01
NASA's Earth Observing System (EOS) is generating immense volumes of data, in some cases too much to provide to users with data-intensive needs. As an alternative to moving the data to the user and his/her research algorithms, we are providing a means to move the algorithms to the data. The Near-line Archive Data Mining (NADM) system is the Goddard Earth Sciences Distributed Active Archive Center's (GES DAAC) web data mining portal to the EOS Data and Information System (EOSDIS) data pool, a 50-TB online disk cache. The NADM web portal enables registered users to submit and execute data mining algorithm codes on the data in the EOSDIS data pool. A web interface allows the user to access the NADM system. The users first develops personalized data mining code on their home platform and then uploads them to the NADM system. The C, FORTRAN and IDL languages are currently supported. The user developed code is automatically audited for any potential security problems before it is installed within the NADM system and made available to the user. Once the code has been installed the user is provided a test environment where he/she can test the execution of the software against data sets of the user's choosing. When the user is satisfied with the results, he/she can promote their code to the "operational" environment. From here the user can interactively run his/her code on the data available in the EOSDIS data pool. The user can also set up a processing subscription. The subscription will automatically process new data as it becomes available in the EOSDIS data pool. The generated mined data products are then made available for FTP pickup. The NADM system uses the GES DAAC-developed Simple Scalable Script-based Science Processor (S4P) to automate tasks and perform the actual data processing. Users will also have the option of selecting a DAAC-provided data mining algorithm and using it to process the data of their choice.
Simple, Scalable, Script-based, Science Processor for Measurements - Data Mining Edition (S4PM-DME)
NASA Astrophysics Data System (ADS)
Pham, L. B.; Eng, E. K.; Lynnes, C. S.; Berrick, S. W.; Vollmer, B. E.
2005-12-01
The S4PM-DME is the Goddard Earth Sciences Distributed Active Archive Center's (GES DAAC) web-based data mining environment. The S4PM-DME replaces the Near-line Archive Data Mining (NADM) system with a better web environment and a richer set of production rules. S4PM-DME enables registered users to submit and execute custom data mining algorithms. The S4PM-DME system uses the GES DAAC developed Simple Scalable Script-based Science Processor for Measurements (S4PM) to automate tasks and perform the actual data processing. A web interface allows the user to access the S4PM-DME system. The user first develops personalized data mining algorithm on his/her home platform and then uploads them to the S4PM-DME system. Algorithms in C and FORTRAN languages are currently supported. The user developed algorithm is automatically audited for any potential security problems before it is installed within the S4PM-DME system and made available to the user. Once the algorithm has been installed the user can promote the algorithm to the "operational" environment. From here the user can search and order the data available in the GES DAAC archive for his/her science algorithm. The user can also set up a processing subscription. The subscription will automatically process new data as it becomes available in the GES DAAC archive. The generated mined data products are then made available for FTP pickup. The benefits of using S4PM-DME are 1) to decrease the downloading time it typically takes a user to transfer the GES DAAC data to his/her system thus off-load the heavy network traffic, 2) to free-up the load on their system, and last 3) to utilize the rich and abundance ocean, atmosphere data from the MODIS and AIRS instruments available from the GES DAAC.
The LCOGT Observation Portal, Data Pipeline and Science Archive
NASA Astrophysics Data System (ADS)
Lister, Tim; LCOGT Science Archive Team
2014-01-01
Las Cumbres Observatory Global Telescope (LCOGT) is building and deploying a world-wide network of optical telescopes dedicated to time-domain astronomy. During 2012-2013, we successfully deployed and commissioned nine new 1m telescopes at McDonald Observatory (Texas), CTIO (Chile), SAAO (South Africa) and Siding Spring Observatory (Australia). New, improved cameras and additional telescopes will be deployed during 2014. To enable the diverse LCOGT user community of scientific and educational users to request observations on the LCOGT Network and to see their progress and get access to their data, we have developed an Observation Portal system. This Observation Portal integrates proposal submission and observation requests with seamless access to the data products from the data pipelines in near-realtime and long-term products from the Science Archive. We describe the LCOGT Observation Portal and the data pipeline, currently in operation, which makes use of the ORAC-DR automated recipe-based data reduction pipeline and illustrate some of the new data products. We also present the LCOGT Science Archive, which is being developed in partnership with the Infrared Processing and Analysis Center (IPAC) and show some of the new features the Science Archive provides.
jade: An End-To-End Data Transfer and Catalog Tool
NASA Astrophysics Data System (ADS)
Meade, P.
2017-10-01
The IceCube Neutrino Observatory is a cubic kilometer neutrino telescope located at the Geographic South Pole. IceCube collects 1 TB of data every day. An online filtering farm processes this data in real time and selects 10% to be sent via satellite to the main data center at the University of Wisconsin-Madison. IceCube has two year-round on-site operators. New operators are hired every year, due to the hard conditions of wintering at the South Pole. These operators are tasked with the daily operations of running a complex detector in serious isolation conditions. One of the systems they operate is the data archiving and transfer system. Due to these challenging operational conditions, the data archive and transfer system must above all be simple and robust. It must also share the limited resource of satellite bandwidth, and collect and preserve useful metadata. The original data archive and transfer software for IceCube was written in 2005. After running in production for several years, the decision was taken to fully rewrite it, in order to address a number of structural drawbacks. The new data archive and transfer software (JADE2) has been in production for several months providing improved performance and resiliency. One of the main goals for JADE2 is to provide a unified system that handles the IceCube data end-to-end: from collection at the South Pole, all the way to long-term archive and preservation in dedicated repositories at the North. In this contribution, we describe our experiences and lessons learned from developing and operating the data archive and transfer software for a particle physics experiment in extreme operational conditions like IceCube.
The Archives of the Department of Terrestrial Magnetism: Documenting 100 Years of Carnegie Science
NASA Astrophysics Data System (ADS)
Hardy, S. J.
2005-12-01
The archives of the Department of Terrestrial Magnetism (DTM) of the Carnegie Institution of Washington document more than a century of geophysical and astronomical investigations. Primary source materials available for historical research include field and laboratory notebooks, equipment designs, plans for observatories and research vessels, scientists' correspondence, and thousands of expedition and instrument photographs. Yet despite its history, DTM long lacked a systematic approach to managing its documentary heritage. A preliminary records survey conducted in 2001 identified more than 1,000 linear feet of historically-valuable records languishing in dusty, poorly-accessible storerooms. Intellectual control at that time was minimal. With support from the National Historical Publications and Records Commission, the "Carnegie Legacy Project" was initiated in 2003 to preserve, organize, and facilitate access to DTM's archival records, as well as those of the Carnegie Institution's administrative headquarters and Geophysical Laboratory. Professional archivists were hired to process the 100-year backlog of records. Policies and procedures were established to ensure that all work conformed to national archival standards. Records were appraised, organized, and rehoused in acid-free containers, and finding aids were created for the project web site. Standardized descriptions of each collection were contributed to the WorldCat bibliographic database and the AIP International Catalog of Sources for History of Physics. Historic photographs and documents were digitized for online exhibitions to raise awareness of the archives among researchers and the general public. The success of the Legacy Project depended on collaboration between archivists, librarians, historians, data specialists, and scientists. This presentation will discuss key aspects (funding, staffing, preservation, access, outreach) of the Legacy Project and is aimed at personnel in observatories, research institutes, and other organizations interested in establishing their own archival programs.
DETECTION OF ELEMENTS AT ALL THREE r-PROCESS PEAKS IN THE METAL-POOR STAR HD 160617
DOE Office of Scientific and Technical Information (OSTI.GOV)
Roederer, Ian U.; Lawler, James E., E-mail: iur@obs.carnegiescience.edu, E-mail: jelawler@wisc.edu
2012-05-01
We report the first detection of elements at all three r-process peaks in the metal-poor halo star HD 160617. These elements include arsenic and selenium, which have not been detected previously in halo stars, and the elements tellurium, osmium, iridium, and platinum, which have been detected previously. Absorption lines of these elements are found in archive observations made with the Space Telescope Imaging Spectrograph on board the Hubble Space Telescope. We present up-to-date absolute atomic transition probabilities and complete line component patterns for these elements. Additional archival spectra of this star from several ground-based instruments allow us to derive abundancesmore » or upper limits of 45 elements in HD 160617, including 27 elements produced by neutron-capture reactions. The average abundances of the elements at the three r-process peaks are similar to the predicted solar system r-process residuals when scaled to the abundances in the rare earth element domain. This result for arsenic and selenium may be surprising in light of predictions that the production of the lightest r-process elements generally should be decoupled from the heavier r-process elements.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sisterson, DL
2008-09-30
Individual raw data streams from instrumentation at the Atmospheric Radiation Measurement (ARM) Program Climate Research Facility (ACRF) fixed and mobile sites are collected and sent to the Data Management Facility (DMF) at Pacific Northwest National Laboratory (PNNL) for processing in near real-time. Raw and processed data are then sent daily to the ACRF Archive, where they are made available to users. For each instrument, we calculate the ratio of the actual number of data records received daily at the Archive to the expected number of data records. The results are tabulated by (1) individual data stream, site, and month formore » the current year and (2) site and fiscal year (FY) dating back to 1998. The U.S. Department of Energy (DOE) requires national user facilities to report time-based operating data. The requirements concern the actual hours of operation (ACTUAL); the estimated maximum operation or uptime goal (OPSMAX), which accounts for planned downtime; and the VARIANCE [1 – (ACTUAL/OPSMAX)], which accounts for unplanned downtime. The OPSMAX time for the fourth quarter of FY 2008 for the Southern Great Plains (SGP) site is 2,097.60 hours (0.95 x 2,208 hours this quarter). The OPSMAX for the North Slope Alaska (NSA) locale is 1,987.20 hours (0.90 x 2,208), and for the Tropical Western Pacific (TWP) locale is 1,876.80 hours (0.85 x 2,208). The OPSMAX time for the ARM Mobile Facility (AMF) is not reported this quarter because the data have not yet been released from China to the DMF for processing. The differences in OPSMAX performance reflect the complexity of local logistics and the frequency of extreme weather events. It is impractical to measure OPSMAX for each instrument or data stream. Data availability reported here refers to the average of the individual, continuous data streams that have been received by the Archive. Data not at the Archive are caused by downtime (scheduled or unplanned) of the individual instruments. Therefore, data availability is directly related to individual instrument uptime. Thus, the average percentage of data in the Archive represents the average percentage of the time (24 hours per day, 92 days for this quarter) the instruments were operating this quarter.« less
Data Processing, Visualization and Distribution for Support of Science Programs in the Arctic Ocean
NASA Astrophysics Data System (ADS)
Johnson, P. D.; Edwards, M. H.; Wright, D.
2006-12-01
For the past two years the Hawaii Mapping Research Group (HMRG) and Oregon State University researchers have been building an on-line archive of geophysical data for the Arctic Basin. This archive is known as AAGRUUK - the Arctic Archive for Geophysical Research: Unlocking Undersea Knowledge (http://www.soest.hawaii.edu/hmrg/Aagruuk). It contains a wide variety of data including bathymetry, sidescan and subbottom data collected by: 1) U.S. Navy nuclear-powered submarines during the Science Ice Exercises (SCICEX), 2) icebreakers such as the USCGC Healy, R/V Nathaniel B. Palmer, and CCGS Amundsen, and 3) historical depth soundings from the T3 ice camp and pre-1990 nuclear submarine missions. Instead of simply soliciting data, reformatting it, and serving it to the community, we have focused our efforts on producing and serving an integrated dataset. We pursued this path after experimenting with dataset integration and discovering a multitude of problems including navigational inconsistencies and systemic offsets produced by acquiring data in an ice-covered ocean. Our goal in addressing these problems, integrating the processed datasets and producing a data compilation was to prevent the myriad researchers interested in these datasets, many of whom have less experience processing geophysical data than HMRG personnel, from having to repeat the same data processing efforts. For investigators interested in pursuing their own data processing approaches, AAGRUUK also serves most of the raw data that was included in the data compilation, as well as processed versions of individual datasets. The archive also provides downloadable static chart sets for users who desire derived products for inclusion in reports, planning documents, etc. We are currently testing a prototype mapserver that allows maps of the cleaned datasets to be accessed interactively as well as providing access to the edited files that make up the datasets. Previously we have documented the types of the problems that were encountered in a general way. Over the past year we have integrated two terabytes of data, which allows us to comment on system performance from a much broader context. In this presentation we will show the types of error for each data acquisition system and also for operating conditions (e.g. ice cover, time of year, etc.). Our error analysis both illuminates our approach to data processing and serves as a guide for, when possible, choosing the type of instruments and the optimal time to conduct these types of surveys in ice-covered oceans.
The Ecology of Collaborative Work. Workscape 21: The Ecology of New Ways of Working.
ERIC Educational Resources Information Center
Becker, Franklin; Quinn, Kristen L.; Tennessen, Carolyn M.
A study examined Chiat/Day inc. Advertising's team-based virtual office in which work could occur at any location inside or outside the office at any time. Three sites used three workplace strategies: full virtual (FV), modified virtual (MV), and conventional (C). Interviews, observations, and archival data were used to assess project teams doing…