Radiologic image communication and archive service: a secure, scalable, shared approach
NASA Astrophysics Data System (ADS)
Fellingham, Linda L.; Kohli, Jagdish C.
1995-11-01
The Radiologic Image Communication and Archive (RICA) service is designed to provide a shared archive for medical images to the widest possible audience of customers. Images are acquired from a number of different modalities, each available from many different vendors. Images are acquired digitally from those modalities which support direct digital output and by digitizing films for projection x-ray exams. The RICA Central Archive receives standard DICOM 3.0 messages and data streams from the medical imaging devices at customer institutions over the public telecommunication network. RICA represents a completely scalable resource. The user pays only for what he is using today with the full assurance that as the volume of image data that he wishes to send to the archive increases, the capacity will be there to accept it. To provide this seamless scalability imposes several requirements on the RICA architecture: (1) RICA must support the full array of transport services. (2) The Archive Interface must scale cost-effectively to support local networks that range from the very small (one x-ray digitizer in a medical clinic) to the very large and complex (a large hospital with several CTs, MRs, Nuclear medicine devices, ultrasound machines, CRs, and x-ray digitizers). (3) The Archive Server must scale cost-effectively to support rapidly increasing demands for service providing storage for and access to millions of patients and hundreds of millions of images. The architecture must support the incorporation of improved technology as it becomes available to maintain performance and remain cost-effective as demand rises.
Service-Based Extensions to an OAIS Archive for Science Data Management
NASA Astrophysics Data System (ADS)
Flathers, E.; Seamon, E.; Gessler, P. E.
2014-12-01
With new data management mandates from major funding sources such as the National Institutes for Health and the National Science Foundation, architecture of science data archive systems is becoming a critical concern for research institutions. The Consultative Committee for Space Data Systems (CCSDS), in 2002, released their first version of a Reference Model for an Open Archival Information System (OAIS). The CCSDS document (now an ISO standard) was updated in 2012 with additional focus on verifying the authenticity of data and developing concepts of access rights and a security model. The OAIS model is a good fit for research data archives, having been designed to support data collections of heterogeneous types, disciplines, storage formats, etc. for the space sciences. As fast, reliable, persistent Internet connectivity spreads, new network-available resources have been developed that can support the science data archive. A natural extension of an OAIS archive is the interconnection with network- or cloud-based services and resources. We use the Service Oriented Architecture (SOA) design paradigm to describe a set of extensions to an OAIS-type archive: purpose and justification for each extension, where and how each extension connects to the model, and an example of a specific service that meets the purpose.
Expanding the PACS archive to support clinical review, research, and education missions
NASA Astrophysics Data System (ADS)
Honeyman-Buck, Janice C.; Frost, Meryll M.; Drane, Walter E.
1999-07-01
Designing an image archive and retrieval system that supports multiple users with many different requirements and patterns of use without compromising the performance and functionality required by diagnostic radiology is an intellectual and technical challenge. A diagnostic archive, optimized for performance when retrieving diagnostic images for radiologists needed to be expanded to support a growing clinical review network, the University of Florida Brain Institute's demands for neuro-imaging, Biomedical Engineering's imaging sciences, and an electronic teaching file. Each of the groups presented a different set of problems for the designers of the system. In addition, the radiologists did not want to see nay loss of performance as new users were added.
Integration Of An MR Image Network Into A Clinical PACS
NASA Astrophysics Data System (ADS)
Ratib, Osman M.; Mankovich, Nicholas J.; Taira, Ricky K.; Cho, Paul S.; Huang, H. K.
1988-06-01
A direct link between a clinical pediatric PACS module and a FONAR MRI image network was implemented. The original MR network combines together the MR scanner, a remote viewing station and a central archiving station. The pediatric PACS directly connects to the archiving unit through an Ethernet TCP-IP network adhering to FONAR's protocol. The PACS communication software developed supports the transfer of patient studies and the patient information directly from the MR archive database to the pediatric PACS. In the first phase of our project we developed a package to transfer data between a VAX-111750 and the IBM PC I AT-based MR archive database through the Ethernet network. This system served as a model for PACS-to-modality network communication. Once testing was complete on this research network, the software and network hardware was moved to the clinical pediatric VAX for full PACS integration. In parallel to the direct transmission of digital images to the Pediatric PACS, a broadband communication system in video format was developed for real-time broadcasting of images originating from the MR console to 8 remote viewing stations distributed in the radiology department. These analog viewing stations allow the radiologists to directly monitor patient positioning and to select the scan levels during a patient examination from remote locations in the radiology department. This paper reports (1) the technical details of this implementation, (2) the merits of this network development scheme, and (3) the performance statistics of the network-to-PACS interface.
Yucca Mountain licensing support network archive assistant.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dunlavy, Daniel M.; Bauer, Travis L.; Verzi, Stephen J.
2008-03-01
This report describes the Licensing Support Network (LSN) Assistant--a set of tools for categorizing e-mail messages and documents, and investigating and correcting existing archives of categorized e-mail messages and documents. The two main tools in the LSN Assistant are the LSN Archive Assistant (LSNAA) tool for recategorizing manually labeled e-mail messages and documents and the LSN Realtime Assistant (LSNRA) tool for categorizing new e-mail messages and documents. This report focuses on the LSNAA tool. There are two main components of the LSNAA tool. The first is the Sandia Categorization Framework, which is responsible for providing categorizations for documents in anmore » archive and storing them in an appropriate Categorization Database. The second is the actual user interface, which primarily interacts with the Categorization Database, providing a way for finding and correcting categorizations errors in the database. A procedure for applying the LSNAA tool and an example use case of the LSNAA tool applied to a set of e-mail messages are provided. Performance results of the categorization model designed for this example use case are presented.« less
Information Assurance Tasks Supporting the Processing of Electronic Records Archives
2007-03-01
3 Table 2. OpenVPN evaluation results...........................................................................................10 iv 1...operation of necessary security features and compare the network performance under OpenVPN (openvpn.net) operation with the network performance under no...VPN operation (non-VPN) in a gigabit network environment. The reason for selecting OpenVPN product was based on the previous findings of Khanvilkar
Ocean Networks Canada's "Big Data" Initiative
NASA Astrophysics Data System (ADS)
Dewey, R. K.; Hoeberechts, M.; Moran, K.; Pirenne, B.; Owens, D.
2013-12-01
Ocean Networks Canada operates two large undersea observatories that collect, archive, and deliver data in real time over the Internet. These data contribute to our understanding of the complex changes taking place on our ocean planet. Ocean Networks Canada's VENUS was the world's first cabled seafloor observatory to enable researchers anywhere to connect in real time to undersea experiments and observations. Its NEPTUNE observatory is the largest cabled ocean observatory, spanning a wide range of ocean environments. Most recently, we installed a new small observatory in the Arctic. Together, these observatories deliver "Big Data" across many disciplines in a cohesive manner using the Oceans 2.0 data management and archiving system that provides national and international users with open access to real-time and archived data while also supporting a collaborative work environment. Ocean Networks Canada operates these observatories to support science, innovation, and learning in four priority areas: study of the impact of climate change on the ocean; the exploration and understanding the unique life forms in the extreme environments of the deep ocean and below the seafloor; the exchange of heat, fluids, and gases that move throughout the ocean and atmosphere; and the dynamics of earthquakes, tsunamis, and undersea landslides. To date, the Ocean Networks Canada archive contains over 130 TB (collected over 7 years) and the current rate of data acquisition is ~50 TB per year. This data set is complex and diverse. Making these "Big Data" accessible and attractive to users is our priority. In this presentation, we share our experience as a "Big Data" institution where we deliver simple and multi-dimensional calibrated data cubes to a diverse pool of users. Ocean Networks Canada also conducts extensive user testing. Test results guide future tool design and development of "Big Data" products. We strive to bridge the gap between the raw, archived data and the needs and experience of a diverse user community, each requiring tailored data visualization and integrated products. By doing this we aim to design tools that maximize exploitation of the data.
Water level ingest, archive and processing system - an integral part of NOAA's tsunami database
NASA Astrophysics Data System (ADS)
McLean, S. J.; Mungov, G.; Dunbar, P. K.; Price, D. J.; Mccullough, H.
2013-12-01
The National Oceanic and Atmospheric Administration (NOAA), National Geophysical Data Center (NGDC) and collocated World Data Service for Geophysics (WDS) provides long-term archive, data management, and access to national and global tsunami data. Archive responsibilities include the NOAA Global Historical Tsunami event and runup database, damage photos, as well as other related hazards data. Beginning in 2008, NGDC was given the responsibility of archiving, processing and distributing all tsunami and hazards-related water level data collected from NOAA observational networks in a coordinated and consistent manner. These data include the Deep-ocean Assessment and Reporting of Tsunami (DART) data provided by the National Data Buoy Center (NDBC), coastal-tide-gauge data from the National Ocean Service (NOS) network and tide-gauge data from the two National Weather Service (NWS) Tsunami Warning Centers (TWCs) regional networks. Taken together, this integrated archive supports tsunami forecast, warning, research, mitigation and education efforts of NOAA and the Nation. Due to the variety of the water level data, the automatic ingest system was redesigned, along with upgrading the inventory, archive and delivery capabilities based on modern digital data archiving practices. The data processing system was also upgraded and redesigned focusing on data quality assessment in an operational manner. This poster focuses on data availability highlighting the automation of all steps of data ingest, archive, processing and distribution. Examples are given from recent events such as the October 2012 hurricane Sandy, the Feb 06, 2013 Solomon Islands tsunami, and the June 13, 2013 meteotsunami along the U.S. East Coast.
PRECISION OF ATMOSPHERIC DRY DEPOSITION DATA FROM THE CLEAN AIR STATUS AND TRENDS NETWORK (CASTNET)
A collocated, dry deposition sampling program was begun in January 1987 by the US Environmental Protection Agency to provide ongoing estimates of the overall precision of dry deposition and supporting data entering the Clean Air Status and Trends Network (CASTNet) archives Duplic...
Using Network Analysis to Characterize Biogeographic Data in a Community Archive
NASA Astrophysics Data System (ADS)
Wellman, T. P.; Bristol, S.
2017-12-01
Informative measures are needed to evaluate and compare data from multiple providers in a community-driven data archive. This study explores insights from network theory and other descriptive and inferential statistics to examine data content and application across an assemblage of publically available biogeographic data sets. The data are archived in ScienceBase, a collaborative catalog of scientific data supported by the U.S Geological Survey to enhance scientific inquiry and acuity. In gaining understanding through this investigation and other scientific venues our goal is to improve scientific insight and data use across a spectrum of scientific applications. Network analysis is a tool to reveal patterns of non-trivial topological features in the data that do not exhibit complete regularity or randomness. In this work, network analyses are used to explore shared events and dependencies between measures of data content and application derived from metadata and catalog information and measures relevant to biogeographic study. Descriptive statistical tools are used to explore relations between network analysis properties, while inferential statistics are used to evaluate the degree of confidence in these assessments. Network analyses have been used successfully in related fields to examine social awareness of scientific issues, taxonomic structures of biological organisms, and ecosystem resilience to environmental change. Use of network analysis also shows promising potential to identify relationships in biogeographic data that inform programmatic goals and scientific interests.
2014-03-01
M. Callaghan ( AKR -1001). Retrieved from http://www.navsource.org/archives/09/54/541001.htm Nguyen, H., & Baker, M. (2012). Characteristics of a ...AND IMPLEMENTATION OF A NETWORKING PROOF-OF-CONCEPT PROTOTYPE TO SUPPORT MARITIME VISIT, BOARD, SEARCH AND SEIZURE TEAMS by Van E. Stewart...2. REPORT DATE March 2014 3. REPORT TYPE AND DATES COVERED Master’s Thesis 4. TITLE AND SUBTITLE ANALYSIS, DESIGN AND IMPLEMENTATION OF A
NASA Astrophysics Data System (ADS)
Oswald, Helmut; Mueller-Jones, Kay; Builtjes, Jan; Fleck, Eckart
1998-07-01
The developments in information technologies -- computer hardware, networking and storage media -- has led to expectations that these advances make it possible to replace 35 mm film completely by digital techniques in the catheter laboratory. Besides the role of an archival medium, cine film is used as the major image review and exchange medium in cardiology. None of the today technologies can fulfill completely the requirements to replace cine film. One of the major drawbacks of cine film is the single access in time and location. For the four catheter laboratories in our institutions we have designed a complementary concept combining the CD-R, also called CD-medical, as a single patient storage and exchange medium, and a digital archive for on-line access and image review of selected frames or short sequences on adequate medical workstations. The image data from various modalities as well as all digital documents regarding to a patient are part of an electronic patient record. The access, the processing and the display of documents is supported by an integrated medical application.
Tools to manage the enterprise-wide picture archiving and communications system environment.
Lannum, L M; Gumpf, S; Piraino, D
2001-06-01
The presentation will focus on the implementation and utilization of a central picture archiving and communications system (PACS) network-monitoring tool that allows for enterprise-wide operations management and support of the image distribution network. The MagicWatch (Siemens, Iselin, NJ) PACS/radiology information system (RIS) monitoring station from Siemens has allowed our organization to create a service support structure that has given us proactive control of our environment and has allowed us to meet the service level performance expectations of the users. The Radiology Help Desk has used the MagicWatch PACS monitoring station as an applications support tool that has allowed the group to monitor network activity and individual systems performance at each node. Fast and timely recognition of the effects of single events within the PACS/RIS environment has allowed the group to proactively recognize possible performance issues and resolve problems. The PACS/operations group performs network management control, image storage management, and software distribution management from a single, central point in the enterprise. The MagicWatch station allows for the complete automation of software distribution, installation, and configuration process across all the nodes in the system. The tool has allowed for the standardization of the workstations and provides a central configuration control for the establishment and maintenance of the system standards. This report will describe the PACS management and operation prior to the implementation of the MagicWatch PACS monitoring station and will highlight the operational benefits of a centralized network and system-monitoring tool.
Enterprise-class Digital Imaging and Communications in Medicine (DICOM) image infrastructure.
York, G; Wortmann, J; Atanasiu, R
2001-06-01
Most current picture archiving and communication systems (PACS) are designed for a single department or a single modality. Few PACS installations have been deployed that support the needs of the hospital or the entire Integrated Delivery Network (IDN). The authors propose a new image management architecture that can support a large, distributed enterprise.
System Security Authorization Agreement (SSAA) for the WIRE Archive and Research Facility
NASA Technical Reports Server (NTRS)
2002-01-01
The Wide-Field Infrared Explorer (WIRE) Archive and Research Facility (WARF) is operated and maintained by the Department of Physics, USAF Academy. The lab is located in Fairchild Hall, 2354 Fairchild Dr., Suite 2A103, USAF Academy, CO 80840. The WARF will be used for research and education in support of the NASA Wide Field Infrared Explorer (WIRE) satellite, and for related high-precision photometry missions and activities. The WARF will also contain the WIRE preliminary and final archives prior to their delivery to the National Space Science Data Center (NSSDC). The WARF consists of a suite of equipment purchased under several NASA grants in support of WIRE research. The core system consists of a Red Hat Linux workstation with twin 933 MHz PIII processors, 1 GB of RAM, 133 GB of hard disk space, and DAT and DLT tape drives. The WARF is also supported by several additional networked Linux workstations. Only one of these (an older 450 Mhz PIII computer running Red Hat Linux) is currently running, but the addition of several more is expected over the next year. In addition, a printer will soon be added. The WARF will serve as the primary research facility for the analysis and archiving of data from the WIRE satellite, together with limited quantities of other high-precision astronomical photometry data from both ground- and space-based facilities. However, the archive to be created here will not be the final archive; rather, the archive will be duplicated at the NSSDC and public access to the data will generally take place through that site.
Buckets: Aggregative, Intelligent Agents for Publishing
NASA Technical Reports Server (NTRS)
Nelson, Michael L.; Maly, Kurt; Shen, Stewart N. T.; Zubair, Mohammad
1998-01-01
Buckets are an aggregative, intelligent construct for publishing in digital libraries. The goal of research projects is to produce information. This information is often instantiated in several forms, differentiated by semantic types (report, software, video, datasets, etc.). A given semantic type can be further differentiated by syntactic representations as well (PostScript version, PDF version, Word version, etc.). Although the information was created together and subtle relationships can exist between them, different semantic instantiations are generally segregated along currently obsolete media boundaries. Reports are placed in report archives, software might go into a software archive, but most of the data and supporting materials are likely to be kept in informal personal archives or discarded altogether. Buckets provide an archive-independent container construct in which all related semantic and syntactic data types and objects can be logically grouped together, archived, and manipulated as a single object. Furthermore, buckets are active archival objects and can communicate with each other, people, or arbitrary network services.
ERIC Educational Resources Information Center
Cachola, Ellen-Rae Cabebe
2014-01-01
This dissertation describes the International Women's Network Against Militarism's (IWNAM) political epistemology of security from an archival perspective, and how they create community archives to evidence this epistemology. This research examines records created by Women for Genuine Security (WGS) and Women's Voices Women Speak (WVWS), U.S. and…
40 CFR 58.16 - Data submittal and archiving requirements.
Code of Federal Regulations, 2010 CFR
2010-07-01
.... Other Federal agencies may request access to filters for purposes of supporting air quality management... PROGRAMS (CONTINUED) AMBIENT AIR QUALITY SURVEILLANCE Monitoring Network § 58.16 Data submittal and..., via AQS all ambient air quality data and associated quality assurance data for SO2; CO; O3; NO2; NO...
The Telecommunications and Data Acquisition Report
NASA Technical Reports Server (NTRS)
Posner, E. C. (Editor)
1992-01-01
Archival reports on developments in programs managed by JPL's Office of Telecommunications and Data Acquisition (TDA) are provided. In space communications, radio navigation, radio science, and ground-based radio and radar astronomy, it reports on activities of the Deep Space Network (DSN) in planning, in supporting research and technology, in implementation, and in operations. Also included is standards activity at JPL for space data and information. In the search for extraterrestrial intelligence (SETI), the TDA Progress Report reports on implementation and operations for searching the microwave spectrum. Topics covered include tracking and ground-based navigation; communications, spacecraft-ground; station control and system technology; capabilities for new projects; network upgrade and sustaining; network operations and operations support; and TDA program management and analysis.
The Comet Halley archive: Summary volume
NASA Technical Reports Server (NTRS)
Sekanina, Zdenek (Editor); Fry, Lori (Editor)
1991-01-01
The contents are as follows: The Organizational History of the International Halley Watch; Operations of the International Halley Watch from a Lead Center Perspective; The Steering Group; Astrometry Network; Infrared Studies Network; Large-Scale Phenomena Network; Meteor Studies Network; Near-Nucleus Studies Network; Photometry and Polarimetry Network; Radio Science Network; Spectroscopy and Spectrophotometry Network; Amateur Observation Network; Use of the CD-ROM Archive; The 1986 Passage of Comet Halley; and Recent Observations of Comet Halley.
Exploring Digisonde Ionogram Data with SAO-X and DIDBase
NASA Astrophysics Data System (ADS)
Khmyrov, Grigori M.; Galkin, Ivan A.; Kozlov, Alexander V.; Reinisch, Bodo W.; McElroy, Jonathan; Dozois, Claude
2008-02-01
A comprehensive suite of software tools for ionogram data analysis and archiving has been developed at UMLCAR to support the exploration of raw and processed data from the worldwide network of digisondes in a low-latency, user-friendly environment. Paired with the remotely accessible Digital Ionogram Data Base (DIDBase), the SAO Explorer software serves as an example of how an academic institution conscientiously manages its resident data archive while local experts continue to work on design of new and improved data products, all in the name of free public access to the full roster of acquired ionospheric sounding data.
COMBINE archive and OMEX format: one file to share all information to reproduce a modeling project.
Bergmann, Frank T; Adams, Richard; Moodie, Stuart; Cooper, Jonathan; Glont, Mihai; Golebiewski, Martin; Hucka, Michael; Laibe, Camille; Miller, Andrew K; Nickerson, David P; Olivier, Brett G; Rodriguez, Nicolas; Sauro, Herbert M; Scharm, Martin; Soiland-Reyes, Stian; Waltemath, Dagmar; Yvon, Florent; Le Novère, Nicolas
2014-12-14
With the ever increasing use of computational models in the biosciences, the need to share models and reproduce the results of published studies efficiently and easily is becoming more important. To this end, various standards have been proposed that can be used to describe models, simulations, data or other essential information in a consistent fashion. These constitute various separate components required to reproduce a given published scientific result. We describe the Open Modeling EXchange format (OMEX). Together with the use of other standard formats from the Computational Modeling in Biology Network (COMBINE), OMEX is the basis of the COMBINE Archive, a single file that supports the exchange of all the information necessary for a modeling and simulation experiment in biology. An OMEX file is a ZIP container that includes a manifest file, listing the content of the archive, an optional metadata file adding information about the archive and its content, and the files describing the model. The content of a COMBINE Archive consists of files encoded in COMBINE standards whenever possible, but may include additional files defined by an Internet Media Type. Several tools that support the COMBINE Archive are available, either as independent libraries or embedded in modeling software. The COMBINE Archive facilitates the reproduction of modeling and simulation experiments in biology by embedding all the relevant information in one file. Having all the information stored and exchanged at once also helps in building activity logs and audit trails. We anticipate that the COMBINE Archive will become a significant help for modellers, as the domain moves to larger, more complex experiments such as multi-scale models of organs, digital organisms, and bioengineering.
NASA Technical Reports Server (NTRS)
Behnke, Jeanne; Ramapriyan, H. K. " Rama"
2016-01-01
NASA's Earth Observing System Data and Information System (EOSDIS) has been in operation since August 1994, and serving a diverse user community around the world with Earth science data from satellites, aircraft, field campaigns and research investigations. The ESDIS Project, responsible for EOSDIS is a Network Member of the International Council for Sciences (ICSU) World Data System (WDS). Nine of the 12 Distributed Active Archive Centers (DAACs), which are part of EOSDIS, are Regular Members of the ICSUWDS. This poster presents the EOSDIS mission objectives, key characteristics of the DAACs that make them world class Earth science data centers, successes, challenges and best practices of EOSDIS focusing on the years 2014-2016, and illustrates some highlights of accomplishments of EOSDIS. The highlights include: high customer satisfaction, growing archive and distribution volumes, exponential growth in number of products distributed to users around the world, unified metadata model and common metadata repository, flexibility provided to uses by supporting data transformations to suit their applications, near-real-time capabilities to support various operational and research applications, and full resolution image browse capabilities to help users select data of interest. The poster also illustrates how the ESDIS Project is actively involved in several US and international data system organizations.
The Telecommunications and Data Acquisition Report
NASA Technical Reports Server (NTRS)
Posner, E. C. (Editor)
1990-01-01
Archival reports on developments in programs managed by the Jet Propulsion Laboratory's (JPL) Office of Telecommunications and Data Acquisition (TDA) are given. Space communications, radio navigation, radio science, and ground-based radio and radar astronomy, activities of the Deep Space Network (DSN) and its associated Ground Communications Facility (GCF) in planning, supporting research and technology, implementation, and operations are reported. Also included is TDA-funded activity at JPL on data and information systems and reimbursable Deep Space Network (DSN) work performed for other space agencies through NASA.
77 FR 32141 - Privacy Act of 1974, as Amended; System of Records Notices
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-31
... records titled ``Internal Collaboration Network''. SUMMARY: The National Archives and Records... 43, the Internal Collaboration Network, which contains files with information on National Archives.... SUPPLEMENTARY INFORMATION: The Internal Collaboration Network is a web- based platform that allows users to...
Workshop: Western hemisphere network of bird banding programs
Celis-Murillo, A.
2007-01-01
Purpose: To promote collaboration among banding programs in the Americas. Introduction: Bird banding and marking provide indispensable tools for ornithological research, management, and conservation of migratory birds on migratory routes, breeding and non-breeding grounds. Many countries and organizations in Latin America and the Caribbean are in the process of developing or have expressed interest in developing national banding schemes and databases to support their research and management programs. Coordination of developing and existing banding programs is essential for effective data management, reporting, archiving and security, and most importantly, for gaining a fuller understanding of migratory bird conservation issues and how the banding data can help. Currently, there is a well established bird-banding program in the U.S.A. and Canada, and programs in other countries are being developed as well. Ornithologists in many Latin American countries and the Caribbean are interested in using banding and marking in their research programs. Many in the ornithological community are interested in establishing banding schemes and some countries have recently initiated independent banding programs. With the number of long term collaborative and international initiatives increasing, the time is ripe to discuss and explore opportunities for international collaboration, coordination, and administration of bird banding programs in the Western Hemisphere. We propose the second ?Western Hemisphere Network of Bird Banding Programs? workshop, in association with the SCSCB, to be an essential step in the progress to strengthen international partnerships and support migratory bird conservation in the Americas and beyond. This will be the second multi-national meeting to promote collaboration among banding programs in the Americas (the first meeting was held in October 8-9, 2006 in La Mancha, Veracruz, Mexico). The Second ?Western Hemisphere Network of Bird Banding Programs? workshop will continue addressing issues surrounding the coordination of an Americas? approach to bird banding and will review in detail the advances made on the first workshop such as, coordination of bands and markers, coordination in recovery reporting, permit issues, data management and data sharing and archiving, data security, training, etc. Workshop Goals: Build on accomplishments of the network?s first workshop (Oct 8-9, 2006). Identify and explore new opportunities for data sharing, data archiving, data access, training, etc. Initiate strategies to support international collaboration and coordination amongst bird banding programs in the Western Hemisphere. Workshop structure: One day workshop of guided discussions. Participants: Representatives of government agencies, program managers and NGOs.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shoopman, J. D.
This report documents Livermore Computing (LC) activities in support of ASC L2 milestone 5589: Modernization and Expansion of LLNL Archive Disk Cache, due March 31, 2016. The full text of the milestone is included in Attachment 1. The description of the milestone is: Description: Configuration of archival disk cache systems will be modernized to reduce fragmentation, and new, higher capacity disk subsystems will be deployed. This will enhance archival disk cache capability for ASC archive users, enabling files written to the archives to remain resident on disk for many (6–12) months, regardless of file size. The milestone was completed inmore » three phases. On August 26, 2015 subsystems with 6PB of disk cache were deployed for production use in LLNL’s unclassified HPSS environment. Following that, on September 23, 2015 subsystems with 9 PB of disk cache were deployed for production use in LLNL’s classified HPSS environment. On January 31, 2016, the milestone was fully satisfied when the legacy Data Direct Networks (DDN) archive disk cache subsystems were fully retired from production use in both LLNL’s unclassified and classified HPSS environments, and only the newly deployed systems were in use.« less
Rich Support for Heterogeneous Polar Data in RAMADDA
NASA Astrophysics Data System (ADS)
McWhirter, J.; Crosby, C. J.; Griffith, P. C.; Khalsa, S.; Lazzara, M. A.; Weber, W. J.
2013-12-01
Difficult to navigate environments, tenuous logistics, strange forms, deeply rooted cultures - these are all experiences shared by Polar scientist in the field as well as the developers of the underlying data management systems back in the office. Among the key data management challenges that Polar investigations present are the heterogeneity and complexity of data that are generated. Polar regions are intensely studied across many science domains through a variety of techniques - satellite and aircraft remote sensing, in-situ observation networks, modeling, sociological investigations, and extensive PI-driven field project data collection. While many data management efforts focus on large homogeneous collections of data targeting specific science domains (e.g., satellite, GPS, modeling), multi-disciplinary efforts that focus on Polar data need to be able to address a wide range of data formats, science domains and user communities. There is growing use of the RAMADDA (Repository for Archiving, Managing and Accessing Diverse Data) system to manage and provide services for Polar data. RAMADDA is a freely available extensible data repository framework that supports a wide range of data types and services to allow the creation, management, discovery and use of data and metadata. The broad range of capabilities provided by RAMADDA and its extensibility makes it well-suited as an archive solution for Polar data. RAMADDA can run in a number of diverse contexts - as a centralized archive, at local institutions, and can even run on an investigator's laptop in the field, providing in-situ metadata and data management services. We are actively developing archives and support for a number of Polar initiatives: - NASA-Arctic Boreal Vulnerability Experiment (ABoVE): ABoVE is a long-term multi-instrument field campaign that will make use of a wide range of data. We have developed an extensive ontology of program, project and site metadata in RAMADDA, in support of the ABoVE Science Definition Team and Project Office. See: http://above.nasa.gov - UNAVCO Terrestrial Laser Scanning (TLS): UNAVCO's Polar program provides support for terrestrial laser scanning field projects. We are using RAMADDA to archive these field projects, with over 40 projects ingested to date. - NASA-IceBridge: As part of the NASA LiDAR Access System (NLAS) project, RAMADDA supports numerous airborne and satellite LiDAR data sets - GLAS, LVIS, ATM, Paris, McORDS, etc. - Antarctic Meteorological Research Center (AMRC): Satellite and surface observation network - Support for numerous other data from AON-ACADIS, Greenland GC-Net, NOAA-GMD, AmeriFlux, etc. In this talk we will discuss some of the challenges that Polar data brings to geoinformatics and describe the approaches we have taken to address these challenges in RAMADDA.
The Telecommunications and Data Acquisition Report
NASA Technical Reports Server (NTRS)
Yuen, Joseph H. (Editor)
1994-01-01
This quarterly publication provides archival reports on developments in programs in space communications, radio navigation, radio science, and ground-based radio and radar astronomy. It reports on activities of the Deep Space Network (DSN) in planning, supporting research and technology, implementation, and operations. Also included are standardization activities at the Jet Propulsion Laboratory for space data and information systems.
The Telecommunications and Data Acquisition Report
NASA Technical Reports Server (NTRS)
Posner, E. C. (Editor)
1983-01-01
Archival reports on developments in programs managed by JPL's office of Telecommunications and Data Acquisition (TDA) are presented. In space communications, radio navigation, radio science, and ground-based radio astronomy, it reports on activities of the Deep Space Network (DSN) and its associated Ground Communications Facility (GCF) in planning, in supporting research and technology, in implementation, and in operations.
Optical Fiber Transmission In A Picture Archiving And Communication System For Medical Applications
NASA Astrophysics Data System (ADS)
Aaron, Gilles; Bonnard, Rene
1984-03-01
In an hospital, the need for an electronic communication network is increasing along with the digitization of pictures. This local area network is intended to link some picture sources such as digital radiography, computed tomography, nuclear magnetic resonance, ultrasounds etc...with an archiving system. Interactive displays can be used in examination rooms, physicians offices and clinics. In such a system, three major requirements must be considered : bit-rate, cable length, and number of devices. - The bit-rate is very important because a maximum response time of a few seconds must be guaranteed for several mega-bit pictures. - The distance between nodes may be a few kilometers in some large hospitals. - The number of devices connected to the network is never greater than a few tens because picture sources and computers represent important hardware, and simple displays can be concentrated. All these conditions are fulfilled by optical fiber transmissions. Depending on the topology and the access protocol, two solutions are to be considered - Active ring - Active or passive star Finally Thomson-CSF developments of optical transmission devices for large networks of TV distribution bring us a technological support and a mass produc-tion which will cut down hardware costs.
Ionospheric characteristics for archiving at the World Data Centers. Technical report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gamache, R.R.; Reinisch, B.W.
1990-12-01
A database structure for archiving ionospheric characteristics at uneven data rates was developed at the July 1989 Ionospheric Informatics Working Group (IIWG) Lowell Workshop in Digital Ionogram Data Formats for World Data Center Archiving. This structure is proposed as a new URSI standard and is being employed by the World Data Center A for solar terrestrial physics for archiving characteristics. Here the database has been slightly refined for the application and programs written to generate these database files using as input Digisonde 256 ARTIST data, post processed by the ULCAR ADEP (ARTIST Data Editing Program) system. The characteristics program asmore » well as supplemental programs developed for this task are described here. The new software will make it possible to archive the ionospheric characteristics from the Geophysics Laboratory high latitude Digisonde network, the AWS DISS and the international Digisonde networks, and other ionospheric sounding networks.« less
An Extensible Sensing and Control Platform for Building Energy Management
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rowe, Anthony; Berges, Mario; Martin, Christopher
2016-04-03
The goal of this project is to develop Mortar.io, an open-source BAS platform designed to simplify data collection, archiving, event scheduling and coordination of cross-system interactions. Mortar.io is optimized for (1) robustness to network outages, (2) ease of installation using plug-and-play and (3) scalable support for small to large buildings and campuses.
Development of multi-mission satellite data systems at the German Remote Sensing Data Centre
NASA Astrophysics Data System (ADS)
Lotz-Iwen, H. J.; Markwitz, W.; Schreier, G.
1998-11-01
This paper focuses on conceptual aspects of the access to multi-mission remote sensing data by online catalogue and information systems. The system ISIS of the German Remote Sensing Data Centre is described as an example of a user interface to earth observation data. ISIS has been designed to support international scientific research as well as operational applications by offering online access to the database via public networks. It provides catalogue retrieval, visualisation and transfer of image data, and is integrated in international activities dedicated to catalogue and archive interoperability. Finally, an outlook is given on international projects dealing with access to remote sensing data in distributed archives.
Community archiving of imaging studies
NASA Astrophysics Data System (ADS)
Fritz, Steven L.; Roys, Steven R.; Munjal, Sunita
1996-05-01
The quantity of image data created in a large radiology practice has long been a challenge for available archiving technology. Traditional methods ofarchiving the large quantity of films generated in radiology have relied on warehousing in remote sites, with courier delivery of film files for historical comparisons. A digital community archive, accessible via a wide area network, represents a feasible solution to the problem of archiving digital images from a busy practice. In addition, it affords a physician caring for a patient access to imaging studies performed at a variety ofhealthcare institutions without the need to repeat studies. Security problems include both network security issues in the WAN environment and access control for patient, physician and imaging center. The key obstacle to developing a community archive is currently political. Reluctance to participate in a community archive can be reduced by appropriate design of the access mechanisms.
Investigation of Magnetic Field Measurements.
1983-02-28
Report) Ill. SUPPLEMENTARY NOTES IS. KEY WORDS (CoEntnue on revere side I necoseer mnd Identify by block mamber) AFGL Magnetometer Network Fluxgate ... Magnetometer Induction Coil Magnetometer Temperature Dependency of Fluxgate Automation of Testing 20. ABSTRACT (Coniniue an reverse aide If neeeeey and...data collection platforms. Support was also provided to AFGL to process the fluxgate magnetometer archive tapes in order to make the data available to
Digital echocardiography 2002: now is the time
NASA Technical Reports Server (NTRS)
Thomas, James D.; Greenberg, Neil L.; Garcia, Mario J.
2002-01-01
The ability to acquire echocardiographic images digitally, store and transfer these data using the DICOM standard, and routinely analyze examinations exists today and allows the implementation of a digital echocardiography laboratory. The purpose of this review article is to outline the critical components of a digital echocardiography laboratory, discuss general strategies for implementation, and put forth some of the pitfalls that we have encountered in our own implementation. The major components of the digital laboratory include (1) digital echocardiography machines with network output, (2) a switched high-speed network, (3) a high throughput server with abundant local storage, (4) a reliable low-cost archive, (5) software to manage information, and (6) support mechanisms for software and hardware. Implementation strategies can vary from a complete vendor solution providing all components (hardware, software, support), to a strategy similar to our own where standard computer and networking hardware are used with specialized software for management of image and measurement information.
ERIC Educational Resources Information Center
McCarthy, Gavan; Evans, Joanne
2007-01-01
This article examines the evolution of a national register of the archives of science and technology in Australia and the related development of an archival informatics focused initially on people and their relationships to archival materials. The register was created in 1985 as an in-house tool for the Australian Science Archives Project of the…
The Telecommunications and Data Acquisition Report
NASA Technical Reports Server (NTRS)
Yuen, Joseph H. (Editor)
1993-01-01
This quarterly publication provides archival reports on developments in programs managed by JPL's Office of Telecommunications and Data Acquisition (TDA). In space communications, radio navigation, radio science, and ground-based radio and radar astronomy, it reports on activities of the Deep Space Network (DSN) in planning, supporting research and technology, implementation, and operations. Also included are standards activity at JPL for space data and information systems and reimbursable DSN work performed for other space agencies through NASA. The papers included in this document cover satellite tracking and ground-based navigation, spacecraft-ground communications, and optical communication systems for the Deep Space Network.
NASA Technical Reports Server (NTRS)
Edberg, Stephen J. (Editor)
1996-01-01
The International Halley Watch (IHW) was organized for the purpose of gathering and archiving the most complete record of the apparition of a comet, Comet Halley (1982i = 1986 III = 1P/Halley), ever compiled. The redirection of the International Cometary Explorer (ICE), toward Comet Giacobini-Zinner (1984e = 1985 XIII = 21P/Giacobini-Zinner) prompted the initiation of a formal watch on that comet. All the data collected on P/Giacobini-Zinner and P/Halley have been published on CD-ROM in the Comet Halley Archive. This document contains a printed version of the archive data, collected by amateur astronomers, on these two comets. Volume 1 contains the Comet Giacobini-Zinner data archive and Volume 2 contains the Comet Halley archive. Both volumes include information on how to read the data in both archives, as well as a history of both comet watches (including the organizing of the network of astronomers and lessons learned from that experience).
A Framework for Integration of Heterogeneous Medical Imaging Networks
Viana-Ferreira, Carlos; Ribeiro, Luís S; Costa, Carlos
2014-01-01
Medical imaging is increasing its importance in matters of medical diagnosis and in treatment support. Much is due to computers that have revolutionized medical imaging not only in acquisition process but also in the way it is visualized, stored, exchanged and managed. Picture Archiving and Communication Systems (PACS) is an example of how medical imaging takes advantage of computers. To solve problems of interoperability of PACS and medical imaging equipment, the Digital Imaging and Communications in Medicine (DICOM) standard was defined and widely implemented in current solutions. More recently, the need to exchange medical data between distinct institutions resulted in Integrating the Healthcare Enterprise (IHE) initiative that contains a content profile especially conceived for medical imaging exchange: Cross Enterprise Document Sharing for imaging (XDS-i). Moreover, due to application requirements, many solutions developed private networks to support their services. For instance, some applications support enhanced query and retrieve over DICOM objects metadata. This paper proposes anintegration framework to medical imaging networks that provides protocols interoperability and data federation services. It is an extensible plugin system that supports standard approaches (DICOM and XDS-I), but is also capable of supporting private protocols. The framework is being used in the Dicoogle Open Source PACS. PMID:25279021
A framework for integration of heterogeneous medical imaging networks.
Viana-Ferreira, Carlos; Ribeiro, Luís S; Costa, Carlos
2014-01-01
Medical imaging is increasing its importance in matters of medical diagnosis and in treatment support. Much is due to computers that have revolutionized medical imaging not only in acquisition process but also in the way it is visualized, stored, exchanged and managed. Picture Archiving and Communication Systems (PACS) is an example of how medical imaging takes advantage of computers. To solve problems of interoperability of PACS and medical imaging equipment, the Digital Imaging and Communications in Medicine (DICOM) standard was defined and widely implemented in current solutions. More recently, the need to exchange medical data between distinct institutions resulted in Integrating the Healthcare Enterprise (IHE) initiative that contains a content profile especially conceived for medical imaging exchange: Cross Enterprise Document Sharing for imaging (XDS-i). Moreover, due to application requirements, many solutions developed private networks to support their services. For instance, some applications support enhanced query and retrieve over DICOM objects metadata. This paper proposes anintegration framework to medical imaging networks that provides protocols interoperability and data federation services. It is an extensible plugin system that supports standard approaches (DICOM and XDS-I), but is also capable of supporting private protocols. The framework is being used in the Dicoogle Open Source PACS.
Preserving the Pyramid of STI Using Buckets
NASA Technical Reports Server (NTRS)
Nelson, Michael L.; Maly, Kurt
2004-01-01
The product of research projects is information. Through the life cycle of a project, information comes from many sources and takes many forms. Traditionally, this body of information is summarized in a formal publication, typically a journal article. While formal publications enjoy the benefits of peer review and technical editing, they are also often compromises in media format and length. As such, we consider a formal publication to represent an abstract to a larger body of work: a pyramid of scientific and technical information (STI). While this abstract may be sufficient for some applications, an in-depth use or analysis is likely to require the supporting layers from the pyramid. We have developed buckets to preserve this pyramid of STI. Buckets provide an archive- and protocol-independent container construct in which all related information objects can be logically grouped together, archived, and manipulated as a single object. Furthermore, buckets are active archival objects and can communicate with each other, people, or arbitrary network services. Buckets are an implementation of the Smart Object, Dumb Archive (SODA) DL model. In SODA, data objects are more important than the archives that hold them. Much of the functionality traditionally associated with archives is pushed down into the objects, such as enforcing terms and conditions, negotiating display, and content maintenance. In this paper, we discuss the motivation, design, and implication of bucket use in DLs with respect to grey literature.
NASA Technical Reports Server (NTRS)
Touch, Joseph D.
1994-01-01
Future NASA earth science missions, including the Earth Observing System (EOS), will be generating vast amounts of data that must be processed and stored at various locations around the world. Here we present a stepwise-refinement of the intelligent database management (IDM) of the distributed active archive center (DAAC - one of seven regionally-located EOSDIS archive sites) architecture, to showcase the telecommunications issues involved. We develop this architecture into a general overall design. We show that the current evolution of protocols is sufficient to support IDM at Gbps rates over large distances. We also show that network design can accommodate a flexible data ingestion storage pipeline and a user extraction and visualization engine, without interference between the two.
Closet to Cloud: The online archiving of tape-based continuous NCSN seismic data from 1993-2005
NASA Astrophysics Data System (ADS)
Neuhauser, D. S.; Aranha, M. A.; Kohler, W. M.; Oppenheimer, D.
2016-12-01
As earthquake monitoring systems in the 1980s moved from analog to digital recording systems, most seismic networks only archived digital waveforms from detected events due to lack of affordable online digital storage for continuous high-rate (100 sps) data. The Northern California Earthquake Data Center (NCEDC), established in 1991 by UC Berkeley and the USGS Menlo Park, archived 20 sps continuous data and triggerd high-rate from the sparse Berkeley seismic network, but could not afford the online storage for continuous high-rate data from the 300+ stations of the USGS Northern California Seismic Network (NCSN). The discovery of non-volcanic tremor and the use of continuous waveform correlation techniques for detecting repeating earthquakes combined with the increase in disk capacity capacity and significant reduction in disk costs led the Northern California Earthquake Data Center (NCEDC) to begin archiving continuous high-rate waveforms in 2004-2005. The USGS Menlo Park NCSN network had backup tapes of continuous high-rate waveform data since 1993 on the shelf, and the USGS and NCEDC embarked on a project to restore and archive all continuous NCSN data from 1993 through 2005. We will discuss the procedures and problems encountered when reading, transcribing, converting data formats, SEED channel naming, and archiving the 1993-2005 continuous NCSN waveforms. We will also illustrate new science enabled by these data. These and other northern California seismic and geophysical data are available via web services at http://service.ncedc.org
Mass-storage management for distributed image/video archives
NASA Astrophysics Data System (ADS)
Franchi, Santina; Guarda, Roberto; Prampolini, Franco
1993-04-01
The realization of image/video database requires a specific design for both database structures and mass storage management. This issue has addressed the project of the digital image/video database system that has been designed at IBM SEMEA Scientific & Technical Solution Center. Proper database structures have been defined to catalog image/video coding technique with the related parameters, and the description of image/video contents. User workstations and servers are distributed along a local area network. Image/video files are not managed directly by the DBMS server. Because of their wide size, they are stored outside the database on network devices. The database contains the pointers to the image/video files and the description of the storage devices. The system can use different kinds of storage media, organized in a hierarchical structure. Three levels of functions are available to manage the storage resources. The functions of the lower level provide media management. They allow it to catalog devices and to modify device status and device network location. The medium level manages image/video files on a physical basis. It manages file migration between high capacity media and low access time media. The functions of the upper level work on image/video file on a logical basis, as they archive, move and copy image/video data selected by user defined queries. These functions are used to support the implementation of a storage management strategy. The database information about characteristics of both storage devices and coding techniques are used by the third level functions to fit delivery/visualization requirements and to reduce archiving costs.
NASA Technical Reports Server (NTRS)
Saeed, M.; Lieu, C.; Raber, G.; Mark, R. G.
2002-01-01
Development and evaluation of Intensive Care Unit (ICU) decision-support systems would be greatly facilitated by the availability of a large-scale ICU patient database. Following our previous efforts with the MIMIC (Multi-parameter Intelligent Monitoring for Intensive Care) Database, we have leveraged advances in networking and storage technologies to develop a far more massive temporal database, MIMIC II. MIMIC II is an ongoing effort: data is continuously and prospectively archived from all ICU patients in our hospital. MIMIC II now consists of over 800 ICU patient records including over 120 gigabytes of data and is growing. A customized archiving system was used to store continuously up to four waveforms and 30 different parameters from ICU patient monitors. An integrated user-friendly relational database was developed for browsing of patients' clinical information (lab results, fluid balance, medications, nurses' progress notes). Based upon its unprecedented size and scope, MIMIC II will prove to be an important resource for intelligent patient monitoring research, and will support efforts in medical data mining and knowledge-discovery.
NASA Technical Reports Server (NTRS)
Edberg, Stephen J. (Editor)
1996-01-01
The International Halley Watch (IHW) was organized for the purpose of gathering and archiving the most complete record of the apparition of a comet, Halley's Comet (1982i = 1986 III = 1P/Halley), ever compiled. The redirection of the International Sun-Earth Explorer 3 (ISEE-3) spacecraft, subsequently renamed the International Cometary Explorer (ICE), toward Comet Giacobini- Zinner (1984e = 1985 XIII = 21P/Giacobini-Zinner) prompted the initiation of a formal watch on that comet. All the data collected on P/Giacobini-Zinner and P/Halley have been published on CD-ROM in the Comet Halley Archive. This document contains a printed version of the archive data, collected by amateur astronomers, on these two comets. Volume 1 contains the Comet Giacobini-Zinner data archive and Volume 2 contains the Comet Halley archive. Both volumes include information on how to read the data in both archives, as well as a history of both comet watches (including the organizing of the network of astronomers and lessons learned from that experience).
NASA Technical Reports Server (NTRS)
Edberg, Stephen J. (Editor)
1966-01-01
The International Halley Watch (IHW) was organized for the purpose of gathering and archiving the most complete record of the apparition of a comet, Halley's Comet (1982i = 1986 III = 1P/Halley), ever compiled. The redirection of the International Sun-Earth Explorer 3 (ISEE-3) spacecraft, subsequently renamed the International Cometary Explorer (ICE), toward Comet Giacobini-Zinner (1984e = 1985 XIII = 21P/Giacobini-Zinner) prompted the initiation of a formal watch on that comet. All the data collected on P/Giacobini-Zinner and P/Halley have been published on CD-ROM in the Comet Halley Archive. This document contains a printed version of the archive data, collected by amateur astronomers, on these two comets. Volume 1 contains the Comet Giacobini-Zinner data archive and Volume 2 contains the Comet Halley archive. Both volumes include information on how to read the data in both archives, as well as a history of both comet watches (including the organizing of the network of astronomers and lessons learned from that experience).
The challenge of a data storage hierarchy
NASA Technical Reports Server (NTRS)
Ruderman, Michael
1992-01-01
A discussion of Mesa Archival Systems' data archiving system is presented. This data archiving system is strictly a software system that is implemented on a mainframe and manages the data into permanent file storage. Emphasis is placed on the fact that any kind of client system on the network can be connected through the Unix interface of the data archiving system.
The Telecommunications and Data Acquisition Report
NASA Technical Reports Server (NTRS)
Posner, E. C. (Editor)
1993-01-01
This quarterly publication provides archival reports on developments in programs managed by JPL's Office of Telecommunications and Data Acquisition (TDA). In space communications, radio navigation, radio science, and ground-based radio and radar astronomy, it reports on activities of the Deep Space Network (DSN) in planning, supporting research and technology, implementation, and operations. Also included are standards activity at JPL for space data and information systems and reimbursable DSN work performed for other space agencies through NASA.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Scott, Bari
SoundVision held a post-workshop teleconference for our 2011 graduates (as we have done for all participants) to consolidate what they'd learned during the workshop. To maximize the Science Literacy Project's impact after it ends, we strengthened and reinforced our alumni's vibrant networking infrastructure so they can continue to connect and support each other, and updated our archive system to ensure all of our science and science journalism resources and presentations will be easy to access and use over time.
Archive interoperability in the Virtual Observatory
NASA Astrophysics Data System (ADS)
Genova, Françoise
2003-02-01
Main goals of Virtual Observatory projects are to build interoperability between astronomical on-line services, observatory archives, databases and results published in journals, and to develop tools permitting the best scientific usage from the very large data sets stored in observatory archives and produced by large surveys. The different Virtual Observatory projects collaborate to define common exchange standards, which are the key for a truly International Virtual Observatory: for instance their first common milestone has been a standard allowing exchange of tabular data, called VOTable. The Interoperability Work Area of the European Astrophysical Virtual Observatory project aims at networking European archives, by building a prototype using the CDS VizieR and Aladin tools, and at defining basic rules to help archive providers in interoperability implementation. The prototype is accessible for scientific usage, to get user feedback (and science results!) at an early stage of the project. ISO archive participates very actively to this endeavour, and more generally to information networking. The on-going inclusion of the ISO log in SIMBAD will allow higher level links for users.
Moving towards persistent identification in the seismological community
NASA Astrophysics Data System (ADS)
Quinteros, Javier; Evans, Peter; Strollo, Angelo; Ulbricht, Damian; Elger, Kirsten; Bertelmann, Roland
2016-04-01
The GEOFON data centre and others in the seismological community have been archiving seismic waveforms for many years. The amount of seismic data available continuously increases due to the use of higher sampling rates and the growing number of stations. In recent years, there is a trend towards standardization of the protocols and formats to improve and homogenise access to these data [FDSN, 2013]. The seismological community has begun assigning a particular persistent identifier (PID), the Digital Object Identifier (DOI), to seismic networks as a first step for properly and consistently attributing the use of data from seismic networks in scientific articles [Evans et al., 2015]. This was codified in a recommendation by the international Federation of Digital Seismic Networks [FDSN, 2014]; DOIs for networks now appear in community web pages. However, our community, in common with other fields of science, still struggles with issues such as: supporting reproducibility of results; providing proper attribution (data citation) for data sets; and measuring the impact (by tracking their use) of, those data sets. Seismological data sets used for research are frequently created "on-the-fly" based on particular user requirements such as location or time period; users prepare requests to select subsets of the data held in seismic networks; the data actually provided may even be held at many different data centres [EIDA, 2016]. These subsets also require careful citation. For persistency, a request must receive exactly the same data when repeated at a later time. However, if data are curated between requests, the data set delivered may differ, severely complicating the ability to reproduce a result. Transmission problems or configuration problems may also inadvertently modify the response to a request. With this in mind, our next step is the assignment of additional EPIC-PIDs to daily data files (currently over 28 million in the GEOFON archive) for use within the data centre. These will be used for replication and versioning of the data. This will support reproducible, fine-grained citation of seismic waveform data in a consistent fashion. Moreover, we plan to create also PIDs for collections of PIDs, in order to support the citation of a set of many data files with a single identifier. The technical information describing the instruments used to acquire the data and their location will most probably be also identified with a PID (to a StationXML record) and pointed to from the metadata of the waveform PID. StationXML will also include the DOI of the network for citation purposes. With all these elements, progress towards reproducibility and better attribution are gained. References - EIDA (2016): European Integrated Data Archive (EIDA) . http://www.orfeus-eu.org/eida/eida.html - Evans, P., Strollo, A., Clark, A., Ahern, T., Newman, R., Clinton, J. F., Pedersen, H., Pequegnat, C. (2015 online): Why Seismic Networks Need Digital Object Identifiers. - Eos, Transactions American Geophysical Union, 96. http://doi.org/10.1029/2015EO036971 - International Federation of Digital Seismograph Networks (FDSN) (2013): FDSN Web Service Specifications, Version 1.1b, 2013/10/25. http://www.fdsn.org/webservices/FDSN-WS-Specifications-1.1.pdf - International Federation of Digital Seismograph Networks (FDSN) (2014), FDSN recommendations for seismic network DOIs and related FDSN services [WG3 recommendation], http://doi.org/10.7914/D11596.
Ocean Wireless Networking and Real Time Data Management
NASA Astrophysics Data System (ADS)
Berger, J.; Orcutt, J. A.; Vernon, F. L.; Braun, H. W.; Rajasekar, A.
2001-12-01
Recent advances in technology have enabled the exploitation of satellite communications for high-speed (> 64 kbps) duplex communications with oceanographic ships at sea. Furthermore, decreasing costs for high-speed communications have made possible continuous connectivity to the global Internet for delivery of data ashore and communications with scientists and engineers on the ship. Through support from the Office of Naval Research, we have planned a series of tests using the R/V Revelle for real time data delivery of large quantities of underway data (e.g. continuous multibeam profiling) to shore for quality control, archiving, and real-time data availability. The Cecil H. and Ida M. Green Institute of Geophysics and Planetary Physics (IGPP) and the San Diego Supercomputer Center (SDSC) were funded by the NSF Information Technology Research (ITR) Program, the California Institute for Telecommunications and Information Technology [Cal-(IT)2] and the Scripps Institution of Oceanography for research entitled: "Exploring the Environment in Time: Wireless Networks & Real-Time Management." We will describe the technology to be used for the real-time seagoing experiment and the planned expansion of the project through support from the ITR grant. The short-term goal is to exercise the communications system aboard ship in various weather conditions and sea states while testing and developing the real-time data quality control and archiving methodology. The long-term goal is to enable continuous observations in the ocean, specifically supporting the goals of the DEOS (Dynamics of Earth and Ocean Systems) observatory program supported through a NSF Major Research Equipment (MRE) program - a permanent presence in the oceans. The impact on scientific work aboard ships, however, is likely to be fundamental. It will be possible to go to sea in the future with limited engineering capability for scientific operations by allowing shore-based quality control of data collected and videoconferencing for problem resolution. Costs for shipboard measurements will be reduced significantly while, at the same time, the quality of data collected will increase and ex-post-facto data archiving will no longer be necessary.
On-line access to remote sensing data with the satellite-data information system (ISIS)
NASA Astrophysics Data System (ADS)
Strunz, G.; Lotz-Iwen, H.-J.
1994-08-01
The German Remote Sensing Data Center (DFD) is developing the satellite-data information system ISIS as central interface for users to access Earth observation data. ISIS has been designed to support international scientific research as well as operational applications by offering online database access via public networks, and is integrated in the international activities dedicated to catalogue and archive interoperability. A prototype of ISIS is already in use within the German Processing and Archiving Facility for ERS-1 for the storage and retrieval of digital SAR quicklook products and for the Radarmap of Germany. An operational status of the system is envisaged for the launch of ERS-2. The paper in hand describes the underlying concepts of ISIS and the recent state of realization. It explains the overall structure of the system and the functionality of each of its components. Emphasis is put on the description of the advisory system, the catalogue retrieval, and the online access and transfer of image data. Finally, the integration into a future global environmental data network is outlined.
The challenges of archiving networked-based multimedia performances (Performance cryogenics)
NASA Astrophysics Data System (ADS)
Cohen, Elizabeth; Cooperstock, Jeremy; Kyriakakis, Chris
2002-11-01
Music archives and libraries have cultural preservation at the core of their charters. New forms of art often race ahead of the preservation infrastructure. The ability to stream multiple synchronized ultra-low latency streams of audio and video across a continent for a distributed interactive performance such as music and dance with high-definition video and multichannel audio raises a series of challenges for the architects of digital libraries and those responsible for cultural preservation. The archiving of such performances presents numerous challenges that go beyond simply recording each stream. Case studies of storage and subsequent retrieval issues for Internet2 collaborative performances are discussed. The development of shared reality and immersive environments generate issues about, What constitutes an archived performance that occurs across a network (in multiple spaces over time)? What are the families of necessary metadata to reconstruct this virtual world in another venue or era? For example, if the network exhibited changes in latency the performers most likely adapted. In a future recreation, the latency will most likely be completely different. We discuss the parameters of immersive environment acquisition and rendering, network architectures, software architecture, musical/choreographic scores, and environmental acoustics that must be considered to address this problem.
NASA Technical Reports Server (NTRS)
Noll. Carey E.
2010-01-01
Since 1982. the Crustal Dynamics Data Information System (CDDIS) has supported the archive and distribution of geodetic data products acquired by the National Aeronautics and Space Administration (NASA) as well as national and international programs. The CDDIS provides easy, timely, and reliable access to a variety of data sets, products, and information about these data. These measurements. obtained from a global network of nearly 650 instruments at more than 400 distinct sites, include DORIS (Doppler Orbitography and Radiopositioning Integrated by Satellite), GNSS (Global Navigation Satellite System), SLR and LLR (Satellite and Lunar Laser Ranging), and VLBI (Very Long Baseline Interferometry). The CDDIS data system and its archive have become increasingly important to many national and international science communities, particularly several of the operational services within the International Association of Geodesy (IAG) and its observing system the Global Geodetic Observing System (GGOS), including the International DORIS Service (IDS), the International GNSS Service (IGS). the International Laser Ranging Service (ILRS), the International VLBI Service for Geodesy and Astrometry (IVS). and the International Earth rotation and Reference frame Service (IERS), Investigations resulting from the data and products available through the CDDIS support research in many aspects of Earth system science and global change. Each month, the CDDIS archives more than one million data and derived product files totaling over 90 Gbytes in volume. In turn. the global user community downloads nearly 1.2 TBytes (over 10.5 million files) of data and products from the CDDIS each month. The requirements of analysts have evolved since the start of the CDDIS; the specialized nature of the system accommodates the enhancements required to support diverse data sets and user needs. This paper discusses the CDDIS. including background information about the system and its. user communities. archive contents. available metadata, and future plans.
NASA Astrophysics Data System (ADS)
Benson, R. B.; Ahern, T. K.; Trabant, C.
2006-12-01
The IRIS Data Management System has long supported international collaboration for seismology by both deploying a global network of seismometers and creating and maintaining an open and accessible archive in Seattle, WA, known as the Data Management Center (DMC). With sensors distributed on a global scale spanning more than 30 years of digital data, the DMC provides a rich repository of observations across broad time and space domains. Primary seismological data types include strong motion and broadband seismometers, conventional and superconducting gravimeters, tilt and creep meters, GPS measurements, along with other similar sensors that record accurate and calibrated ground motion. What may not be as well understood is the volume of environmental data that accompanies typical seismological data these days. This poster will review the types of time-series data that are currently being collected, how they are collected, and made freely available for download at the IRIS DMC. Environmental sensor data that is often co-located with geophysical data sensors include temperature, barometric pressure, wind direction and speed, humidity, insolation, rain gauge, and sometimes hydrological data like water current, level, temperature and depth. As the primary archival institution of the International Federation of Digital Seismograph Networks (FDSN), the IRIS DMC collects approximately 13,600 channels of real-time data from 69 different networks, from close to 1600 individual stations, currently averaging 10Tb per year in total. A major contribution to the IRIS archive currently is the EarthScope project data, a ten-year science undertaking that is collecting data from a high-resolution, multi-variate sensor network. Data types include magnetotelluric, high-sample rate seismics from a borehole drilled into the San Andreas fault (SAFOD) and various types of strain data from the Plate Boundary Observatory (PBO). In addition to the DMC, data centers located in other countries are networked seamlessly, and are providing access for researchers to these data from national networks around the world utilizing the IRIS developed Data Handling Interface (DHI) system. This poster will highlight some of the DHI enabled clients that allow geophysical information to be directly transferred to the clients. This ability allows one to construct a virtual network of data centers providing the illusion of a single virtual observatory. Furthermore, some of the features that will be shown include direct connections to MATLAB and the ability to access globally distributed sensor data in real time. We encourage discussion and participation from network operators who would like to leverage existing technology, as well as enabling collaboration.
DAVE: A plug and play model for distributed multimedia application development
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mines, R.F.; Friesen, J.A.; Yang, C.L.
1994-07-01
This paper presents a model being used for the development of distributed multimedia applications. The Distributed Audio Video Environment (DAVE) was designed to support the development of a wide range of distributed applications. The implementation of this model is described. DAVE is unique in that it combines a simple ``plug and play`` programming interface, supports both centralized and fully distributed applications, provides device and media extensibility, promotes object reuseability, and supports interoperability and network independence. This model enables application developers to easily develop distributed multimedia applications and create reusable multimedia toolkits. DAVE was designed for developing applications such as videomore » conferencing, media archival, remote process control, and distance learning.« less
Problems and Mitigation Strategies for Developing and Validating Statistical Cyber Defenses
2014-04-01
Clustering Support Vector Machine (SVM) Classification Netflow Twitter Training Datasets Trained SVMs Enriched Feature State...requests. • Netflow data for TCP connections • E-mail data from SMTP logs • Chat data from XMPP logs • Microtext data (from Twitter message archives...summary data from Bro and Netflow data captured on the BBN network over the period of 1 month, plus simulated attacks WHOIS Domain name record
European distributed seismological data archives infrastructure: EIDA
NASA Astrophysics Data System (ADS)
Clinton, John; Hanka, Winfried; Mazza, Salvatore; Pederson, Helle; Sleeman, Reinoud; Stammler, Klaus; Strollo, Angelo
2014-05-01
The European Integrated waveform Data Archive (EIDA) is a distributed Data Center system within ORFEUS that (a) securely archives seismic waveform data and related metadata gathered by European research infrastructures, and (b) provides transparent access to the archives for the geosciences research communities. EIDA was founded in 2013 by ORFEUS Data Center, GFZ, RESIF, ETH, INGV and BGR to ensure sustainability of a distributed archive system and the implementation of standards (e.g. FDSN StationXML, FDSN webservices) and coordinate new developments. Under the mandate of the ORFEUS Board of Directors and Executive Committee the founding group is responsible for steering and maintaining the technical developments and organization of the European distributed seismic waveform data archive and the integration within broader multidisciplanry frameworks like EPOS. EIDA currently offers uniform data access to unrestricted data from 8 European archives (www.orfeus-eu.org/eida), linked by the Arclink protocol, hosting data from 75 permanent networks (1800+ stations) and 33 temporary networks (1200+) stations). Moreover, each archive may also provide unique, restricted datasets. A webinterface, developed at GFZ, offers interactive access to different catalogues (EMSC, GFZ, USGS) and EIDA waveform data. Clients and toolboxes like arclink_fetch and ObsPy can connect directly to any EIDA node to collect data. Current developments are directed to the implementation of quality parameters and strong motion parameters.
Security Considerations for Archives: Rare Book, Manuscript, and Other Special Collections.
ERIC Educational Resources Information Center
Cupp, Christian M.
The first of six sections in this guide to security for special collections in archives and libraries discusses the importance of security and the difficulty of preventing theft of archival materials. The second section, which focuses on planning, recommends an inservice training program for staff, a planned communications network between library…
Detailed description of the Mayo/IBM PACS
NASA Astrophysics Data System (ADS)
Gehring, Dale G.; Persons, Kenneth R.; Rothman, Melvyn L.; Salutz, James R.; Morin, Richard L.
1991-07-01
The Mayo Clinic and IBM/Rochester have jointly developed a picture archiving system (PACS) for use with Mayo's MRI and Neuro-CT imaging modalities. The system was developed to replace the imaging system's vendor-supplied magnetic tape archiving capability. The system consists of seven MR imagers and nine CT scanners, each interfaced to the PACS via IBM Personal System/2(tm) (PS/2) computers, which act as gateways from the imaging modality to the PACS network. The PAC system operates on the token-ring component of Mayo's city-wide local area network. Also on the PACS network are four optical storage subsystems used for image archival, three optical subsystems used for image retrieval, an IBM Application System/400(tm) (AS/400) computer used for database management and multiple PS/2-based image display systems and their image servers.
A portal for the ocean biogeographic information system
Zhang, Yunqing; Grassle, J. F.
2002-01-01
Since its inception in 1999 the Ocean Biogeographic Information System (OBIS) has developed into an international science program as well as a globally distributed network of biogeographic databases. An OBIS portal at Rutgers University provides the links and functional interoperability among member database systems. Protocols and standards have been established to support effective communication between the portal and these functional units. The portal provides distributed data searching, a taxonomy name service, a GIS with access to relevant environmental data, biological modeling, and education modules for mariners, students, environmental managers, and scientists. The portal will integrate Census of Marine Life field projects, national data archives, and other functional modules, and provides for network-wide analyses and modeling tools.
The World Radiation Monitoring Center of the Baseline Surface Radiation Network: Status 2017
NASA Astrophysics Data System (ADS)
Driemel, Amelie; König-Langlo, Gert; Sieger, Rainer; Long, Charles N.
2017-04-01
The World Radiation Monitoring Center (WRMC) is the central archive of the Baseline Surface Radiation Network (BSRN). The BSRN was initiated by the World Climate Research Programme (WCRP) Working Group on Radiative Fluxes and began operations in 1992. One of its aims is to provide short and long-wave surface radiation fluxes of the best possible quality to support the research projects of the WCRP and other scientific projects. The high quality, uniform and consistent measurements of the BSRN network can be used to monitor the short- and long-wave radiative components and their changes with the best methods currently available, to validate and evaluate satellite-based estimates of the surface radiative fluxes, and to verify the results of global climate models. In 1992 the BSRN/WRMC started at ETH Zurich, Switzerland with 9 stations. Since 2007 the archive is hosted by the Alfred-Wegener-Institut (AWI) in Bremerhaven, Germany (http://www.bsrn.awi.de/) and comprises a network of currently 59 stations in contrasting climatic zones, covering a latitude range from 80°N to 90°S. Of the 59 stations, 23 offer the complete radiation budget (down- and upwelling short- and long-wave data). In addition to the ftp-service access instituted at ETH Zurich, the archive at AWI offers data access via PANGAEA - Data Publisher for Earth & Environmental Science (https://www.pangaea.de). PANGAEA guarantees the long-term availability of its content through a commitment of the operating institutions. Within PANGAEA, the metadata of the stations are freely available. To access the data itself an account is required. If the scientist accepts to follow the data release guidelines of the archive (http://bsrn.awi.de/data/conditions-of-data-release/) he or she can get an account from amelie.driemel@awi.de. Currently, more than 9,400 station months (>780 years) are available for interested scientists (see also https://dataportals.pangaea.de/bsrn/?q=LR0100 for an overview on available data). After long years of excellent service as the director of the WRMC, Gert-König Langlo retires in 2017. He is handing over the duties to the current WRMC data curator Amelie Driemel who will continue this important task in the years to come.
JNDMS Task Authorization 2 Report
2013-10-01
uses Barnyard to store alarms from all DREnet Snort sensors in a MySQL database. Barnyard is an open source tool designed to work with Snort to take...Technology ITI Information Technology Infrastructure J2EE Java 2 Enterprise Edition JAR Java Archive. This is an archive file format defined by Java ...standards. JDBC Java Database Connectivity JDW JNDMS Data Warehouse JNDMS Joint Network and Defence Management System JNDMS Joint Network Defence and
NASA Technical Reports Server (NTRS)
Simpson, James J.; Harkins, Daniel N.
1993-01-01
Historically, locating and browsing satellite data has been a cumbersome and expensive process. This has impeded the efficient and effective use of satellite data in the geosciences. SSABLE is a new interactive tool for the archive, browse, order, and distribution of satellite date based upon X Window, high bandwidth networks, and digital image rendering techniques. SSABLE provides for automatically constructing relational database queries to archived image datasets based on time, data, geographical location, and other selection criteria. SSABLE also provides a visual representation of the selected archived data for viewing on the user's X terminal. SSABLE is a near real-time system; for example, data are added to SSABLE's database within 10 min after capture. SSABLE is network and machine independent; it will run identically on any machine which satisfies the following three requirements: 1) has a bitmapped display (monochrome or greater); 2) is running the X Window system; and 3) is on a network directly reachable by the SSABLE system. SSABLE has been evaluated at over 100 international sites. Network response time in the United States and Canada varies between 4 and 7 s for browse image updates; reported transmission times to Europe and Australia typically are 20-25 s.
NASA Astrophysics Data System (ADS)
Servilla, M. S.; Brunt, J.; Costa, D.; Gries, C.; Grossman-Clarke, S.; Hanson, P. C.; O'Brien, M.; Smith, C.; Vanderbilt, K.; Waide, R.
2017-12-01
In the world of data repositories, there seems to be a never ending struggle between the generation of high-quality data documentation and the ease of archiving a data product in a repository - the higher the documentation standards, the greater effort required by the scientist, and the less likely the data will be archived. The Environmental Data Initiative (EDI) attempts to balance the rigor of data documentation to the amount of effort required by a scientist to upload and archive data. As an outgrowth of the LTER Network Information System, the EDI is funded by the US NSF Division of Environmental Biology, to support the LTER, LTREB, OBFS, and MSB programs, in addition to providing an open data archive for environmental scientists without a viable archive. EDI uses the PASTA repository software, developed originally by the LTER. PASTA is metadata driven and documents data with the Ecological Metadata Language (EML), a high-fidelity standard that can describe all types of data in great detail. PASTA incorporates a series of data quality tests to ensure that data are correctly documented with EML in a process that is termed "metadata and data congruence", and incongruent data packages are forbidden in the repository. EDI reduces the burden of data documentation on scientists in two ways: first, EDI provides hands-on assistance in data documentation best practices using R and being developed in Python, for generating EML. These tools obscure the details of EML generation and syntax by providing a more natural and contextual setting for describing data. Second, EDI works closely with community information managers in defining rules used in PASTA quality tests. Rules deemed too strict can be turned off completely or just issue a warning, while the community learns to best handle the situation and improve their documentation practices. Rules can also be added or refined over time to improve overall quality of archived data. The outcome of quality tests are stored as part of the data archive in PASTA and are accessible to all users of the EDI data repository. In summary, EDI's metadata support to scientists and the comprehensive set of data quality tests for metadata and data congruency provide an ideal archive for environmental and ecological data.
Schilling, R B
1993-05-01
Picture archiving and communication systems (PACS) provide image viewing at diagnostic, reporting, consultation, and remote workstations; archival on magnetic or optical media by means of short- or long-term storage devices; communications by means of local or wide area networks or public communication services; and integrated systems with modality interfaces and gateways to health care facilities and departmental information systems. Research indicates three basic needs for image and report management: (a) improved communication and turnaround time between radiologists and other imaging specialists and referring physicians, (b) fast reliable access to both current and previously obtained images and reports, and (c) space-efficient archival support. Although PACS considerations are much more complex than those associated with single modalities, the same basic purchase criteria apply. These criteria include technical leadership, image quality, throughput, life cost (eg, initial cost, maintenance, upgrades, and depreciation), and total service. Because a PACS takes much longer to implement than a single modality, the customer and manufacturer must develop a closer working relationship than has been necessary in the past.
Byrnes, Hilary F; Miller, Brenda A
2012-12-01
Neighborhood characteristics have been linked to healthy behavior, including effective parenting behaviors. This may be partially explained through the neighborhood's relation to parents' access to social support from friends and family. The current study examined associations of neighborhood characteristics with parenting behaviors indirectly through social support. The sample included 614 mothers of 11-12 year old youths enrolled in a health care system in the San Francisco area. Structural equations modeling shows that neighborhood perceptions were related to parenting behaviors, indirectly through social support, while archival census neighborhood indicators were unrelated to social support and parenting. Perceived neighborhood social cohesion and control were related to greater social support, which was related to more effective parenting style, parent-child communication, and monitoring. Perceived neighborhood disorganization was unrelated to social support. Prevention strategies should focus on helping parents build a social support network that can act as a resource in times of need.
NASA Goddard Space Flight Center
NASA Technical Reports Server (NTRS)
Carter, David; Wetzel, Scott
2000-01-01
The NASA SLR Operational Center is responsible for: 1) NASA SLR network control, sustaining engineering, and logistics; 2) ILRS mission operations; and 3) ILRS and NASA SLR data operations. NASA SLR network control and sustaining engineering tasks include technical support, daily system performance monitoring, system scheduling, operator training, station status reporting, system relocation, logistics and support of the ILRS Networks and Engineering Working Group. These activities ensure the NASA SLR systems are meeting ILRS and NASA mission support requirements. ILRS mission operations tasks include mission planning, mission analysis, mission coordination, development of mission support plans, and support of the ILRS Missions Working Group. These activities ensure than new mission and campaign requirements are coordinated with the ILRS. Global Normal Points (NP) data, NASA SLR FullRate (FR) data, and satellite predictions are managed as part of data operations. Part of this operation includes supporting the ILRS Data Formats and Procedures Working Group. Global NP data operations consist of receipt, format and data integrity verification, archiving and merging. This activity culminates in the daily electronic transmission of NP files to the CDDIS. Currently of all these functions are automated. However, to ensure the timely and accurate flow of data, regular monitoring and maintenance of the operational software systems, computer systems and computer networking are performed. Tracking statistics between the stations and the data centers are compared periodically to eliminate lost data. Future activities in this area include sub-daily (i.e., hourly) NP data management, more stringent data integrity tests, and automatic station notification of format and data integrity issues.
The Telecommunications and Data Acquisition Report
NASA Technical Reports Server (NTRS)
Posner, Edward C. (Editor)
1991-01-01
This quarterly publication provides archival reports on developments in programs managed by the Jet Propulsion Laboratory's (JPL's) Office of Telecommunications and Data Acquisition (TDA). In space communications, radio navigation, radio science, and ground-based radio and radar astronomy, it reports on the activities of the Deep Space Network (DSN) in planning, in supporting research and technology, in implementation, and in operations. Also included is standards activity at JPL for space data, information systems, and reimbursable DSN work performed for other space agencies through NASA.
NASA Technical Reports Server (NTRS)
Papitashvili, N. E.; Papitashvili, V. O.; Allen, J. H.; Morris, L. D.
1995-01-01
The National Geophysical Data Center has the largest collection of geomagnetic data from the worldwide network of magnetic observatories. The data base management system and retrieval/display software have been developed for the archived geomagnetic data (annual means, monthly, daily, hourly, and 1-minute values) and placed on the center's CD-ROM's to provide users with 'user-oriented' and 'user-friendly' support. This system is described in this paper with a brief outline of provided options.
The Telecommunications and Data Acquisition
NASA Technical Reports Server (NTRS)
Posner, Edward C. (Editor)
1992-01-01
This quarterly publication provides archival reports on developments in programs managed by JPL's Office of Telecommunications and Data Acquisition (TDA). In space communications, radio navigation, radio science, and ground-based radio and radar astronomy, it reports on activities of the Deep Space Network (DSN) in planning, supporting research and technology, implementation, and operations. Also included are standards activity at JPL for space data and information systems and reimbursable DSN work performed for other space agencies through NASA. The preceding work is all performed for NASA's Office of Space Communications (OSC).
The 1990 annual statistics and highlights report
NASA Technical Reports Server (NTRS)
Green, James L.
1991-01-01
The National Space Science Data Center (NSSDC) has archived over 6 terabytes of space and Earth science data accumulated over nearly 25 years. It now expects these holdings to nearly double every two years. The science user community needs rapid access to this archival data and information about data. The NSSDC has been set on course to provide just that. Five years ago the NSSDC came on line, becoming easily reachable for thousands of scientists around the world through electronic networks it managed and other international electronic networks to which it connected. Since that time, the data center has developed and implemented over 15 interactive systems, operational nearly 24 hours per day, and is reachable through DECnet, TCP/IP, X25, and BITnet communication protocols. The NSSDC is a clearinghouse for the science user to find data needed through the Master Directory system whether it is at the NSSDC or deposited in over 50 other archives and data management facilities around the world. Over 13,000 users accessed the NSSDC electronic systems, during the past year. Thousands of requests for data have been satisfied, resulting in the NSSDC's sending out a volume of data last year that nearly exceeded a quarter of its holdings. This document reports on some of the highlights and distribution statistics for most of the basic NSSDC operational services for fiscal year 1990. It is intended to be the first of a series of annual reports on how well NSSDC is doing in supporting the space and Earth science user communities.
NASA Astrophysics Data System (ADS)
Busby, R. W.; Woodward, R.; Aderhold, K.; Frassetto, A.
2017-12-01
The Alaska Transportable Array deployment is completely installed, totaling 280 stations, with 194 new stations and 86 existing stations, 28 of those upgraded with new sensor emplacement. We briefly summarize the deployment of this seismic network, describe the added meteorological instruments and soil temperature gauges, and review our expectations for operation and demobilization. Curation of data from the contiguous Lower-48 States deployment of Transportable Array (>1800 stations, 2004-2015) has continued with the few gaps in real-time data replaced by locally archived files as well as minor adjustments in metadata. We highlight station digests that provide more detail on the components and settings of individual stations, documentation of standard procedures used throughout the deployment and other resources available online. In cooperation with IRIS DMC, a copy of the complete TA archive for the Lower-48 period has been transferred to a local disk to experiment with data access and software workflows that utilize most or all of the seismic timeseries, in contrast to event segments. Assembling such large datasets reliably - from field stations to a well managed data archive to a user's workspace - is complex. Sharing a curated and defined data volume with researchers is a potentially straightforward way to make data intensive analyses less difficult. We note that data collection within the Lower-48 continues with 160 stations of the N4 network operating at increased sample rates (100 sps) as part of the CEUSN, as operational support transitions from NSF to USGS.
NASA Astrophysics Data System (ADS)
Smith, Edward M.; Wright, Jeffrey; Fontaine, Marc T.; Robinson, Arvin E.
1998-07-01
The Medical Information, Communication and Archive System (MICAS) is a multi-vendor incremental approach to PACS. MICAS is a multi-modality integrated image management system that incorporates the radiology information system (RIS) and radiology image database (RID) with future 'hooks' to other hospital databases. Even though this approach to PACS is more risky than a single-vendor turn-key approach, it offers significant advantages. The vendors involved in the initial phase of MICAS are IDX Corp., ImageLabs, Inc. and Digital Equipment Corp (DEC). The network architecture operates at 100 MBits per sec except between the modalities and the stackable intelligent switch which is used to segment MICAS by modality. Each modality segment contains the acquisition engine for the modality, a temporary archive and one or more diagnostic workstations. All archived studies are available at all workstations, but there is no permanent archive at this time. At present, the RIS vendor is responsible for study acquisition and workflow as well as maintenance of the temporary archive. Management of study acquisition, workflow and the permanent archive will become the responsibility of the archive vendor when the archive is installed in the second quarter of 1998. The modalities currently interfaced to MICAS are MRI, CT and a Howtek film digitizer with Nuclear Medicine and computed radiography (CR) to be added when the permanent archive is installed. There are six dual-monitor diagnostic workstations which use ImageLabs Shared Vision viewer software located in MRI, CT, Nuclear Medicine, musculoskeletal reading areas and two in Radiology's main reading area. One of the major lessons learned to date is that the permanent archive should have been part of the initial MICAS installation and the archive vendor should have been responsible for image acquisition rather than the RIS vendor. Currently an archive vendor is being selected who will be responsible for the management of the archive plus the HIS/RIS interface, image acquisition, modality work list manager and interfacing to the current DICOM viewer software. The next phase of MICAS will include interfacing ultrasound, locating servers outside of the Radiology LAN to support the distribution of images and reports to the clinical floors and physician offices both within and outside of the University of Rochester Medical Center (URMC) campus and the teaching archive.
LDCM Ground System. Network Lesson Learned
NASA Technical Reports Server (NTRS)
Gal-Edd, Jonathan
2010-01-01
This slide presentation reviews the Landsat Data Continuity Mission (LDCM) and the lessons learned in implementing the network that was assembled to allow for the acquisition, archiving and distribution of the data from the Landsat mission. The objective of the LDCM is to continue the acquisition, archiving, and distribution of moderate-resolution multispectral imagery affording global, synoptic, and repetitive coverage of the earth's land surface at a scale where natural and human-induced changes can be detected, differentiated, characterized, and monitored over time. It includes a review of the ground network, including a block diagram of the ground network elements (GNE) and a review of the RF design and testing. Also included is a listing of the lessons learned.
Global Change Data Center: Mission, Organization, Major Activities, and 2001 Highlights
NASA Technical Reports Server (NTRS)
Wharton, Stephen W. (Technical Monitor)
2002-01-01
Rapid efficient access to Earth sciences data is fundamental to the Nation's efforts to understand the effects of global environmental changes and their implications for public policy. It becomes a bigger challenge in the future when data volumes increase further and missions with constellations of satellites start to appear. Demands on data storage, data access, network throughput, processing power, and database and information management are increased by orders of magnitude, while budgets remain constant and even shrink. The Global Change Data Center's (GCDC) mission is to provide systems, data products, and information management services to maximize the availability and utility of NASA's Earth science data. The specific objectives are (1) support Earth science missions be developing and operating systems to generate, archive, and distribute data products and information; (2) develop innovative information systems for processing, archiving, accessing, visualizing, and communicating Earth science data; and (3) develop value-added products and services to promote broader utilization of NASA Earth Sciences Enterprise (ESE) data and information. The ultimate product of GCDC activities is access to data and information to support research, education, and public policy.
Experience with PACS in an ATM/Ethernet switched network environment.
Pelikan, E; Ganser, A; Kotter, E; Schrader, U; Timmermann, U
1998-03-01
Legacy local area network (LAN) technologies based on shared media concepts are not adequate for the growth of a large-scale picture archiving and communication system (PACS) in a client-server architecture. First, an asymmetric network load, due to the requests of a large number of PACS clients for only a few main servers, should be compensated by communication links to the servers with a higher bandwidth compared to the clients. Secondly, as the number of PACS nodes increases, the network throughout should not measurably cut production. These requirements can easily be fulfilled using switching technologies. Here asynchronous transfer mode (ATM) is clearly one of the hottest topics in networking because the ATM architecture provides integrated support for a variety of communication services, and it supports virtual networking. On the other hand, most of the imaging modalities are not yet ready for integration into a native ATM network. For a lot of nodes already joining an Ethernet, a cost-effective and pragmatic way to benefit from the switching concept would be a combined ATM/Ethernet switching environment. This incorporates an incremental migration strategy with the immediate benefits of high-speed, high-capacity ATM (for servers and high-sophisticated display workstations), while preserving elements of the existing network technologies. In addition, Ethernet switching instead of shared media Ethernet improves the performance considerably. The LAN emulation (LANE) specification by the ATM forum defines mechanisms that allow ATM networks to coexist with legacy systems using any data networking protocol. This paper points out the suitability of this network architecture in accordance with an appropriate system design.
Archive of observations of periodic comet Crommelin made during its 1983-84 apparition
NASA Technical Reports Server (NTRS)
Sekanina, Z. (Editor); Aronsson, M.
1985-01-01
This is an archive of 680 reduced observations of Periodic Comet Crommelin made during its 1984 apparition. The archive integrates reports by members of the eight networks of the International Halley Watch (IHW) and presents the results of a trial run designed to test the preparedness of the IHW organization for the current apparition of Periodic Comet Halley.
The AmericaView Project - Putting the Earth into Your Hands
,
2005-01-01
The U.S. Geological Survey (USGS) is a leader in collecting, archiving, and distributing geospatial data and information about the Earth. Providing quick, reliable access to remotely sensed images and geospatial data is the driving principle behind the AmericaView Project. A national not-for-profit organization, AmericaView, Inc. was established and is supported by the USGS to coordinate the activities of a national network of university-led consortia with the primary objective of the advancement of the science of remote sensing. Individual consortia members include academic institutions, as well as state, local, and tribal government agencies. AmericaView's focus is to expand the understanding and use of remote sensing through education and outreach efforts and to provide affordable, integrated remote sensing information access and delivery to the American public. USGS's Landsat and NASA's Earth Observing System (EOS) satellite data are downlinked from satellites or transferred from other facilities to the USGS Center for Earth Resources Observation and Science (EROS) ground receiving station in Sioux Falls, South Dakota. The data can then be transferred over high-speed networks to consortium members, where it is archived and made available for public use.
Image acquisition unit for the Mayo/IBM PACS project
NASA Astrophysics Data System (ADS)
Reardon, Frank J.; Salutz, James R.
1991-07-01
The Mayo Clinic and IBM Rochester, Minnesota, have jointly developed a picture archiving, distribution and viewing system for use with Mayo's CT and MRI imaging modalities. Images are retrieved from the modalities and sent over the Mayo city-wide token ring network to optical storage subsystems for archiving, and to server subsystems for viewing on image review stations. Images may also be retrieved from archive and transmitted back to the modalities. The subsystems that interface to the modalities and communicate to the other components of the system are termed Image Acquisition Units (LAUs). The IAUs are IBM Personal System/2 (PS/2) computers with specially developed software. They operate independently in a network of cooperative subsystems and communicate with the modalities, archive subsystems, image review server subsystems, and a central subsystem that maintains information about the content and location of images. This paper provides a detailed description of the function and design of the Image Acquisition Units.
The LCOGT Science Archive and Data Pipeline
NASA Astrophysics Data System (ADS)
Lister, Tim; Walker, Z.; Ciardi, D.; Gelino, C. R.; Good, J.; Laity, A.; Swain, M.
2013-01-01
Las Cumbres Observatory Global Telescope (LCOGT) is building and deploying a world-wide network of optical telescopes dedicated to time-domain astronomy. In the past year, we have deployed and commissioned four new 1m telescopes at McDonald Observatory, Texas and at CTIO, Chile, with more to come at SAAO, South Africa and Siding Spring Observatory, Australia. To handle these new data sources coming from the growing LCOGT network, and to serve them to end users, we have constructed a new data pipeline and Science Archive. We describe the new LCOGT pipeline, currently under development and testing, which makes use of the ORAC-DR automated recipe-based data reduction pipeline and illustrate some of the new data products. We also present the new Science Archive, which is being developed in partnership with the Infrared Processing and Analysis Center (IPAC) and show some of the new features the Science Archive provides.
NASA Technical Reports Server (NTRS)
Short, Nick, Jr.; Bedet, Jean-Jacques; Bodden, Lee; Boddy, Mark; White, Jim; Beane, John
1994-01-01
The Goddard Space Flight Center (GSFC) Distributed Active Archive Center (DAAC) has been operational since October 1, 1993. Its mission is to support the Earth Observing System (EOS) by providing rapid access to EOS data and analysis products, and to test Earth Observing System Data and Information System (EOSDIS) design concepts. One of the challenges is to ensure quick and easy retrieval of any data archived within the DAAC's Data Archive and Distributed System (DADS). Over the 15-year life of EOS project, an estimated several Petabytes (10(exp 15)) of data will be permanently stored. Accessing that amount of information is a formidable task that will require innovative approaches. As a precursor of the full EOS system, the GSFC DAAC with a few Terabits of storage, has implemented a prototype of a constraint-based task and resource scheduler to improve the performance of the DADS. This Honeywell Task and Resource Scheduler (HTRS), developed by Honeywell Technology Center in cooperation the Information Science and Technology Branch/935, the Code X Operations Technology Program, and the GSFC DAAC, makes better use of limited resources, prevents backlog of data, provides information about resources bottlenecks and performance characteristics. The prototype which is developed concurrently with the GSFC Version 0 (V0) DADS, models DADS activities such as ingestion and distribution with priority, precedence, resource requirements (disk and network bandwidth) and temporal constraints. HTRS supports schedule updates, insertions, and retrieval of task information via an Application Program Interface (API). The prototype has demonstrated with a few examples, the substantial advantages of using HTRS over scheduling algorithms such as a First In First Out (FIFO) queue. The kernel scheduling engine for HTRS, called Kronos, has been successfully applied to several other domains such as space shuttle mission scheduling, demand flow manufacturing, and avionics communications scheduling.
NASA Astrophysics Data System (ADS)
Dricker, I. G.; Friberg, P.; Hellman, S.
2001-12-01
Under the contract with the CTBTO, Instrumental Software Technologies Inc., (ISTI) has designed and developed a Standard Station Interface (SSI) - a set of executable programs and application programming interface libraries for acquisition, authentication, archiving and telemetry of seismic and infrasound data for stations of the CTBTO nuclear monitoring network. SSI (written in C) is fully supported under both the Solaris and Linux operating systems and will be shipped with fully documented source code. SSI consists of several interconnected modules. The Digitizer Interface Module maintains a near-real-time data flow between multiple digitizers and the SSI. The Disk Buffer Module is responsible for local data archival. The Station Key Management Module is a low-level tool for data authentication and verification of incoming signatures. The Data Transmission Module supports packetized near-real-time data transmission from the primary CTBTO stations to the designated Data Center. The AutoDRM module allows transport of seismic and infrasound signed data via electronic mail (auxiliary station mode). The Command Interface Module is used to pass the remote commands to the digitizers and other modules of SSI. A station operator has access to the state-of-health information and waveforms via an the Operator Interface Module. Modular design of SSI will allow painless extension of the software system within and outside the boundaries of CTBTO station requirements. Currently an alpha version of SSI undergoes extensive tests in the lab and onsite.
TLALOCNet: A Continuous GPS-Met Array in Mexico for Seismotectonic and Atmospheric Research
NASA Astrophysics Data System (ADS)
Cabral-Cano, E.; Salazar-Tlaczani, L.; Galetzka, J.; DeMets, C.; Serra, Y. L.; Feaux, K.; Mattioli, G. S.; Miller, M. M.
2015-12-01
TLALOCNet is a network of continuous Global Positioning System (cGPS) and meteorology stations in Mexico for the interrogation of the earthquake cycle, tectonic processes, land subsidence, and atmospheric processes of Mexico. Once completed, TLALOCNet will span all of Mexico and will link existing GPS infrastructure in North America and the Caribbean aiming towards creating a continuous, federated network of networks in the Americas. Phase 1 (2014-2015), funded by NSF and UNAM, is building and upgrading 30+ cGPS-Met sites to the high standard of the EarthScope Plate Boundary Observatory (PBO). Phase 2 (2016) will add ~25 more cGPS-Met stations to be funded through CONACyT. TLALOCNet provides open and freely available raw GPS data, GPS-PWV, surface meteorology measurements, time series of daily positions, as well as a station velocity field to support a broad range of geoscience investigations. This is accomplished through the development of the TLALOCNet data center (http://tlalocnet.udg.mx) that serves as a collection and distribution point. This data center is based on UNAVCO's Dataworks-GSAC software and can work as part of UNAVCO's seamless archive for discovery, sharing, and access to data.The TLALOCNet data center also contains contributed data from several regional networks in Mexico. By using the same protocols and structure as the UNAVCO and other COCONet regional data centers, the geodetic community has the capability of accessing data from a large number of scientific and academically operated Mexican GPS sites. This archive provides a fully querable and scriptable GPS and Meteorological data retrieval point. Additionally Real-time 1Hz streams from selected TLALOCNet stations are available in BINEX, RTCM 2.3 and RTCM 3.1 formats via the Networked Transport of RTCM via Internet Protocol (NTRIP).
Development of a system for transferring images via a network: supporting a regional liaison.
Mihara, Naoki; Manabe, Shiro; Takeda, Toshihiro; Shinichirou, Kitamura; Junichi, Murakami; Kouji, Kiso; Matsumura, Yasushi
2013-01-01
We developed a system that transfers images via network and started using them in our hospital's PACS (Picture Archiving and Communication Systems) in 2006. We are pleased to report that the system has been re-developed and has been running so that there will be a regional liaison in the future. It has become possible to automatically transfer images simply by selecting the destination hospital that is registered in advance at the relay server. The gateway of this system can send images to a multi-center, relay management server, which receives the images and resends them. This system has the potential to be useful for image exchange, and to serve as a regional medical liaison.
Squish: Near-Optimal Compression for Archival of Relational Datasets
Gao, Yihan; Parameswaran, Aditya
2017-01-01
Relational datasets are being generated at an alarmingly rapid rate across organizations and industries. Compressing these datasets could significantly reduce storage and archival costs. Traditional compression algorithms, e.g., gzip, are suboptimal for compressing relational datasets since they ignore the table structure and relationships between attributes. We study compression algorithms that leverage the relational structure to compress datasets to a much greater extent. We develop Squish, a system that uses a combination of Bayesian Networks and Arithmetic Coding to capture multiple kinds of dependencies among attributes and achieve near-entropy compression rate. Squish also supports user-defined attributes: users can instantiate new data types by simply implementing five functions for a new class interface. We prove the asymptotic optimality of our compression algorithm and conduct experiments to show the effectiveness of our system: Squish achieves a reduction of over 50% in storage size relative to systems developed in prior work on a variety of real datasets. PMID:28180028
Rushinek, Avi; Rushinek, Sara; Lippincott, Christine; Ambrosia, Todd
2014-04-01
The aim of this article is to describe the repurposing of classroom video surveillance and on-screen archives (RCVSOSA) model, which is an innovative, technology-enabled approach to continuing education in nursing. The RCVSOSA model leverages network Internet-protocol, high-definition surveillance cameras to record videos of classroom lectures that can be automatically uploaded to the Internet or converted to DVD, either in their entirety or as content-specific modules, with the production work embedded in the technology. The proposed model supports health care continuing education through the use of online assessments for focused education modules, access to archived online recordings and DVD training courses, voice-to-text transcripts, and possibly continuing education modules that may be translated into multiple languages. Potential benefits of this model include increased access to educational modules for students, instant authorship, and financial compensation for instructors and their respective organizations.
An Approach to Data Center-Based KDD of Remote Sensing Datasets
NASA Technical Reports Server (NTRS)
Lynnes, Christopher; Mack, Robert; Wharton, Stephen W. (Technical Monitor)
2001-01-01
The data explosion in remote sensing is straining the ability of data centers to deliver the data to the user community, yet many large-volume users actually seek a relatively small information component within the data, which they extract at their sites using Knowledge Discovery in Databases (KDD) techniques. To improve the efficiency of this process, the Goddard Earth Sciences Distributed Active Archive Center (GES DAAC) has implemented a KDD subsystem that supports execution of the user's KDD algorithm at the data center, dramatically reducing the volume that is sent to the user. The data are extracted from the archive in a planned, organized "campaign"; the algorithms are executed, and the output products sent to the users over the network. The first campaign, now complete, has resulted in overall reductions in shipped volume from 3.3 TB to 0.4 TB.
Allison, Scott A; Sweet, Clifford F; Beall, Douglas P; Lewis, Thomas E; Monroe, Thomas
2005-09-01
The PACS implementation process is complicated requiring a tremendous amount of time, resources, and planning. The Department of Defense (DOD) has significant experience in developing and refining PACS acceptance testing (AT) protocols that assure contract compliance, clinical safety, and functionality. The DOD's AT experience under the initial Medical Diagnostic Imaging Support System contract led to the current Digital Imaging Network-Picture Archiving and Communications Systems (DIN-PACS) contract AT protocol. To identify the most common system and component deficiencies under the current DIN-PACS AT protocol, 14 tri-service sites were evaluated during 1998-2000. Sixteen system deficiency citations with 154 separate types of limitations were noted with problems involving the workstation, interfaces, and the Radiology Information System comprising more than 50% of the citations. Larger PACS deployments were associated with a higher number of deficiencies. The most commonly cited systems deficiencies were among the most expensive components of the PACS.
Recommendations for a service framework to access astronomical archives
NASA Technical Reports Server (NTRS)
Travisano, J. J.; Pollizzi, J.
1992-01-01
There are a large number of astronomical archives and catalogs on-line for network access, with many different user interfaces and features. Some systems are moving towards distributed access, supplying users with client software for their home sites which connects to servers at the archive site. Many of the issues involved in defining a standard framework of services that archive/catalog suppliers can use to achieve a basic level of interoperability are described. Such a framework would simplify the development of client and server programs to access the wide variety of astronomical archive systems. The primary services that are supplied by current systems include: catalog browsing, dataset retrieval, name resolution, and data analysis. The following issues (and probably more) need to be considered in establishing a standard set of client/server interfaces and protocols: Archive Access - dataset retrieval, delivery, file formats, data browsing, analysis, etc.; Catalog Access - database management systems, query languages, data formats, synchronous/asynchronous mode of operation, etc.; Interoperability - transaction/message protocols, distributed processing mechanisms (DCE, ONC/SunRPC, etc), networking protocols, etc.; Security - user registration, authorization/authentication mechanisms, etc.; Service Directory - service registration, lookup, port/task mapping, parameters, etc.; Software - public vs proprietary, client/server software, standard interfaces to client/server functions, software distribution, operating system portability, data portability, etc. Several archive/catalog groups, notably the Astrophysics Data System (ADS), are already working in many of these areas. In the process of developing StarView, which is the user interface to the Space Telescope Data Archive and Distribution Service (ST-DADS), these issues and the work of others were analyzed. A framework of standard interfaces for accessing services on any archive system which would benefit archive user and supplier alike is proposed.
NASA Astrophysics Data System (ADS)
Evans, Robert D.; Petropavlovskikh, Irina; McClure-Begley, Audra; McConville, Glen; Quincy, Dorothy; Miyagawa, Koji
2017-10-01
The United States government has operated Dobson ozone spectrophotometers at various sites, starting during the International Geophysical Year (1 July 1957 to 31 December 1958). A network of stations for long-term monitoring of the total column content (thickness of the ozone layer) of the atmosphere was established in the early 1960s and eventually grew to 16 stations, 14 of which are still operational and submit data to the United States of America's National Oceanic and Atmospheric Administration (NOAA). Seven of these sites are also part of the Network for the Detection of Atmospheric Composition Change (NDACC), an organization that maintains its own data archive. Due to recent changes in data processing software the entire dataset was re-evaluated for possible changes. To evaluate and minimize potential changes caused by the new processing software, the reprocessed data record was compared to the original data record archived in the World Ozone and UV Data Center (WOUDC) in Toronto, Canada. The history of the observations at the individual stations, the instruments used for the NOAA network monitoring at the station, the method for reducing zenith-sky observations to total ozone, and calibration procedures were re-evaluated using data quality control tools built into the new software. At the completion of the evaluation, the new datasets are to be published as an update to the WOUDC and NDACC archives, and the entire dataset is to be made available to the scientific community. The procedure for reprocessing Dobson data and the results of the reanalysis on the archived record are presented in this paper. A summary of historical changes to 14 station records is also provided.
Designing Solar Data Archives: Practical Considerations
NASA Astrophysics Data System (ADS)
Messerotti, M.
The variety of new solar observatories in space and on the ground poses the stringent problem of an efficient storage and archiving of huge datasets. We briefly address some typical architectures and consider the key point of data access and distribution through networking.
Update on the activities of the GGOS Bureau of Networks and Observations
NASA Technical Reports Server (NTRS)
Pearlman, Michael R.; Pavlis, Erricos C.; Ma, Chopo; Noll, Carey; Thaller, Daniela; Richter, Bernd; Gross, Richard; Neilan, Ruth; Mueller, Juergen; Barzaghi, Ricardo;
2016-01-01
The recently reorganized GGOS Bureau of Networks and Observations has many elements that are associated with building and sustaining the infrastructure that supports the Global Geodetic Observing System (GGOS) through the development and maintenance of the International Terrestrial and Celestial Reference Frames, improved gravity field models and their incorporation into the reference frame, the production of precision orbits for missions of interest to GGOS, and many other applications. The affiliated Service Networks (IVS, ILRS, IGS, IDS, and now the IGFS and the PSMSL) continue to grow geographically and to improve core and co-location site performance with newer technologies. Efforts are underway to expand GGOS participation and outreach. Several groups are undertaking initiatives and seeking partnerships to update existing sites and expand the networks in geographic areas void of coverage. New satellites are being launched by the Space Agencies in disciplines relevant to GGOS. Working groups now constitute an integral part of the Bureau, providing key service to GGOS. Their activities include: projecting future network capability and examining trade-off options for station deployment and technology upgrades, developing metadata collection and online availability strategies; improving coordination and information exchange with the missions for better ground-based network response and space-segment adequacy for the realization of GGOS goals; and standardizing site-tie measurement, archiving, and analysis procedures. This poster will present the progress in the Bureau's activities and its efforts to expand the networks and make them more effective in supporting GGOS.
NASA Johnson Space Center Life Sciences Data System
NASA Technical Reports Server (NTRS)
Rahman, Hasan; Cardenas, Jeffery
1994-01-01
The Life Sciences Project Division (LSPD) at JSC, which manages human life sciences flight experiments for the NASA Life Sciences Division, augmented its Life Sciences Data System (LSDS) in support of the Spacelab Life Sciences-2 (SLS-2) mission, October 1993. The LSDS is a portable ground system supporting Shuttle, Spacelab, and Mir based life sciences experiments. The LSDS supports acquisition, processing, display, and storage of real-time experiment telemetry in a workstation environment. The system may acquire digital or analog data, storing the data in experiment packet format. Data packets from any acquisition source are archived and meta-parameters are derived through the application of mathematical and logical operators. Parameters may be displayed in text and/or graphical form, or output to analog devices. Experiment data packets may be retransmitted through the network interface and database applications may be developed to support virtually any data packet format. The user interface provides menu- and icon-driven program control and the LSDS system can be integrated with other workstations to perform a variety of functions. The generic capabilities, adaptability, and ease of use make the LSDS a cost-effective solution to many experiment data processing requirements. The same system is used for experiment systems functional and integration tests, flight crew training sessions and mission simulations. In addition, the system has provided the infrastructure for the development of the JSC Life Sciences Data Archive System scheduled for completion in December 1994.
The Spanish network for Gaia Science Exploitation
NASA Astrophysics Data System (ADS)
Figueras, F.; Jordi, C.; Luri, X.; Torra, J.; REG Executive Committee Team; Gaia UB Team
2017-03-01
The ''Red Española de Explotación Científica de Gaia'' (REG) continues to intensify its activities facing the imminent publication of the first and second Gaia data releases (14 September, 2016 and Q4-2017, respectively). The network, supported by the MINECO under contract Acciones de dinamizaci ´on, Redes de Excelencia (2016-2017), has as major priority the task to coordinate and support the collective activities developed by its more than 150 members. At present, REG plays a prominent role in the preparation of the Spanish community for the use of the Gaia data archive (a task lead by the Spanish team), in the work to exploit the Gaia-ESO survey collected during the last four years and in supporting the preparation of the science case and survey plan for WEAVE, the new multi-object spectrograph for the WHT at Canary Islands (commissioning, 2018). These activities are described together with the schedule of future national and international science meetings and the outreach activities being organized for the first and second Data Releases
Hauken, May Aasebø; Senneseth, Mette; Dyregrov, Atle; Dyregrov, Kari
2015-12-30
Parental cancer can have a significant impact on a family's psychosocial functioning and quality of life, whereby the children's situation is strongly related to parental coping and capacity. Such parents ask for more help in order to increase their care capacity, while the network is often insecure about how to help and thereby withdraw. They ask for guidance and training to be able to support cancer families. Based on this, the Cancer- Psycho-Educational Program for the SOcial NEtwork (PEPSONE) study was developed. To optimize social network support through a psycho-educational program for families living with parental cancer and their network members in order to increase parental capacity and thereby secure the children's safety and quality of life. A randomized controlled trial (RCT) in which families (N=60) living with parental cancer will be randomized to either an intervention group or a control group. The intervention will last for 3 hours and includes (1) introduction, (2) psycho-education (living with cancer in the family and the importance of social network support), and (3) discussion (this family's need for social support). Primary outcomes are social support, mental health, and quality of life, and secondary outcomes are resilience and parental capacity. Data will be collected by a set of questionnaires distributed to healthy parents (N=60) living with a partner with cancer, one child in the family between 8-18 years of age (N=60), and network members (N=210) of the intervention families at inclusion, and after 3 and 6 months. Comparing differences between the intervention group (n=30) and the control group (n=30), the power analysis shows that P<.05 and a statistical power = .80 would detect effect sizes of clinical interest. This paper presents the Cancer-PEPSON study's protocol to provide a broader understanding of the background and content of the program. The study is ongoing until August 2016 and the first results are anticipated to be finished by November 2015. To our knowledge, this will be the first RCT study to optimize social network support through a psycho-educational program for families living with parental cancer and their network members, as well as provide an evidence basis for social network support. The results may provide important knowledge that is useful for clinical practice and further research. The trial is reported according to the CONSORT checklist. International Standard Randomized Controlled Trial Number (ISRCTN): 15982171; http://www.controlled-trials.com/ISRCTN15982171/15982171 (Archived by WebCite at http://www.webcitation.org/6cg9zunS0).
Improved Data Access From the Northern California Earthquake Data Center
NASA Astrophysics Data System (ADS)
Neuhauser, D.; Oppenheimer, D.; Zuzlewski, S.; Klein, F.; Jensen, E.; Gee, L.; Murray, M.; Romanowicz, B.
2002-12-01
The NCEDC is a joint project of the UC Berkeley Seismological Laboratory and the USGS Menlo Park to provide a long-term archive and distribution center for geophysical data for northern California. Most data are available via the Web at http://quake.geo.berkeley.edu and research accounts are available for access to specialized datasets. Current efforts continue to expand the available datasets, enhance distribution methods, and to provide rapid access to all datasets. The NCEDC archives continuous and event-based seismic and geophysical time-series data from the BDSN, the USGS NCSN, the UNR Seismic Network, the Parkfield HRSN, and the Calpine/Unocal Geysers network. In collaboration with the USGS, the NCEDC has archived a total of 887 channels from 139 sites of the "USGS low-frequency" geophysical network (UL), including data from strainmeters, creep meters, magnetometers, water well levels, and tiltmeters. There are 336 active continuous data channels that are updated at the NCEDC on a daily basis. Geodetic data from the BARD network of over 40 continuously recording GPS sites are archived at the NCEDC in both raw and RINEX format. The NCEDC is the primary archive for survey-mode GPS and other geodetic data collected in northern California by the USGS, universities, and other agencies. All of the BARD data and GPS data archived from USGS Menlo Park surveys are now available through the GPS Seamless Archive Centers (GSAC), and by FTP directly from the NCEDC. Virtually all time-series data at the NCEDC are now available in SEED with complete instrument responses. Assembling, verifying, and maintaining the response information for these networks is a huge task, and is accomplished through the collaborative efforts of the NCEDC and the contributing agencies. Until recently, the NCSN waveform data were available only through research accounts and special request methods due to incomplete instrument responses. In the last year, the USGS compiled the necessary descriptions for for both historic and current NCSN instrumentation. The NCEDC and USGS jointly developed a procedure to create and maintain the hardware attributes and instrument responses at the NCEDC for the 3500 NCSN channels. As a result, the NCSN waveform data can now be distributed in SEED format. The NCEDC provides access to waveform data through Web forms, email requests, and programming interfaces. The SeismiQuery Web interface provides information about data holdings. NetDC allows users to retrieve inventory information, instrument responses, and waveforms in SEED format. STP provides both a Web and programming interface to retrieve data in SEED or other user-friendly formats. Through the newly formed California Integrated Seismic Network, we are working with the SCEDC to provide unified access to California earthquake data.
The Telecommunications and Data Acquisition Report
NASA Technical Reports Server (NTRS)
Posner, E. C. (Editor)
1987-01-01
Archival reports on developments in programs managed by JPL's Office of Telecommunications and Data Acquisition (TDA) are provided. Activities of the Deep Space Network (DSN) in space communications, radio navigation, radio science, and ground-based radio astronomy are reported. Also included are the plans, supporting research and technology, implementation and operations for the Ground Communications Facility (GCF). In geodynamics, the publication reports on the application of radio interferometry at microwave frequencies for geodynamic measurements. In the search for extraterrestrial intelligence (SETI), it reports on implementation and operations for searching the microwave spectrum.
Technology and the Transformation of Archival Description
ERIC Educational Resources Information Center
Pitti, Daniel V.
2005-01-01
The emergence of computer and network technologies has presented the archival profession with daunting challenges as well as inspiring opportunities. Archivists have been actively imagining and realizing the application of advanced technologies to their professional functions and activities. Using advanced technologies, archivists have been…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lay, Erin Hoffmann; Wiens, Kyle Cameron; Delapp, Dorothea Marcia
2016-03-11
The World Wide Lightning Location Network (WWLLN) provides continuous global lightning monitoring and detection. At LANL we collect and archive these data on a daily basis. This document describes the WWLLN data, how they are collected and archived, and how to use the data at LANL.
NASA Astrophysics Data System (ADS)
Hastings, M. G.; Kontak, R.; Adams, A. S.; Barnes, R. T.; Fischer, E. V.; Glessmer, M. S.; Holloway, T.; Marin-Spiotta, E.; Rodriguez, C.; Steiner, A. L.; Wiedinmyer, C.; Laursen, S. L.
2013-12-01
The Earth Science Women's Network (ESWN) is an organization of women geoscientists, many in the early stages of their careers. The mission of ESWN is to promote success in scientific careers by facilitating career development, community, informal mentoring and support, and professional collaborations. ESWN currently connects nearly 2000 women across the globe, and includes graduate students, postdoctoral scientists, tenure and non-tenure track faculty from diverse colleges and universities, program managers, and government, non-government and industry researchers. In 2009, ESWN received an NSF ADVANCE PAID award, with the primary goals to grow our membership to serve a wider section of the geosciences community, to design and administer career development workshops, to promote professional networking at scientific conferences, and to develop web resources to build connections, collaborations, and peer mentoring for and among women in the Earth Sciences. Now at the end of the grant, ESWN members have reported gains in a number of aspects of their personal and professional lives including: knowledge about career resources; a greater understanding of the challenges facing women in science and resources to overcome them; a sense of community and less isolation; greater confidence in their own career trajectories; professional collaborations; emotional support on a variety of issues; and greater engagement and retention in scientific careers. The new ESWN web center (www.ESWNonline.org), a major development supported by NSF ADVANCE and AGU, was created to facilitate communication and networking among our members. The web center offers a state-of-the-art social networking platform and features: 1) a public site offering information on ESWN, career resources for all early career scientists, and a 'members' spotlight' highlighting members' scientific and professional achievements; and 2) a password protected member area where users can personalize profiles, create and respond to discussions, and connect with other members. The new member area's archive of discussions and member database are searchable, providing better tools for targeted networking and collaboration.
32 CFR 2001.50 - Telecommunications automated information systems and network security.
Code of Federal Regulations, 2014 CFR
2014-07-01
... and network security. 2001.50 Section 2001.50 National Defense Other Regulations Relating to National Defense INFORMATION SECURITY OVERSIGHT OFFICE, NATIONAL ARCHIVES AND RECORDS ADMINISTRATION CLASSIFIED... network security. Each agency head shall ensure that classified information electronically accessed...
32 CFR 2001.50 - Telecommunications automated information systems and network security.
Code of Federal Regulations, 2013 CFR
2013-07-01
... and network security. 2001.50 Section 2001.50 National Defense Other Regulations Relating to National Defense INFORMATION SECURITY OVERSIGHT OFFICE, NATIONAL ARCHIVES AND RECORDS ADMINISTRATION CLASSIFIED... network security. Each agency head shall ensure that classified information electronically accessed...
32 CFR 2001.50 - Telecommunications automated information systems and network security.
Code of Federal Regulations, 2012 CFR
2012-07-01
... and network security. 2001.50 Section 2001.50 National Defense Other Regulations Relating to National Defense INFORMATION SECURITY OVERSIGHT OFFICE, NATIONAL ARCHIVES AND RECORDS ADMINISTRATION CLASSIFIED... network security. Each agency head shall ensure that classified information electronically accessed...
Application-driven strategies for efficient transfer of medical images over very high speed networks
NASA Astrophysics Data System (ADS)
Alsafadi, Yasser H.; McNeill, Kevin M.; Martinez, Ralph
1993-09-01
The American College of Radiology (ACR) and the National Electrical Manufacturing Association (NEMA) in 1982 formed the ACR-NEMA committee to develop a standard to enable equipment from different vendors to communicate and participate in a picture archiving and communications system (PACS). The standard focused mostly on interconnectivity issues and communication needs of PACS. It was patterned after the international standards organization open systems interconnection (ISO/OSI) reference model. Three versions of the standard appeared, evolving from simple point-to-point specification of connection between two medical devices to a complex standard of a network environment. However, fast changes in network software and hardware technologies makes it difficult for the standard to keep pace. This paper compares two versions of the ACR-NEMA standard and then describes a system that is used at the University of Arizona Intensive Care Unit. In this system, the application should specify the interface to network services and grade of service required. These provisions are suggested to make the application independent from evolving network technology and support true open systems.
Byrnes, Hilary F.; Miller, Brenda A.
2011-01-01
Neighborhood characteristics have been linked to healthy behavior, including effective parenting behaviors. This may be partially explained through the neighborhood’s relation to parents’ access to social support from friends and family. The current study examined associations of neighborhood characteristics with parenting behaviors indirectly through social support. The sample included 614 mothers of 11–12 year old youths enrolled in a health care system in the San Francisco area. Structural equations modeling shows that neighborhood perceptions were related to parenting behaviors, indirectly through social support, while archival census neighborhood indicators were unrelated to social support and parenting. Perceived neighborhood social cohesion and control were related to greater social support, which was related to more effective parenting style, parent-child communication, and monitoring. Perceived neighborhood disorganization was unrelated to social support. Prevention strategies should focus on helping parents build a social support network that can act as a resource in times of need. PMID:23794774
Keep It Sacred | National Native Network
Detection Health Care Coverage Get Involved Resources NNN Webinar Archive Newsletter Archive Podcasts Cancer Guide Tribal Public Health Data Toolkits Smoke-Free Policy Toolkit Success Stories Resource Library Colorectal Cancer Diabetes Fact Sheets & Contact General Health Problems & Cancers General State
40 CFR 58.16 - Data submittal and archiving requirements.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 5 2011-07-01 2011-07-01 false Data submittal and archiving requirements. 58.16 Section 58.16 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) AMBIENT AIR QUALITY SURVEILLANCE Monitoring Network § 58.16 Data submittal and...
Back to the Future: Long-Term Seismic Archives Revisited
NASA Astrophysics Data System (ADS)
Waldhauser, F.; Schaff, D. P.
2007-12-01
Archives of digital seismic data recorded by seismometer networks around the world have grown tremendously over the last several decades helped by the deployment of seismic stations and their continued operation within the framework of monitoring seismic activity. These archives typically consist of waveforms of seismic events and associated parametric data such as phase arrival time picks and the location of hypocenters. Catalogs of earthquake locations are fundamental data in seismology, and even in the Earth sciences in general. Yet, these locations have notoriously low spatial resolution because of errors in both the picks and the models commonly used to locate events one at a time. This limits their potential to address fundamental questions concerning the physics of earthquakes, the structure and composition of the Earth's interior, and the seismic hazards associated with active faults. We report on the comprehensive use of modern waveform cross-correlation based methodologies for high- resolution earthquake location - as applied to regional and global long-term seismic databases. By simultaneous re-analysis of two decades of the digital seismic archive of Northern California, reducing pick errors via cross-correlation and model errors via double-differencing, we achieve up to three orders of magnitude resolution improvement over existing hypocenter locations. The relocated events image networks of discrete faults at seismogenic depths across various tectonic settings that until now have been hidden in location uncertainties. Similar location improvements are obtained for earthquakes recorded at global networks by re- processing 40 years of parametric data from the ISC and corresponding waveforms archived at IRIS. Since our methods are scaleable and run on inexpensive Beowulf clusters, periodic re-analysis of entire archives may thus become a routine procedure to continuously improve resolution in existing catalogs. We demonstrate the role of seismic archives in obtaining the precise location of new events in real-time. Such information has considerable social and economic impact in the evaluation and mitigation of seismic hazards, for example, and highlights the need for consistent long-term seismic monitoring and archiving of records.
NASA Astrophysics Data System (ADS)
Benson, R. B.
2007-05-01
The IRIS Data Management Center, located in Seattle, WA, is the largest openly accessible geophysical archive in the world, and has a unique perspective on data management and operational practices that gets the most out of your network. Networks scale broad domains in time and space, from finite needs to monitor bridges and dams to national and international networks like the GSN and the FDSN that establish a baseline for global monitoring and research, the requirements that go into creating a well-tuned DMC archive treat these the same, building a collaborative network of networks that generations of users rely on and adds value to the data. Funded by the National Science Foundation through the Division of Earth Sciences, IRIS is operated through member universities and in cooperation with the USGS, and the DMS facility is a bridge between a globally distributed collaboration of seismic networks and an equally distributed network of users that demand a high standard for data quality, completeness, and ease of access. I will describe the role that a perpetual archive has in the life cycle of data, and how hosting real-time data performs a dual role of being a hub for continuous data from approximately 59 real-time networks, and distributing these (along with other data from the 40-year library of available time-series data) to researchers, while simultaneously providing shared data back to networks in real- time that benefits monitoring activities. I will describe aspects of our quality-assurance framework that are both passively and actively performed on 1100 seismic stations, generating over 6,000 channels of regularly sampled data arriving daily, that data providers can use as aids in operating their network, and users can likewise use when requesting suitable data for research purposes. The goal of the DMC is to eliminate bottlenecks in data discovery and shortening the steps leading to analysis. This includes many challenges, including keeping metadata current, tools for evaluating and viewing them, along with measuring and creating databases of other performance metrics and how monitoring them closer to real- time helps reduce operation costs, creates a richer repository, and eliminates problems over generations of duty cycles of data usage. I will describe a new resource, called the Nominal Response Library, which hopes to provide accurate and representative examples of sensor and data logger configurations that are hosted at the DMC and constitute a high-graded subset for crafting your own metadata. Finally, I want to encourage all network operators who do not currently submit SEED format data to an archive to consider these benefits, and briefly discuss how robust transfer mechanisms that include Earthworm, LISS, Antelope, NRTS and SeisComp, to name a few, can assist you in contributing your network data and help create this enabling virtual network of networks. In this era of high performance Internet capacity, the process that enables others to share your data and allows you to utilize external sources of data is nearly seamless with your current mission of network operation.
Development and Evaluation of a City-Wide Wireless Weather Sensor Network
ERIC Educational Resources Information Center
Chang, Ben; Wang, Hsue-Yie; Peng, Tian-Yin; Hsu, Ying-Shao
2010-01-01
This project analyzed the effectiveness of a city-wide wireless weather sensor network, the Taipei Weather Science Learning Network (TWIN), in facilitating elementary and junior high students' study of weather science. The network, composed of sixty school-based weather sensor nodes and a centralized weather data archive server, provides students…
Archiving and Distributing Seismic Data at the Southern California Earthquake Data Center (SCEDC)
NASA Astrophysics Data System (ADS)
Appel, V. L.
2002-12-01
The Southern California Earthquake Data Center (SCEDC) archives and provides public access to earthquake parametric and waveform data gathered by the Southern California Seismic Network and since January 1, 2001, the TriNet seismic network, southern California's earthquake monitoring network. The parametric data in the archive includes earthquake locations, magnitudes, moment-tensor solutions and phase picks. The SCEDC waveform archive prior to TriNet consists primarily of short-period, 100-samples-per-second waveforms from the SCSN. The addition of the TriNet array added continuous recordings of 155 broadband stations (20 samples per second or less), and triggered seismograms from 200 accelerometers and 200 short-period instruments. Since the Data Center and TriNet use the same Oracle database system, new earthquake data are available to the seismological community in near real-time. Primary access to the database and waveforms is through the Seismogram Transfer Program (STP) interface. The interface enables users to search the database for earthquake information, phase picks, and continuous and triggered waveform data. Output is available in SAC, miniSEED, and other formats. Both the raw counts format (V0) and the gain-corrected format (V1) of COSMOS (Consortium of Organizations for Strong-Motion Observation Systems) are now supported by STP. EQQuest is an interface to prepackaged waveform data sets for select earthquakes in Southern California stored at the SCEDC. Waveform data for large-magnitude events have been prepared and new data sets will be available for download in near real-time following major events. The parametric data from 1981 to present has been loaded into the Oracle 9.2.0.1 database system and the waveforms for that time period have been converted to mSEED format and are accessible through the STP interface. The DISC optical-disk system (the "jukebox") that currently serves as the mass-storage for the SCEDC is in the process of being replaced with a series of inexpensive high-capacity (1.6 Tbyte) magnetic-disk RAIDs. These systems are built with PC-technology components, using 16 120-Gbyte IDE disks, hot-swappable disk trays, two RAID controllers, dual redundant power supplies and a Linux operating system. The system is configured over a private gigabit network that connects to the two Data Center servers and spans between the Seismological Lab and the USGS. To ensure data integrity, each RAID disk system constantly checks itself against its twin and verifies file integrity using 128-bit MD5 file checksums that are stored separate from the system. The final level of data protection is a Sony AIT-3 tape backup of the files. The primary advantage of the magnetic-disk approach is faster data access because magnetic disk drives have almost no latency. This means that the SCEDC can provide better "on-demand" interactive delivery of the seismograms in the archive.
Terreros, D A; Martinez, R
1997-01-01
A multimedia telemedicine network is proposed for a VISN-19 test bed and it will include picture archiving and communication systems (PACS). Initial tests have been performed, and the technical feasibility of the basic plan has been demonstrated.
NASA Astrophysics Data System (ADS)
Cabral-Cano, E.; Salazar-Tlaczani, L.; Adams, D. K.; Vivoni, E. R.; Grutter, M.; Serra, Y. L.; DeMets, C.; Galetzka, J.; Feaux, K.; Mattioli, G. S.; Miller, M. M.
2017-12-01
TLALOCNet is a network of continuous GPS and meteorology stations in Mexico to study atmospheric and solid earth processes. This recently completed network spans most of Mexico with a strong coverage emphasis on southern and western Mexico. This network, funded by NSF, CONACyT and UNAM, recently built 40 cGPS-Met sites to EarthScope Plate Boundary Observatory standards and upgraded 25 additional GPS stations. TLALOCNet provides open and freely available raw GPS data, and high frequency surface meteorology measurements, and time series of daily positions. This is accomplished through the development of the TLALOCNet data center (http://tlalocnet.udg.mx) that serves as a collection and distribution point. This data center is based on UNAVCO's Dataworks-GSAC software and also works as part of UNAVCO's seamless archive for discovery, sharing, and access to GPS data. The TLALOCNet data center also contains contributed data from several regional GPS networks in Mexico for a total of 100+ stations. By using the same protocols and structure as the UNAVCO and other COCONet regional data centers, the scientific community has the capability of accessing data from the largest Mexican GPS network. This archive provides a fully queryable and scriptable GPS and Meteorological data retrieval point. In addition, real-time 1Hz streams from selected TLALOCNet stations are available in BINEX, RTCM 2.3 and RTCM 3.1 formats via the Networked Transport of RTCM via Internet Protocol (NTRIP) for real-time seismic and weather forecasting applications. TLALOCNet served as a GPS-Met backbone for the binational Mexico-US North American Monsoon GPS Hydrometeorological Network 2017 campaign experiment. This innovative experiment attempts to address water vapor source regions and land-surface water vapor flux contributions to precipitation (i.e., moisture recycling) during the 2017 North American Monsoon in Baja California, Sonora, Chihuahua, and Arizona. Models suggest that moisture recycling is a large contributor to summer rainfall. This experiment represents a first attempt to quantify the surface water vapor flux contribution to GPS-derived precipitable water vapor. Preliminary results from this campaign are presented.
Creating Trading Networks of Digital Archives.
ERIC Educational Resources Information Center
Cooper, Brian; Garcia-Molina, Hector
Digital materials are vulnerable to a number of different kinds of failures, including decay of the digital media, loss due to hackers and viruses, accidental deletions, natural disasters, and bankruptcy of the institution holding the collection. Digital archives can best survive failures if they have made several copies of their collections at…
The Gaia scientific exploitation networks
NASA Astrophysics Data System (ADS)
Figueras, F.; Jordi, C.
2015-05-01
On July 2014 the Gaia satellite, placed at L2 since January 2014, finished their commissioning phase and started collecting high accurate scientific data. New and more realistic estimations of the astrometric, photometric and spectroscopic accuracy expected after five years mission operation (2014-2019) have been recently published in the Gaia Science Performance Web page. Here we present the coordination efforts and the activities being conducted through the two GREAT (Gaia Research for European Astronomy Training) European Networks, the GREAT-ESF, a programme supported by the European Science Foundation (2010-2015), and the GREAT-ITN network, from the European Union's Seventh Framework Programme (2011-2015). The main research theme of these networks is to unravel the origin and history of our home galaxy. Emphasis is placed on the research projects being conducted by the Spanish Researchers through these networks, well coordinated by the Red Española de Explotación Científica de Gaia (REG network, with more than 140 participants). Members of the REG play an important role on the collection of complementary spectroscopic data from ground based telescopes, on the development of new tools for an optimal scientific exploitation of Gaia data and on the preparation task to create the Gaia archive.
NASA Astrophysics Data System (ADS)
Yu, E.; Bhaskaran, A.; Chen, S.; Chowdhury, F. R.; Meisenhelter, S.; Hutton, K.; Given, D.; Hauksson, E.; Clayton, R. W.
2010-12-01
Currently the SCEDC archives continuous and triggered data from nearly 5000 data channels from 425 SCSN recorded stations, processing and archiving an average of 12,000 earthquakes each year. The SCEDC provides public access to these earthquake parametric and waveform data through its website www.data.scec.org and through client applications such as STP and DHI. This poster will describe the most significant developments at the SCEDC in the past year. Updated hardware: ● The SCEDC has more than doubled its waveform file storage capacity by migrating to 2 TB disks. New data holdings: ● Waveform data: Beginning Jan 1, 2010 the SCEDC began continuously archiving all high-sample-rate strong-motion channels. All seismic channels recorded by SCSN are now continuously archived and available at SCEDC. ● Portable data from El Mayor Cucapah 7.2 sequence: Seismic waveforms from portable stations installed by researchers (contributed by Elizabeth Cochran, Jamie Steidl, and Octavio Lazaro-Mancilla) have been added to the archive and are accessible through STP either as continuous data or associated with events in the SCEDC earthquake catalog. This additional data will help SCSN analysts and researchers improve event locations from the sequence. ● Real time GPS solutions from El Mayor Cucapah 7.2 event: Three component 1Hz seismograms of California Real Time Network (CRTN) GPS stations, from the April 4, 2010, magnitude 7.2 El Mayor-Cucapah earthquake are available in SAC format at the SCEDC. These time series were created by Brendan Crowell, Yehuda Bock, the project PI, and Mindy Squibb at SOPAC using data from the CRTN. The El Mayor-Cucapah earthquake demonstrated definitively the power of real-time high-rate GPS data: they measure dynamic displacements directly, they do not clip and they are also able to detect the permanent (coseismic) surface deformation. ● Triggered data from the Quake Catcher Network (QCN) and Community Seismic Network (CSN): The SCEDC in cooperation with QCN and CSN is exploring ways to archive and distribute data from high density low cost networks. As a starting point the SCEDC will store a dataset from QCN and CSN and distribute it through a separate STP client. New archival methods: ● The SCEDC is exploring the feasibility of archiving and distributing waveform data using cloud computing such as Google Apps. A month of continuous data from the SCEDC archive will be stored in Google Apps and a client developed to access it in a manner similar to STP. XML formats: ● The SCEDC is now distributing earthquake parameter data through web services in QuakeML format. ● The SCEDC in collaboration with the Northern California Earthquake Data Center (NCEDC) and USGS Golden has reviewed and revised the StationXML format to produce version 2.0. The new version includes a rules on extending the schema, use of named complex types, and greater consistency in naming conventions. Based on this work we plan to develop readers and writers of the StationXML format.
A high-speed network for cardiac image review.
Elion, J L; Petrocelli, R R
1994-01-01
A high-speed fiber-based network for the transmission and display of digitized full-motion cardiac images has been developed. Based on Asynchronous Transfer Mode (ATM), the network is scaleable, meaning that the same software and hardware is used for a small local area network or for a large multi-institutional network. The system can handle uncompressed digital angiographic images, considered to be at the "high-end" of the bandwidth requirements. Along with the networking, a general-purpose multi-modality review station has been implemented without specialized hardware. This station can store a full injection sequence in "loop RAM" in a 512 x 512 format, then interpolate to 1024 x 1024 while displaying at 30 frames per second. The network and review stations connect to a central file server that uses a virtual file system to make a large high-speed RAID storage disk and associated off-line storage tapes and cartridges all appear as a single large file system to the software. In addition to supporting archival storage and review, the system can also digitize live video using high-speed Direct Memory Access (DMA) from the frame grabber to present uncompressed data to the network. Fully functional prototypes have provided the proof of concept, with full deployment in the institution planned as the next stage.
A high-speed network for cardiac image review.
Elion, J. L.; Petrocelli, R. R.
1994-01-01
A high-speed fiber-based network for the transmission and display of digitized full-motion cardiac images has been developed. Based on Asynchronous Transfer Mode (ATM), the network is scaleable, meaning that the same software and hardware is used for a small local area network or for a large multi-institutional network. The system can handle uncompressed digital angiographic images, considered to be at the "high-end" of the bandwidth requirements. Along with the networking, a general-purpose multi-modality review station has been implemented without specialized hardware. This station can store a full injection sequence in "loop RAM" in a 512 x 512 format, then interpolate to 1024 x 1024 while displaying at 30 frames per second. The network and review stations connect to a central file server that uses a virtual file system to make a large high-speed RAID storage disk and associated off-line storage tapes and cartridges all appear as a single large file system to the software. In addition to supporting archival storage and review, the system can also digitize live video using high-speed Direct Memory Access (DMA) from the frame grabber to present uncompressed data to the network. Fully functional prototypes have provided the proof of concept, with full deployment in the institution planned as the next stage. PMID:7949964
CDDIS: NASA's Archive of Space Geodesy Data and Products Supporting GGOS
NASA Technical Reports Server (NTRS)
Noll, Carey; Michael, Patrick
2016-01-01
The Crustal Dynamics Data Information System (CDDIS) supports data archiving and distribution activities for the space geodesy and geodynamics community. The main objectives of the system are to store space geodesy and geodynamics related data and products in a central archive, to maintain information about the archival of these data,to disseminate these data and information in a timely manner to a global scientific research community, and provide user based tools for the exploration and use of the archive. The CDDIS data system and its archive is a key component in several of the geometric services within the International Association of Geodesy (IAG) and its observing systemthe Global Geodetic Observing System (GGOS), including the IGS, the International DORIS Service (IDS), the International Laser Ranging Service (ILRS), the International VLBI Service for Geodesy and Astrometry (IVS), and the International Earth Rotation and Reference Systems Service (IERS). The CDDIS provides on-line access to over 17 Tbytes of dataand derived products in support of the IAG services and GGOS. The systems archive continues to grow and improve as new activities are supported and enhancements are implemented. Recently, the CDDIS has established a real-time streaming capability for GNSS data and products. Furthermore, enhancements to metadata describing the contents ofthe archive have been developed to facilitate data discovery. This poster will provide a review of the improvements in the system infrastructure that CDDIS has made over the past year for the geodetic community and describe future plans for the system.
Present status and future directions of the Mayo/IBM PACS project
NASA Astrophysics Data System (ADS)
Morin, Richard L.; Forbes, Glenn S.; Gehring, Dale G.; Salutz, James R.; Pavlicek, William
1991-07-01
This joint project began in 1988 and was motivated by the need to develop an alternative to the archival process in place at that time (magnetic tape) for magnetic resonance imaging and neurological computed tomography. In addition, this project was felt to be an important step in gaining the necessary clinical experience for the future implementation of various aspects of electronic imaging. The initial phase of the project was conceived and developed to prove the concept, test the fundamental components, and produce performance measurements for future work. The key functions of this phase centered on attachment of imaging equipment (GE Signa) and archival processes using a non-dedicated (institutionally supplied) local area network (LAN). Attachment of imaging equipment to the LAN was performed using commercially available devices (Ethernet, PS/2, Token Ring). Image data were converted to ACR/NEMA format with retention of the vendor specific header information. Performance measurements were encouraging and led to the design of following projects. The second phase has recently been concluded. The major features of this phase have been to greatly expand the network, put the network into clinical use, establish an efficient and useful viewing station, include diagnostic reports in the archive data, provide wide area network (WAN) capability via ISDN, and establish two-way real-time video between remote sites. This phase has heightened both departmental and institutional thought regarding various issues raised by electronic imaging. Much discussion regarding both present as well as future archival processes has occurred. The use of institutional LAN resources has proven to be adequate for the archival function examined thus far. Experiments to date have shown that use of dedicated resources will be necessary for retrieval activities at even a basic level. This report presents an overview of the background present status and future directions of the project.
Problem of data quality and the limitations of the infrastructure approach
NASA Astrophysics Data System (ADS)
Behlen, Fred M.; Sayre, Richard E.; Rackus, Edward; Ye, Dingzhong
1998-07-01
The 'Infrastructure Approach' is a PACS implementation methodology wherein the archive, network and information systems interfaces are acquired first, and workstations are installed later. The approach allows building a history of archived image data, so that most prior examinations are available in digital form when workstations are deployed. A limitation of the Infrastructure Approach is that the deferred use of digital image data defeats many data quality management functions that are provided automatically by human mechanisms when data is immediately used for the completion of clinical tasks. If the digital data is used solely for archiving while reports are interpreted from film, the radiologist serves only as a check against lost films, and another person must be designated as responsible for the quality of the digital data. Data from the Radiology Information System and the PACS were analyzed to assess the nature and frequency of system and data quality errors. The error level was found to be acceptable if supported by auditing and error resolution procedures requiring additional staff time, and in any case was better than the loss rate of a hardcopy film archive. It is concluded that the problem of data quality compromises but does not negate the value of the Infrastructure Approach. The Infrastructure Approach should best be employed only to a limited extent, and that any phased PACS implementation should have a substantial complement of workstations dedicated to softcopy interpretation for at least some applications, and with full deployment following not long thereafter.
The Telecommunications and Data Acquisition Report
NASA Technical Reports Server (NTRS)
Yuen, Joseph H. (Editor)
1994-01-01
This quarterly publication provides archival reports on developments in programs managed by JPL's Telecommunications and Mission Operations Directorate (TMOD), which now includes the former Telecommunications and Data Acquisition (TDA) Office. In space communications, radio navigation, radio science, and ground-based radio and radar astronomy, it reports on activities of the Deep Space Network (DS) in planning, supporting research and technology, implementation, and operations. Also included are standards activity at JPL for space data and information systems and reimbursable DSN work performed for other space agencies through NASA. The preceding work is all performed for NASA's Office of Space Communications (OSC).
The Telecommunications and Data Acquisition Report
NASA Technical Reports Server (NTRS)
Yuen, Joseph H. (Editor)
1995-01-01
This quarterly publication provides archival reports on developments in programs managed by JPL's Telecommunications and Mission Operations Directorate (TMOD), which now includes the former Telecommunications and Data Acquisition (TDA) Office. In space communications, radio navigation, radio science, and ground-based radio and radar astronomy, it reports on activities of the Deep Space Network (DSN) in planning, supporting research and technology, implementation, and operations. Also included are standards activity at JPL for space data and information systems and reimbursable DSN work performed for other space agencies through NASA. The preceding work is all performed for NASA's Office of Space Communications (OSC).
2015-01-01
Background Parental cancer can have a significant impact on a family's psychosocial functioning and quality of life, whereby the children’s situation is strongly related to parental coping and capacity. Such parents ask for more help in order to increase their care capacity, while the network is often insecure about how to help and thereby withdraw. They ask for guidance and training to be able to support cancer families. Based on this, the Cancer- Psycho-Educational Program for the SOcial NEtwork (PEPSONE) study was developed. Objective To optimize social network support through a psycho-educational program for families living with parental cancer and their network members in order to increase parental capacity and thereby secure the children’s safety and quality of life. Methods A randomized controlled trial (RCT) in which families (N=60) living with parental cancer will be randomized to either an intervention group or a control group. The intervention will last for 3 hours and includes (1) introduction, (2) psycho-education (living with cancer in the family and the importance of social network support), and (3) discussion (this family’s need for social support). Primary outcomes are social support, mental health, and quality of life, and secondary outcomes are resilience and parental capacity. Data will be collected by a set of questionnaires distributed to healthy parents (N=60) living with a partner with cancer, one child in the family between 8-18 years of age (N=60), and network members (N=210) of the intervention families at inclusion, and after 3 and 6 months. Comparing differences between the intervention group (n=30) and the control group (n=30), the power analysis shows that P<.05 and a statistical power = .80 would detect effect sizes of clinical interest. Results This paper presents the Cancer-PEPSON study’s protocol to provide a broader understanding of the background and content of the program. The study is ongoing until August 2016 and the first results are anticipated to be finished by November 2015. Conclusions To our knowledge, this will be the first RCT study to optimize social network support through a psycho-educational program for families living with parental cancer and their network members, as well as provide an evidence basis for social network support. The results may provide important knowledge that is useful for clinical practice and further research. The trial is reported according to the CONSORT checklist. ClinicalTrial International Standard Randomized Controlled Trial Number (ISRCTN): 15982171; http://www.controlled-trials.com/ISRCTN15982171/15982171 (Archived by WebCite at http://www.webcitation.org/6cg9zunS0) PMID:26733339
Taking digital imaging to the next level: challenges and opportunities.
Hobbs, W Cecyl
2004-01-01
New medical imaging technology, such as multi-detector computed tomography (CT) scanners and positron emission tomography (PET) scanners, are creating new possibilities for non-invasive diagnosis that are leading providers to invest heavily in these new technologies. The volume of data produced by such technology is so large that it cannot be "read" using traditional film-based methods, and once in digital form, it creates a massive data integration and archiving challenge. Despite the benefits of digital imaging and archiving, there are several key challenges that healthcare organizations should consider in planning, selecting, and implementing the information technology (IT) infrastructure to support digital imaging. Decisions about storage and image distribution are essentially questions of "where" and "how fast." When planning the digital archiving infrastructure, organizations should think about where they want to store and distribute their images. This is similar to decisions that organizations have to make in regard to physical film storage and distribution, except the portability of images is even greater in a digital environment. The principle of "network effects" seems like a simple concept, yet the effect is not always considered when implementing a technology plan. To fully realize the benefits of digital imaging, the radiology department must integrate the archiving solutions throughout the department and, ultimately, with applications across other departments and enterprises. Medical institutions can derive a number of benefits from implementing digital imaging and archiving solutions like PACS. Hospitals and imaging centers can use the transition from film-based imaging as a foundational opportunity to reduce costs, increase competitive advantage, attract talent, and improve service to patients. The key factors in achieving these goals include attention to the means of data storage, distribution and protection.
Research Capacity Building in Education: The Role of Digital Archives
ERIC Educational Resources Information Center
Carmichael, Patrick
2011-01-01
Accounts of how research capacity in education can be developed often make reference to electronic networks and online resources. This paper presents a theoretically driven analysis of the role of one such resource, an online archive of educational research studies that includes not only digitised collections of original documents but also videos…
Dataworks for GNSS: Software for Supporting Data Sharing and Federation of Geodetic Networks
NASA Astrophysics Data System (ADS)
Boler, F. M.; Meertens, C. M.; Miller, M. M.; Wier, S.; Rost, M.; Matykiewicz, J.
2015-12-01
Continuously-operating Global Navigation Satellite System (GNSS) networks are increasingly being installed globally for a wide variety of science and societal applications. GNSS enables Earth science research in areas including tectonic plate interactions, crustal deformation in response to loading by tectonics, magmatism, water and ice, and the dynamics of water - and thereby energy transfer - in the atmosphere at regional scale. The many individual scientists and organizations that set up GNSS stations globally are often open to sharing data, but lack the resources or expertise to deploy systems and software to manage and curate data and metadata and provide user tools that would support data sharing. UNAVCO previously gained experience in facilitating data sharing through the NASA-supported development of the Geodesy Seamless Archive Centers (GSAC) open source software. GSAC provides web interfaces and simple web services for data and metadata discovery and access, supports federation of multiple data centers, and simplifies transfer of data and metadata to long-term archives. The NSF supported the dissemination of GSAC to multiple European data centers forming the European Plate Observing System. To expand upon GSAC to provide end-to-end, instrument-to-distribution capability, UNAVCO developed Dataworks for GNSS with NSF funding to the COCONet project, and deployed this software on systems that are now operating as Regional GNSS Data Centers as part of the NSF-funded TLALOCNet and COCONet projects. Dataworks consists of software modules written in Python and Java for data acquisition, management and sharing. There are modules for GNSS receiver control and data download, a database schema for metadata, tools for metadata handling, ingest software to manage file metadata, data file management scripts, GSAC, scripts for mirroring station data and metadata from partner GSACs, and extensive software and operator documentation. UNAVCO plans to provide a cloud VM image of Dataworks that would allow standing up a Dataworks-enabled GNSS data center without requiring upfront investment in server hardware. By enabling data creators to organize their data and metadata for sharing, Dataworks helps scientists expand their data curation awareness and responsibility, and enhances data access for all.
A pathologist-designed imaging system for anatomic pathology signout, teaching, and research.
Schubert, E; Gross, W; Siderits, R H; Deckenbaugh, L; He, F; Becich, M J
1994-11-01
Pathology images are derived from gross surgical specimens, light microscopy, immunofluorescence, electron microscopy, molecular diagnostic gels, flow cytometry, image analysis data, and clinical laboratory data in graphic form. We have implemented a network of desktop personal computers (PCs) that allow us to easily capture, store, and retrieve gross and microscopic, anatomic, and research pathology images. System architecture involves multiple image acquisition and retrieval sites and a central file server for storage. The digitized images are conveyed via a local area network to and from image capture or display stations. Acquisition sites consist of a high-resolution camera connected to a frame grabber card in a 486-type personal computer, equipped with 16 MB (Table 1) RAM, a 1.05-gigabyte hard drive, and a 32-bit ethernet card for access to our anatomic pathology reporting system. We have designed a push-button workstation for acquiring and indexing images that does not significantly interfere with surgical pathology sign-out. Advantages of the system include the following: (1) Improving patient care: the availability of gross images at time of microscopic sign-out, verification of recurrence of malignancy from archived images, monitoring of bone marrow engraftment and immunosuppressive intervention after bone marrow/solid organ transplantation on repeat biopsies, and ability to seek instantaneous consultation with any pathologist on the network; (2) enhancing the teaching environment: building a digital surgical pathology atlas, improving the availability of images for conference support, and sharing cases across the network; (3) enhancing research: case study compilation, metastudy analysis, and availability of digitized images for quantitative analysis and permanent/reusable image records for archival study; and (4) other practical and economic considerations: storing case requisition images and hand-drawn diagrams deters the spread of gross room contaminants and results in considerable cost savings in photographic media for conferences, improved quality assurance by porting control stains across the network, and a multiplicity of other advantages that enhance image and information management in pathology.
HEASARC - The High Energy Astrophysics Science Archive Research Center
NASA Technical Reports Server (NTRS)
Smale, Alan P.
2011-01-01
The High Energy Astrophysics Science Archive Research Center (HEASARC) is NASA's archive for high-energy astrophysics and cosmic microwave background (CMB) data, supporting the broad science goals of NASA's Physics of the Cosmos theme. It provides vital scientific infrastructure to the community by standardizing science data formats and analysis programs, providing open access to NASA resources, and implementing powerful archive interfaces. Over the next five years the HEASARC will ingest observations from up to 12 operating missions, while serving data from these and over 30 archival missions to the community. The HEASARC archive presently contains over 37 TB of data, and will contain over 60 TB by the end of 2014. The HEASARC continues to secure major cost savings for NASA missions, providing a reusable mission-independent framework for reducing, analyzing, and archiving data. This approach was recognized in the NRC Portals to the Universe report (2007) as one of the HEASARC's great strengths. This poster describes the past and current activities of the HEASARC and our anticipated developments in coming years. These include preparations to support upcoming high energy missions (NuSTAR, Astro-H, GEMS) and ground-based and sub-orbital CMB experiments, as well as continued support of missions currently operating (Chandra, Fermi, RXTE, Suzaku, Swift, XMM-Newton and INTEGRAL). In 2012 the HEASARC (which now includes LAMBDA) will support the final nine-year WMAP data release. The HEASARC is also upgrading its archive querying and retrieval software with the new Xamin system in early release - and building on opportunities afforded by the growth of the Virtual Observatory and recent developments in virtual environments and cloud computing.
A Computational framework for telemedicine.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Foster, I.; von Laszewski, G.; Thiruvathukal, G. K.
1998-07-01
Emerging telemedicine applications require the ability to exploit diverse and geographically distributed resources. Highspeed networks are used to integrate advanced visualization devices, sophisticated instruments, large databases, archival storage devices, PCs, workstations, and supercomputers. This form of telemedical environment is similar to networked virtual supercomputers, also known as metacomputers. Metacomputers are already being used in many scientific application areas. In this article, we analyze requirements necessary for a telemedical computing infrastructure and compare them with requirements found in a typical metacomputing environment. We will show that metacomputing environments can be used to enable a more powerful and unified computational infrastructure formore » telemedicine. The Globus metacomputing toolkit can provide the necessary low level mechanisms to enable a large scale telemedical infrastructure. The Globus toolkit components are designed in a modular fashion and can be extended to support the specific requirements for telemedicine.« less
NASA Astrophysics Data System (ADS)
Chun, F.; Tippets, R.; Dearborn, M.; Gresham, K.; Freckleton, R.; Douglas, M.
2014-09-01
The Falcon Telescope Network (FTN) is a global network of small aperture telescopes developed by the Center for Space Situational Awareness Research in the Department of Physics at the United States Air Force Academy (USAFA). Consisting of commercially available equipment, the FTN is a collaborative effort between USAFA and other educational institutions ranging from two- and four-year colleges to major research universities. USAFA provides the equipment (e.g. telescope, mount, camera, filter wheel, dome, weather station, computers and storage devices) while the educational partners provide the building and infrastructure to support an observatory. The user base includes USAFA along with K-12 and higher education faculty and students. Since the FTN has a general use purpose, objects of interest include satellites, astronomical research, and STEM support images. The raw imagery, all in the public domain, will be accessible to FTN partners and will be archived at USAFA in the Cadet Space Operations Center. FTN users will be able to submit observational requests via a web interface. The requests will then be prioritized based on the type of user, the object of interest, and a user-defined priority. A network wide schedule will be developed every 24 hours and each FTN site will autonomously execute its portion of the schedule. After an observational request is completed, the FTN user will receive notification of collection and a link to the data. The Falcon Telescope Network is an ambitious endeavor, but demonstrates the cooperation that can be achieved by multiple educational institutions.
On detecting variables using ROTSE-IIId archival data
NASA Astrophysics Data System (ADS)
Yesilyaprak, C.; Yerli, S. K.; Aksaker, N.; Gucsav, B. B.; Kiziloglu, U.; Dikicioglu, E.; Coker, D.; Aydin, E.; Ozeren, F. F.
ROTSE (Robotic Optical Transient Search Experiment) telescopes can also be used for variable star detection. As explained in the system description tep{2003PASP..115..132A}, they have a good sky coverage and they allow a fast data acquisition. The optical magnitude range varies between 7^m to 19^m. Thirty percent of the telescope time of north-eastern leg of the network, namely ROTSE-IIId (located at TUBITAK National Observatory, Bakirlitepe, Turkey http://www.tug.tubitak.gov.tr/) is owned by Turkish researchers. Since its first light (May 2004) considerably a large amount of data has been collected (around 2 TB) from the Turkish time and roughly one million objects have been identified from the reduced data. A robust pipeline has been constructed to discover new variables, transients and planetary nebulae from this archival data. In the detection process, different statistical methods were applied to the archive. We have detected thousands of variable stars by applying roughly four different tests to light curve of each star. In this work a summary of the pipeline is presented. It uses a high performance computing (HPC) algorithm which performs inhomogeneous ensemble photometry of the data on a 36 core cluster. This study is supported by TUBITAK (Scientific and Technological Research Council of Turkey) with the grant number TBAG-108T475.
NASA Astrophysics Data System (ADS)
Walker, D. A.; Breen, A. L.; Broderson, D.; Epstein, H. E.; Fisher, W.; Grunblatt, J.; Heinrichs, T.; Raynolds, M. K.; Walker, M. D.; Wirth, L.
2013-12-01
Abundant ground-based information will be needed to inform remote-sensing and modeling studies of NASA's Arctic-Boreal Vulnerability Experiment (ABoVE). A large body of plot and map data collected by the Alaska Geobotany Center (AGC) and collaborators from the Arctic regions of Alaska and the circumpolar Arctic over the past several decades is being archived and made accessible to scientists and the public via the Geographic Information Network of Alaska's (GINA's) 'Catalog' display and portal system. We are building two main types of data archives: Vegetation Plot Archive: For the plot information we use a Turboveg database to construct the Alaska portion of the international Arctic Vegetation Archive (AVA) http://www.geobotany.uaf.edu/ava/. High quality plot data and non-digital legacy datasets in danger of being lost have highest priority for entry into the archive. A key aspect of the database is the PanArctic Species List (PASL-1), developed specifically for the AVA to provide a standard of species nomenclature for the entire Arctic biome. A wide variety of reports, documents, and ancillary data are linked to each plot's geographic location. Geoecological Map Archive: This database includes maps and remote sensing products and links to other relevant data associated with the maps, mainly those produced by the Alaska Geobotany Center. Map data include GIS shape files of vegetation, land-cover, soils, landforms and other categorical variables and digital raster data of elevation, multispectral satellite-derived data, and data products and metadata associated with these. The map archive will contain all the information that is currently in the hierarchical Toolik-Arctic Geobotanical Atlas (T-AGA) in Alaska http://www.arcticatlas.org, plus several additions that are in the process of development and will be combined with GINA's already substantial holdings of spatial data from northern Alaska. The Geoecological Atlas Portal uses GINA's Catalog tool to develop a web interface to view and access the plot and map data. The mapping portal allows visualization of GIS data, sample-point locations and imagery and access to the map data. Catalog facilitates the discovery and dissemination of science-based information products in support of analysis and decision-making concerned with development and climate change and is currently used by GINA in several similar archive/distribution portals.
The Cancer Imaging Archive (TCIA): maintaining and operating a public information repository.
Clark, Kenneth; Vendt, Bruce; Smith, Kirk; Freymann, John; Kirby, Justin; Koppel, Paul; Moore, Stephen; Phillips, Stanley; Maffitt, David; Pringle, Michael; Tarbox, Lawrence; Prior, Fred
2013-12-01
The National Institutes of Health have placed significant emphasis on sharing of research data to support secondary research. Investigators have been encouraged to publish their clinical and imaging data as part of fulfilling their grant obligations. Realizing it was not sufficient to merely ask investigators to publish their collection of imaging and clinical data, the National Cancer Institute (NCI) created the open source National Biomedical Image Archive software package as a mechanism for centralized hosting of cancer related imaging. NCI has contracted with Washington University in Saint Louis to create The Cancer Imaging Archive (TCIA)-an open-source, open-access information resource to support research, development, and educational initiatives utilizing advanced medical imaging of cancer. In its first year of operation, TCIA accumulated 23 collections (3.3 million images). Operating and maintaining a high-availability image archive is a complex challenge involving varied archive-specific resources and driven by the needs of both image submitters and image consumers. Quality archives of any type (traditional library, PubMed, refereed journals) require management and customer service. This paper describes the management tasks and user support model for TCIA.
NASA Astrophysics Data System (ADS)
Greene, G.; Kyprianou, M.; Levay, K.; Sienkewicz, M.; Donaldson, T.; Dower, T.; Swam, M.; Bushouse, H.; Greenfield, P.; Kidwell, R.; Wolfe, D.; Gardner, L.; Nieto-Santisteban, M.; Swade, D.; McLean, B.; Abney, F.; Alexov, A.; Binegar, S.; Aloisi, A.; Slowinski, S.; Gousoulin, J.
2015-09-01
The next generation for the Space Telescope Science Institute data management system is gearing up to provide a suite of archive system services supporting the operation of the James Webb Space Telescope. We are now completing the initial stage of integration and testing for the preliminary ground system builds of the JWST Science Operations Center which includes multiple components of the Data Management Subsystem (DMS). The vision for astronomical science and research with the JWST archive introduces both solutions to formal mission requirements and innovation derived from our existing mission systems along with the collective shared experience of our global user community. We are building upon the success of the Hubble Space Telescope archive systems, standards developed by the International Virtual Observatory Alliance, and collaborations with our archive data center partners. In proceeding forward, the “one archive” architectural model presented here is designed to balance the objectives for this new and exciting mission. The STScI JWST archive will deliver high quality calibrated science data products, support multi-mission data discovery and analysis, and provide an infrastructure which supports bridges to highly valued community tools and services.
NASA Technical Reports Server (NTRS)
McGlynn, Thomas A.
2008-01-01
We discuss approaches to building archives that support the way most science is done. Today research is done in formal teams and informal groups. However our on-line services are designed to work with a single user. We have begun prototyping a new approach to building archives in which support for collaborative research is built in from the start. We organize the discussion along three elements that we believe to be necessary for effective support: We must enable user presence in the archive environment; users must be able to interact. Users must be able to personalize the environment, adding data and capabilities useful to themselves and their team. These changes must be persistent: subsequent sessions must be able to build upon previous sessions. In building the archive we see the large multi-player interactive games as a paradigm of how this approach can work. These three 'P's are essential in gaming as well and we shall use insights from the gaming world and virtual reality systems like Second Life in our prototype.
How to Boost Engineering Support Via Web 2.0 - Seeds for the Ares Project...and/or Yours?
NASA Technical Reports Server (NTRS)
Scott, David W.
2010-01-01
The Mission Operations Laboratory (MOL) at Marshall Space Flight Center (MSFC) is responsible for Engineering Support capability for NASA s Ares launch system development. In pursuit of this, MOL is building the Ares Engineering and Operations Network (AEON), a web-based portal intended to provide a seamless interface to support and simplify two critical activities: a) Access and analyze Ares manufacturing, test, and flight performance data, with access to Shuttle data for comparison. b) Provide archive storage for engineering instrumentation data to support engineering design, development, and test. A mix of NASA-written and COTS software provides engineering analysis tools. A by-product of using a data portal to access and display data is access to collaborative tools inherent in a Web 2.0 environment. This paper discusses how Web 2.0 techniques, particularly social media, might be applied to the traditionally conservative and formal engineering support arena. A related paper by the author [1] considers use
NASA Astrophysics Data System (ADS)
Peek, Joshua E. G.; Hargis, Jonathan R.; Jones, Craig K.
2018-01-01
Astronomical instruments produce petabytes of images every year, vastly more than can be inspected by a member of the astronomical community in search of a specific population of structures. Fortunately, the sky is mostly black and source extraction algorithms have been developed to provide searchable catalogs of unconfused sources like stars and galaxies. These tools often fail for studies of more diffuse structures like the interstellar medium and unresolved stellar structures in nearby galaxies, leaving astronomers interested in observations of photodissociation regions, stellar clusters, diffuse interstellar clouds without the crucial ability to search. In this work we present a new path forward for finding structures in large data sets similar to an input structure using convolutional neural networks, transfer learning, and machine learning clustering techniques. We show applications to archival data in the Mikulski Archive for Space Telescopes (MAST).
7 CFR 1755.901 - Incorporation by Reference.
Code of Federal Regulations, 2011 CFR
2011-01-01
..., Digital Systems and Networks, Transmission media characteristics—Optical fibre cables, Characteristics of... Systems and Media, Digital Systems and Networks, Transmission media characteristics—Optical fibre cables... National Archives and Records Administration (NARA). For information on the availability of these materials...
NASA Astrophysics Data System (ADS)
Nybade, A.; Aster, R.; Beck, S.; Ekstrom, G.; Fischer, K.; Lerner-Lam, A.; Meltzer, A.; Sandvol, E.; Willemann, R. J.
2008-12-01
Building a sustainable earthquake monitoring system requires well-informed cooperation between commercial companies that manufacture components or deliver complete systems and the government or other agencies that will be responsible for operating them. Many nations or regions with significant earthquake hazard lack the financial, technical, and human resources to establish and sustain permanent observatory networks required to return the data needed for hazard mitigation. Government agencies may not be well- informed about the short-term and long-term challenges of managing technologically advanced monitoring systems, much less the details of how they are built and operated. On the relatively compressed time scale of disaster recovery efforts, it can be difficult to find a reliable, disinterested source of information, without which government agencies may be dependent on partial information. If system delivery fails to include sufficient development of indigenous expertise, the performance of local and regional networks may decline quickly, and even data collected during an early high-performance period may be degraded or lost. Drawing on unsurpassed educational capabilities of its members working in close cooperation with its facility staff, IRIS is well prepared to contribute to sustainability through a wide variety of training and service activities that further promote standards for network installation, data exchange protocols, and free and open access to data. Members of the Consortium and staff of its Core Programs together could write a guide on decisions about network design, installation and operation. The intended primary audience would be government officials seeking to understand system requirements, the acquisition and installation process, and the expertise needed operate a system. The guide would cover network design, procurement, set-up, data use and archiving. Chapters could include advice on network data processing, archiving data (including information on the value of standards), installing and servicing stations, building a data processing and management center (including information on evaluating bids), using results from earthquake monitoring, and sustaining an earthquake monitoring system. Appendices might include profiles of well-configured and well- run networks and sample RFPs. Establishing permanent networks could provide a foundation for international research and educational collaborations and critical new data for imaging Earth structure while supporting scientific capacity building and strengthening hazard monitoring around the globe.
A Structure Standard for Archival Context: EAC-CPF Is Here
ERIC Educational Resources Information Center
Dryden, Jean
2010-01-01
The archival community's new descriptive standard, "Encoded Archival Context" for Corporate Bodies, Persons, and Families (EAC-CPF), supports the sharing of descriptions of records creators and is a significant addition to the suite of standards for archival description. EAC-CPF is a data structure standard similar to its older sibling EAD…
NASA Technical Reports Server (NTRS)
Green, James L.
1989-01-01
The National Space Science Data Center (NSSDC), established in 1966, is the largest archive for processed data from NASA's space and Earth science missions. The NSSDC manages over 120,000 data tapes with over 4,000 data sets. The size of the digital archive is approximately 6,000 gigabytes with all of this data in its original uncompressed form. By 1995 the NSSDC digital archive is expected to more than quadruple in size reaching over 28,000 gigabytes. The NSSDC digital archive is expected to more than quadruple in size reaching over 28,000 gigabytes. The NSSDC is beginning several thrusts allowing it to better serve the scientific community and keep up with managing the ever increasing volumes of data. These thrusts involve managing larger and larger amounts of information and data online, employing mass storage techniques, and the use of low rate communications networks to move requested data to remote sites in the United States, Europe and Canada. The success of these thrusts, combined with the tremendous volume of data expected to be archived at the NSSDC, clearly indicates that innovative storage and data management solutions must be sought and implemented. Although not presently used, data compression techniques may be a very important tool for managing a large fraction or all of the NSSDC archive in the future. Some future applications would consist of compressing online data in order to have more data readily available, compress requested data that must be moved over low rate ground networks, and compress all the digital data in the NSSDC archive for a cost effective backup that would be used only in the event of a disaster.
NASA Astrophysics Data System (ADS)
Ribeiro, Luís S.; Costa, Carlos; Oliveira, José Luís
2010-03-01
Diagnostic tools supported by digital medical images have increasingly become an essential aid to medical decisions. However, despite its growing importance, Picture Archiving and Communication Systems (PACS) are typically oriented to support a single healthcare institution, and the sharing of medical data across institutions is still a difficult process. This paper describes a proposal to publish and control Digital Imaging Communications in Medicine (DICOM) services in a wide domain composed of several healthcare institutions. The system creates virtual bridges between intranets enabling the exchange, search and store of the medical data within the wide domain. The service provider publishes the DICOM services following a token-based strategy. The token advertisements are public and known by all system users. However, access to the DICOM service is controlled through a role association between an access key and the service. Furthermore, in medical diagnoses, time is a crucial factor. Therefore, our system is a turnkey solution, capable of exchanging medical data across firewalls and Network Address Translation (NAT), avoiding bureaucratic issues with local network security. Security is also an important concern - in any transmission across different domains, data is encrypted by Transport Layer Security (TLS).
Integration experiences and performance studies of A COTS parallel archive systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Hsing-bung; Scott, Cody; Grider, Bary
2010-01-01
Current and future Archive Storage Systems have been asked to (a) scale to very high bandwidths, (b) scale in metadata performance, (c) support policy-based hierarchical storage management capability, (d) scale in supporting changing needs of very large data sets, (e) support standard interface, and (f) utilize commercial-off-the-shelf(COTS) hardware. Parallel file systems have been asked to do the same thing but at one or more orders of magnitude faster in performance. Archive systems continue to move closer to file systems in their design due to the need for speed and bandwidth, especially metadata searching speeds such as more caching and lessmore » robust semantics. Currently the number of extreme highly scalable parallel archive solutions is very small especially those that will move a single large striped parallel disk file onto many tapes in parallel. We believe that a hybrid storage approach of using COTS components and innovative software technology can bring new capabilities into a production environment for the HPC community much faster than the approach of creating and maintaining a complete end-to-end unique parallel archive software solution. In this paper, we relay our experience of integrating a global parallel file system and a standard backup/archive product with a very small amount of additional code to provide a scalable, parallel archive. Our solution has a high degree of overlap with current parallel archive products including (a) doing parallel movement to/from tape for a single large parallel file, (b) hierarchical storage management, (c) ILM features, (d) high volume (non-single parallel file) archives for backup/archive/content management, and (e) leveraging all free file movement tools in Linux such as copy, move, ls, tar, etc. We have successfully applied our working COTS Parallel Archive System to the current world's first petaflop/s computing system, LANL's Roadrunner, and demonstrated its capability to address requirements of future archival storage systems.« less
Integration experiments and performance studies of a COTS parallel archive system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Hsing-bung; Scott, Cody; Grider, Gary
2010-06-16
Current and future Archive Storage Systems have been asked to (a) scale to very high bandwidths, (b) scale in metadata performance, (c) support policy-based hierarchical storage management capability, (d) scale in supporting changing needs of very large data sets, (e) support standard interface, and (f) utilize commercial-off-the-shelf (COTS) hardware. Parallel file systems have been asked to do the same thing but at one or more orders of magnitude faster in performance. Archive systems continue to move closer to file systems in their design due to the need for speed and bandwidth, especially metadata searching speeds such as more caching andmore » less robust semantics. Currently the number of extreme highly scalable parallel archive solutions is very small especially those that will move a single large striped parallel disk file onto many tapes in parallel. We believe that a hybrid storage approach of using COTS components and innovative software technology can bring new capabilities into a production environment for the HPC community much faster than the approach of creating and maintaining a complete end-to-end unique parallel archive software solution. In this paper, we relay our experience of integrating a global parallel file system and a standard backup/archive product with a very small amount of additional code to provide a scalable, parallel archive. Our solution has a high degree of overlap with current parallel archive products including (a) doing parallel movement to/from tape for a single large parallel file, (b) hierarchical storage management, (c) ILM features, (d) high volume (non-single parallel file) archives for backup/archive/content management, and (e) leveraging all free file movement tools in Linux such as copy, move, Is, tar, etc. We have successfully applied our working COTS Parallel Archive System to the current world's first petafiop/s computing system, LANL's Roadrunner machine, and demonstrated its capability to address requirements of future archival storage systems.« less
Filmless PACS in a multiple facility environment
NASA Astrophysics Data System (ADS)
Wilson, Dennis L.; Glicksman, Robert A.; Prior, Fred W.; Siu, Kai-Yeung; Goldburgh, Mitchell M.
1996-05-01
A Picture Archiving and Communication System centered on a shared image file server can support a filmless hospital. Systems based on this architecture have proven themselves in over four years of clinical operation. Changes in healthcare delivery are causing radiology groups to support multiple facilities for remote clinic support and consolidation of services. There will be a corresponding need for communicating over a standardized wide area network (WAN). Interactive workflow, a natural extension to the single facility case, requires a means to work effectively and seamlessly across moderate to low speed communication networks. Several schemes for supporting a consortium of medical treatment facilities over a WAN are explored. Both centralized and distributed database approaches are evaluated against several WAN scenarios. Likewise, several architectures for distributing image file servers or buffers over a WAN are explored, along with the caching and distribution strategies that support them. An open system implementation is critical to the success of a wide area system. The role of the Digital Imaging and Communications in Medicine (DICOM) standard in supporting multi- facility and multi-vendor open systems is also addressed. An open system can be achieved by using a DICOM server to provide a view of the system-wide distributed database. The DICOM server interface to a local version of the global database lets a local workstation treat the multiple, distributed data servers as though they were one local server for purposes of examination queries. The query will recover information about the examination that will permit retrieval over the network from the server on which the examination resides. For efficiency reasons, the ability to build cross-facility radiologist worklists and clinician-oriented patient folders is essential. The technologies of the World-Wide-Web can be used to generate worklists and patient folders across facilities. A reliable broadcast protocol may be a convenient way to notify many different users and many image servers about new activities in the network of image servers. In addition to ensuring reliability of message delivery and global serialization of each broadcast message in the network, the broadcast protocol should not introduce significant communication overhead.
Tools for Integrating Data Access from the IRIS DMC into Research Workflows
NASA Astrophysics Data System (ADS)
Reyes, C. G.; Suleiman, Y. Y.; Trabant, C.; Karstens, R.; Weertman, B. R.
2012-12-01
Web service interfaces at the IRIS Data Management Center (DMC) provide access to a vast archive of seismological and related geophysical data. These interfaces are designed to easily incorporate data access into data processing workflows. Examples of data that may be accessed include: time series data, related metadata, and earthquake information. The DMC has developed command line scripts, MATLAB® interfaces and a Java library to support a wide variety of data access needs. Users of these interfaces do not need to concern themselves with web service details, networking, or even (in most cases) data conversion. Fetch scripts allow access to the DMC archive and are a comfortable fit for command line users. These scripts are written in Perl and are well suited for automation and integration into existing workflows on most operating systems. For metdata and event information, the Fetch scripts even parse the returned data into simple text summaries. The IRIS Java Web Services Library (IRIS-WS Library) allows Java developers the ability to create programs that access the DMC archives seamlessly. By returning the data and information as native Java objects the Library insulates the developer from data formats, network programming and web service details. The MATLAB interfaces leverage this library to allow users access to the DMC archive directly from within MATLAB (r2009b or newer), returning data into variables for immediate use. Data users and research groups are developing other toolkits that use the DMC's web services. Notably, the ObsPy framework developed at LMU Munich is a Python Toolbox that allows seamless access to data and information via the DMC services. Another example is the MATLAB-based GISMO and Waveform Suite developments that can now access data via web services. In summary, there now exist a host of ways that researchers can bring IRIS DMC data directly into their workflows. MATLAB users can use irisFetch.m, command line users can use the various Fetch scripts, Java users can use the IRIS-WS library, and Python users may request data through ObsPy. To learn more about any of these clients see http://www.iris.edu/ws/wsclients/.
EMODnet Physics: open and free marine physical data for science and for society
NASA Astrophysics Data System (ADS)
Nolan, G.; Novellino, A.; Gorringe, P.; Manzella, G. M. R., Sr.; Schaap, D.; Pouliquen, S.; Richards, L.
2016-02-01
Europe is sustaining a long term strategy on Blue Growth, looking at seas and oceans as drivers for innovation and growth. A number of weaknesses have been identified, among which gaps in knowledge and data about the state of our oceans, seabed resources, marine life and risks to habitats and ecosystems. European Marine Observation and Data Network (EMODnet) has been created to improve the usefulness to European users for scientific, regulatory and commercial purposes of observations and the resulting marine data collected and held by European public and private bodies. EMODNet Physics is providing access to archived and real time data catalog on the physical condition in Europe's seas and oceans. The overall objectives are to provide access to archived and near real-time data on physical conditions in Europe's seas and oceans by means of a dedicated portal and to determine how well the data meet the needs of users from industry, public authorities and scientists. EMODnet Physics is contributing to the broader initiative 'Marine Knowledge 2020', and in particular to the implementation of the European Copernicus programme, an EU-wide programme that aims to support policymakers, business, and citizens with improved environmental information. In the global context, Copernicus is an integral part of the Global Earth Observation System of Systems. Near real time data and metadata are populated by data owners, organized at EuroGOOS level according its regional operational systems (ROOSs) infrastructure and conventions and made available with the EMODnet Physics user interface. Latest 60 days are freely viewable and downloadable while the access to older data (monthly archives) request credentials. Archived data series and metadata are organized according and in collaboration with NODCs network (SeaDataNet). Access to data and metadata consider measurements on winds at the sea surface, waves, temperature and salinity, water velocities, light attenuation, sea level and ice coverage. EMODnet Physics has the specific objective of processing physical data into interoperable formats which includes agreed standards, common baselines or reference conditions; assessments of their accuracy and precision. The data and metadata are accessible through an ISO, OGC, INSPIRE compliant portal that is operational 24/7.
Improving the Quality of Backup Process for Publishing Houses and Printing Houses
NASA Astrophysics Data System (ADS)
Proskuriakov, N. E.; Yakovlev, B. S.; Pries, V. V.
2018-04-01
The analysis of main types for data threats, used by print media, and their influence on the vitality and security of information is made. The influence of the programs settings for preparing archive files, the types of file managers on the backup process is analysed. We proposed a simple and economical version of the practical implementation of the backup process consisting of 4 components: the command line interpreter, the 7z archiver, the Robocopy utility, and network storage. We recommend that the best option would be to create backup copies, consisting of three local copies of data and two network copies.
NASA Astrophysics Data System (ADS)
Agarwal, D.; Varadharajan, C.; Cholia, S.; Snavely, C.; Hendrix, V.; Gunter, D.; Riley, W. J.; Jones, M.; Budden, A. E.; Vieglais, D.
2017-12-01
The ESS-DIVE archive is a new U.S. Department of Energy (DOE) data archive designed to provide long-term stewardship and use of data from observational, experimental, and modeling activities in the earth and environmental sciences. The ESS-DIVE infrastructure is constructed with the long-term vision of enabling broad access to and usage of the DOE sponsored data stored in the archive. It is designed as a scalable framework that incentivizes data providers to contribute well-structured, high-quality data to the archive and that enables the user community to easily build data processing, synthesis, and analysis capabilities using those data. The key innovations in our design include: (1) application of user-experience research methods to understand the needs of users and data contributors; (2) support for early data archiving during project data QA/QC and before public release; (3) focus on implementation of data standards in collaboration with the community; (4) support for community built tools for data search, interpretation, analysis, and visualization tools; (5) data fusion database to support search of the data extracted from packages submitted and data available in partner data systems such as the Earth System Grid Federation (ESGF) and DataONE; and (6) support for archiving of data packages that are not to be released to the public. ESS-DIVE data contributors will be able to archive and version their data and metadata, obtain data DOIs, search for and access ESS data and metadata via web and programmatic portals, and provide data and metadata in standardized forms. The ESS-DIVE archive and catalog will be federated with other existing catalogs, allowing cross-catalog metadata search and data exchange with existing systems, including DataONE's Metacat search. ESS-DIVE is operated by a multidisciplinary team from Berkeley Lab, the National Center for Ecological Analysis and Synthesis (NCEAS), and DataONE. The primarily data copies are hosted at DOE's NERSC supercomputing facility with replicas at DataONE nodes.
NASA Astrophysics Data System (ADS)
McGlynn, Thomas; Fabbiano, Giuseppina; Accomazzi, Alberto; Smale, Alan; White, Richard L.; Donaldson, Thomas; Aloisi, Alessandra; Dower, Theresa; Mazzerella, Joseph M.; Ebert, Rick; Pevunova, Olga; Imel, David; Berriman, Graham B.; Teplitz, Harry I.; Groom, Steve L.; Desai, Vandana R.; Landry, Walter
2016-07-01
Since the turn of the millennium a constant concern of astronomical archives have begun providing data to the public through standardized protocols unifying data from disparate physical sources and wavebands across the electromagnetic spectrum into an astronomical virtual observatory (VO). In October 2014, NASA began support for the NASA Astronomical Virtual Observatories (NAVO) program to coordinate the efforts of NASA astronomy archives in providing data to users through implementation of protocols agreed within the International Virtual Observatory Alliance (IVOA). A major goal of the NAVO collaboration has been to step back from a piecemeal implementation of IVOA standards and define what the appropriate presence for the US and NASA astronomy archives in the VO should be. This includes evaluating what optional capabilities in the standards need to be supported, the specific versions of standards that should be used, and returning feedback to the IVOA, to support modifications as needed. We discuss a standard archive model developed by the NAVO for data archive presence in the virtual observatory built upon a consistent framework of standards defined by the IVOA. Our standard model provides for discovery of resources through the VO registries, access to observation and object data, downloads of image and spectral data and general access to archival datasets. It defines specific protocol versions, minimum capabilities, and all dependencies. The model will evolve as the capabilities of the virtual observatory and needs of the community change.
NASA Technical Reports Server (NTRS)
McGlynn, Thomas; Guiseppina, Fabbiano A; Accomazzi, Alberto; Smale, Alan; White, Richard L.; Donaldson, Thomas; Aloisi, Alessandra; Dower, Theresa; Mazzerella, Joseph M.; Ebert, Rick;
2016-01-01
Since the turn of the millennium a constant concern of astronomical archives have begun providing data to the public through standardized protocols unifying data from disparate physical sources and wavebands across the electromagnetic spectrum into an astronomical virtual observatory (VO). In October 2014, NASA began support for the NASA Astronomical Virtual Observatories (NAVO) program to coordinate the efforts of NASA astronomy archives in providing data to users through implementation of protocols agreed within the International Virtual Observatory Alliance (IVOA). A major goal of the NAVO collaboration has been to step back from a piecemeal implementation of IVOA standards and define what the appropriate presence for the US and NASA astronomy archives in the VO should be. This includes evaluating what optional capabilities in the standards need to be supported, the specific versions of standards that should be used, and returning feedback to the IVOA, to support modifications as needed. We discuss a standard archive model developed by the NAVO for data archive presence in the virtual observatory built upon a consistent framework of standards defined by the IVOA. Our standard model provides for discovery of resources through the VO registries, access to observation and object data, downloads of image and spectral data and general access to archival datasets. It defines specific protocol versions, minimum capabilities, and all dependencies. The model will evolve as the capabilities of the virtual observatory and needs of the community change.
The LCOGT Observation Portal, Data Pipeline and Science Archive
NASA Astrophysics Data System (ADS)
Lister, Tim; LCOGT Science Archive Team
2014-01-01
Las Cumbres Observatory Global Telescope (LCOGT) is building and deploying a world-wide network of optical telescopes dedicated to time-domain astronomy. During 2012-2013, we successfully deployed and commissioned nine new 1m telescopes at McDonald Observatory (Texas), CTIO (Chile), SAAO (South Africa) and Siding Spring Observatory (Australia). New, improved cameras and additional telescopes will be deployed during 2014. To enable the diverse LCOGT user community of scientific and educational users to request observations on the LCOGT Network and to see their progress and get access to their data, we have developed an Observation Portal system. This Observation Portal integrates proposal submission and observation requests with seamless access to the data products from the data pipelines in near-realtime and long-term products from the Science Archive. We describe the LCOGT Observation Portal and the data pipeline, currently in operation, which makes use of the ORAC-DR automated recipe-based data reduction pipeline and illustrate some of the new data products. We also present the LCOGT Science Archive, which is being developed in partnership with the Infrared Processing and Analysis Center (IPAC) and show some of the new features the Science Archive provides.
Automated Measurement and Verification and Innovative Occupancy Detection Technologies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Price, Phillip; Bruce, Nordman; Piette, Mary Ann
In support of DOE’s sensors and controls research, the goal of this project is to move toward integrated building to grid systems by building on previous work to develop and demonstrate a set of load characterization measurement and evaluation tools that are envisioned to be part of a suite of applications for transactive efficient buildings, built upon data-driven load characterization and prediction models. This will include the ability to include occupancy data in the models, plus data collection and archival methods to include different types of occupancy data with existing networks and a taxonomy for naming these data within amore » Volttron agent platform.« less
Enterprise-wide worklist management.
Locko, Roberta C; Blume, Hartwig; Goble, John C
2002-01-01
Radiologists in multi-facility health care delivery networks must serve not only their own departments but also departments of associated clinical facilities. We describe our experience with a picture archiving and communication system (PACS) implementation that provides a dynamic view of relevant radiological workload across multiple facilities. We implemented a distributed query system that permits management of enterprise worklists based on modality, body part, exam status, and other criteria that span multiple compatible PACSs. Dynamic worklists, with lesser flexibility, can be constructed if the incompatible PACSs support specific DICOM functionality. Enterprise-wide worklists were implemented across Generations Plus/Northern Manhattan Health Network, linking radiology departments of three hospitals (Harlem, Lincoln, and Metropolitan) with 1465 beds and 4260 ambulatory patients per day. Enterprise-wide, dynamic worklist management improves utilization of radiologists and enhances the quality of care across large multi-facility health care delivery organizations. Integration of other workflow-related components remain a significant challenge.
Network image data bank prototype: the RSI project (Resume de Sortie Images)
NASA Astrophysics Data System (ADS)
Abouchi, Nacer; Jourlin, Michel; Bohbot, Oriana; Faurie, Catherine; Grisel, Richard
1995-02-01
The Hospital Edouard Herriot in Lyon and 3M company, associated with the Electronic Department of Physics Chimics and Electronic Engineering School (CPE), decided in 1993 to begin a study on a project of image network. This project is composed of many practical applications to be checked one by one. The purpose of this paper is to discuss the context, which is kind of small picture archiving and communication system (PACS), to explain the methodology which has been used related to hardware and software implementation, and to give examples of the first results obtained. One of the main interests of the results is the possibility to obtain on the same support, 3M laser imager, a film including images from different modalities and abstract summing up the patient stay in the hospital. The framework used is built around Omnis7 and C++ language on a PC computer.
NASA Astrophysics Data System (ADS)
Waldhauser, F.; Schaff, D. P.
2012-12-01
Archives of digital seismic data recorded by seismometer networks around the world have grown tremendously over the last several decades helped by the deployment of seismic stations and their continued operation within the framework of monitoring earthquake activity and verification of the Nuclear Test-Ban Treaty. We show results from our continuing effort in developing efficient waveform cross-correlation and double-difference analysis methods for the large-scale processing of regional and global seismic archives to improve existing earthquake parameter estimates, detect seismic events with magnitudes below current detection thresholds, and improve real-time monitoring procedures. We demonstrate the performance of these algorithms as applied to the 28-year long seismic archive of the Northern California Seismic Network. The tools enable the computation of periodic updates of a high-resolution earthquake catalog of currently over 500,000 earthquakes using simultaneous double-difference inversions, achieving up to three orders of magnitude resolution improvement over existing hypocenter locations. This catalog, together with associated metadata, form the underlying relational database for a real-time double-difference scheme, DDRT, which rapidly computes high-precision correlation times and hypocenter locations of new events with respect to the background archive (http://ddrt.ldeo.columbia.edu). The DDRT system facilitates near-real-time seismicity analysis, including the ability to search at an unprecedented resolution for spatio-temporal changes in seismogenic properties. In areas with continuously recording stations, we show that a detector built around a scaled cross-correlation function can lower the detection threshold by one magnitude unit compared to the STA/LTA based detector employed at the network. This leads to increased event density, which in turn pushes the resolution capability of our location algorithms. On a global scale, we are currently building the computational framework for double-difference processing the combined parametric and waveform archives of the ISC, NEIC, and IRIS with over three million recorded earthquakes worldwide. Since our methods are scalable and run on inexpensive Beowulf clusters, periodic re-analysis of such archives may thus become a routine procedure to continuously improve resolution in existing global earthquake catalogs. Results from subduction zones and aftershock sequences of recent great earthquakes demonstrate the considerable social and economic impact that high-resolution images of active faults, when available in real-time, will have in the prompt evaluation and mitigation of seismic hazards. These results also highlight the need for consistent long-term seismic monitoring and archiving of records.
Instantaneous network RTK in Orange County, California
NASA Astrophysics Data System (ADS)
Bock, Y.
2003-04-01
The Orange County Real Time GPS Network (OCRTN) is an upgrade of a sub-network of SCIGN sites in southern California to low latency (1-2 sec), high-rate (1 Hz) data streaming, analysis, and dissemination. The project is a collaborative effort of the California Spatial Reference Center (CSRC) and the Orange County Public Resource and Facilities Division, with partners from the geophysical community, local and state government, and the private sector. Currently, ten sites are streaming 1 Hz raw data (Ashtech binary MBEN format) by means of dedicated, point-to-point radio modems to a network hub that translates the asynchronous serial data to TCP/IP and onto a PC workstation residing on a local area network. Software residing on the PC allows multiple clients to access the raw data simultaneously though TCP/IP. One of the clients is a Geodetics RTD server that receives and archives (1) the raw 1 Hz network data, (2) estimates of instantaneous positions and zenith tropospheric delays for quality control and detection of ground motion, and (3) RINEX data to decimated to 30 seconds. Data recovery is typically 99-100%. The server also produces 1 Hz RTCM data (messages 18, 19, 3 and 22) that are available by means of TCP/IP to RTK clients with wireless Internet modems. Coverage is excellent throughout the county. The server supports standard RTK users and is compatible with existing GPS instrumentation. Typical latency is 1-2 s, with initialization times of several seconds to minutes OCRTN site spacing is 10-15 km. In addition, the server supports “smart clients” who can retrieve data from the closest n sites (typically 3) and obtain an instantaneous network RTK position with 1-2 s latency. This mode currently requires a PDA running the RTD client software, and a wireless card. Since there is no initialization and re-initialization required this approach is well suited to support high-precision (centimeter-level) dynamic applications such as intelligent transportation and aircraft landing. We will discuss the results of field tests of this system, indicating that instantaneous network RTK can be performed accurately and reliably. If an Internet connection is available we will present a real-time demonstration.
Clinical experience with a high-performance ATM-connected DICOM archive for cardiology
NASA Astrophysics Data System (ADS)
Solomon, Harry P.
1997-05-01
A system to archive large image sets, such as cardiac cine runs, with near realtime response must address several functional and performance issues, including efficient use of a high performance network connection with standard protocols, an architecture which effectively integrates both short- and long-term mass storage devices, and a flexible data management policy which allows optimization of image distribution and retrieval strategies based on modality and site-specific operational use. Clinical experience with such as archive has allowed evaluation of these systems issues and refinement of a traffic model for cardiac angiography.
Fitzgerald, Louise; Harvey, Gill
2015-08-01
International attention has focussed on the variations between research evidence and practice in healthcare. This prompted the creation of formalized translational networks consisting of academic-service partnerships. The English Collaborations for Leadership in Applied Health Research and Care (CLAHRCs) are one example of a translational network. Using longitudinal, archival case study data from one CLAHRC over a 3-year period (2008-11), this article explores the relationship between organizational form and the function(s) of a translational network. The article focuses on the research gaps on the effective structures and appropriate governance to support a translational network. Data analysis suggested that the policy of setting up translational networks is insufficient of itself to produce positive translational activity. The data indicate that to leverage the benefits of the whole network, attention must be paid to devising a structure which integrates research production and use and facilitates lateral cross-disciplinary and cross-organizational communication. Equally, appropriate governance arrangements are necessary, particularly in large, multi-stakeholder networks, where shared governance may be questionable. Inappropriate network structure and governance inhibits the potential of the translational network. Finally, the case provides insights into the movement of knowledge within and between network organizations. The data demonstrate that knowledge mobilization extends beyond knowledge translation; knowledge mobilization includes the negotiated utilization of knowledge - a balanced power form of collaboration. Whilst much translational effort is externally focused on the health system, our findings highlight the essential need for the internal negotiation and mobilization of knowledge within academia. Copyright © 2015 Elsevier Ltd. All rights reserved.
[Management and development of the dangerous preparation archive].
Binetti, Roberto; Longo, Marcello; Scimonelli, Luigia; Costamagna, Francesca
2006-01-01
In the year 2000 an archive of dangerous preparations was created at the National Health Institute (Istituto Superiore di Sanità), following a principle included in the Directive 88/379/EEC on dangerous preparations, subsequently modified by the Directive 1999/45/EC, concerning the creation of a data bank on dangerous preparations in each European country. The information stored in the archive is useful for purposes of health consumer's and workers protection and prevention, and particularly in case of acute poisonings. The archive is fully informatised, therefore the companies can send the information using the web and the authorized Poison Centres can find the information on the archive using the web. In each Member State different procedures are in place to comply with the 1999/45/EC Directive; therefore an international coordination could be useful in order to create an European network of national data-banks on dangerous preparations.
[A new concept for integration of image databanks into a comprehensive patient documentation].
Schöll, E; Holm, J; Eggli, S
2001-05-01
Image processing and archiving are of increasing importance in the practice of modern medicine. Particularly due to the introduction of computer-based investigation methods, physicians are dealing with a wide variety of analogue and digital picture archives. On the other hand, clinical information is stored in various text-based information systems without integration of image components. The link between such traditional medical databases and picture archives is a prerequisite for efficient data management as well as for continuous quality control and medical education. At the Department of Orthopedic Surgery, University of Berne, a software program was developed to create a complete multimedia electronic patient record. The client-server system contains all patients' data, questionnaire-based quality control, and a digital picture archive. Different interfaces guarantee the integration into the hospital's data network. This article describes our experiences in the development and introduction of a comprehensive image archiving system at a large orthopedic center.
A Geospatial Database that Supports Derivation of Climatological Features of Severe Weather
NASA Astrophysics Data System (ADS)
Phillips, M.; Ansari, S.; Del Greco, S.
2007-12-01
The Severe Weather Data Inventory (SWDI) at NOAA's National Climatic Data Center (NCDC) provides user access to archives of several datasets critical to the detection and evaluation of severe weather. These datasets include archives of: · NEXRAD Level-III point features describing general storm structure, hail, mesocyclone and tornado signatures · National Weather Service Storm Events Database · National Weather Service Local Storm Reports collected from storm spotters · National Weather Service Warnings · Lightning strikes from Vaisala's National Lightning Detection Network (NLDN) SWDI archives all of these datasets in a spatial database that allows for convenient searching and subsetting. These data are accessible via the NCDC web site, Web Feature Services (WFS) or automated web services. The results of interactive web page queries may be saved in a variety of formats, including plain text, XML, Google Earth's KMZ, standards-based NetCDF and Shapefile. NCDC's Storm Risk Assessment Project (SRAP) uses data from the SWDI database to derive gridded climatology products that show the spatial distributions of the frequency of various events. SRAP also can relate SWDI events to other spatial data such as roads, population, watersheds, and other geographic, sociological, or economic data to derive products that are useful in municipal planning, emergency management, the insurance industry, and other areas where there is a need to quantify and qualify how severe weather patterns affect people and property.
NASA Astrophysics Data System (ADS)
Wilson, Dennis L.; Glicksman, Robert A.
1994-05-01
A Picture Archiving and Communications System (PACS) must be able to support the image rate of the medical treatment facility. In addition the PACS must have adequate working storage and archive storage capacity required. The calculation of the number of images per minute and the capacity of working storage and of archiving storage is discussed. The calculation takes into account the distribution of images over the different size of radiological images, the distribution between inpatient and outpatient, and the distribution over plain film CR images and other modality images. The support of the indirect clinical image load is difficult to estimate and is considered in some detail. The result of the exercise for a particular hospital is an estimate of the average size of the images and exams on the system, of the number of gigabytes of working storage, of the number of images moved per minute, of the size of the archive in gigabytes, and of the number of images that are to be moved by the archive per minute. The types of storage required to support the image rates and the capacity required are discussed.
NASA Astrophysics Data System (ADS)
Long, C. N.; Augustine, J. A.; McComiskey, A. C.
2017-12-01
The NOAA Earth Systems Research Laboratory (ESRL) Global Monitoring Division (GMD) operates a network of seven surface radiation budget sites (SURFRAD) across the continental United States. The SURFRAD network was established in 1993 with the primary objective to support climate research with accurate, continuous, long-term measurements of the surface radiation budget over the United States and is a major contributor to the WMO international Baseline Surface Radiation Network. The data from the SURFRAD sites have been used in many studies including trend analyses of surface solar brightening (Long et al, 2009; Augustine and Dutton, 2013; Gan et al., 2015). These studies have focused mostly on long term aggregate trends. Here we will present results of studies that take a closer look across the years of the cloud influence on the surface radiation budget components partitioned by seasonal and diurnal analyses, and using derived quantities now available from the SURFRAD data archive produced by the Radiative Flux Analysis value added processing. The results show distinct differences between the sites surface radiative energy budgets and cloud radiative effects due to their differing climates and latitudinal locations.
Global Change Data Center: Mission, Organization, Major Activities, and 2003 Highlights
NASA Technical Reports Server (NTRS)
2004-01-01
Rapid, efficient access to Earth sciences data from satellites and ground validation stations is fundamental to the nation's efforts to understand the effects of global environmental changes and their implications for public policy. It becomes a bigger challenge in the future when data volumes increase from current levels to terabytes per day. Demands on data storage, data access, network throughput, processing power, and database and information management are increased by orders of magnitude, while budgets remain constant and even shrink.The Global Change Data Center's (GCDC) mission is to develop and operate data systems, generate science products, and provide archival and distribution services for Earth science data in support of the U.S. Global Change Program and NASA's Earth Sciences Enterprise. The ultimate product of the GCDC activities is access to data to support research, education, and public policy.
A comprehensive cost model for NASA data archiving
NASA Technical Reports Server (NTRS)
Green, J. L.; Klenk, K. F.; Treinish, L. A.
1990-01-01
A simple archive cost model has been developed to help predict NASA's archiving costs. The model covers data management activities from the beginning of the mission through launch, acquisition, and support of retrospective users by the long-term archive; it is capable of determining the life cycle costs for archived data depending on how the data need to be managed to meet user requirements. The model, which currently contains 48 equations with a menu-driven user interface, is available for use on an IBM PC or AT.
New Developments At The Science Archives Of The NASA Exoplanet Science Institute
NASA Astrophysics Data System (ADS)
Berriman, G. Bruce
2018-06-01
The NASA Exoplanet Science Institute (NExScI) at Caltech/IPAC is the science center for NASA's Exoplanet Exploration Program and as such, NExScI operates three scientific archives: the NASA Exoplanet Archive (NEA) and Exoplanet Follow-up Observation Program Website (ExoFOP), and the Keck Observatory Archive (KOA).The NASA Exoplanet Archive supports research and mission planning by the exoplanet community by operating a service that provides confirmed and candidate planets, numerous project and contributed data sets and integrated analysis tools. The ExoFOP provides an environment for exoplanet observers to share and exchange data, observing notes, and information regarding the Kepler, K2, and TESS candidates. KOA serves all raw science and calibration observations acquired by all active and decommissioned instruments at the W. M. Keck Observatory, as well as reduced data sets contributed by Keck observers.In the coming years, the NExScI archives will support a series of major endeavours allowing flexible, interactive analysis of the data available at the archives. These endeavours exploit a common infrastructure based upon modern interfaces such as JuypterLab and Python. The first service will enable reduction and analysis of precision radial velocity data from the HIRES Keck instrument. The Exoplanet Archive is developing a JuypterLab environment based on the HIRES PRV interactive environment. Additionally, KOA is supporting an Observatory initiative to develop modern, Python based pipelines, and as part of this work, it has delivered a NIRSPEC reduction pipeline. The ensemble of pipelines will be accessible through the same environments.
NASA Astrophysics Data System (ADS)
Martinez, Ralph; Nam, Jiseung
1992-07-01
Picture Archiving and Communication Systems (PACS) is an integration of digital image formation in a hospital, which encompasses various imaging equipment, image viewing workstations, image databases, and a high speed network. The integration requires a standardization of communication protocols to connect devices from different vendors. The American College of Radiology and the National Electrical Manufacturers Association (ACR- NEMA) standard Version 2.0 provides a point-to-point hardware interface, a set of software commands, and a consistent set of data formats for PACS. But, it is inadequate for PACS networking environments, because of its point-to-point nature and its inflexibility to allow other services and protocols in the future. Based on previous experience of PACS developments in The University of Arizona, a new communication protocol for PACS networks and an approach were proposed to ACR-NEMA Working Group VI. The defined PACS protocol is intended to facilitate the development of PACS''s capable of interfacing with other hospital information systems. Also, it is intended to allow the creation of diagnostic information data bases which can be interrogated by a variety of distributed devices. A particularly important goal is to support communications in a multivendor environment. The new protocol specifications are defined primarily as a combination of the International Organization for Standardization/Open Systems Interconnection (ISO/OSI), TCP/IP protocols, and the data format portion of ACR-NEMA standard. This paper addresses the specification and implementation of the ISO-based protocol into a PACS prototype. The protocol specification, which covers Presentation, Session, Transport, and Network layers, is summarized briefly. The protocol implementation is discussed based on our implementation efforts in the UNIX Operating System Environment. At the same time, results of performance comparison between the ISO and TCP/IP implementations are presented to demonstrate the implementation of defined protocol. The testing of performance analysis is done by prototyping PACS on available platforms, which are Micro VAX II, DECstation and SUN Workstation.
Architecture for a PACS primary diagnosis workstation
NASA Astrophysics Data System (ADS)
Shastri, Kaushal; Moran, Byron
1990-08-01
A major factor in determining the overall utility of a medical Picture Archiving and Communications (PACS) system is the functionality of the diagnostic workstation. Meyer-Ebrecht and Wendler [1] have proposed a modular picture computer architecture with high throughput and Perry et.al [2] have defined performance requirements for radiology workstations. In order to be clinically useful, a primary diagnosis workstation must not only provide functions of current viewing systems (e.g. mechanical alternators [3,4]) such as acceptable image quality, simultaneous viewing of multiple images, and rapid switching of image banks; but must also provide a diagnostic advantage over the current systems. This includes window-level functions on any image, simultaneous display of multi-modality images, rapid image manipulation, image processing, dynamic image display (cine), electronic image archival, hardcopy generation, image acquisition, network support, and an easy user interface. Implementation of such a workstation requires an underlying hardware architecture which provides high speed image transfer channels, local storage facilities, and image processing functions. This paper describes the hardware architecture of the Siemens Diagnostic Reporting Console (DRC) which meets these requirements.
NASA Astrophysics Data System (ADS)
Fowler, D. K.; Moses, J. F.; Duerr, R. E.; Webster, D.; Korn, D.
2010-12-01
Data Stewardship is becoming a principal part of a data manager’s work at NSIDC. It is vitally important that our organization makes a commitment to both current and long-term goals of data management and the preservation of our scientific data. Data must be available to researchers not only during active missions, but long after missions end. This includes maintaining accurate documentation, data tools, and a knowledgeable user support staff. NSIDC is preparing for long-term support of the ICESat mission data. Though ICESat has seen its last operational day, the data is still being improved and NSIDC is scheduled to archive the final release, Release 33, starting late in 2010. This release will include the final adjustments to the processing algorithms and will produce the best possible products to date. Along with the higher-level data sets, all supporting documentation will be archived at NSIDC. For the long-term archive, it is imperative that there is sufficient information about how products were prepared in order to convince future researchers that the scientific results are reproducible. The processing algorithms along with the Level 0 and ancillary products used to create the higher-level products will be archived and made available to users. This can enable users to examine production history, to derive revised products and to create their own products. Also contained in the long-term archive will be pre-launch, calibration/validation, and test data. These data are an important part of the provenance which must be preserved. For longevity, we’ll need to archive the data and documentation in formats that will be supported in the years to come.
NASA Astrophysics Data System (ADS)
Berriman, G. B.; Ciardi, D. R.; Good, J. C.; Laity, A. C.; Zhang, A.
2006-07-01
At ADASS XIV, we described how the W. M. Keck Observatory Archive (KOA) re-uses and extends the component based architecture of the NASA/IPAC Infrared Science Archive (IRSA) to ingest and serve level 0 observations made with HIRES, the High Resolution Echelle Spectrometer. Since August 18, the KOA has ingested 325 GB of data from 135 nights of observations. The architecture exploits a service layer between the mass storage layer and the user interface. This service layer consists of standalone utilities called through a simple executive that perform generic query and retrieval functions, such as query generation, database table sub-setting, and return page generation etc. It has been extended to implement proprietary access to data through deployment of query management middleware developed for the National Virtual Observatory. The MSC archives have recently extended this design to query and retrieve complex data sets describing the properties of potential target stars for the Terrestrial Planet Finder (TPF) missions. The archives can now support knowledge based retrieval, as well as data retrieval. This paper describes how extensions to the IRSA architecture, which is applicable across all wavelengths and astronomical datatypes, supports the design and development of the MSC NP archives at modest cost.
Suomi Npp and Jpss Pre-Launch Test Data Collection and Archive
NASA Astrophysics Data System (ADS)
Denning, M.; Ullman, R.; Guenther, B.; Kilcoyne, H.; Chandler, C.; Adameck, J.
2012-12-01
During the development of each Suomi National Polar-orbiting Partnership (Suomi NPP) instrument, significant testing was performed, both in ambient and simulated orbital (thermal-vacuum) conditions, at the instrument factory, and again after integration with the spacecraft. The NPOESS Integrated Program Office (IPO), and later the NASA Joint Polar Satellite System (JPSS) Program Office, defined two primary objectives with respect to capturing instrument and spacecraft test data during these test events. The first objective was to disseminate test data and auxiliary documentation to an often distributed network of scientists to permit timely production of independent assessments of instrument performance, calibration, data quality, and test progress. The second goal was to preserve the data and documentation in a catalogued government archive for the life of the mission, to aid in the resolution of anomalies and to facilitate the comparison of on-orbit instrument operating characteristics to those observed prior to launch. In order to meet these objectives, Suomi NPP pre-launch test data collection, distribution, processing, and archive methods included adaptable support infrastructures to quickly and completely transfer test data and documentation from the instrument and spacecraft factories to sensor scientist teams on-site at the factory and around the country. These methods were unique, effective, and low in cost. These efforts supporting pre-launch instrument calibration permitted timely data quality assessments and technical feedback from contributing organizations within the government, academia, and industry, and were critical in supporting timely sensor development. Second, in parallel to data distribution to the sensor science teams, pre-launch test data were transferred and ingested into the central Suomi NPP calibration and validation (cal/val) system, known as the Government Resource for Algorithm Verification, Independent Testing, and Evaluation (GRAVITE), where they will reside for the life of the mission. As a result, data and documentation are available for query, analysis, and download by the cal/val community via the command-line GRAVITE Transfer Protocol (GTP) tool or via the NOAA-collaborative website "CasaNOSA". Instrument and spacecraft test data, telemetry, and ground support equipment information were collected and organized with detailed test procedures, logs, analyses, characterizations, and reports. This 45 Terabyte archive facilitates the comparison of on-orbit Suomi NPP operating characteristics with that observed prior to launch, and will serve as a resource to aid in the assessment of pre-launch JPSS-1 sensor performance. In summary, this paper will present the innovative pre-launch test data campaign infrastructures employed for Suomi NPP and planned for JPSS-1.
NASA Technical Reports Server (NTRS)
Singley, P. T.; Bell, J. D.; Daugherty, P. F.; Hubbs, C. A.; Tuggle, J. G.
1993-01-01
This paper will discuss user interface development and the structure and use of metadata for the Atmospheric Radiation Measurement (ARM) Archive. The ARM Archive, located at Oak Ridge National Laboratory (ORNL) in Oak Ridge, Tennessee, is the data repository for the U.S. Department of Energy's (DOE's) ARM Project. After a short description of the ARM Project and the ARM Archive's role, we will consider the philosophy and goals, constraints, and prototype implementation of the user interface for the archive. We will also describe the metadata that are stored at the archive and support the user interface.
Cardio-PACs: a new opportunity
NASA Astrophysics Data System (ADS)
Heupler, Frederick A., Jr.; Thomas, James D.; Blume, Hartwig R.; Cecil, Robert A.; Heisler, Mary
2000-05-01
It is now possible to replace film-based image management in the cardiac catheterization laboratory with a Cardiology Picture Archiving and Communication System (Cardio-PACS) based on digital imaging technology. The first step in the conversion process is installation of a digital image acquisition system that is capable of generating high-quality DICOM-compatible images. The next three steps, which are the subject of this presentation, involve image display, distribution, and storage. Clinical requirements and associated cost considerations for these three steps are listed below: Image display: (1) Image quality equal to film, with DICOM format, lossless compression, image processing, desktop PC-based with color monitor, and physician-friendly imaging software; (2) Performance specifications include: acquire 30 frames/sec; replay 15 frames/sec; access to file server 5 seconds, and to archive 5 minutes; (3) Compatibility of image file, transmission, and processing formats; (4) Image manipulation: brightness, contrast, gray scale, zoom, biplane display, and quantification; (5) User-friendly control of image review. Image distribution: (1) Standard IP-based network between cardiac catheterization laboratories, file server, long-term archive, review stations, and remote sites; (2) Non-proprietary formats; (3) Bidirectional distribution. Image storage: (1) CD-ROM vs disk vs tape; (2) Verification of data integrity; (3) User-designated storage capacity for catheterization laboratory, file server, long-term archive. Costs: (1) Image acquisition equipment, file server, long-term archive; (2) Network infrastructure; (3) Review stations and software; (4) Maintenance and administration; (5) Future upgrades and expansion; (6) Personnel.
Software For Monitoring A Computer Network
NASA Technical Reports Server (NTRS)
Lee, Young H.
1992-01-01
SNMAT is rule-based expert-system computer program designed to assist personnel in monitoring status of computer network and identifying defective computers, workstations, and other components of network. Also assists in training network operators. Network for SNMAT located at Space Flight Operations Center (SFOC) at NASA's Jet Propulsion Laboratory. Intended to serve as data-reduction system providing windows, menus, and graphs, enabling users to focus on relevant information. SNMAT expected to be adaptable to other computer networks; for example in management of repair, maintenance, and security, or in administration of planning systems, billing systems, or archives.
NASA Astrophysics Data System (ADS)
Nass, A.; D'Amore, M.; Helbert, J.
2018-04-01
An archiving structure and reference level of derived and already published data supports the scientific community significantly by a constant rise of knowledge and understanding based on recent discussions within Information Science and Management.
High Performance Computing and Networking for Science--Background Paper.
ERIC Educational Resources Information Center
Congress of the U.S., Washington, DC. Office of Technology Assessment.
The Office of Technology Assessment is conducting an assessment of the effects of new information technologies--including high performance computing, data networking, and mass data archiving--on research and development. This paper offers a view of the issues and their implications for current discussions about Federal supercomputer initiatives…
AstroCloud, a Cyber-Infrastructure for Astronomy Research: Data Archiving and Quality Control
NASA Astrophysics Data System (ADS)
He, B.; Cui, C.; Fan, D.; Li, C.; Xiao, J.; Yu, C.; Wang, C.; Cao, Z.; Chen, J.; Yi, W.; Li, S.; Mi, L.; Yang, S.
2015-09-01
AstroCloud is a cyber-Infrastructure for Astronomy Research initiated by Chinese Virtual Observatory (China-VO) under funding support from NDRC (National Development and Reform commission) and CAS (Chinese Academy of Sciences)1(Cui et al. 2014). To archive the astronomical data in China, we present the implementation of the astronomical data archiving system (ADAS). Data archiving and quality control are the infrastructure for the AstroCloud. Throughout the data of the entire life cycle, data archiving system standardized data, transferring data, logging observational data, archiving ambient data, And storing these data and metadata in database. Quality control covers the whole process and all aspects of data archiving.
Building a COTS archive for satellite data
NASA Technical Reports Server (NTRS)
Singer, Ken; Terril, Dave; Kelly, Jack; Nichols, Cathy
1994-01-01
The goal of the NOAA/NESDIS Active Archive was to provide a method of access to an online archive of satellite data. The archive had to manage and store the data, let users interrogate the archive, and allow users to retrieve data from the archive. Practical issues of the system design such as implementation time, cost and operational support were examined in addition to the technical issues. There was a fixed window of opportunity to create an operational system, along with budget and staffing constraints. Therefore, the technical solution had to be designed and implemented subject to constraint imposed by the practical issues. The NOAA/NESDIS Active Archive came online in July of 1994, meeting all of its original objectives.
CD-based image archival and management on a hybrid radiology intranet.
Cox, R D; Henri, C J; Bret, P M
1997-08-01
This article describes the design and implementation of a low-cost image archival and management solution on a radiology network consisting of UNIX, IBM personal computer-compatible (IBM, Purchase, NY) and Macintosh (Apple Computer, Cupertino, CA) workstations. The picture archiving and communications system (PACS) is modular, scaleable and conforms to the Digital Imaging and Communications in Medicine (DICOM) 3.0 standard for image transfer, storage and retrieval. Image data is made available on soft-copy reporting workstations by a work-flow management scheme and on desktop computers through a World Wide Web (WWW) interface. Data archival is based on recordable compact disc (CD) technology and is automated. The project has allowed the radiology department to eliminate the use of film in magnetic resonance (MR) imaging, computed tomography (CT) and ultrasonography.
Turning Archival Tapes into an Online “Cardless” Catalog
Zuckerman, Alan E.; Ewens, Wilma A.; Cannard, Bonnie G.; Broering, Naomi C.
1982-01-01
Georgetown University has created an online card catalog based on machine readable cataloging records (MARC) loaded from archival tapes or online via the OCLC network. The system is programmed in MUMPS and uses the medical subject headings (MeSH) authority file created by the National Library of Medicine. The online catalog may be searched directly by library users and has eliminated the need for manual filing of catalog cards.
Optical Properties of Aerosols from Long Term Ground-Based Aeronet Measurements
NASA Technical Reports Server (NTRS)
Holben, B. N.; Tanre, D.; Smirnov, A.; Eck, T. F.; Slutsker, I.; Dubovik, O.; Lavenu, F.; Abuhassen, N.; Chatenet, B.
1999-01-01
AERONET is an optical ground-based aerosol monitoring network and data archive supported by NASA's Earth Observing System and expanded by federation with many non-NASA institutions including AEROCAN (AERONET CANada) and PHOTON (PHOtometrie pour le Traiteinent Operatonnel de Normalisation Satellitaire). The network hardware consists of identical automatic sun-sky scanning spectral radiometers owned by national agencies and universities purchased for their own monitoring and research objectives. Data are transmitted hourly through the data collection system (DCS) on board the geostationary meteorological satellites GMS, GOES and METEOSAT and received in a common archive for daily processing utilizing a peer reviewed series of algorithms thus imposing a standardization and quality control of the product data base. Data from this collaboration provides globally distributed near real time observations of aerosol spectral optical depths, aerosol size distributions, and precipitable water in diverse aerosol regimes. Access to the AERONET data base has shifted from the interactive program 'demonstrat' (reserved for PI's) to the AERONET homepage allowing faster access and greater development for GIS object oriented retrievals and analysis with companion geocoded data sets from satellites, LIDAR and solar flux measurements for example. We feel that a significant yet under utilized component of the AERONET data base are inversion products made from hourly principal plane and almucanter measurements. The current inversions have been shown to retrieve aerosol volume size distributions. A significant enhancement to the inversion code has been developed and is presented in these proceedings.
NASA GES DISC support of CO2 Data from OCO-2, ACOS, and AIRS
NASA Technical Reports Server (NTRS)
Wei, Jennifer C; Vollmer, Bruce E.; Savtchenko, Andrey K.; Hearty, Thomas J; Albayrak, Rustem Arif; Deshong, Barbara E.
2013-01-01
NASA Goddard Earth Sciences Data and Information Services Centers (GES DISC) is the data center assigned to archive and distribute current AIRS, ACOS data and data from the upcoming OCO-2 mission. The GES DISC archives and supports data containing information on CO2 as well as other atmospheric composition, atmospheric dynamics, modeling and precipitation. Along with the data stewardship, an important mission of GES DISC is to facilitate access to and enhance the usability of data as well as to broaden the user base. GES DISC strives to promote the awareness of science content and novelty of the data by working with Science Team members and releasing news articles as appropriate. Analysis of events that are of interest to the general public, and that help in understanding the goals of NASA Earth Observing missions, have been among most popular practices.Users have unrestricted access to a user-friendly search interface, Mirador, that allows temporal, spatial, keyword and event searches, as well as an ontology-driven drill down. Variable subsetting, format conversion, quality screening, and quick browse, are among the services available in Mirador. The majority of the GES DISC data are also accessible through OPeNDAP (Open-source Project for a Network Data Access Protocol) and WMS (Web Map Service). These services add more options for specialized subsetting, format conversion, image viewing and contributing to data interoperability.
What to do with a Dead Research Code
NASA Astrophysics Data System (ADS)
Nemiroff, Robert J.
2016-01-01
The project has ended -- should all of the computer codes that enabled the project be deleted? No. Like research papers, research codes typically carry valuable information past project end dates. Several possible end states to the life of research codes are reviewed. Historically, codes are typically left dormant on an increasingly obscure local disk directory until forgotten. These codes will likely become any or all of: lost, impossible to compile and run, difficult to decipher, and likely deleted when the code's proprietor moves on or dies. It is argued here, though, that it would be better for both code authors and astronomy generally if project codes were archived after use in some way. Archiving is advantageous for code authors because archived codes might increase the author's ADS citable publications, while astronomy as a science gains transparency and reproducibility. Paper-specific codes should be included in the publication of the journal papers they support, just like figures and tables. General codes that support multiple papers, possibly written by multiple authors, including their supporting websites, should be registered with a code registry such as the Astrophysics Source Code Library (ASCL). Codes developed on GitHub can be archived with a third party service such as, currently, BackHub. An important code version might be uploaded to a web archiving service like, currently, Zenodo or Figshare, so that this version receives a Digital Object Identifier (DOI), enabling it to found at a stable address into the future. Similar archiving services that are not DOI-dependent include perma.cc and the Internet Archive Wayback Machine at archive.org. Perhaps most simply, copies of important codes with lasting value might be kept on a cloud service like, for example, Google Drive, while activating Google's Inactive Account Manager.
MIRAGE: The data acquisition, analysis, and display system
NASA Technical Reports Server (NTRS)
Rosser, Robert S.; Rahman, Hasan H.
1993-01-01
Developed for the NASA Johnson Space Center and Life Sciences Directorate by GE Government Services, the Microcomputer Integrated Real-time Acquisition Ground Equipment (MIRAGE) system is a portable ground support system for Spacelab life sciences experiments. The MIRAGE system can acquire digital or analog data. Digital data may be NRZ-formatted telemetry packets of packets from a network interface. Analog signal are digitized and stored in experimental packet format. Data packets from any acquisition source are archived to a disk as they are received. Meta-parameters are generated from the data packet parameters by applying mathematical and logical operators. Parameters are displayed in text and graphical form or output to analog devices. Experiment data packets may be retransmitted through the network interface. Data stream definition, experiment parameter format, parameter displays, and other variables are configured using spreadsheet database. A database can be developed to support virtually any data packet format. The user interface provides menu- and icon-driven program control. The MIRAGE system can be integrated with other workstations to perform a variety of functions. The generic capabilities, adaptability and ease of use make the MIRAGE a cost-effective solution to many experimental data processing requirements.
Investigation of air transportation technology at Princeton University, 1991-1992
NASA Technical Reports Server (NTRS)
Stengel, Robert F.
1993-01-01
The Air Transportation Research Program at Princeton University proceeded along six avenues during the past year: (1) intelligent flight control; (2) computer-aided control system design; (3) neural networks for flight control; (4) stochastic robustness of flight control systems; (5) microburst hazards to aircraft; and (6) fundamental dynamics of atmospheric flight. This research has resulted in a number of publications, including archival papers and conference papers. An annotated bibliography of publications that appeared between June 1991 and June 1992 appears at the end of this report. The research that these papers describe was supported in whole or in part by the Joint University Program, including work that was completed prior to the reporting period.
gkmSVM: an R package for gapped-kmer SVM
Ghandi, Mahmoud; Mohammad-Noori, Morteza; Ghareghani, Narges; Lee, Dongwon; Garraway, Levi; Beer, Michael A.
2016-01-01
Summary: We present a new R package for training gapped-kmer SVM classifiers for DNA and protein sequences. We describe an improved algorithm for kernel matrix calculation that speeds run time by about 2 to 5-fold over our original gkmSVM algorithm. This package supports several sequence kernels, including: gkmSVM, kmer-SVM, mismatch kernel and wildcard kernel. Availability and Implementation: gkmSVM package is freely available through the Comprehensive R Archive Network (CRAN), for Linux, Mac OS and Windows platforms. The C ++ implementation is available at www.beerlab.org/gkmsvm Contact: mghandi@gmail.com or mbeer@jhu.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:27153639
The Telecommunications and Data Acquisition Report
NASA Technical Reports Server (NTRS)
Posner, E. C. (Editor)
1990-01-01
Archival reports are given on developments in programs managed by JPL's Office of Telecommunications and Data Acquisition (TDA), including space communications, radio navigation, radio science, ground-based radio and radar astronomy, and the Deep Space Network (DSN) and its associated Ground Communications Facility (GCF) in planning, supporting research and technology, implementation, and operations. Also included is TDA-funded activity at JPL on data and information systems and reimbursable DSN work performed for other space agencies through NASA. In the search for extraterrestrial intelligence (SETI), implementation and operations for searching the microwave spectrum are reported. Use of the Goldstone Solar System Radar for scientific exploration of the planets, their rings and satellites, asteroids, and comets are discussed.
The Hydrologic Cycle Distributed Active Archive Center
NASA Technical Reports Server (NTRS)
Hardin, Danny M.; Goodman, H. Michael
1995-01-01
The Marshall Space Flight Center Distributed Active Archive Center in Huntsville, Alabama supports the acquisition, production, archival and dissemination of data relevant to the study of the global hydrologic cycle. This paper describes the Hydrologic Cycle DAAC, surveys its principle data holdings, addresses future growth, and gives information for accessing the data sets.
Nuclear Physics Science Network Requirements Workshop, May 2008 - Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tierney, Ed., Brian L; Dart, Ed., Eli; Carlson, Rich
2008-11-10
The Energy Sciences Network (ESnet) is the primary provider of network connectivity for the US Department of Energy Office of Science, the single largest supporter of basic research in the physical sciences in the United States of America. In support of the Office of Science programs, ESnet regularly updates and refreshes its understanding of the networking requirements of the instruments, facilities, scientists, and science programs that it serves. This focus has helped ESnet to be a highly successful enabler of scientific discovery for over 20 years. In May 2008, ESnet and the Nuclear Physics (NP) Program Office of the DOEmore » Office of Science organized a workshop to characterize the networking requirements of the science programs funded by the NP Program Office. Most of the key DOE sites for NP related work will require significant increases in network bandwidth in the 5 year time frame. This includes roughly 40 Gbps for BNL, and 20 Gbps for NERSC. Total transatlantic requirements are on the order of 40 Gbps, and transpacific requirements are on the order of 30 Gbps. Other key sites are Vanderbilt University and MIT, which will need on the order of 20 Gbps bandwidth to support data transfers for the CMS Heavy Ion program. In addition to bandwidth requirements, the workshop emphasized several points in regard to science process and collaboration. One key point is the heavy reliance on Grid tools and infrastructure (both PKI and tools such as GridFTP) by the NP community. The reliance on Grid software is expected to increase in the future. Therefore, continued development and support of Grid software is very important to the NP science community. Another key finding is that scientific productivity is greatly enhanced by easy researcher-local access to instrument data. This is driving the creation of distributed repositories for instrument data at collaborating institutions, along with a corresponding increase in demand for network-based data transfers and the tools to manage those transfers effectively. Network reliability is also becoming more important as there is often a narrow window between data collection and data archiving when transfer and analysis can be done. The instruments do not stop producing data, so extended network outages can result in data loss due to analysis pipeline stalls. Finally, as the scope of collaboration continues to increase, collaboration tools such as audio and video conferencing are becoming ever more critical to the productivity of scientific collaborations.« less
NASA Technical Reports Server (NTRS)
Zibordi, Giuseppe; Holben, Brent; Slutsker, Ilya; Giles, David; D'Alimonte, Davide; Melin, Frederic; Berthon, Jean-Francois; Vandemark, Doug; Feng, Hui; Schuster, Gregory;
2008-01-01
The Ocean Color component of the Aerosol Robotic Network (AERONET-OC) has been implemented to support long-term satellite ocean color investigations through cross-site consistent and accurate measurements collected by autonomous radiometer systems deployed on offshore fixed platforms. The ultimate purpose of AERONET-OC is the production of standardized measurements performed at different sites with identical measuring systems and protocols, calibrated using a single reference source and method, and processed with the same code. The AERONET-OC primary data product is the normalized water leaving radiance determined at center-wavelengths of interest for satellite ocean color applications, with an uncertainty lower than 5% in the blue-green spectral regions and higher than 8% in the red. Measurements collected at 6 sites counting the northern Adriatic Sea, the Baltic Proper, the Gulf of Finland, the Persian Gulf, and, the northern and southern margins of the Middle Atlantic Bay, have shown the capability of producing quality assured data over a wide range of bio-optical conditions including Case-2 yellow substance- and sedimentdominated waters. This work briefly introduces network elements like: deployment sites, measurement method, instrument calibration, processing scheme, quality-assurance, uncertainties, data archive and products accessibility. Emphases is given to those elements which underline the network strengths (i.e., mostly standardization of any network element) and its weaknesses (i.e., the use of consolidated, but old-fashioned technology). The work also addresses the application of AERONET-OC data to the validation of primary satellite radiometric products over a variety of complex coastal waters and finally provides elements for the identification of new deployment sites most suitable to support satellite ocean color missions.
Huang, H K; Wong, A W; Zhu, X
1997-01-01
Asynchronous transfer mode (ATM) technology emerges as a leading candidate for medical image transmission in both local area network (LAN) and wide area network (WAN) applications. This paper describes the performance of an ATM LAN and WAN network at the University of California, San Francisco. The measurements were obtained using an intensive care unit (ICU) server connecting to four image workstations (WS) at four different locations of a hospital-integrated picture archiving and communication system (HI-PACS) in a daily regular clinical environment. Four types of performance were evaluated: magnetic disk-to-disk, disk-to-redundant array of inexpensive disks (RAID), RAID-to-memory, and memory-to-memory. Results demonstrate that the transmission rate between two workstations can reach 5-6 Mbytes/s from RAID-to-memory, and 8-10 Mbytes/s from memory-to-memory. When the server has to send images to all four workstations simultaneously, the transmission rate to each WS is about 4 Mbytes/s. Both situations are adequate for radiologic image communications for picture archiving and communication systems (PACS) and teleradiology applications.
The NASA Astrophysics Data System joins the Revolution
NASA Astrophysics Data System (ADS)
Accomazzi, Alberto; Kurtz, Michael J.; Henneken, Edwin; Grant, Carolyn S.; Thompson, Donna M.; Chyla, Roman; Holachek, Alexandra; Sudilovsky, Vladimir; Elliott, Jonathan; Murray, Stephen S.
2015-08-01
Whether or not scholarly publications are going through an evolution or revolution, one comforting certainty remains: the NASA Astrophysics Data System (ADS) is here to help the working astronomer and librarian navigate through the increasingly complex communication environment we find ourselves in. Born as a bibliographic database, today's ADS is best described as a an "aggregator" of scholarly resources relevant to the needs of researchers in astronomy and physics. In addition to indexing content from a variety of publishers, data and software archives, the ADS enriches its records by text-mining and indexing the full-text articles, enriching its metadata through the extraction of citations and acknowledgments and the ingest of bibliographies and data links maintained by astronomy institutions and data archives. In addition, ADS generates and maintains citation and co-readership networks to support discovery and bibliometric analysis.In this talk I will summarize new and ongoing curation activities and technology developments of the ADS in the face of the ever-changing world of scholarly publishing and the trends in information-sharing behavior of astronomers. Recent curation efforts include the indexing of non-standard scholarly content (such as software packages, IVOA documents and standards, and NASA award proposals); the indexing of additional content (full-text of articles, acknowledgments, affiliations, ORCID ids); and enhanced support for bibliographic groups and data links. Recent technology developments include a new Application Programming Interface which provides access to a variety of ADS microservices, a new user interface featuring a variety of visualizations and bibliometric analysis, and integration with ORCID services to support paper claiming.
Kalvelage, Thomas A.; Willems, Jennifer
2005-01-01
The US Geological Survey's EROS Data Center (EDC) hosts the Land Processes Distributed Active Archive Center (LP DAAC). The LP DAAC supports NASA's Earth Observing System (EOS), which is a series of polar-orbiting and low inclination satellites for long-term global observations of the land surface, biosphere, solid Earth, atmosphere, and oceans. The EOS Data and Information Systems (EOSDIS) was designed to acquire, archive, manage and distribute Earth observation data to the broadest possible user community.The LP DAAC is one of four DAACs that utilize the EOSDIS Core System (ECS) to manage and archive their data. Since the ECS was originally designed, significant changes have taken place in technology, user expectations, and user requirements. Therefore the LP DAAC has implemented additional systems to meet the evolving needs of scientific users, tailored to an integrated working environment. These systems provide a wide variety of services to improve data access and to enhance data usability through subsampling, reformatting, and reprojection. These systems also support the wide breadth of products that are handled by the LP DAAC.The LP DAAC is the primary archive for the Landsat 7 Enhanced Thematic Mapper Plus (ETM+) data; it is the only facility in the United States that archives, processes, and distributes data from the Advanced Spaceborne Thermal Emission/Reflection Radiometer (ASTER) on NASA's Terra spacecraft; and it is responsible for the archive and distribution of “land products” generated from data acquired by the Moderate Resolution Imaging Spectroradiometer (MODIS) on NASA's Terra and Aqua satellites.
Redundant array of independent disks: practical on-line archiving of nuclear medicine image data.
Lear, J L; Pratt, J P; Trujillo, N
1996-02-01
While various methods for long-term archiving of nuclear medicine image data exist, none support rapid on-line search and retrieval of information. We assembled a 90-Gbyte redundant array of independent disks (RAID) system using 10-, 9-Gbyte disk drives. The system was connected to a personal computer and software was used to partition the array into 4-Gbyte sections. All studies (50,000) acquired over a 7-year period were archived in the system. Based on patient name/number and study date, information could be located within 20 seconds and retrieved for display and analysis in less than 5 seconds. RAID offers a practical, redundant method for long-term archiving of nuclear medicine studies that supports rapid on-line retrieval.
Recommendations resulting from the SPDS Community-Wide Workshop
NASA Technical Reports Server (NTRS)
1993-01-01
The Data Systems Panel identified three critical functionalities of a Space Physics Data System (SPDS): the delivery of self-documenting data, the existence of a matrix of translators between various standard formats (IDFS, CDF, netCDF, HDF, TENNIS, UCLA flat file, and FITS), and a network-based capability for browsing and examining inventory records for the system's data holdings. The recommendations resulting from the workshop include the philosophy, funding, and objectives of a SPDS. Access to quality data is seen as the most important objective by the Policy Panel, with curation and information about the data being integral parts of any accessible data set. The Data Issues Panel concluded that the SPDS can supply encouragement, guidelines, and ultimately provide a mechanism for financial support for data archiving, restoration, and curation. The Software Panel of the SPDS focused on defining the requirements and priorities for SPDS to support common data analysis and data visualization tools and packages.
A PRACTICAL ONTOLOGY FOR THE LARGE-SCALE MODELING OF SCHOLARLY ARTIFACTS AND THEIR USAGE
DOE Office of Scientific and Technical Information (OSTI.GOV)
RODRIGUEZ, MARKO A.; BOLLEN, JOHAN; VAN DE SOMPEL, HERBERT
2007-01-30
The large-scale analysis of scholarly artifact usage is constrained primarily by current practices in usage data archiving, privacy issues concerned with the dissemination of usage data, and the lack of a practical ontology for modeling the usage domain. As a remedy to the third constraint, this article presents a scholarly ontology that was engineered to represent those classes for which large-scale bibliographic and usage data exists, supports usage research, and whose instantiation is scalable to the order of 50 million articles along with their associated artifacts (e.g. authors and journals) and an accompanying 1 billion usage events. The real worldmore » instantiation of the presented abstract ontology is a semantic network model of the scholarly community which lends the scholarly process to statistical analysis and computational support. They present the ontology, discuss its instantiation, and provide some example inference rules for calculating various scholarly artifact metrics.« less
NASA Astrophysics Data System (ADS)
Liu, Brent J.; Documet, Luis; Documet, Jorge; Huang, H. K.; Muldoon, Jean
2004-04-01
An Application Service Provider (ASP) archive model for disaster recovery for Saint John"s Health Center (SJHC) clinical PACS data has been implemented using a Fault-Tolerant Archive Server at the Image Processing and Informatics Laboratory, Marina del Rey, CA (IPIL) since mid-2002. The purpose of this paper is to provide clinical experiences with the implementation of an ASP model backup archive in conjunction with handheld wireless technologies for a particular disaster recovery scenario, an earthquake, in which the local PACS archive and the hospital are destroyed and the patients are moved from one hospital to another. The three sites involved are: (1) SJHC, the simulated disaster site; (2) IPIL, the ASP backup archive site; and (3) University of California, Los Angeles Medical Center (UCLA), the relocated patient site. An ASP backup archive has been established at IPIL to receive clinical PACS images daily using a T1 line from SJHC for backup and disaster recovery storage. Procedures were established to test the network connectivity and data integrity on a regular basis. In a given disaster scenario where the local PACS archive has been destroyed and the patients need to be moved to a second hospital, a wireless handheld device such as a Personal Digital Assistant (PDA) can be utilized to route images to the second hospital site with a PACS and reviewed by radiologists. To simulate this disaster scenario, a wireless network was implemented within the clinical environment in all three sites: SJHC, IPIL, and UCLA. Upon executing the disaster scenario, the SJHC PACS archive server simulates a downtime disaster event. Using the PDA, the radiologist at UCLA can query the ASP backup archive server at IPIL for PACS images and route them directly to UCLA. Implementation experiences integrating this solution within the three clinical environments as well as the wireless performance are discussed. A clinical downtime disaster scenario was implemented and successfully tested. Radiologists were able to successfully query PACS images utilizing a wireless handheld device from the ASP backup archive at IPIL and route the PACS images directly to a second clinical site at UCLA where they and the patients are located at that time. In a disaster scenario, using a wireless device, radiologists at the disaster health care center can route PACS data from an ASP backup archive server to be reviewed in a live clinical PACS environment at a secondary site. This solution allows Radiologists to use a wireless handheld device to control the image workflow and to review PACS images during a major disaster event where patients must be moved to a secondary site.
The NASA Exoplanet Science Institute Archives: KOA and NStED
NASA Astrophysics Data System (ADS)
Berriman, G. B.; Ciardi, D.; Abajian, M.; Barlow, T.; Bryden, G.; von Braun, K.; Good, J.; Kane, S.; Kong, M.; Laity, A.; Lynn, M.; Elroy, D. M.; Plavchan, P.; Ramirez, S.; Schmitz, M.; Stauffer, J.; Wyatt, P.; Zhang, A.; Goodrich, R.; Mader, J.; Tran, H.; Tsubota, M.; Beekley, A.; Berukoff, S.; Chan, B.; Lau, C.; Regelson, M.; Saucedo, M.; Swain, M.
2010-12-01
The NASA Exoplanet Science Institute (NExScI) maintains a series of archival services in support of NASA’s planet finding and characterization goals. Two of the larger archival services at NExScI are the Keck Observatory Archive (KOA) and the NASA Star and Exoplanet Database (NStED). KOA, a collaboration between the W. M. Keck Observatory and NExScI, serves raw data from the High Resolution Echelle Spectrograph (HIRES) and extracted spectral browse products. As of June 2009, KOA hosts over 28 million files (4.7 TB) from over 2,000 nights. In Spring 2010, it will begin to serve data from the Near-Infrared Echelle Spectrograph (NIRSPEC). NStED is a general purpose archive with the aim of providing support for NASA’s planet finding and characterization goals, and stellar astrophysics. There are two principal components of NStED: a database of (currently) all known exoplanets, and images; and an archive dedicated to high precision photometric surveys for transiting exoplanets. NStED is the US portal to the CNES mission CoRoT, the first space mission dedicated to the discovery and characterization of exoplanets. These archives share a common software and hardware architecture with the NASA/IPAC Infrared Science Archive (IRSA). The software architecture consists of standalone utilities that perform generic query and retrieval functions. They are called through program interfaces and plugged together to form applications through a simple executive library.
The design and implementation of the HY-1B Product Archive System
NASA Astrophysics Data System (ADS)
Liu, Shibin; Liu, Wei; Peng, Hailong
2010-11-01
Product Archive System (PAS), as a background system, is the core part of the Product Archive and Distribution System (PADS) which is the center for data management of the Ground Application System of HY-1B satellite hosted by the National Satellite Ocean Application Service of China. PAS integrates a series of updating methods and technologies, such as a suitable data transmittal mode, flexible configuration files and log information in order to make the system with several desirable characteristics, such as ease of maintenance, stability, minimal complexity. This paper describes seven major components of the PAS (Network Communicator module, File Collector module, File Copy module, Task Collector module, Metadata Extractor module, Product data Archive module, Metadata catalogue import module) and some of the unique features of the system, as well as the technical problems encountered and resolved.
Image dissemination and archiving.
Robertson, Ian
2007-08-01
Images generated as part of the sonographic examination are an integral part of the medical record and must be retained according to local regulations. The standard medical image format, known as DICOM (Digital Imaging and COmmunications in Medicine) makes it possible for images from many different imaging modalities, including ultrasound, to be distributed via a standard internet network to distant viewing workstations and a central archive in an almost seamless fashion. The DICOM standard is a truly universal standard for the dissemination of medical images. When purchasing an ultrasound unit, the consumer should research the unit's capacity to generate images in a DICOM format, especially if one wishes interconnectivity with viewing workstations and an image archive that stores other medical images. PACS, an acronym for Picture Archive and Communication System refers to the infrastructure that links modalities, workstations, the image archive, and the medical record information system into an integrated system, allowing for efficient electronic distribution and storage of medical images and access to medical record data.
New Archiving Distributed InfrastructuRe (NADIR): Status and Evolution
NASA Astrophysics Data System (ADS)
De Marco, M.; Knapic, C.; Smareglia, R.
2015-09-01
The New Archiving Distributed InfrastructuRe (NADIR) has been developed at INAF-OATs IA2 (Italian National Institute for Astrophysics - Astronomical Observatory of Trieste, Italian center of Astronomical Archives), as an evolution of the previous archiving and distribution system, used on several telescopes (LBT, TNG, Asiago, etc.) to improve performance, efficiency and reliability. At the present, NADIR system is running on LBT telescope and Vespa (Italian telescopes network for outreach) Ramella et al. (2014), and will be used on TNG, Asiago and IRA (Istituto Radio Astronomia) archives of Medicina, Noto and SRT radio telescopes Zanichelli et al. (2014) as the data models for radio data will be ready. This paper will discuss the progress status, the architectural choices and the solutions adopted, during the development and the commissioning phase of the project. A special attention will be given to the LBT case, due to some critical aspect of data flow and policies and standards compliance, adopted by the LBT organization.
NASA Astrophysics Data System (ADS)
Servilla, M. S.; O'Brien, M.; Costa, D.
2013-12-01
Considerable ecological research performed today occurs through the analysis of data downloaded from various repositories and archives, often resulting in derived or synthetic products generated by automated workflows. These data are only meaningful for research if they are well documented by metadata, lest semantic or data type errors may occur in interpretation or processing. The Long Term Ecological Research (LTER) Network now screens all data packages entering its long-term archive to ensure that each package contains metadata that is complete, of high quality, and accurately describes the structure of its associated data entity and the data are structurally congruent to the metadata. Screening occurs prior to the upload of a data package into the Provenance Aware Synthesis Tracking Architecture (PASTA) data management system through a series of quality checks, thus preventing ambiguously or incorrectly documented data packages from entering the system. The quality checks within PASTA are designed to work specifically with the Ecological Metadata Language (EML), the metadata standard adopted by the LTER Network to describe data generated by their 26 research sites. Each quality check is codified in Java as part of the ecological community-supported Data Manager Library, which is a resource of the EML specification and used as a component of the PASTA software stack. Quality checks test for metadata quality, data integrity, or metadata-data congruence. Quality checks are further classified as either conditional or informational. Conditional checks issue a 'valid', 'warning' or 'error' response. Only an 'error' response blocks the data package from upload into PASTA. Informational checks only provide descriptive content pertaining to a particular facet of the data package. Quality checks are designed by a group of LTER information managers and reviewed by the LTER community before deploying into PASTA. A total of 32 quality checks have been deployed to date. Quality checks can be customized through a configurable template, which includes turning checks 'on' or 'off' and setting the severity of conditional checks. This feature is important to other potential users of the Data Manager Library who wish to configure its quality checks in accordance with the standards of their community. Executing the complete set of quality checks produces a report that describes the result of each check. The report is an XML document that is stored by PASTA for future reference.
The amino acid's backup bone - storage solutions for proteomics facilities.
Meckel, Hagen; Stephan, Christian; Bunse, Christian; Krafzik, Michael; Reher, Christopher; Kohl, Michael; Meyer, Helmut Erich; Eisenacher, Martin
2014-01-01
Proteomics methods, especially high-throughput mass spectrometry analysis have been continually developed and improved over the years. The analysis of complex biological samples produces large volumes of raw data. Data storage and recovery management pose substantial challenges to biomedical or proteomic facilities regarding backup and archiving concepts as well as hardware requirements. In this article we describe differences between the terms backup and archive with regard to manual and automatic approaches. We also introduce different storage concepts and technologies from transportable media to professional solutions such as redundant array of independent disks (RAID) systems, network attached storages (NAS) and storage area network (SAN). Moreover, we present a software solution, which we developed for the purpose of long-term preservation of large mass spectrometry raw data files on an object storage device (OSD) archiving system. Finally, advantages, disadvantages, and experiences from routine operations of the presented concepts and technologies are evaluated and discussed. This article is part of a Special Issue entitled: Computational Proteomics in the Post-Identification Era. Guest Editors: Martin Eisenacher and Christian Stephan. Copyright © 2013. Published by Elsevier B.V.
A Report on Education and Training in the International Council on Archives' Africa Programme
ERIC Educational Resources Information Center
Lowry, James
2017-01-01
In 2015, the International Council on Archives launched its Africa Programme (2015-2020) in order to coordinate its support for African archives and archivists. The Programme is focused on two strategic priorities: advocacy and education and training. This article examines the education and training component of the Programme. It begins by…
The Protein Data Bank: unifying the archive
Westbrook, John; Feng, Zukang; Jain, Shri; Bhat, T. N.; Thanki, Narmada; Ravichandran, Veerasamy; Gilliland, Gary L.; Bluhm, Wolfgang F.; Weissig, Helge; Greer, Douglas S.; Bourne, Philip E.; Berman, Helen M.
2002-01-01
The Protein Data Bank (PDB; http://www.pdb.org/) is the single worldwide archive of structural data of biological macromolecules. This paper describes the progress that has been made in validating all data in the PDB archive and in releasing a uniform archive for the community. We have now produced a collection of mmCIF data files for the PDB archive (ftp://beta.rcsb.org/pub/pdb/uniformity/data/mmCIF/). A utility application that converts the mmCIF data files to the PDB format (called CIFTr) has also been released to provide support for existing software. PMID:11752306
ACE: A distributed system to manage large data archives
NASA Technical Reports Server (NTRS)
Daily, Mike I.; Allen, Frank W.
1993-01-01
Competitive pressures in the oil and gas industry are requiring a much tighter integration of technical data into E and P business processes. The development of new systems to accommodate this business need must comprehend the significant numbers of large, complex data objects which the industry generates. The life cycle of the data objects is a four phase progression from data acquisition, to data processing, through data interpretation, and ending finally with data archival. In order to implement a cost effect system which provides an efficient conversion from data to information and allows effective use of this information, an organization must consider the technical data management requirements in all four phases. A set of technical issues which may differ in each phase must be addressed to insure an overall successful development strategy. The technical issues include standardized data formats and media for data acquisition, data management during processing, plus networks, applications software, and GUI's for interpretation of the processed data. Mass storage hardware and software is required to provide cost effective storage and retrieval during the latter three stages as well as long term archival. Mobil Oil Corporation's Exploration and Producing Technical Center (MEPTEC) has addressed the technical and cost issues of designing, building, and implementing an Advanced Computing Environment (ACE) to support the petroleum E and P function, which is critical to the corporation's continued success. Mobile views ACE as a cost effective solution which can give Mobile a competitive edge as well as a viable technical solution.
NASA Astrophysics Data System (ADS)
Tarasova, O. A.; Jalkanen, L.
2010-12-01
The WMO Global Atmosphere Watch (GAW) Programme is the only existing long-term international global programme providing an international coordinated framework for observations and analysis of the chemical composition of the atmosphere. GAW is a partnership involving contributors from about 80 countries. It includes a coordinated global network of observing stations along with supporting facilities (Central Facilities) and expert groups (Scientific Advisory Groups, SAGs and Expert Teams, ETs). Currently GAW coordinates activities and data from 27 Global Stations and a substantial number of Regional and Contributing Stations. Station information is available through the GAW Station Information System GAWSIS (http://gaw.empa.ch/gawsis/). There are six key groups of variables which are addressed by the GAW Programme, namely: ozone, reactive gases, greenhouse gases, aerosols, UV radiation and precipitation chemistry. GAW works to implement integrated observations unifying measurements from different platforms (ground based in situ and remote, balloons, aircraft and satellite) supported by modeling activities. GAW provides data for ozone assessments, Greenhouse Gas Bulletins, Ozone Bulletins and precipitation chemistry assessments published on a regular basis and for early warnings of changes in the chemical composition and related physical characteristics of the atmosphere. To ensure that observations can be used for global assessments, the GAW Programme has developed a Quality Assurance system. Five types of Central Facilities dedicated to the six groups of measurement variables are operated by WMO Members and form the basis of quality assurance and data archiving for the GAW global monitoring network. They include Central Calibration Laboratories (CCLs) that host primary standards (PS), Quality Assurance/Science Activity Centres (QA/SACs), World Calibration Centers (WCCs), Regional Calibration Centers (RCCs), and World Data Centers (WDCs) with responsibility for archiving and access to GAW data. Education, training, workshops, comparison campaigns, station audits/visits and twinning are also provided to build capacities in atmospheric sciences in Member countries.
Initial Experience With A Prototype Storage System At The University Of North Carolina
NASA Astrophysics Data System (ADS)
Creasy, J. L.; Loendorf, D. D.; Hemminger, B. M.
1986-06-01
A prototype archiving system manufactured by the 3M Corporation has been in place at the University of North Carolina for approximately 12 months. The system was installed as a result of a collaboration between 3M and UNC, with 3M seeking testing of their system, and UNC realizing the need for an archiving system as an essential part of their PACS test-bed facilities. System hardware includes appropriate network and disk interface devices as well as media for both short and long term storage of images and their associated information. The system software includes those procedures necessary to communicate with the network interface elements(NIEs) as well as those procedures necessary to interpret the ACR-NEMA header blocks and to store the images. A subset of the total ACR-NEMA header is parsed and stored in a relational database system. The entire header is stored on disk with the completed study. Interactive programs have been developed that allow radiologists to easily retrieve information about the archived images and to send the full images to a viewing console. Initial experience with the system has consisted primarily of hardware and software debugging. Although the system is ACR-NEMA compatable, further objective and subjective assessments of system performance is awaiting the connection of compatable consoles and acquisition devices to the network.
Transforming War Fighting through the Use of Service Based Architecture (SBA) Technology
2006-05-04
near-real-time video & telemetry to users on network using standard web-based protocols – Provides web-based access to archived video files MTI...Target Tracks Service Capabilities – Disseminates near-real-time MTI and Target Tracks to users on network based on consumer specified geographic...filter IBS SIGINT Service Capabilities – Disseminates near-real-time IBS SIGINT data to users on network based on consumer specified geographic filter
Optical Disk Technology and Information.
ERIC Educational Resources Information Center
Goldstein, Charles M.
1982-01-01
Provides basic information on videodisks and potential applications, including inexpensive online storage, random access graphics to complement online information systems, hybrid network architectures, office automation systems, and archival storage. (JN)
NASA's Contribution to Global Space Geodesy Networks
NASA Technical Reports Server (NTRS)
Bosworth, John M.
1999-01-01
The NASA Space Geodesy program continues to be a major provider of space geodetic data for the international earth science community. NASA operates high performance Satellite Laser Ranging (SLR), Very Long Baseline Interferometry (VLBI) and Global Positioning System (GPS) ground receivers at well over 30 locations around the world and works in close cooperation with space geodetic observatories around the world. NASA has also always been at the forefront in the quest for technical improvement and innovation in the space geodesy technologies to make them even more productive, accurate and economical. This presentation will highlight the current status of NASA's networks; the plans for partnerships with international groups in the southern hemisphere to improve the geographic distribution of space geodesy sites and the status of the technological improvements in SLR and VLBI that will support the new scientific thrusts proposed by interdisciplinary earth scientists. In addition, the expanding role of the NASA Space geodesy data archive, the CDDIS will be described.
ARM Operations and Engineering Procedure Mobile Facility Site Startup
DOE Office of Scientific and Technical Information (OSTI.GOV)
Voyles, Jimmy W
2015-05-01
This procedure exists to define the key milestones, necessary steps, and process rules required to commission and operate an Atmospheric Radiation Measurement (ARM) Mobile Facility (AMF), with a specific focus toward on-time product delivery to the ARM Data Archive. The overall objective is to have the physical infrastructure, networking and communications, and instrument calibration, grooming, and alignment (CG&A) completed with data products available from the ARM Data Archive by the Operational Start Date milestone.
Teleradiology network to improve patient care in a peacekeeping military operation
NASA Astrophysics Data System (ADS)
Cleary, Kevin R.; Levine, Betty A.; Norton, Gary S.; Mun, Seong K.; Cramer, Timothy J.; de Treville, Robert E.
1997-05-01
The Imaging Science and Information Systems (ISIS) Center of the Department of Radiology at Georgetown University Medical Center recently collaborated with the US Army in developing an off-the-shelf teleradiology network for Operation Joint Endeavor, the peace-keeping mission in Bosnia-Herzegovina. The network is part of Operation Primetime III, a project to deploy advanced communications and medical equipment to provide state-of-the-art medical care to the 20,000 US troops stationed there. The network encompasses three major sites: the 212th Mobile Army Surgical Hospital (MASH) near Tuzla, Bosnia-Herzegovina; the 67th Combat Support Hospital (CSH) in Taszar, Hungary; and the Landstuhl Regional Medical Center (LRMC) in Landstuhl, Germany. Planning for the project began in January 1996, and all three sites were operational by April 1996. Since the system was deployed, computed radiography (CR) has been sued almost exclusively at the MASH and CSH for all general x-ray exams. From mid- May to September 1996, over 2700 CR images were acquired at the MASH and over 1600 at the CSH. Since there was not a radiologist a the MASH, the images were transferred to the CSH for primary diagnosis and archiving. In the same time period, over 550 patient folders were sent from the MASH to the CSH.
Network Modeling Reveals Steps in Angiotensin Peptide Processing
Schwacke, John H.; Spainhour, John Christian G.; Ierardi, Jessalyn L.; Chaves, Jose M.; Arthur, John M.; Janech, Michael G.; Velez, Juan Carlos Q.
2015-01-01
New insights into the intrarenal renin-angiotensin system (RAS) have modified our traditional view of the system. However, many finer details of this network of peptides and associated peptidases remain unclear. We hypothesized that a computational systems biology approach, applied to peptidomic data, could help to unravel the network of enzymatic conversions. We built and refined a Bayesian network model and a dynamic systems model starting from a skeleton created with established elements of the RAS and further developed it with archived MALDI-TOF mass spectra from experiments conducted in mouse podocytes exposed to exogenous angiotensin (Ang) substrates. The model-building process suggested previously unrecognized steps, three of which were confirmed in vitro, including the conversion of Ang(2-10) to Ang(2-7) by neprilysin (NEP), and Ang(1-9) to Ang(2-9) and Ang(1-7) to Ang(2-7) by aminopeptidase A (APA). These data suggest a wider role of NEP and APA in glomerular formation of bioactive Ang peptides and/or shunting their formation. Other steps were also suggested by the model and supporting evidence for those steps was evaluated using model-comparison methods. Our results demonstrate that systems biology methods applied to peptidomic data are effective in identifying novel steps in the Ang peptide processing network, and these findings improve our understanding of the glomerular RAS. PMID:23283355
ERIC Educational Resources Information Center
Chang, Jui-Hung; Chiu, Po-Sheng; Huang, Yueh-Min
2018-01-01
With the advances in mobile network technology, the use of portable devices and mobile networks for learning is not limited by time and space. Such use, in combination with appropriate learning strategies, can achieve a better effect. Despite the effectiveness of mobile learning, students' learning direction, progress, and achievement may differ.…
Data systems and computer science programs: Overview
NASA Technical Reports Server (NTRS)
Smith, Paul H.; Hunter, Paul
1991-01-01
An external review of the Integrated Technology Plan for the Civil Space Program is presented. The topics are presented in viewgraph form and include the following: onboard memory and storage technology; advanced flight computers; special purpose flight processors; onboard networking and testbeds; information archive, access, and retrieval; visualization; neural networks; software engineering; and flight control and operations.
ERIC Educational Resources Information Center
Kilburn, M. Rebecca, Ed.
2014-01-01
The Promising Practices Network (PPN) on Children, Families and Communities (www.promisingpractices.net) began as a partnership between four state-level organizations that help public and private organizations improve the well-being of children and families. The PPN website, archived in June 2014, featured summaries of programs and practices that…
Keller, Vera; Penman, Leigh T I
2015-03-01
Many historians have traced the accumulation of scientific archives via communication networks. Engines for communication in early modernity have included trade, the extrapolitical Republic of Letters, religious enthusiasm, and the centralization of large emerging information states. The communication between Samuel Hartlib, John Dury, Duke Friedrich III of Gottorf-Holstein, and his key agent in England, Frederick Clodius, points to a less obvious but no less important impetus--the international negotiations of smaller states. Smaller states shaped communication networks in an international (albeit politically and religiously slanted) direction. Their networks of negotiation contributed to the internationalization of emerging science through a political and religious concept of shared interest. While interest has been central to social studies of science, interest itself has not often been historicized within the history of science. This case study demonstrates the co-production of science and society by tracing how period concepts of interest made science international.
The Biomolecular Interaction Network Database and related tools 2005 update
Alfarano, C.; Andrade, C. E.; Anthony, K.; Bahroos, N.; Bajec, M.; Bantoft, K.; Betel, D.; Bobechko, B.; Boutilier, K.; Burgess, E.; Buzadzija, K.; Cavero, R.; D'Abreo, C.; Donaldson, I.; Dorairajoo, D.; Dumontier, M. J.; Dumontier, M. R.; Earles, V.; Farrall, R.; Feldman, H.; Garderman, E.; Gong, Y.; Gonzaga, R.; Grytsan, V.; Gryz, E.; Gu, V.; Haldorsen, E.; Halupa, A.; Haw, R.; Hrvojic, A.; Hurrell, L.; Isserlin, R.; Jack, F.; Juma, F.; Khan, A.; Kon, T.; Konopinsky, S.; Le, V.; Lee, E.; Ling, S.; Magidin, M.; Moniakis, J.; Montojo, J.; Moore, S.; Muskat, B.; Ng, I.; Paraiso, J. P.; Parker, B.; Pintilie, G.; Pirone, R.; Salama, J. J.; Sgro, S.; Shan, T.; Shu, Y.; Siew, J.; Skinner, D.; Snyder, K.; Stasiuk, R.; Strumpf, D.; Tuekam, B.; Tao, S.; Wang, Z.; White, M.; Willis, R.; Wolting, C.; Wong, S.; Wrong, A.; Xin, C.; Yao, R.; Yates, B.; Zhang, S.; Zheng, K.; Pawson, T.; Ouellette, B. F. F.; Hogue, C. W. V.
2005-01-01
The Biomolecular Interaction Network Database (BIND) (http://bind.ca) archives biomolecular interaction, reaction, complex and pathway information. Our aim is to curate the details about molecular interactions that arise from published experimental research and to provide this information, as well as tools to enable data analysis, freely to researchers worldwide. BIND data are curated into a comprehensive machine-readable archive of computable information and provides users with methods to discover interactions and molecular mechanisms. BIND has worked to develop new methods for visualization that amplify the underlying annotation of genes and proteins to facilitate the study of molecular interaction networks. BIND has maintained an open database policy since its inception in 1999. Data growth has proceeded at a tremendous rate, approaching over 100 000 records. New services provided include a new BIND Query and Submission interface, a Standard Object Access Protocol service and the Small Molecule Interaction Database (http://smid.blueprint.org) that allows users to determine probable small molecule binding sites of new sequences and examine conserved binding residues. PMID:15608229
Young, Jasmine Y; Westbrook, John D; Feng, Zukang; Sala, Raul; Peisach, Ezra; Oldfield, Thomas J; Sen, Sanchayita; Gutmanas, Aleksandras; Armstrong, David R; Berrisford, John M; Chen, Li; Chen, Minyu; Di Costanzo, Luigi; Dimitropoulos, Dimitris; Gao, Guanghua; Ghosh, Sutapa; Gore, Swanand; Guranovic, Vladimir; Hendrickx, Pieter M S; Hudson, Brian P; Igarashi, Reiko; Ikegawa, Yasuyo; Kobayashi, Naohiro; Lawson, Catherine L; Liang, Yuhe; Mading, Steve; Mak, Lora; Mir, M Saqib; Mukhopadhyay, Abhik; Patwardhan, Ardan; Persikova, Irina; Rinaldi, Luana; Sanz-Garcia, Eduardo; Sekharan, Monica R; Shao, Chenghua; Swaminathan, G Jawahar; Tan, Lihua; Ulrich, Eldon L; van Ginkel, Glen; Yamashita, Reiko; Yang, Huanwang; Zhuravleva, Marina A; Quesada, Martha; Kleywegt, Gerard J; Berman, Helen M; Markley, John L; Nakamura, Haruki; Velankar, Sameer; Burley, Stephen K
2017-03-07
OneDep, a unified system for deposition, biocuration, and validation of experimentally determined structures of biological macromolecules to the PDB archive, has been developed as a global collaboration by the worldwide PDB (wwPDB) partners. This new system was designed to ensure that the wwPDB could meet the evolving archiving requirements of the scientific community over the coming decades. OneDep unifies deposition, biocuration, and validation pipelines across all wwPDB, EMDB, and BMRB deposition sites with improved focus on data quality and completeness in these archives, while supporting growth in the number of depositions and increases in their average size and complexity. In this paper, we describe the design, functional operation, and supporting infrastructure of the OneDep system, and provide initial performance assessments. Published by Elsevier Ltd.
The HEASARC in 2013 and Beyond: NuSTAR, Astro-H, NICER..
NASA Astrophysics Data System (ADS)
Drake, Stephen A.; Smale, A. P.; McGlynn, T. A.; Arnaud, K. A.
2013-04-01
The High Energy Astrophysics Archival Research Center or HEASARC (http://heasarc.gsfc.nasa.gov/) is in its third decade as the NASA astrophysics discipline node supporting multi-mission cosmic X-ray and gamma-ray astronomy research. It provides a unified archive and software structure aimed both at 'legacy' missions such as Einstein, EXOSAT, ROSAT and RXTE, contemporary missions such as Fermi, Swift, Suzaku, Chandra, etc., and upcoming missions, such as NuSTAR, Astro-H and NICER. The HEASARC's high-energy astronomy archive has grown so that it presently contains 45 TB of data from 28 orbital missions. The HEASARC is the designated archive which supports NASA's Physics of the Cosmos theme (http://pcos.gsfc.nasa.gov/). We discuss some of the upcoming new initiatives and developments for the HEASARC, including the arrival of public data from the hard X-ray imaging NuSTAR mission in the summer of 2013, and the ongoing preparations to support the JAXA/NASA Astro-H mission and the NASA MoO Neutron Star Interior Composition Explorer (NICER), which are expected to become operational in 2015-2016. We also highlight some of the new software capabilities of the HEASARC, such as Xamin, a next-generation archive interface which will eventually supersede Browse, and the latest update of XSPEC (v 12.8.0).
Patton, John M.; Ketchum, David C.; Guy, Michelle R.
2015-11-02
This document provides an overview of the capabilities, design, and use cases of the data acquisition and archiving subsystem at the U.S. Geological Survey National Earthquake Information Center. The Edge and Continuous Waveform Buffer software supports the National Earthquake Information Center’s worldwide earthquake monitoring mission in direct station data acquisition, data import, short- and long-term data archiving, data distribution, query services, and playback, among other capabilities. The software design and architecture can be configured to support acquisition and (or) archiving use cases. The software continues to be developed in order to expand the acquisition, storage, and distribution capabilities.
GPS data exploration for seismologists and geodesists
NASA Astrophysics Data System (ADS)
Webb, F.; Bock, Y.; Kedar, S.; Dong, D.; Jamason, P.; Chang, R.; Prawirodirdjo, L.; MacLeod, I.; Wadsworth, G.
2007-12-01
Over the past decade, GPS and seismic networks spanning the western US plate boundaries have produced vast amounts of data that need to be made accessible to both the geodesy and seismology communities. Unlike seismic data, raw geodetic data requires significant processing before geophysical interpretations can be made. This requires the generation of data-products (time series, velocities and strain maps) and dissemination strategies to bridge these differences and assure efficient use of data across traditionally separate communities. "GPS DATA PRODUCTS FOR SOLID EARTH SCIENCE" (GDPSES) is a multi-year NASA funded project, designed to produce and deliver high quality GPS time series, velocities, and strain fields, derived from multiple GPS networks along the western US plate boundary, and to make these products easily accessible to geophysicists. Our GPS product dissemination is through modern web-based IT methodology. Product browsing is facilitated through a web tool known as GPS Explorer and continuous streams of GPS time series are provided using web services to the seismic archive, where it can be accessed by seismologists using traditional seismic data viewing and manipulation tools. GPS-Explorer enables users to efficiently browse several layers of data products from raw data through time series, velocities and strain by providing the user with a web interface, which seamlessly interacts with a continuously updated database of these data products through the use of web-services. The current archive contains GDPSES data products beginning in 1995, and includes observations from GPS stations in EarthScope's Plate Boundary Observatory (PBO), as well as from real-time real-time CGPS stations. The generic, standards-based approach used in this project enables GDPSES to seamlessly expand indefinitely to include other space-time-dependent data products from additional GPS networks. The prototype GPS-Explorer provides users with a personalized working environment in which the user may zoom in and access subsets of the data via web services. It provides users with a variety of interactive web tools interconnected in a portlet environment to explore and save datasets of interest to return to at a later date. At the same time the GPS time series are also made available through the seismic data archive, where the GPS networks are treated as regular seismic networks, whose data is made available in data formats used by seismic utilities such as SEED readers and SAC. A key challenge, stemming from the fundamental differences between seismic and geodetic time series, is the representation of reprocessed of GPS data in the seismic archive. As GPS processing algorithms evolve and their accuracy increases, a periodic complete recreation of the the GPS time series archive is necessary.
Dissociable meta-analytic brain networks contribute to coordinated emotional processing.
Riedel, Michael C; Yanes, Julio A; Ray, Kimberly L; Eickhoff, Simon B; Fox, Peter T; Sutherland, Matthew T; Laird, Angela R
2018-06-01
Meta-analytic techniques for mining the neuroimaging literature continue to exert an impact on our conceptualization of functional brain networks contributing to human emotion and cognition. Traditional theories regarding the neurobiological substrates contributing to affective processing are shifting from regional- towards more network-based heuristic frameworks. To elucidate differential brain network involvement linked to distinct aspects of emotion processing, we applied an emergent meta-analytic clustering approach to the extensive body of affective neuroimaging results archived in the BrainMap database. Specifically, we performed hierarchical clustering on the modeled activation maps from 1,747 experiments in the affective processing domain, resulting in five meta-analytic groupings of experiments demonstrating whole-brain recruitment. Behavioral inference analyses conducted for each of these groupings suggested dissociable networks supporting: (1) visual perception within primary and associative visual cortices, (2) auditory perception within primary auditory cortices, (3) attention to emotionally salient information within insular, anterior cingulate, and subcortical regions, (4) appraisal and prediction of emotional events within medial prefrontal and posterior cingulate cortices, and (5) induction of emotional responses within amygdala and fusiform gyri. These meta-analytic outcomes are consistent with a contemporary psychological model of affective processing in which emotionally salient information from perceived stimuli are integrated with previous experiences to engender a subjective affective response. This study highlights the utility of using emergent meta-analytic methods to inform and extend psychological theories and suggests that emotions are manifest as the eventual consequence of interactions between large-scale brain networks. © 2018 Wiley Periodicals, Inc.
NASA Astrophysics Data System (ADS)
Coleman, D. F.
2012-12-01
Most research vessels are equipped with satellite Internet services with bandwidths capable of being upgraded to support telepresence technologies and live shore-based participation. This capability can be used for real-time data transmission to shore, where it can be distributed, managed, processed, and archived. The University of Rhode Island Inner Space Center utilizes telepresence technologies and a growing network of command centers on Internet2 to participate live with a variety of research vessels and their ocean observing and sampling systems. High-bandwidth video streaming, voice-over-IP telecommunications, and real-time data feeds and file transfers enable users on shore to take part in the oceanographic expeditions as if they were present on the ship, working in the lab. Telepresence-enabled systematic ocean exploration and similar programs represent a significant and growing paradigm shift that can change the future of seagoing ocean observations using research vessels. The required platform is the ship itself, and users of the technology rely on the ship-based technical teams, but remote and distributed shore-based science users, students, educators, and the general public can now take part by being aboard virtually.
The IRIS Data Management Center: Enabling Access to Observational Time Series Spanning Decades
NASA Astrophysics Data System (ADS)
Ahern, T.; Benson, R.; Trabant, C.
2009-04-01
The Incorporated Research Institutions for Seismology (IRIS) is funded by the National Science Foundation (NSF) to operate the facilities to generate, archive, and distribute seismological data to research communities in the United States and internationally. The IRIS Data Management System (DMS) is responsible for the ingestion, archiving, curation and distribution of these data. The IRIS Data Management Center (DMC) manages data from more than 100 permanent seismic networks, hundreds of temporary seismic deployments as well as data from other geophysical observing networks such as magnetotelluric sensors, ocean bottom sensors, superconducting gravimeters, strainmeters, surface meteorological measurements, and in-situ atmospheric pressure measurements. The IRIS DMC has data from more than 20 different types of sensors. The IRIS DMC manages approximately 100 terabytes of primary observational data. These data are archived in multiple distributed storage systems that insure data availability independent of any single catastrophic failure. Storage systems include both RAID systems of greater than 100 terabytes as well as robotic tape robots of petabyte capacity. IRIS performs routine transcription of the data to new media and storage systems to insure the long-term viability of the scientific data. IRIS adheres to the OAIS Data Preservation Model in most cases. The IRIS data model requires the availability of metadata describing the characteristics and geographic location of sensors before data can be fully archived. IRIS works with the International Federation of Digital Seismographic Networks (FDSN) in the definition and evolution of the metadata. The metadata insures that the data remain useful to both current and future generations of earth scientists. Curation of the metadata and time series is one of the most important activities at the IRIS DMC. Data analysts and an automated quality assurance system monitor the quality of the incoming data. This insures data are of acceptably high quality. The formats and data structures used by the seismological community are esoteric. IRIS and its FDSN partners are developing web services that can transform the data holdings to structures that are more easily used by broader scientific communities. For instance, atmospheric scientists are interested in using global observations of microbarograph data but that community does not understand the methods of applying instrument corrections to the observations. Web processing services under development at IRIS will transform these data in a manner that allows direct use within such analysis tools as MATLAB® already in use by that community. By continuing to develop web-service based methods of data discovery and access, IRIS is enabling broader access to its data holdings. We currently support data discovery using many of the Open Geospatial Consortium (OGC) web mapping services. We are involved in portal technologies to support data discovery and distribution for all data from the EarthScope project. We are working with computer scientists at several universities including the University of Washington as part of a DataNet proposal and we intend to enhance metadata, further develop ontologies, develop a Registry Service to aid in the discovery of data sets and services, and in general improve the semantic interoperability of the data managed at the IRIS DMC. Finally IRIS has been identified as one of four scientific organizations that the External Research Division of Microsoft wants to work with in the development of web services and specifically with the development of a scientific workflow engine. More specific details of current and future developments at the IRIS DMC will be included in this presentation.
Development of an Oceanographic Data Archiving and Service System for the Korean Researchers
NASA Astrophysics Data System (ADS)
Kim, Sung Dae; Park, Hyuk Min; Baek, Sang Ho
2014-05-01
Oceanographic Data and Information Center of Korea Institute of Ocean Science and Technology (KIOST) started to develop an oceanographic data archiving and service system in 2010 to support the Korean ocean researchers by providing quality controlled data continuously. Many physical oceanographic data available in the public domain and Korean domestic data were collected periodically, quality controlled, manipulated and provided to ocean modelers who need ocean data continuously and marine biologists who don't know well physical data but need it. The northern limit and the southern limit of the spatial coverage are 20°N and 55°N, and the western limit and the eastern limit are 110°E and 150°E, respectively. To archive TS (Temperature and Salinity) profile data, ARGO data were gathered from ARGO GDACs (France and USA) and many historical TS profile data observed by CTD, OSD and BT were retrieved from World Ocean Database 2009. The quality control software for TS profile data, which meets QC criteria suggested by the ARGO program and the GTSPP (Global Temperature-Salinity Profile Program), was programmed and applied to the collected data. By the end of 2013, the total number of vertical profile data from the ARGO GDACs was 59,642 and total number of station data from WOD 2009 was 1,604,422. We also collected the global satellite SST data produced by NCDC and global SSH data from AVISO every day. An automatic program was coded to collect satellite data, extract sub data sets of the North West Pacific area and produce distribution maps. The total number of collected satellite data sets was 3,613 by the end of 2013. We use 3 different data services to provide archived data to the Korean experts. A FTP service was prepared to allow data users to download data in the original format. We developed TS database system using Oracle RDBMS to contain all collected temperature salinity data and support SQL data retrieval with various conditions. The KIOST ocean data portal was used as the data retrieving service of TS DB, which uses GIS interface made by open source GIS software. We also installed Live Access Service developed by US PMEL for service of the satellite netCDF data files, which support on-the-fly visualization and OPeNDAP (Open-source Project for a Network Data Access Protocol) service for remote connection and sub-setting of large data set
The Telecommunications and Data Acquisition Report
NASA Technical Reports Server (NTRS)
Posner, E. C. (Editor)
1987-01-01
This quarterly publication (July-September 1987) provides archival reports on developments in programs managed by JPL's Office of Telecommunications and Data Acquisition (TDA). In space communications, radio navigation, radio science, and ground-based radio astronomy, it reports on activities of the Deep Space Network (DSN) and its associated Ground Communications Facility (GCF) in planning, in supporting research and technology, in implementation, and in operations. This work is performed for NASA's Office of Space Tracking and Data Systems (OSTDS). In geodynamics, the publication reports on the application of radio interferometry at microwave frequencies for geodynamic measurements. In the Search for Extraterrestrial Intelligence (SETI), it reports on implementation and operations for searching the microwave spectrum. The latter two programs are performed for NASA's Office of Space Science and Applications (OSSA).
The Telecommunications and Data Acquisition Report
NASA Technical Reports Server (NTRS)
Posner, E. C. (Editor)
1986-01-01
This quarterly publication (July-Sept. 1986) provides archival reports on developments in programs managed by JPL's Office of Telecommunications and Data Acquisition (TDA). In space communications, radio navigation, radio science, and ground-based radio astronomy, it reports on activities of the Deep Space Network (DSN) and its associated Ground Communications Facility (GCF) in planning, in supporting research and technology, in implementation, and in operations. This work is performed for NASA's Office of Space Tracking and Data Systems (OSTDS). In geodynamics, the publication reports on the application of radio interferometry at microwave frequencies for geodynamic measurements. In the search for extraterrestrial intelligence (SETI), it reports on implementation and operations for searching the microwave spectrum. The latter two programs are performed for NASA's Office of Space Science and Applications (OSSA).
The Telecommunications and Data Acquisition Report
NASA Technical Reports Server (NTRS)
Posner, Edward C. (Editor)
1992-01-01
This quarterly publication provides archival reports on developments in programs managed by JPL's Office Telecommunications and Data Acquisition (TDA). In space communications, radio navigation, radio science, and ground-based radio and radar astronomy, it reports on activities of the Deep Space Network (DSN) in planning, in supporting research and technology, in implementation, and in operations. Also included is standards activity at JPL for space data and information systems and reimbursable DSN work performed for other space agencies through NASA. The preceding work is all performed for NASA's Office of Space Operations (OSO). The TDA Office also performs work funded by two other NASA program offices through and with the cooperation of the OSO. These are the Orbital Debris Radar Program and 21st Century Communication Studies.
The Telecommunications and Data Acquisition Report
NASA Technical Reports Server (NTRS)
Posner, Edward C. (Editor)
1993-01-01
This quarterly publication provides archival reports on developments in programs managed by JPL's Office of Telecommunications and Data Acquisition (TDA) in the following areas: space communications, radio navigation, radio science, and ground-based radio and radar astronomy. This document also reports on the activities of the Deep Space Network (DSN) in planning, supporting research and technology, implementation, and operations. Also included are standards activity at JPL for space data and information systems and reimbursable DSN work performed for other space agencies through NASA. The preceding work is all performed for NASA's Office of Space Communications (OSC). The TDA Office also performs work funded by another NASA program office through and with the cooperation of OSC. This is the Orbital Debris Radar Program with the Office of Space Systems Development.
The Telecommunications and Data Acquisition Report
NASA Technical Reports Server (NTRS)
Yuen, Joseph H. (Editor)
1995-01-01
This quarterly publiction provides archival reports on developments in programs managed by JPL Telecommunications and Mission Operations Directorate (TMOD), which now includes the former communications and Data Acquisition (TDA) Office. In space communications, radio navigation, radio science, and ground-based radio and radar astronomy, it reports on activities of the Deep Space Network (DSN) in planning, supporting research and technology, implementation, and operations. Also included are standards activity at JPL for space data and information systems and reimbursable DSN work performed for other space agencies through NASA. The Orbital Debris Radar Program, funded by the Office of Space Systems Development, makes use of the planetary radar capability when the antennas are configured at science instruments making direct observations of planets, their satellites, and asteroids of our solar system.
Nekton Interaction Monitoring System
DOE Office of Scientific and Technical Information (OSTI.GOV)
2017-03-15
The software provides a real-time processing system for sonar to detect and track animals, and to extract water column biomass statistics in order to facilitate continuous monitoring of an underwater environment. The Nekton Interaction Monitoring System (NIMS) extracts and archives tracking and backscatter statistics data from a real-time stream of data from a sonar device. NIMS also sends real-time tracking messages over the network that can be used by other systems to generate other metrics or to trigger instruments such as an optical video camera. A web-based user interface provides remote monitoring and control. NIMS currently supports three popular sonarmore » devices: M3 multi-beam sonar (Kongsberg), EK60 split-beam echo-sounder (Simrad) and BlueView acoustic camera (Teledyne).« less
gkmSVM: an R package for gapped-kmer SVM.
Ghandi, Mahmoud; Mohammad-Noori, Morteza; Ghareghani, Narges; Lee, Dongwon; Garraway, Levi; Beer, Michael A
2016-07-15
We present a new R package for training gapped-kmer SVM classifiers for DNA and protein sequences. We describe an improved algorithm for kernel matrix calculation that speeds run time by about 2 to 5-fold over our original gkmSVM algorithm. This package supports several sequence kernels, including: gkmSVM, kmer-SVM, mismatch kernel and wildcard kernel. gkmSVM package is freely available through the Comprehensive R Archive Network (CRAN), for Linux, Mac OS and Windows platforms. The C ++ implementation is available at www.beerlab.org/gkmsvm mghandi@gmail.com or mbeer@jhu.edu Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
NASA Astrophysics Data System (ADS)
Silva, Augusto F. d.; Costa, Carlos; Abrantes, Pedro; Gama, Vasco; Den Boer, Ad
1998-07-01
This paper describes an integrated system designed to provide efficient means for DICOM compliant cardiac imaging archival, transmission and visualization based on a communications backbone matching recent enabling telematic technologies like Asynchronous Transfer Mode (ATM) and switched Local Area Networks (LANs). Within a distributed client-server framework, the system was conceived on a modality based bottom-up approach, aiming ultrafast access to short term archives and seamless retrieval of cardiac video sequences throughout review stations located at the outpatient referral rooms, intensive and intermediate care units and operating theaters.
NASA GIBS Use in Live Planetarium Shows
NASA Astrophysics Data System (ADS)
Emmart, C. B.
2015-12-01
The American Museum of Natural History's Hayden Planetarium was rebuilt in year 2000 as an immersive theater for scientific data visualization to show the universe in context to our planet. Specific astrophysical movie productions provide the main daily programming, but interactive control software, developed at AMNH allows immersive presentation within a data aggregation of astronomical catalogs called the Digital Universe 3D Atlas. Since 2006, WMS globe browsing capabilities have been built into a software development collaboration with Sweden's Linkoping University (LiU). The resulting Uniview software, now a product of the company SCISS, is operated by about fifty planetariums around that world with ability to network amongst the sites for global presentations. Public presentation of NASA GIBS has allowed authoritative narratives to be presented within the range of data available in context to other sources such as Science on a Sphere, NASA Earth Observatory and Google Earth KML resources. Specifically, the NOAA supported World Views Network conducted a series of presentations across the US that focused on local ecological issues that could then be expanded in the course of presentation to national and global scales of examination. NASA support of for GIBS resources in an easy access multi scale streaming format like WMS has tremendously enabled particularly facile presentations of global monitoring like never before. Global networking of theaters for distributed presentations broadens out the potential for impact of this medium. Archiving and refinement of these presentations has already begun to inform new types of documentary productions that examine pertinent, global interdependency topics.
New Capabilities in the Astrophysics Multispectral Archive Search Engine
NASA Astrophysics Data System (ADS)
Cheung, C. Y.; Kelley, S.; Roussopoulos, N.
The Astrophysics Multispectral Archive Search Engine (AMASE) uses object-oriented database techniques to provide a uniform multi-mission and multi-spectral interface to search for data in the distributed archives. We describe our experience of porting AMASE from Illustra object-relational DBMS to the Informix Universal Data Server. New capabilities and utilities have been developed, including a spatial datablade that supports Nearest Neighbor queries.
NASA Astrophysics Data System (ADS)
Conway, Esther; Waterfall, Alison; Pepler, Sam; Newey, Charles
2015-04-01
In this paper we decribe a business process modelling approach to the integration of exisiting archival activities. We provide a high level overview of existing practice and discuss how procedures can be extended and supported through the description of preservation state. The aim of which is to faciliate the dynamic controlled management of scientific data through its lifecycle. The main types of archival processes considered are: • Management processes that govern the operation of an archive. These management processes include archival governance (preservation state management, selection of archival candidates and strategic management) . • Operational processes that constitute the core activities of the archive which maintain the value of research assets. These operational processes are the acquisition, ingestion, deletion, generation of metadata and preservation actvities, • Supporting processes, which include planning, risk analysis and monitoring of the community/preservation environment. We then proceed by describing the feasability testing of extended risk management and planning procedures which integrate current practices. This was done through the CEDA Archival Format Audit which inspected British Atmospherics Data Centre and National Earth Observation Data Centre Archival holdings. These holdings are extensive, comprising of around 2PB of data and 137 million individual files which were analysed and characterised in terms of format based risk. We are then able to present an overview of the risk burden faced by a large scale archive attempting to maintain the usability of heterogeneous environmental data sets. We conclude by presenting a dynamic data management information model that is capable of describing the preservation state of archival holdings throughout the data lifecycle. We provide discussion of the following core model entities and their relationships: • Aspirational entities, which include Data Entity definitions and their associated Preservation Objectives. • Risk entities, which act as drivers for change within the data lifecycle. These include Acquisitional Risks, Technical Risks, Strategic Risks and External Risks • Plan entities, which detail the actions to bring about change within an archive. These include Acquisition Plans, Preservation Plans and Monitoring plans • The Result entities describe the successful outcomes of the executed plans. These include Acquisitions, Mitigations and Accepted Risks.
Standard formatted data units-control authority procedures
NASA Technical Reports Server (NTRS)
1991-01-01
The purpose of this document is to establish a set of minimum and optional requirements for the implementation of Control Authority (CA) organizations within and among the Agencies participating in the Consultative Committee for Space Data Systems (CCSDS). By satisfying these requirements, the resultant cooperating set of CA organizations will produce a global CA service supporting information transfer with digital data under the Standard Formatted Data Unit (SFDU) concept. This service is primarily accomplished through the registration, permanent archiving, and dissemination of metadata in the form of Metadata Objects (MDO) that assist in the interpretation of data objects received in SFDU form. This Recommendation addresses the responsibilities, services, and interface protocols for a hierarchy of CA organizations. The top level, consisting of the CCSDS Secretariat and its operational agent, is unique and primarily provides a global coordination function. The lower levels are Agency CA organizations that have primary responsibility for the registration, archiving, and dissemination of MDOs. As experience is gained and technology evolves, the CA Procedures will be extended to include enhanced services and their supporting protocols. In particular, it is anticipated that eventually CA organizations will be linked via networks on a global basis, and will provide requestors with online automated access to CA services. While this Recommendation does not preclude such operations, it also does not recommend the specific protocols to be used to ensure global compatibility of these services. These recommendations will be generated as experience is gained.
NASA Technical Reports Server (NTRS)
Hilland, Jeffrey E.; Collins, Donald J.; Nichols, David A.
1991-01-01
The Distributed Active Archive Center (DAAC) at the Jet Propulsion Laboratory will support scientists specializing in physical oceanography and air-sea interaction. As part of the NASA Earth Observing System Data and Information System Version 0 the DAAC will build on existing capabilities to provide services for data product generation, archiving, distribution and management of information about data. To meet scientist's immediate needs for data, existing data sets from missions such as Seasat, Geosat, the NOAA series of satellites and the Global Positioning Satellite system will be distributed to investigators upon request. In 1992, ocean topography, wave and surface roughness data from the Topex/Poseidon radar altimeter mission will be archived and distributed. New data products will be derived from Topex/Poseidon and other sensor systems based on recommendations of the science community. In 1995, ocean wind field measurements from the NASA Scatterometer will be supported by the DAAC.
Young, Jasmine Y.; Westbrook, John D.; Feng, Zukang; Sala, Raul; Peisach, Ezra; Oldfield, Thomas J.; Sen, Sanchayita; Gutmanas, Aleksandras; Armstrong, David R.; Berrisford, John M.; Chen, Li; Chen, Minyu; Di Costanzo, Luigi; Dimitropoulos, Dimitris; Gao, Guanghua; Ghosh, Sutapa; Gore, Swanand; Guranovic, Vladimir; Hendrickx, Pieter MS; Hudson, Brian P.; Igarashi, Reiko; Ikegawa, Yasuyo; Kobayashi, Naohiro; Lawson, Catherine L.; Liang, Yuhe; Mading, Steve; Mak, Lora; Mir, M. Saqib; Mukhopadhyay, Abhik; Patwardhan, Ardan; Persikova, Irina; Rinaldi, Luana; Sanz-Garcia, Eduardo; Sekharan, Monica R.; Shao, Chenghua; Swaminathan, G. Jawahar; Tan, Lihua; Ulrich, Eldon L.; van Ginkel, Glen; Yamashita, Reiko; Yang, Huanwang; Zhuravleva, Marina A.; Quesada, Martha; Kleywegt, Gerard J.; Berman, Helen M.; Markley, John L.; Nakamura, Haruki; Velankar, Sameer; Burley, Stephen K.
2017-01-01
SUMMARY OneDep, a unified system for deposition, biocuration, and validation of experimentally determined structures of biological macromolecules to the Protein Data Bank (PDB) archive, has been developed as a global collaboration by the Worldwide Protein Data Bank (wwPDB) partners. This new system was designed to ensure that the wwPDB could meet the evolving archiving requirements of the scientific community over the coming decades. OneDep unifies deposition, biocuration, and validation pipelines across all wwPDB, EMDB, and BMRB deposition sites with improved focus on data quality and completeness in these archives, while supporting growth in the number of depositions and increases in their average size and complexity. In this paper, we describe the design, functional operation, and supporting infrastructure of the OneDep system, and provide initial performance assessments. PMID:28190782
NASA Astrophysics Data System (ADS)
Reynolds, D.; Hall, I. R.; Slater, S. M.; Scourse, J. D.; Wanamaker, A. D.; Halloran, P. R.; Garry, F. K.
2017-12-01
Spatial network analyses of precisely dated, and annually resolved, tree-ring proxy records have facilitated robust reconstructions of past atmospheric climate variability and the associated mechanisms and forcings that drive it. In contrast, a lack of similarly dated marine archives has constrained the use of such techniques in the marine realm, despite the potential for developing a more robust understanding of the role basin scale ocean dynamics play in the global climate system. Here we show that a spatial network of marine molluscan sclerochronological oxygen isotope (δ18Oshell) series spanning the North Atlantic region provides a skilful reconstruction of basin scale North Atlantic sea surface temperatures (SSTs). Our analyses demonstrate that the composite marine series (referred to as δ18Oproxy_PC1) is significantly sensitive to inter-annual variability in North Atlantic SSTs (R=-0.61 P<0.01) and surface air temperatures (SATs; R=-0.67, P<0.01) over the 20th century. Subpolar gyre (SPG) SSTs dominates variability in the δ18Oproxy_PC1 series at sub-centennial frequencies (R=-0.51, P<0.01). Comparison of the δ18Oproxy_PC1 series against variability in the strength of the European Slope Current and maximum North Atlantic meridional overturning circulation derived from numeric climate models (CMIP5), indicates that variability in the SPG region, associated with the strength of the surface currents of the North Atlantic, are playing a significant role in shaping the multi-decadal scale SST variability over the industrial era. These analyses demonstrate that spatial networks developed from sclerochronological archives can provide powerful baseline archives of past ocean variability that can facilitate the development of a quantitative understanding for the role the oceans play in the global climate systems and constraining uncertainties in numeric climate models.
Quantifying travel time variability in transportation networks.
DOT National Transportation Integrated Search
2010-03-01
Nonrecurring congestion creates significant delay on freeways in urban areas, lending importance : to the study of facility reliability. In locations where traffic detectors record and archive data, : approximate probability distributions for travel ...
NASA Astrophysics Data System (ADS)
Chen, S. E.; Yu, E.; Bhaskaran, A.; Chowdhury, F. R.; Meisenhelter, S.; Hutton, K.; Given, D.; Hauksson, E.; Clayton, R. W.
2011-12-01
Currently, the SCEDC archives continuous and triggered data from nearly 8400 data channels from 425 SCSN recorded stations, processing and archiving an average of 6.4 TB of continuous waveforms and 12,000 earthquakes each year. The SCEDC provides public access to these earthquake parametric and waveform data through its website www.data.scec.org and through client applications such as STP and DHI. This poster will describe the most significant developments at the SCEDC during 2011. New website design: ? The SCEDC has revamped its website. The changes make it easier for users to search the archive, discover updates and new content. These changes also improve our ability to manage and update the site. New data holdings: ? Post processing on El Mayor Cucapah 7.2 sequence continues. To date there have been 11847 events reviewed. Updates are available in the earthquake catalog immediately. ? A double difference catalog (Hauksson et. al 2011) spanning 1981 to 6/30/11 will be available for download at www.data.scec.org and available via STP. ? A focal mechanism catalog determined by Yang et al. 2011 is available for distribution at www.data.scec.org. ? Waveforms from Southern California NetQuake stations are now being stored in the SCEDC archive and available via STP as event associated waveforms. Amplitudes from these stations are also being stored in the archive and used by ShakeMap. ? As part of a NASA/AIST project in collaboration with JPL and SIO, the SCEDC will receive real time 1 sps streams of GPS displacement solutions from the California Real Time Network (http://sopac.ucsd.edu/projects/realtime; Genrich and Bock, 2006, J. Geophys. Res.). These channels will be archived at the SCEDC as miniSEED waveforms, which then can be distributed to the user community via applications such as STP. Improvements in the user tool STP: ? STP sac output now includes picks from the SCSN. New archival methods: ? The SCEDC is exploring the feasibility of archiving and distributing waveform data using cloud computing such as Google Apps. A month of continuous data from the SCEDC archive will be stored in Google Apps and a client developed to access it in a manner similar to STP. The data is stored in miniseed format with gzip compression. Time gaps between time series were padded with null values, which substantially increases search efficiency by make the records uniform in length.
Evolution of Archival Storage (from Tape to Memory)
NASA Technical Reports Server (NTRS)
Ramapriyan, Hampapuram K.
2015-01-01
Over the last three decades, there has been a significant evolution in storage technologies supporting archival of remote sensing data. This section provides a brief survey of how these technologies have evolved. Three main technologies are considered - tape, hard disk and solid state disk. Their historical evolution is traced, summarizing how reductions in cost have helped being able to store larger volumes of data on faster media. The cost per GB of media is only one of the considerations in determining the best approach to archival storage. Active archives generally require faster response to user requests for data than permanent archives. The archive costs have to consider facilities and other capital costs, operations costs, software licenses, utilities costs, etc. For meeting requirements in any organization, typically a mix of technologies is needed.
A Framework to Manage Information Models
NASA Astrophysics Data System (ADS)
Hughes, J. S.; King, T.; Crichton, D.; Walker, R.; Roberts, A.; Thieman, J.
2008-05-01
The Information Model is the foundation on which an Information System is built. It defines the entities to be processed, their attributes, and the relationships that add meaning. The development and subsequent management of the Information Model is the single most significant factor for the development of a successful information system. A framework of tools has been developed that supports the management of an information model with the rigor typically afforded to software development. This framework provides for evolutionary and collaborative development independent of system implementation choices. Once captured, the modeling information can be exported to common languages for the generation of documentation, application databases, and software code that supports both traditional and semantic web applications. This framework is being successfully used for several science information modeling projects including those for the Planetary Data System (PDS), the International Planetary Data Alliance (IPDA), the National Cancer Institute's Early Detection Research Network (EDRN), and several Consultative Committee for Space Data Systems (CCSDS) projects. The objective of the Space Physics Archive Search and Exchange (SPASE) program is to promote collaboration and coordination of archiving activity for the Space Plasma Physics community and ensure the compatibility of the architectures used for a global distributed system and the individual data centers. Over the past several years, the SPASE data model working group has made great progress in developing the SPASE Data Model and supporting artifacts including a data dictionary, XML Schema, and two ontologies. The authors have captured the SPASE Information Model in this framework. This allows the generation of documentation that presents the SPASE Information Model in object-oriented notation including UML class diagrams and class hierarchies. The modeling information can also be exported to semantic web languages such as OWL and RDF and written to XML Metadata Interchange (XMI) files for import into UML tools.
NASA Astrophysics Data System (ADS)
Boden, T. A.; Krassovski, M.; Yang, B.
2013-06-01
The Carbon Dioxide Information Analysis Center (CDIAC) at Oak Ridge National Laboratory (ORNL), USA has provided scientific data management support for the US Department of Energy and international climate change science since 1982. Among the many data archived and available from CDIAC are collections from long-term measurement projects. One current example is the AmeriFlux measurement network. AmeriFlux provides continuous measurements from forests, grasslands, wetlands, and croplands in North, Central, and South America and offers important insight about carbon cycling in terrestrial ecosystems. To successfully manage AmeriFlux data and support climate change research, CDIAC has designed flexible data systems using proven technologies and standards blended with new, evolving technologies and standards. The AmeriFlux data system, comprised primarily of a relational database, a PHP-based data interface and a FTP server, offers a broad suite of AmeriFlux data. The data interface allows users to query the AmeriFlux collection in a variety of ways and then subset, visualize and download the data. From the perspective of data stewardship, on the other hand, this system is designed for CDIAC to easily control database content, automate data movement, track data provenance, manage metadata content, and handle frequent additions and corrections. CDIAC and researchers in the flux community developed data submission guidelines to enhance the AmeriFlux data collection, enable automated data processing, and promote standardization across regional networks. Both continuous flux and meteorological data and irregular biological data collected at AmeriFlux sites are carefully scrutinized by CDIAC using established quality-control algorithms before the data are ingested into the AmeriFlux data system. Other tasks at CDIAC include reformatting and standardizing the diverse and heterogeneous datasets received from individual sites into a uniform and consistent network database, generating high-level derived products to meet the current demands from a broad user group, and developing new products in anticipation of future needs. In this paper, we share our approaches to meet the challenges of standardizing, archiving and delivering quality, well-documented AmeriFlux data worldwide to benefit others with similar challenges of handling diverse climate change data, to further heighten awareness and use of an outstanding ecological data resource, and to highlight expanded software engineering applications being used for climate change measurement data.
NASA Astrophysics Data System (ADS)
Boden, T. A.; Krassovski, M.; Yang, B.
2013-02-01
The Carbon Dioxide Information Analysis Center (CDIAC) at Oak Ridge National Laboratory (ORNL), USA has provided scientific data management support for the US Department of Energy and international climate change science since 1982. Among the many data archived and available from CDIAC are collections from long-term measurement projects. One current example is the AmeriFlux measurement network. AmeriFlux provides continuous measurements from forests, grasslands, wetlands, and croplands in North, Central, and South America and offers important insight about carbon cycling in terrestrial ecosystems. To successfully manage AmeriFlux data and support climate change research, CDIAC has designed flexible data systems using proven technologies and standards blended with new, evolving technologies and standards. The AmeriFlux data system, comprised primarily of a relational database, a PHP based data-interface and a FTP server, offers a broad suite of AmeriFlux data. The data interface allows users to query the AmeriFlux collection in a variety of ways and then subset, visualize and download the data. From the perspective of data stewardship, on the other hand, this system is designed for CDIAC to easily control database content, automate data movement, track data provenance, manage metadata content, and handle frequent additions and corrections. CDIAC and researchers in the flux community developed data submission guidelines to enhance the AmeriFlux data collection, enable automated data processing, and promote standardization across regional networks. Both continuous flux and meteorological data and irregular biological data collected at AmeriFlux sites are carefully scrutinized by CDIAC using established quality-control algorithms before the data are ingested into the AmeriFlux data system. Other tasks at CDIAC include reformatting and standardizing the diverse and heterogeneous datasets received from individual sites into a uniform and consistent network database, generating high-level derived products to meet the current demands from a broad user group, and developing new products in anticipation of future needs. In this paper, we share our approaches to meet the challenges of standardizing, archiving and delivering quality, well-documented AmeriFlux data worldwide to benefit others with similar challenges of handling diverse climate change data, to further heighten awareness and use of an outstanding ecological data resource, and to highlight expanded software engineering applications being used for climate change measurement data.
NASA Technical Reports Server (NTRS)
Blakeslee, R. J.; Bailey, J. C.; Pinto, O.; Athayde, A.; Renno, N.; Weidman, C. D.
2003-01-01
A four station Advanced Lightning Direction Finder (ALDF) network was established in the state of Rondonia in western Brazil in 1999 through a collaboration of U.S. and Brazilian participants from NASA, INPE, INMET, and various universities. The network utilizes ALDF IMPACT (Improved Accuracy from Combined Technology) sensors to provide cloud-to-ground lightning observations (i.e., stroke/flash locations, signal amplitude, and polarity) using both time-of- arrival and magnetic direction finding techniques. The observations are collected, processed and archived at a central site in Brasilia and at the NASA/Marshall Space Flight Center in Huntsville, Alabama. Initial, non-quality assured quick-look results are made available in near real-time over the Internet. The network, which is still operational, was deployed to provide ground truth data for the Lightning Imaging Sensor (LIS) on the Tropical Rainfall Measuring Mission (TRMM) satellite that was launched in November 1997. The measurements are also being used to investigate the relationship between the electrical, microphysical and kinematic properties of tropical convection. In addition, the long-time series observations produced by this network will help establish a regional lightning climatological database, supplementing other databases in Brazil that already exist or may soon be implemented. Analytic inversion algorithms developed at the NASA/Marshall Space Flight Center have been applied to the Rondonian ALDF lightning observations to obtain site error corrections and improved location retrievals. The data will also be corrected for the network detection efficiency. The processing methodology and the results from the analysis of four years of network operations will be presented.
Sexually Transmitted Diseases (STDs) Prevention
... Isolate Surveillance Project (GISP) STD Health Equity Community Approaches to Reducing STDs Archive STD Surveillance Network (SSuN) ... It is best to get all three doses (shots) before becoming sexually active . However, HPV vaccines are ...
NASA Technical Reports Server (NTRS)
Leblanc, T.; Godin-Beekmann, S.; Payen, Godin-Beekmann; Gabarrot, Franck; vanGijsel, Anne; Bandoro, J.; Sica, R.; Trickl, T.
2012-01-01
The international Network for the Detection of Atmospheric Composition Change (NDACC) is a global network of high-quality, remote-sensing research stations for observing and understanding the physical and chemical state of the Earth atmosphere. As part of NDACC, over 20 ground-based lidar instruments are dedicated to the long-term monitoring of atmospheric composition and to the validation of space-borne measurements of the atmosphere from environmental satellites such as Aura and ENVISAT. One caveat of large networks such as NDACC is the difficulty to archive measurement and analysis information consistently from one research group (or instrument) to another [1][2][3]. Yet the need for consistent definitions has strengthened as datasets of various origin (e.g., satellite and ground-based) are increasingly used for intercomparisons, validation, and ingested together in global assimilation systems.In the framework of the 2010 Call for Proposals by the International Space Science Institute (ISSI) located in Bern, Switzerland, a Team of lidar experts was created to address existing issues in three critical aspects of the NDACC lidar ozone and temperature data retrievals: signal filtering and the vertical filtering of the retrieved profiles, the quantification and propagation of the uncertainties, and the consistent definition and reporting of filtering and uncertainties in the NDACC- archived products. Additional experts from the satellite and global data standards communities complement the team to help address issues specific to the latter aspect.
NOAA tsunami water level archive - scientific perspectives and discoveries
NASA Astrophysics Data System (ADS)
Mungov, G.; Eble, M. C.; McLean, S. J.
2013-12-01
The National Oceanic and Atmospheric Administration (NOAA) National Geophysical Data Center (NGDC) and co-located World Data Service for Geophysics (WDS) provides long-term archive, data management, and access to national and global tsunami data. Currently, NGDC archives and processes high-resolution data recorded by the Deep-ocean Assessment and Reporting of Tsunami (DART) network, the coastal-tide-gauge network from the National Ocean Service (NOS) as well as tide-gauge data recorded by all gauges in the two National Weather Service (NWS) Tsunami Warning Centers' (TWCs) regional networks. The challenge in processing these data is that the observations from the deep-ocean, Pacific Islands, Alaska region, and United States West and East Coasts display commonalities, but, at the same time, differ significantly, especially when extreme events are considered. The focus of this work is on how time integration of raw observations (10-seconds to 1-minute) could mask extreme water levels. Analysis of the statistical and spectral characteristics obtained from records with different time step of integration will be presented. Results show the need to precisely calibrate the despiking procedure against raw data due to the significant differences in the variability of deep-ocean and coastal tide-gauge observations. It is shown that special attention should be drawn to the very strong water level declines associated with the passage of the North Atlantic cyclones. Strong changes for the deep ocean and for the West Coast have implications for data quality but these same features are typical for the East Coast regime.
NASA Astrophysics Data System (ADS)
Pesaresi, D.; Busby, R.
2013-08-01
The number and quality of seismic stations and networks in Europe continually improves, nevertheless there is always scope to optimize their performance. In this session we welcomed contributions from all aspects of seismic network installation, operation and management. This includes site selection; equipment testing and installation; planning and implementing communication paths; policies for redundancy in data acquisition, processing and archiving; and integration of different datasets including GPS and OBS.
The HEASARC in 2016: 25 Years and Counting
NASA Astrophysics Data System (ADS)
Drake, Stephen Alan; Smale, Alan P.
2016-04-01
The High Energy Astrophysics Archival Research Center or HEASARC (http://heasarc.gsfc.nasa.gov/) has been the NASA astrophysics discipline archive supporting multi-mission cosmic X-ray and gamma-ray astronomy research for 25 years, and, through its LAMBDA (Legacy Archive for Microwave Background Data Analysis: http://lambda.gsfc.nasa.gov/) component, the archive for cosmic microwave background data for the last 8 years. The HEASARC is the designated archive which supports NASA's Physics of the Cosmos theme (http://pcos.gsfc.nasa.gov/).The HEASARC provides a unified archive and software structure aimed both at 'legacy' high-energy missions such as Einstein, EXOSAT, ROSAT, RXTE, and Suzaku, contemporary missions such as Fermi, Swift, XMM-Newton, Chandra, NuSTAR, etc., and upcoming missions, such as Astro-H and NICER. The HEASARC's high-energy astronomy archive has grown so that it presently contains more than 80 terabytes (TB) of data from 30 past and present orbital missions. The user community downloaded 160 TB of high-energy data from the HEASARC last year, i.e., an amount equivalent to twice the size of the archive.We discuss some of the upcoming new initiatives and developments for the HEASARC, including the arrival of public data from the JAXA/NASA Astro-H mission, expected to have been launched in February 2016, and the NASA mission of opportunity Neutron Star Interior Composition Explorer (NICER), expected to be deployed in late summer 2016. We also highlight some of the new software and web initiatives of the HEASARC, and discuss our plans for the next 3 years.
Flexible server-side processing of climate archives
NASA Astrophysics Data System (ADS)
Juckes, Martin; Stephens, Ag; Damasio da Costa, Eduardo
2014-05-01
The flexibility and interoperability of OGC Web Processing Services are combined with an extensive range of data processing operations supported by the Climate Data Operators (CDO) library to facilitate processing of the CMIP5 climate data archive. The challenges posed by this peta-scale archive allow us to test and develop systems which will help us to deal with approaching exa-scale challenges. The CEDA WPS package allows users to manipulate data in the archive and export the results without first downloading the data -- in some cases this can drastically reduce the data volumes which need to be transferred and greatly reduce the time needed for the scientists to get their results. Reductions in data transfer are achieved at the expense of an additional computational load imposed on the archive (or near-archive) infrastructure. This is managed with a load balancing system. Short jobs may be run in near real-time, longer jobs will be queued. When jobs are queued the user is provided with a web dashboard displaying job status. A clean split between the data manipulation software and the request management software is achieved by exploiting the extensive CDO library. This library has a long history of development to support the needs of the climate science community. Use of the library ensures that operations run on data by the system can be reproduced by users using the same operators installed on their own computers. Examples using the system deployed for the CMIP5 archive will be shown and issues which need to be addressed as archive volumes expand into the exa-scale will be discussed.
Flexible server-side processing of climate archives
NASA Astrophysics Data System (ADS)
Juckes, M. N.; Stephens, A.; da Costa, E. D.
2013-12-01
The flexibility and interoperability of OGC Web Processing Services are combined with an extensive range of data processing operations supported by the Climate Data Operators (CDO) library to facilitate processing of the CMIP5 climate data archive. The challenges posed by this peta-scale archive allow us to test and develop systems which will help us to deal with approaching exa-scale challenges. The CEDA WPS package allows users to manipulate data in the archive and export the results without first downloading the data -- in some cases this can drastically reduce the data volumes which need to be transferred and greatly reduce the time needed for the scientists to get their results. Reductions in data transfer are achieved at the expense of an additional computational load imposed on the archive (or near-archive) infrastructure. This is managed with a load balancing system. Short jobs may be run in near real-time, longer jobs will be queued. When jobs are queued the user is provided with a web dashboard displaying job status. A clean split between the data manipulation software and the request management software is achieved by exploiting the extensive CDO library. This library has a long history of development to support the needs of the climate science community. Use of the library ensures that operations run on data by the system can be reproduced by users using the same operators installed on their own computers. Examples using the system deployed for the CMIP5 archive will be shown and issues which need to be addressed as archive volumes expand into the exa-scale will be discussed.
Smith, E M; Wandtke, J; Robinson, A
1999-05-01
The Medical Information, Communication and Archive System (MICAS) is a multivendor incremental approach to picture archiving and communications system (PACS). It is a multimodality integrated image management system that is seamlessly integrated with the radiology information system (RIS). Phase II enhancements of MICAS include a permanent archive, automated workflow, study caches, Microsoft (Redmond, WA) Windows NT diagnostic workstations with all components adhering to Digital Information Communications in Medicine (DICOM) standards. MICAS is designed as an enterprise-wide PACS to provide images and reports throughout the Strong Health healthcare network. Phase II includes the addition of a Cemax-Icon (Fremont, CA) archive, PACS broker (Mitra, Waterloo, Canada), an interface (IDX PACSlink, Burlington, VT) to the RIS (IDXrad) plus the conversion of the UNIX-based redundant array of inexpensive disks (RAID) 5 temporary archives in phase I to NT-based RAID 0 DICOM modality-specific study caches (ImageLabs, Bedford, MA). The phase I acquisition engines and workflow management software was uninstalled and the Cemax archive manager (AM) assumed these functions. The existing ImageLabs UNIX-based viewing software was enhanced and converted to an NT-based DICOM viewer. Installation of phase II hardware and software and integration with existing components began in July 1998. Phase II of MICAS demonstrates that a multivendor open-system incremental approach to PACS is feasible, cost-effective, and has significant advantages over a single-vendor implementation.
Cloud archiving and data mining of High-Resolution Rapid Refresh forecast model output
NASA Astrophysics Data System (ADS)
Blaylock, Brian K.; Horel, John D.; Liston, Samuel T.
2017-12-01
Weather-related research often requires synthesizing vast amounts of data that need archival solutions that are both economical and viable during and past the lifetime of the project. Public cloud computing services (e.g., from Amazon, Microsoft, or Google) or private clouds managed by research institutions are providing object data storage systems potentially appropriate for long-term archives of such large geophysical data sets. We illustrate the use of a private cloud object store developed by the Center for High Performance Computing (CHPC) at the University of Utah. Since early 2015, we have been archiving thousands of two-dimensional gridded fields (each one containing over 1.9 million values over the contiguous United States) from the High-Resolution Rapid Refresh (HRRR) data assimilation and forecast modeling system. The archive is being used for retrospective analyses of meteorological conditions during high-impact weather events, assessing the accuracy of the HRRR forecasts, and providing initial and boundary conditions for research simulations. The archive is accessible interactively and through automated download procedures for researchers at other institutions that can be tailored by the user to extract individual two-dimensional grids from within the highly compressed files. Characteristics of the CHPC object storage system are summarized relative to network file system storage or tape storage solutions. The CHPC storage system is proving to be a scalable, reliable, extensible, affordable, and usable archive solution for our research.
A generic archive protocol and an implementation
NASA Technical Reports Server (NTRS)
Jordan, J. M.; Jennings, D. G.; Mcglynn, T. A.; Ruggiero, N. G.; Serlemitsos, T. A.
1992-01-01
Archiving vast amounts of data has become a major part of every scientific space mission today. The Generic Archive/Retrieval Services Protocol (GRASP) addresses the question of how to archive the data collected in an environment where the underlying hardware archives may be rapidly changing. GRASP is a device independent specification defining a set of functions for storing and retrieving data from an archive, as well as other support functions. GRASP is divided into two levels: the Transfer Interface and the Action Interface. The Transfer Interface is computer/archive independent code while the Action Interface contains code which is dedicated to each archive/computer addressed. Implementations of the GRASP specification are currently available for DECstations running Ultrix, Sparcstations running SunOS, and microVAX/VAXstation 3100's. The underlying archive is assumed to function as a standard Unix or VMS file system. The code, written in C, is a single suite of files. Preprocessing commands define the machine unique code sections in the device interface. The implementation was written, to the greatest extent possible, using only ANSI standard C functions.
Assortative model for social networks
NASA Astrophysics Data System (ADS)
Catanzaro, Michele; Caldarelli, Guido; Pietronero, Luciano
2004-09-01
In this Brief Report we present a version of a network growth model, generalized in order to describe the behavior of social networks. The case of study considered is the preprint archive at cul.arxiv.org. Each node corresponds to a scientist, and a link is present whenever two authors wrote a paper together. This graph is a nice example of degree-assortative network, that is, to say a network where sites with similar degree are connected to each other. The model presented is one of the few able to reproduce such behavior, giving some insight on the microscopic dynamics at the basis of the graph structure.
75 FR 52992 - Agency Information Collection Activities: Proposed Collection; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-30
... NATIONAL ARCHIVES AND RECORDS ADMINISTRATION Agency Information Collection Activities: Proposed Collection; Comment Request AGENCY: National Archives and Records Administration (NARA). ACTION: Notice... records matter. The information will support adjustments in this offering that will improve the overall...
Espinal, Allyson C; Wang, Dan; Yan, Li; Liu, Song; Tang, Li; Hu, Qiang; Morrison, Carl D; Ambrosone, Christine B; Higgins, Michael J; Sucheston-Campbell, Lara E
2017-02-28
DNA from archival formalin-fixed and paraffin embedded (FFPE) tissue is an invaluable resource for genome-wide methylation studies although concerns about poor quality may limit its use. In this study, we compared DNA methylation profiles of breast tumors using DNA from fresh-frozen (FF) tissues and three types of matched FFPE samples. For 9/10 patients, correlation and unsupervised clustering analysis revealed that the FF and FFPE samples were consistently correlated with each other and clustered into distinct subgroups. Greater than 84% of the top 100 loci previously shown to differentiate ER+ and ER- tumors in FF tissues were also FFPE DML. Weighted Correlation Gene Network Analyses (WCGNA) grouped the DML loci into 16 modules in FF tissue, with ~85% of the module membership preserved across tissue types. Restored FFPE and matched FF samples were profiled using the Illumina Infinium HumanMethylation450K platform. Methylation levels (β-values) across all loci and the top 100 loci previously shown to differentiate tumors by estrogen receptor status (ER+ or ER-) in a larger FF study, were compared between matched FF and FFPE samples using Pearson's correlation, hierarchical clustering and WCGNA. Positive predictive values and sensitivity levels for detecting differentially methylated loci (DML) in FF samples were calculated in an independent FFPE cohort. FFPE breast tumors samples show lower overall detection of DMLs versus FF, however FFPE and FF DMLs compare favorably. These results support the emerging consensus that the 450K platform can be employed to investigate epigenetics in large sets of archival FFPE tissues.
Greenland Ice Sheet Monitoring Network (GLISN): Contributions to Science and Society
NASA Astrophysics Data System (ADS)
Anderson, K. R.; Bonaime, S.; Clinton, J. F.; Dahl-Jensen, T.; Debski, W. M.; Giardini, D.; Govoni, A.; Kanao, M.; Larsen, T. B.; Lasocki, S.; Lee, W. S.; McCormack, D. A.; Mykkeltveit, S.; Nettles, M.; Stutzmann, E.; Strollo, A.; Sweet, J. R.; Tsuboi, S.; Vallee, M.
2017-12-01
The Greenland Ice Sheet Monitoring Network (GLISN) is a broadband, multi-use seismological network, enhanced by selected geodetic observations, designed with the capability to allow researchers to understand the changes currently occurring in the Arctic, and with the operational characteristics necessary to enable response to those changes as understanding improves. GLISN was established through an international collaboration, with 10 nations coordinating their efforts to develop the current 34-station observing network during the last eight years. All of the data collected are freely and openly available in near-real time. The network was designed to transform the community capability for recording, analysis, and interpretation of seismic signals generated by discrete events in Greenland and the Arctic, as well as those traversing the region. Data from the network support a wide range of uses, including estimation of the properties of the solid Earth that control isostatic adjustment rates and set key boundary conditions for ice-sheet evolution; analysis of tectonic earthquakes throughout Greenland and the Arctic; study of the seismic signals associated with large calving events and changing glacier dynamics; and variations in ice and snow properties within the Greenland Ice Sheet. Recordings from the network have also provided invaluable data for rapid evaluation and understanding of the devastating landslide and tsunami that occurred near Nuugaatsiaq, Greenland, in June, 2017. The GLISN strategy of maximizing data quality from a network of approximately evenly distributed stations, delivering data in near-real time, and archiving a continuous data stream easily accessible to researchers, allows continuous discovery of new uses while also facilitating the generation of data products, such as catalogs of tectonic and glacial earthquakes and GPS-based estimates of snow height, that allow for assessment of change over time.
Integrating a local database into the StarView distributed user interface
NASA Technical Reports Server (NTRS)
Silberberg, D. P.
1992-01-01
A distributed user interface to the Space Telescope Data Archive and Distribution Service (DADS) known as StarView is being developed. The DADS architecture consists of the data archive as well as a relational database catalog describing the archive. StarView is a client/server system in which the user interface is the front-end client to the DADS catalog and archive servers. Users query the DADS catalog from the StarView interface. Query commands are transmitted via a network and evaluated by the database. The results are returned via the network and are displayed on StarView forms. Based on the results, users decide which data sets to retrieve from the DADS archive. Archive requests are packaged by StarView and sent to DADS, which returns the requested data sets to the users. The advantages of distributed client/server user interfaces over traditional one-machine systems are well known. Since users run software on machines separate from the database, the overall client response time is much faster. Also, since the server is free to process only database requests, the database response time is much faster. Disadvantages inherent in this architecture are slow overall database access time due to the network delays, lack of a 'get previous row' command, and that refinements of a previously issued query must be submitted to the database server, even though the domain of values have already been returned by the previous query. This architecture also does not allow users to cross correlate DADS catalog data with other catalogs. Clearly, a distributed user interface would be more powerful if it overcame these disadvantages. A local database is being integrated into StarView to overcome these disadvantages. When a query is made through a StarView form, which is often composed of fields from multiple tables, it is translated to an SQL query and issued to the DADS catalog. At the same time, a local database table is created to contain the resulting rows of the query. The returned rows are displayed on the form as well as inserted into the local database table. Identical results are produced by reissuing the query to either the DADS catalog or to the local table. Relational databases do not provide a 'get previous row' function because of the inherent complexity of retrieving previous rows of multiple-table joins. However, since this function is easily implemented on a single table, StarView uses the local table to retrieve the previous row. Also, StarView issues subsequent query refinements to the local table instead of the DADS catalog, eliminating the network transmission overhead. Finally, other catalogs can be imported into the local database for cross correlation with local tables. Overall, it is believe that this is a more powerful architecture for distributed, database user interfaces.
NASA Technical Reports Server (NTRS)
Leptoukh, Gregory
1999-01-01
The Goddard Distributed Active Archive Center (DAAC), as an integral part of the Earth Observing System Data and Information System (EOSDIS), is the official source of data for several important earth remote sensing missions. These include the Sea-viewing Wide-Field-of-view Sensor (SeaWiFS) launched in August 1997, the Tropical Rainfall Measuring Mission (TRMM) launched in November 1997, and the Moderate Resolution Imaging Spectroradiometer (MODIS) scheduled for launch in mid 1999 as part of the EOS AM-1 instrumentation package. The data generated from these missions supports a host of users in the hydrological, land biosphere and oceanographic research and applications communities. The volume and nature of the data present unique challenges to an Earth science data archive and distribution system such as the DAAC. The DAAC system receives, archives and distributes a large number of standard data products on a daily basis, including data files that have been reprocessed with updated calibration data or improved analytical algorithms. A World Wide Web interface is provided allowing interactive data selection and automatic data subscriptions as distribution options. The DAAC also creates customized and value-added data products, which allow additional user flexibility and reduced data volume. Another significant part of our overall mission is to provide ancillary data support services and archive support for worldwide field campaigns designed to validate the results from the various satellite-derived measurements. In addition to direct data services, accompanying documentation, WWW links to related resources, support for EOSDIS data formats, and informed response to inquiries are routinely provided to users. The current GDAAC WWW search and order system is being restructured to provide users with a simplified, hierarchical access to data. Data Browsers have been developed for several data sets to aid users in ordering data. These Browsers allow users to specify spatial, temporal, and other parameter criteria in searching for and previewing data.
Indicators of Suicide Found on Social Networks: Phase 2
2015-10-01
Engagement in Sport and Suicide Risk. Archives of Suicide Research . 11(4), pp375-390. Chioqueta, A. P. & Stiles, T. C. (2007). The relationship between...Approved for Public Distribution: Distribution Unlimited Defense Personnel and Security Research Center Defense Manpower Data Center Technical...Report 15-04 October 2015 Indicators of Suicide Found on Social Networks: Phase 2 Andrée E. Rose Defense Personnel and Security Research
Building the Joint Battlespace Infosphere. Volume 1: Summary
1999-12-17
Integrity guarantees. The information staff will conduct audits and other routine procedures to ensure that the JBI platform and its clients are...delete the object; thereafter, the object will not be available to other JBI clients, but it will have been saved in an archive for auditing or...this project are maintaining multiyear, multilocation programs in nomadic networking and assuring quality of service in emerging networks. Dynamic
Providing Web Interfaces to the NSF EarthScope USArray Transportable Array
NASA Astrophysics Data System (ADS)
Vernon, Frank; Newman, Robert; Lindquist, Kent
2010-05-01
Since April 2004 the EarthScope USArray seismic network has grown to over 850 broadband stations that stream multi-channel data in near real-time to the Array Network Facility in San Diego. Providing secure, yet open, access to real-time and archived data for a broad range of audiences is best served by a series of platform agnostic low-latency web-based applications. We present a framework of tools that mediate between the world wide web and Boulder Real Time Technologies Antelope Environmental Monitoring System data acquisition and archival software. These tools provide comprehensive information to audiences ranging from network operators and geoscience researchers, to funding agencies and the general public. This ranges from network-wide to station-specific metadata, state-of-health metrics, event detection rates, archival data and dynamic report generation over a station's two year life span. Leveraging open source web-site development frameworks for both the server side (Perl, Python and PHP) and client-side (Flickr, Google Maps/Earth and jQuery) facilitates the development of a robust extensible architecture that can be tailored on a per-user basis, with rapid prototyping and development that adheres to web-standards. Typical seismic data warehouses allow online users to query and download data collected from regional networks, without the scientist directly visually assessing data coverage and/or quality. Using a suite of web-based protocols, we have recently developed an online seismic waveform interface that directly queries and displays data from a relational database through a web-browser. Using the Python interface to Datascope and the Python-based Twisted network package on the server side, and the jQuery Javascript framework on the client side to send and receive asynchronous waveform queries, we display broadband seismic data using the HTML Canvas element that is globally accessible by anyone using a modern web-browser. We are currently creating additional interface tools to create a rich-client interface for accessing and displaying seismic data that can be deployed to any system running the Antelope Real Time System. The software is freely available from the Antelope contributed code Git repository (http://www.antelopeusersgroup.org).
ASF archive issues: Current status, past history, and questions for the future
NASA Technical Reports Server (NTRS)
Goula, Crystal A.; Wales, Carl
1994-01-01
The Alaska SAR Facility (ASF) collects, processes, archives, and distributes data from synthetic aperture radar (SAR) satellites in support of scientific research. ASF has been in operation since 1991 and presently has an archive of over 100 terabytes of data. ASF is performing an analysis of its magnetic tape storage system to ensure long-term preservation of this archive. Future satellite missions have the possibility of doubling to tripling the amounts of data that ASF acquires. ASF is examining the current data systems and the high volume storage, and exploring future concerns and solutions.
Davis, Brian N.; Werpy, Jason; Friesz, Aaron M.; Impecoven, Kevin; Quenzer, Robert; Maiersperger, Tom; Meyer, David J.
2015-01-01
Current methods of searching for and retrieving data from satellite land remote sensing archives do not allow for interactive information extraction. Instead, Earth science data users are required to download files over low-bandwidth networks to local workstations and process data before science questions can be addressed. New methods of extracting information from data archives need to become more interactive to meet user demands for deriving increasingly complex information from rapidly expanding archives. Moving the tools required for processing data to computer systems of data providers, and away from systems of the data consumer, can improve turnaround times for data processing workflows. The implementation of middleware services was used to provide interactive access to archive data. The goal of this middleware services development is to enable Earth science data users to access remote sensing archives for immediate answers to science questions instead of links to large volumes of data to download and process. Exposing data and metadata to web-based services enables machine-driven queries and data interaction. Also, product quality information can be integrated to enable additional filtering and sub-setting. Only the reduced content required to complete an analysis is then transferred to the user.
Henri, C J; Cox, R D; Bret, P M
1997-08-01
This article details our experience in developing and operating an ultrasound mini-picture archiving and communication system (PACS). Using software developed in-house, low-end Macintosh computers (Apple Computer Co. Cupertino, CA) equipped with framegrabbers coordinate the entry of patient demographic information, image acquisition, and viewing on each ultrasound scanner. After each exam, the data are transmitted to a central archive server where they can be accessed from anywhere on the network. The archive server also provides web-based access to the data and manages pre-fetch and other requests for data that may no longer be on-line. Archival is fully automatic and is performed on recordable compact disk (CD) without compression. The system has been filmless now for over 18 months. In the meantime, one film processor has been eliminated and the position of one film clerk has been reallocated. Previously, nine ultrasound machines produced approximately 150 sheets of laser film per day (at 14 images per sheet). The same quantity of data are now archived without compression onto a single CD. Start-up costs were recovered within six months, and the project has been extended to include computed tomography (CT) and magnetic resonance imaging (MRI).
Reference Model for an Open Archival Information System
NASA Technical Reports Server (NTRS)
1997-01-01
This document is a technical report for use in developing a consensus on what is required to operate a permanent, or indefinite long-term, archive of digital information. It may be useful as a starting point for a similar document addressing the indefinite long-term preservation of non-digital information. This report establishes a common framework of terms and concepts which comprise an Open Archival Information System (OAIS). It allows existing and future archives to be more meaningfully compared and contrasted. It provides a basis for further standardization of within an archival context and it should promote greater vendor awareness of, and support of , archival requirements. Through the process of normal evolution, it is expected that expansion, deletion, or modification to this document may occur. This report is therefore subject to CCSDS document management and change control procedures.
Simple, Scalable, Script-based, Science Processor for Measurements - Data Mining Edition (S4PM-DME)
NASA Astrophysics Data System (ADS)
Pham, L. B.; Eng, E. K.; Lynnes, C. S.; Berrick, S. W.; Vollmer, B. E.
2005-12-01
The S4PM-DME is the Goddard Earth Sciences Distributed Active Archive Center's (GES DAAC) web-based data mining environment. The S4PM-DME replaces the Near-line Archive Data Mining (NADM) system with a better web environment and a richer set of production rules. S4PM-DME enables registered users to submit and execute custom data mining algorithms. The S4PM-DME system uses the GES DAAC developed Simple Scalable Script-based Science Processor for Measurements (S4PM) to automate tasks and perform the actual data processing. A web interface allows the user to access the S4PM-DME system. The user first develops personalized data mining algorithm on his/her home platform and then uploads them to the S4PM-DME system. Algorithms in C and FORTRAN languages are currently supported. The user developed algorithm is automatically audited for any potential security problems before it is installed within the S4PM-DME system and made available to the user. Once the algorithm has been installed the user can promote the algorithm to the "operational" environment. From here the user can search and order the data available in the GES DAAC archive for his/her science algorithm. The user can also set up a processing subscription. The subscription will automatically process new data as it becomes available in the GES DAAC archive. The generated mined data products are then made available for FTP pickup. The benefits of using S4PM-DME are 1) to decrease the downloading time it typically takes a user to transfer the GES DAAC data to his/her system thus off-load the heavy network traffic, 2) to free-up the load on their system, and last 3) to utilize the rich and abundance ocean, atmosphere data from the MODIS and AIRS instruments available from the GES DAAC.
Networking of Bibliographical Information: Lessons learned for the Virtual Observatory
NASA Astrophysics Data System (ADS)
Genova, Françoise; Egret, Daniel
Networking of bibliographic information is particularly remarkable in astronomy. On-line journals, the ADS bibliographic database, SIMBAD and NED are everyday tools for research, and provide easy navigation from one resource to another. Tables are published on line, in close collaboration with data centers. Recent new developments include the links between observatory archives and the ADS, as well as the large scale prototyping of object links between Astronomy and Astrophysics and SIMBAD, following those implemented a few years ago with New Astronomy and the International Bulletin of Variable stars . This networking has been made possible by close collaboration between the ADS, data centers such as the CDS and NED, and the journals, and this partnership being now extended to observatory archives. Simple, de facto exchange standards, like the bibcode to refer to a published paper, have been the key for building links and exchanging data. This partnership, in which practitioners from different disciplines agree to link their resources and to work together to define useful and usable standards, has produced a revolution in scientists' practice. It is an excellent model for the Virtual Observatory projects.
Goldszal, A F; Brown, G K; McDonald, H J; Vucich, J J; Staab, E V
2001-06-01
In this work, we describe the digital imaging network (DIN), picture archival and communication system (PACS), and radiology information system (RIS) currently being implemented at the Clinical Center, National Institutes of Health (NIH). These systems are presently in clinical operation. The DIN is a redundant meshed network designed to address gigabit density and expected high bandwidth requirements for image transfer and server aggregation. The PACS projected workload is 5.0 TB of new imaging data per year. Its architecture consists of a central, high-throughput Digital Imaging and Communications in Medicine (DICOM) data repository and distributed redundant array of inexpensive disks (RAID) servers employing fiber-channel technology for immediate delivery of imaging data. On demand distribution of images and reports to clinicians and researchers is accomplished via a clustered web server. The RIS follows a client-server model and provides tools to order exams, schedule resources, retrieve and review results, and generate management reports. The RIS-hospital information system (HIS) interfaces include admissions, discharges, and transfers (ATDs)/demographics, orders, appointment notifications, doctors update, and results.
NASA Astrophysics Data System (ADS)
Smale, Alan P.
2018-06-01
The High Energy Astrophysics Science Archive Research Center (HEASARC) is NASA's primary archive for high energy astrophysics and cosmic microwave background (CMB) data, supporting the broad science goals of NASA's Physics of the Cosmos theme. It provides vital scientific infrastructure to the community by standardizing science data formats and analysis programs, providing open access to NASA resources, and implementing powerful archive interfaces. These enable multimission studies of key astronomical targets, and deliver a major cost savings to NASA and proposing mission teams in terms of a reusable science infrastructure, as well as a time savings to the astronomical community through not having to learn a new analysis system for each new mission. The HEASARC archive holdings are currently in excess of 100 TB, supporting seven active missions (Chandra, Fermi, INTEGRAL, NICER, NuSTAR, Swift, and XMM-Newton), and providing continuing access to data from over 40 missions that are no longer in operation. HEASARC scientists are also engaged with the upcoming IXPE and XARM missions, and with many other Probe, Explorer, SmallSat, and CubeSat proposing teams. Within the HEASARC, the LAMBDA CMB thematic archive provides a permanent archive for NASA mission data from WMAP, COBE, IRAS, SWAS, and a wide selection of suborbital missions and experiments, and hosts many other CMB-related datasets, tools, and resources. In this talk I will summarize the current activities of the HEASARC and our plans for the coming decade. In addition to mission support, we will expand our software and user interfaces to provide astronomers with new capabilities to access and analyze HEASARC data, and continue to work with our Virtual Observatory partners to develop and implement standards to enable improved interrogation and analysis of data regardless of wavelength regime, mission, or archive boundaries. The future looks bright for high energy astrophysics, and the HEASARC looks forward to continuing its central role in the community.
NASA's Astrophysics Data Archives
NASA Astrophysics Data System (ADS)
Hasan, H.; Hanisch, R.; Bredekamp, J.
2000-09-01
The NASA Office of Space Science has established a series of archival centers where science data acquired through its space science missions is deposited. The availability of high quality data to the general public through these open archives enables the maximization of science return of the flight missions. The Astrophysics Data Centers Coordinating Council, an informal collaboration of archival centers, coordinates data from five archival centers distiguished primarily by the wavelength range of the data deposited there. Data are available in FITS format. An overview of NASA's data centers and services is presented in this paper. A standard front-end modifyer called `Astrowbrowse' is described. Other catalog browsers and tools include WISARD and AMASE supported by the National Space Scince Data Center, as well as ISAIA, a follow on to Astrobrowse.
Implementing the HDF-EOS5 software library for data products in the UNAVCO InSAR archive
NASA Astrophysics Data System (ADS)
Baker, Scott; Meertens, Charles; Crosby, Christopher
2017-04-01
UNAVCO is a non-profit university-governed consortium that operates the U.S. National Science Foundation (NSF) Geodesy Advancing Geosciences and EarthScope (GAGE) facility and provides operational support to the Western North America InSAR Consortium (WInSAR). The seamless synthetic aperture radar archive (SSARA) is a seamless distributed access system for SAR data and higher-level data products. Under the NASA-funded SSARA project, a user-contributed InSAR archive for interferograms, time series, and other derived data products was developed at UNAVCO. The InSAR archive development has led to the adoption of the HDF-EOS5 data model, file format, and library. The HDF-EOS software library was designed to support NASA Earth Observation System (EOS) science data products and provides data structures for radar geometry (Swath) and geocoded (Grid) data based on the HDF5 data model and file format provided by the HDF Group. HDF-EOS5 inherits the benefits of HDF5 (open-source software support, internal compression, portability, support for structural data, self-describing file metadata enhanced performance, and xml support) and provides a way to standardize InSAR data products. Instrument- and datatype-independent services, such as subsetting, can be applied to files across a wide variety of data products through the same library interface. The library allows integration with GIS software packages such as ArcGIS and GDAL, conversion to other data formats like NetCDF and GeoTIFF, and is extensible with new data structures to support future requirements. UNAVCO maintains a GitHub repository that provides example software for creating data products from popular InSAR processing software packages like GMT5SAR and ISCE as well as examples for reading and converting the data products into other formats. Digital object identifiers (DOI) have been incorporated into the InSAR archive allowing users to assign a permanent location for their processed result and easily reference the final data products. A metadata attribute is added to the HDF-EOS5 file when a DOI is minted for a data product. These data products are searchable through the SSARA federated query providing access to processed data for both expert and non-expert InSAR users. The archive facilitates timely distribution of processed data with particular importance for geohazards and event response.
The status of the international Halley watch
NASA Technical Reports Server (NTRS)
Newburn, Ray L., Jr.; Rahe, Juergen
1987-01-01
More than 1000 professional astronomers worldwide actually observed Halley's comet from the ground. Preliminary logs from the observers indicate that 20-40 Gbytes of data were acquired in eight professional disciplines and as much as 5 Gbytes in the amateur network. The latter will be used to fill in gaps in the Archive and to provide a visual light curve. In addition roughly 400 Mbytes of data were taken on Comet Giacobini-Zinner. Data will be accepted for archiving until early 1989. The permanent archive will consist of a set of CD-ROMs and a set of books, publication of both to be completed by mid-1990. Data from the space missions will be included but only on the CDs. From every indication, the ground based effort and the space missions complimented each other beautifully, both directly in the solution of spacecraft navigation problems and indirectly in the solution of scientific problems. The major remaining concern is that scientists submit their data to the Archive before the 1989 deadline.
Landsat International Cooperators and Global Archive Consolidation
,
2016-04-07
Landsat missions have always been an important component of U.S. foreign policy, as well as science and technology policy. The program’s longstanding network of International Cooperators (ICs), which operate numerous International Ground Stations (IGS) around the world, embodies the United States’ policy of peaceful use of outer space and the worldwide dissemination of civil space technology for public benefit. Thus, the ICs provide an essential dimension to the Landsat mission.In 2010, the Landsat Global Archive Consolidation (LGAC) effort began, with a goal to consolidate the Landsat data archives of all international ground stations, make the data more accessible to the global Landsat community, and significantly increase the frequency of observations over a given area of interest to improve scientific uses such as change detection and analysis.
NASA Technical Reports Server (NTRS)
Blakelee, Richard
1999-01-01
A four station Advanced Lightning Direction Finder (ALDF) network was recently established in the state of Rondonia in western Brazil through a collaboration of U.S. and Brazilian participants from NASA, INPE, INMET, and various universities. The network utilizes ALDF IMPACT (Improved Accuracy from Combined Technology) sensors to provide cloud-to-ground lightning observations (i.e., stroke/flash locations, signal amplitude, and polarity) using both time-of-arrival and magnetic direction finding techniques. The observations are collected, processed and archived at a central site in Brasilia and at the NASA/Marshall Space Flight Center (MSFC) in Huntsville, Alabama. Initial, non-quality assured quick-look results are made available in near real-time over the internet. The network will remain deployed for several years to provide ground truth data for the Lightning Imaging Sensor (LIS) on the Tropical Rainfall Measurement Mission (TRMM) satellite which was launched in November 1997. The measurements will also be used to investigate the relationship between the electrical, microphysical and kinematic properties of tropical convection. In addition, the long-term observations from this network will contribute in establishing a regional lightning climatological data base, supplementing other data bases in Brazil that already exist or may soon be implemented. Analytic inversion algorithms developed at NASA/MSFC are now being applied to the Rondonian ALDF lightning observations to obtain site error corrections and improved location retrievals. The processing methodology and the initial results from an analysis of the first 6 months of network operations will be presented.
NASA Technical Reports Server (NTRS)
Blakeslee, Rich; Bailey, Jeff; Koshak, Bill
1999-01-01
A four station Advanced Lightning Direction Finder (ALDF) network was recently established in the state of Rondonia in western Brazil through a collaboration of U.S. and Brazilian participants from NASA, INPE, INMET, and various universities. The network utilizes ALDF IMPACT (Improved Accuracy from Combined Technology) sensors to provide cloud-to-ground lightning observations (i.e., stroke/flash locations, signal amplitude, and polarity) using both time-of-arrival and magnetic direction finding techniques. The observations are collected, processed and archived at a central site in Brasilia and at the NASA/ Marshall Space Flight Center (MSFC) in Huntsville, Alabama. Initial, non-quality assured quick-look results are made available in near real-time over the internet. The network will remain deployed for several years to provide ground truth data for the Lightning Imaging Sensor (LIS) on the Tropical Rainfall Measuring Mission (TRMM) satellite which was launched in November 1997. The measurements will also be used to investigate the relationship between the electrical, microphysical and kinematic properties of tropical convection. In addition, the long-term observations from this network will contribute in establishing a regional lightning climatological data base, supplementing other data bases in Brazil that already exist or may soon be implemented. Analytic inversion algorithms developed at NASA/Marshall Space Flight Center (MSFC) are now being applied to the Rondonian ALDF lightning observations to obtain site error corrections and improved location retrievals. The processing methodology and the initial results from an analysis of the first 6 months of network operations will be presented.
Request redirection paradigm in medical image archive implementation.
Dragan, Dinu; Ivetić, Dragan
2012-08-01
It is widely recognized that the JPEG2000 facilitates issues in medical imaging: storage, communication, sharing, remote access, interoperability, and presentation scalability. Therefore, JPEG2000 support was added to the DICOM standard Supplement 61. Two approaches to support JPEG2000 medical image are explicitly defined by the DICOM standard: replacing the DICOM image format with corresponding JPEG2000 codestream, or by the Pixel Data Provider service, DICOM supplement 106. The latest one supposes two-step retrieval of medical image: DICOM request and response from a DICOM server, and then JPIP request and response from a JPEG2000 server. We propose a novel strategy for transmission of scalable JPEG2000 images extracted from a single codestream over DICOM network using the DICOM Private Data Element without sacrificing system interoperability. It employs the request redirection paradigm: DICOM request and response from JPEG2000 server through DICOM server. The paper presents programming solution for implementation of request redirection paradigm in a DICOM transparent manner. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.
Visual identification system for homeland security and law enforcement support
NASA Astrophysics Data System (ADS)
Samuel, Todd J.; Edwards, Don; Knopf, Michael
2005-05-01
This paper describes the basic configuration for a visual identification system (VIS) for Homeland Security and law enforcement support. Security and law enforcement systems with an integrated VIS will accurately and rapidly provide identification of vehicles or containers that have entered, exited or passed through a specific monitoring location. The VIS system stores all images and makes them available for recall for approximately one week. Images of alarming vehicles will be archived indefinitely as part of the alarming vehicle"s or cargo container"s record. Depending on user needs, the digital imaging information will be provided electronically to the individual inspectors, supervisors, and/or control center at the customer"s office. The key components of the VIS are the high-resolution cameras that capture images of vehicles, lights, presence sensors, image cataloging software, and image recognition software. In addition to the cameras, the physical integration and network communications of the VIS components with the balance of the security system and client must be ensured.
Improved discovery of NEON data and samples though vocabularies, workflows, and web tools
NASA Astrophysics Data System (ADS)
Laney, C. M.; Elmendorf, S.; Flagg, C.; Harris, T.; Lunch, C. K.; Gulbransen, T.
2017-12-01
The National Ecological Observatory Network (NEON) is a continental-scale ecological observation facility sponsored by the National Science Foundation and operated by Battelle. NEON supports research on the impacts of invasive species, land use change, and environmental change on natural resources and ecosystems by gathering and disseminating a full suite of observational, instrumented, and airborne datasets from field sites across the U.S. NEON also collects thousands of samples from soil, water, and organisms every year, and partners with numerous institutions to analyze and archive samples. We have developed numerous new technologies to support processing and discovery of this highly diverse collection of data. These technologies include applications for data collection and sample management, processing pipelines specific to each collection system (field observations, installed sensors, and airborne instruments), and publication pipelines. NEON data and metadata are discoverable and downloadable via both a public API and data portal. We solicit continued engagement and advice from the informatics and environmental research communities, particularly in the areas of data versioning, usability, and visualization.
Design of real-time encryption module for secure data protection of wearable healthcare devices.
Kim, Jungchae; Lee, Byuck Jin; Yoo, Sun K
2013-01-01
Wearable devices for biomedical instrumentation could generate the medical data and transmit to a repository on cloud service through wireless networks. In this process, the private medical data will be disclosed by man in the middle attack. Thus, the archived data for healthcare services would be protected by non-standardized security policy by healthcare service provider (HSP) because HIPAA only defines the security rules. In this paper, we adopted the Advanced Encryption Standard (AES) for security framework on wearable devices, so healthcare applications using this framework could support the confidentiality easily. The framework developed as dynamic loadable module targeted for lightweight microcontroller such as msp430 within embedded operating system. The performance was shown that the module can support the real-time encryption using electrocardiogram and photoplethysmogram. In this regard, the processing load for enabling security is distributed to wearable devices, and the customized data protection method could be composed by HSP for a trusted healthcare service.
NASA Technical Reports Server (NTRS)
White, Nicholas (Technical Monitor); Murray, Stephen S.
2003-01-01
(1) Chandra Archive: SAO has maintained the interfaces through which HEASARC gains access to the Chandra Data Archive. At HEASARC's request, we have implemented an anonymous ftp copy of a major part of the public archive and we keep that archive up-to- date. SAO has participated in the ADEC interoperability working group, establishing guidelines or interoperability standards and prototyping such interfaces. We have provided an NVO-based prototype interface, intending to serve the HEASARC-led NVO demo project. HEASARC's Astrobrowse interface was maintained and updated. In addition, we have participated in design discussions surrounding HEASARC's Caldb project. We have attended the HEASARC Users Group meeting and presented CDA status and developments. (2) Chandra CALDB: SA0 has maintained and expanded the Chandra CALDB by including four new data file types, defining the corresponding CALDB keyword/identification structures. We have provided CALDB upgrades for the public (CIAO) and for Standard Data Processing. Approximately 40 new files have been added to the CALDB in these version releases. There have been in the past year ten of these CALDB upgrades, each with unique index configurations. In addition, with the inputs from software, archive, and calibration scientists, as well as CIAO/SDP software developers, we have defined a generalized expansion of the existing CALDB interface and indexing structure. The purpose of this is to make the CALDB more generally applicable and useful in new and future missions that will be supported archivally by HEASARC. The generalized interface will identify additional configurational keywords and permit more extensive calibration parameter and boundary condition specifications for unique file selection. HEASARC scientists and developers from SAO and GSFC have become involved in this work, which is expected to produce a new interface for general use within the current year. (3) DS9: One of the decisions that came from last year's HEADCC meeting was to make the ds9 image display program the primary vehicle for displaying line graphics (as well as images). The first step required to make this possible was to enhance the line graphics capabilities of ds9. SAO therefore spent considerable effort upgrading ds9 to use Tcl 8.4 so that the BLT line graphics package could be built and imported into ds9 from source code, rather than from a pre-built (and generally outdated) shared library. This task, which is nearly complete, allows us to extend BLT as needed for the HEAD community. Following HEADCC discussion concerning archiving and the display of archived data, we extended ds9 to support full access to many astronomical Web-based archives sites, including HEASARC, MAST, CHANDRA, SKYVIEW, ADS, NED, SIMBAD, IRAS, NVRO, SAO TDC, and FIRST. Using ds9's new internal Web access capabilities, these archives can be accessed via their Web page. FITS images, plots, spectra, and journal abstracts can be referenced, down-loaded, and displayed directly and easily in ds9. For more information, see: http://hea-www.harvard.edu/saord/ds9. Also after the HEADCC discussion concerning region filtering, we extended the Funtools sample implementation of region filtering as described in: http://hea-www.harvard.edu/saord/funtools/regions.html. In particular, we added several new composite regions for event and image filtering, including elliptical and box annuli. We also extended the panda (Pie AND Annulus) region support to include box pandas and elliptical pandas. These new composite regions are especially useful in programs that need to count photons in each separate region using only a single pass through the data. Support for these new regions was added to ds9. In the same vein, we developed new region support for filtering images using simple FITS image masks, i.e. 8-bit or 16-bit FITS images where the value of a pixel is the region id number for that pixel. Other important enhancements to DS9 this year, include supporor multiple world coordinate systems, three dimensional event file binning, image smoothing, region groups and tags, the ability to save images in a number of image formats (such as JPEG, TIFF, PNG, FITS), improvements in support for integrating external analysis tools, and support for the virtual observatory. In particular, a full-featured web browser has been implemented within D S 9 . This provides support for full access to HEASARC archive sites such as SKYVIEW and W3BROWSE, in addition to other astronomical archives sites such as MAST, CHANDRA, ADS, NED, SIMBAD, IRAS, NVRO, SA0 TDC, and FIRST. From within DS9, the archives can be searched, and FITS images, plots, spectra, and journal abstracts can be referenced, downloaded and displayed The web browser provides the basis for the built-in help facility. All DS9 documentation, including the reference manual, FAQ, Know Features, and contact information is now available to the user without the need for external display applications. New versions of DS9 maybe downloaded and installed using this facility. Two important features used in the analysis of high energy astronomical data have been implemented in the past year. The first is support for binning photon event data in three dimensions. By binning the third dimension in time or energy, users are easily able to detect variable x-ray sources and identify other physical properties of their data. Second, a number of fast smoothing algorithms have been implemented in DS9, which allow users to smooth their data in real time. Algorithms for boxcar, tophat, and gaussian smoothing are supported.
First Light for ASTROVIRTEL Project
NASA Astrophysics Data System (ADS)
2000-04-01
Astronomical data archives increasingly resemble virtual gold mines of information. A new project, known as ASTROVIRTEL aims to exploit these astronomical treasure troves by allowing scientists to use the archives as virtual telescopes. The competition for observing time on large space- and ground-based observatories such as the ESA/NASA Hubble Space Telescope and the ESO Very Large Telescope (VLT) is intense. On average, less than a quarter of applications for observing time are successful. The fortunate scientist who obtains observing time usually has one year of so-called proprietary time to work with the data before they are made publicly accessible and can be used by other astronomers. Precious data from these large research facilities retain their value far beyond their first birthday and may still be useful decades after they were first collected. The enormous quantity of valuable astronomical data now stored in the archives of the European Southern Observatory (ESO) and the Space Telescope-European Coordinating Facility (ST-ECF) is increasingly attracting the attention of astronomers. Scientists are aware that one set of observations can serve many different scientific purposes, including some that were not considered at all when the observations were first made. Data archives as "gold mines" for research [ASTROVIRTEL Logo; JPEG - 184 k] Astronomical data archives increasingly resemble virtual gold mines of information. A new project, known as ASTROVIRTEL or "Accessing Astronomical Archives as Virtual Telescopes" aims to exploit these astronomical treasure troves. It is supported by the European Commission (EC) within the "Access to Research Infrastructures" action under the "Improving Human Potential & the Socio-economic Knowledge Base" of the EC (under EU Fifth Framework Programme). ASTROVIRTEL has been established on behalf of the European Space Agency (ESA) and the European Southern Observatory (ESO) in response to rapid developments currently taking place in the fields of telescope and detector construction, computer hardware, data processing, archiving, and telescope operation. Nowadays astronomical telescopes can image increasingly large areas of the sky. They use more and more different instruments and are equipped with ever-larger detectors. The quantity of astronomical data collected is rising dramatically, generating a corresponding increase in potentially interesting research projects. These large collections of valuable data have led to the useful concept of "data mining", whereby large astronomical databases are exploited to support original research. However, it has become obvious that scientists need additional support to cope efficiently with the massive amounts of data available and so to exploit the true potential of the databases. The strengths of ASTROVIRTEL ASTROVIRTEL is the first virtual astronomical telescope dedicated to data mining. It is currently being established at the joint ESO/Space Telescope-European Coordinating Facility Archive in Garching (Germany). Scientists from EC member countries and associated states will be able to apply for support for a scientific project based on access to and analysis of data from the Hubble Space Telescope (HST), Very Large Telescope (VLT), New Technology Telescope (NTT), and Wide Field Imager (WFI) archives, as well as a number of other related archives, including the Infrared Space Observatory (ISO) archive. Scientists will be able to visit the archive site and collaborate with the archive specialists there. Special software tools that incorporate advanced methods for exploring the enormous quantities of information available will be developed. Statements The project co-ordinator, Piero Benvenuti , Head of ST-ECF, elaborates on the advantages of ASTROVIRTEL: "The observations by the ESA/NASA Hubble Space Telescope and, more recently, by the ESO Very Large Telescope, have already been made available on-line to the astronomical community, once the proprietary period of one year has elapsed. ASTROVIRTEL is different, in that astronomers are now invited to regard the archive as an "observatory" in its own right: a facility that, when properly used, may provide an answer to their specific scientific questions. The architecture of the archives as well as their suite of software tools may have to evolve to respond to the new demand. ASTROVIRTEL will try to drive this evolution on the basis of the scientific needs of its users." Peter Quinn , the Head of ESO's Data Management and Operations Division, is of the same opinion: "The ESO/HST Archive Facility at ESO Headquarters in Garching is currently the most rapidly growing astronomical archive resource in the world. This archive is projected to contain more than 100 Terabytes (100,000,000,000,000 bytes) of data within the next four years. The software and hardware technologies for the archive will be jointly developed and operated by ESA and ESO staff and will be common to both HST and ESO data archives. The ASTROVIRTEL project will provide us with real examples of scientific research programs that will push the capabilities of the archive and allow us to identify and develop new software tools for data mining. The growing archive facility will provide the European astronomical community with new digital windows on the Universe." Note [1] This is a joint Press Release by the European Southern Observatory (ESO) and the Space Telescope European Coordinating Facility (ST-ECF). Additional information More information about ASTROVIRTEL can be found at the dedicated website at: http://www.stecf.org/astrovirtel The European Southern Observatory (ESO) is an intergovernmental organisation, supported by eight European countries: Belgium, Denmark, France, Germany, Italy, The Netherlands, Sweden and Switzerland. The European Space Agency is an intergovernmental organisation supported by 15 European countries: Austria, Belgium, Denmark, Finland, France, Germany, Ireland, Italy, Netherlands, Norway, Portugal, Spain, Sweden, Switzerland and the United Kingdom. The Space Telescope European Coordinating Facility (ST-ECF) is a co-operation between the European Space Agency and the European Southern Observatory. The Hubble Space Telescope (HST) is a project of international co-operation between NASA and ESA.
The Telecommunications and Data Acquisition Report
NASA Technical Reports Server (NTRS)
Yuen, Joseph H. (Editor)
1994-01-01
This quarterly publication provides archival reports on developments in programs managed by JPL's Office of Telecommunications and Data Acquisition (TDA). In space communications, radio navigation, radio science, and ground-based radio and radar astronomy, it reports on activities of the Deep Space Network (DSN) in planning, supporting research and technology, implementation, and operations. Also included are standards activity at JPL for space data and information systems and reimbursable DSN work performed for other space agencies through NASA. The preceding work is all performed for NASA's Office of Space Communications (OSC). The TDA Office also performs work funded by other NASA program offices through and with the cooperation of OSC. Finally, tasks funded under the JPL Director's Discretionary Fund and the Caltech President's Fund that involve the TDA Office are included.
The Telecommunications and Data Acquisition Report
NASA Technical Reports Server (NTRS)
Posner, Edward C. (Editor)
1991-01-01
This quarterly publication provides archival reports on developments in programs managed by JPL's Office of Telecommunications and Data Acquisition (TDA). In space communications, radio navigation, radio science, and ground-based radio and radar astronomy, it reports on activities of the Deep Space Network (DSN). Also included is standards activity at JPL for space data and information systems and reimbursable DSN work performed for other space agencies through NASA. In the search for extraterrestrial intelligence (SETI), 'The TDA Progress Report' reports on implementation and operations for searching the microwave spectrum. In solar system radar, it reports on the uses of the Goldstone Solar System Radar for scientific exploration of the planets, their rings and satellites, asteroids, and comets. In radio astronomy, the areas of support include spectroscopy, very long baseline interferometry, and astrometry.
Digital data preservation for scholarly publications in astronomy
NASA Astrophysics Data System (ADS)
Choudhury, Sayeed; di Lauro, Tim; Szalay, Alex; Vishniac, Ethan; Hanisch, Robert; Steffen, Julie; Milkey, Robert; Ehling, Teresa; Plante, Ray
2007-11-01
Astronomy is similar to other scientific disciplines in that scholarly publication relies on the presentation and interpretation of data. But although astronomy now has archives for its primary research telescopes and associated surveys, the highly processed data that is presented in the peer-reviewed journals and is the basis for final analysis and interpretation is generally not archived and has no permanent repository. We have initiated a project whose goal is to implement an end-to-end prototype system which, through a partnership of a professional society, that society's scholarly publications/publishers, research libraries, and an information technology substrate provided by the Virtual Observatory, will capture high-level digital data as part of the publication process and establish a distributed network of curated, permanent data repositories. The data in this network will be accessible through the research journals, astronomy data centers, and Virtual Observatory data discovery portals.
Image storage in radiation oncology: What did we learn from diagnostic radiology?
NASA Astrophysics Data System (ADS)
Blodgett, Kurt; Luick, Marc; Colonias, Athanasios; Gayou, Olivier; Karlovits, Stephen; Werts, E. Day
2009-02-01
The Digital Imaging and Communications in Medicine (DICOM) standard was developed by the National Electrical Manufacturers Association (NEMA) and the American College of Radiology (ACR) for medical image archiving and retrieval. An extension to this implemented a standard named DICOM-RT for use in Radiation Oncology. There are currently seven radiotherapy-specific DICOM objects which include: RT Structure Set, RT Plan, RT Dose, RT Image, RT Beams Treatment Record, RT Brachy Treatment Record, and RT Treatment Summary Record. The type of data associated with DICOM-RT includes (1) Radiation treatment planning datasets (CT, MRI, PET) with radiation treatment plans showing beam arrangements, isodose distributions, and dose volume histograms of targets/normal tissues and (2) Image-guided radiation modalities such as Siemens MVision mega-voltage cone beam CT (MV-CBCT). With the advent of such advancing technologies, there has been an exponential increase in image data collected for each patient, and the need for reliable and accessible image storage has become critical. A potential solution is a Radiation Oncology specific picture archiving and communication systems (PACS) that would allow data storage from multiple vendor devices and support the storage and retrieval needs not only of a single site but of a large, multi-facility network of radiation oncology clinics. This PACS system must be reliable, expandable, and cost-effective to operate while protecting sensitive patient image information in a Health Insurance Portability and Accountability Act (HIPAA) compliant environment. This paper emphasizes the expanding DICOM-RT storage requirements across our network of 8 radiation oncology clinics and the initiatives we undertook to address the increased volume of data by using the ImageGrid (CANDELiS Inc, Irvine CA) server and the IGViewer license (CANDELiS Inc, Irvine CA) to create a DICOM-RT compatible PACS system.
Oceans 2.0 API: Programmatic access to Ocean Networks Canada's sensor data.
NASA Astrophysics Data System (ADS)
Heesemann, M.; Ross, R.; Hoeberechts, M.; Pirenne, B.; MacArthur, M.; Jeffries, M. A.; Morley, M. G.
2017-12-01
Ocean Networks Canada (ONC) is a not-for-profit society that operates and manages innovative cabled observatories on behalf of the University of Victoria. These observatories supply continuous power and Internet connectivity to various scientific instruments located in coastal, deep-ocean and Arctic environments. The data from the instruments are relayed to the University of Victoria where they are archived, quality-controlled and made freely available to researchers, educators, and the public. The Oceans 2.0 data management system currently contains over 500 terabytes of data collected over 11 years from thousands of sensors. In order to facilitate access to the data, particularly for large datasets and long-time series of high-resolution data, a project was started in 2016 create a comprehensive Application Programming Interface, the "Oceans 2.0 API," to provide programmatic access to all ONC data products. The development is part of a project entitled "A Research Platform for User-Defined Oceanographic Data Products," funded through CANARIE, a Canadian organization responsible for the design and delivery of digital infrastructure for research, education and innovation [1]. Providing quick and easy access to ONC Data Products from within custom software solutions, allows researchers, modelers and decision makers to focus on what is important: solving their problems, answering their questions and making informed decisions. In this paper, we discuss how to access ONC's vast archive of data programmatically, through the Oceans 2.0 API. In particular we discuss the following: Access to ONC Data Products Access to ONC sensor data in near real-time Programming language support Use Cases References [1] CANARIE. Internet: https://www.canarie.ca/; accessed March 6, 2017.
The European Radiobiology Archives (ERA)--content, structure and use illustrated by an example.
Gerber, G B; Wick, R R; Kellerer, A M; Hopewell, J W; Di Majo, V; Dudoignon, N; Gössner, W; Stather, J
2006-01-01
The European Radiobiology Archives (ERA), supported by the European Commission and the European Late Effect Project Group (EULEP), together with the US National Radiobiology Archives (NRA) and the Japanese Radiobiology Archives (JRA) have collected all information still available on long-term animal experiments, including some selected human studies. The archives consist of a database in Microsoft Access, a website, databases of references and information on the use of the database. At present, the archives contain a description of the exposure conditions, animal strains, etc. from approximately 350,000 individuals; data on survival and pathology are available from approximately 200,000 individuals. Care has been taken to render pathological diagnoses compatible among different studies and to allow the lumping of pathological diagnoses into more general classes. 'Forms' in Access with an underlying computer code facilitate the use of the database. This paper describes the structure and content of the archives and illustrates an example for a possible analysis of such data.
NASA's Planetary Data System: Support for the Delivery of Derived Data Sets at the Atmospheres Node
NASA Astrophysics Data System (ADS)
Chanover, Nancy J.; Beebe, Reta; Neakrase, Lynn; Huber, Lyle; Rees, Shannon; Hornung, Danae
2015-11-01
NASA’s Planetary Data System is charged with archiving electronic data products from NASA planetary missions that are sponsored by NASA’s Science Mission Directorate. This archive, currently organized by science disciplines, uses standards for describing and storing data that are designed to enable future scientists who are unfamiliar with the original experiments to analyze the data, and to do this using a variety of computer platforms, with no additional support. These standards address the data structure, description contents, and media design. The new requirement in the NASA ROSES-2015 Research Announcement to include a Data Management Plan will result in an increase in the number of derived data sets that are being delivered to the PDS. These data sets may come from the Planetary Data Archiving, Restoration and Tools (PDART) program, other Data Analysis Programs (DAPs) or be volunteered by individuals who are publishing the results of their analysis. In response to this increase, the PDS Atmospheres Node is developing a set of guidelines and user tools to make the process of archiving these derived data products more efficient. Here we provide a description of Atmospheres Node resources, including a letter of support for the proposal stage, a communication schedule for the planned archive effort, product label samples and templates in extensible markup language (XML), documentation templates, and validation tools necessary for producing a PDS4-compliant derived data bundle(s) efficiently and accurately.
He, Longjun; Ming, Xing; Liu, Qian
2014-04-01
With computing capability and display size growing, the mobile device has been used as a tool to help clinicians view patient information and medical images anywhere and anytime. However, for direct interactive 3D visualization, which plays an important role in radiological diagnosis, the mobile device cannot provide a satisfactory quality of experience for radiologists. This paper developed a medical system that can get medical images from the picture archiving and communication system on the mobile device over the wireless network. In the proposed application, the mobile device got patient information and medical images through a proxy server connecting to the PACS server. Meanwhile, the proxy server integrated a range of 3D visualization techniques, including maximum intensity projection, multi-planar reconstruction and direct volume rendering, to providing shape, brightness, depth and location information generated from the original sectional images for radiologists. Furthermore, an algorithm that changes remote render parameters automatically to adapt to the network status was employed to improve the quality of experience. Finally, performance issues regarding the remote 3D visualization of the medical images over the wireless network of the proposed application were also discussed. The results demonstrated that this proposed medical application could provide a smooth interactive experience in the WLAN and 3G networks.
NASA Astrophysics Data System (ADS)
Smirnov, Alexander; Petrenko, Maksym; Ichoku, Charles; Holben, Brent N.
2017-10-01
The paper reports on the current status of the Maritime Aerosol Network (MAN) which is a component of the Aerosol Robotic Network (AERONET). A public domain web-based data archive dedicated to MAN activity can be found at https://aeronet.gsfc.nasa.gov/new_web/maritime_aerosol_network.html . Since 2006 over 450 cruises were completed and the data archive consists of more than 6000 measurement days. In this work, we present MAN observations collocated with MODIS Terra, MODIS Aqua, MISR, POLDER, SeaWIFS, OMI, and CALIOP spaceborne aerosol products using a modified version of the Multi-Sensor Aerosol Products Sampling System (MAPSS) framework. Because of different spatio-temporal characteristics of the analyzed products, the number of MAN data points collocated with spaceborne retrievals varied between 1500 matchups for MODIS to 39 for CALIOP (as of August 2016). Despite these unavoidable sampling biases, latitudinal dependencies of AOD differences for all satellite sensors, except for SeaWIFS and POLDER, showed positive biases against ground truth (i.e. MAN) in the southern latitudes (<50° S), and substantial scatter in the Northern Atlantic "dust belt" (5°-15° N). Our analysis did not intend to determine whether satellite retrievals are within claimed uncertainty boundaries, but rather show where bias exists and corrections are needed.
The Monterey Ocean Observing System Development Program
NASA Astrophysics Data System (ADS)
Chaffey, M.; Graybeal, J. B.; O'Reilly, T.; Ryan, J.
2004-12-01
The Monterey Bay Aquarium Research Institute (MBARI) has a major development program underway to design, build, test and apply technology suitable to deep ocean observatories. The Monterey Ocean Observing System (MOOS) program is designed to form a large-scale instrument network that provides generic interfaces, intelligent instrument support, data archiving and near-real-time interaction for observatory experiments. The MOOS mooring system is designed as a portable surface mooring based seafloor observatory that provides data and power connections to both seafloor and ocean surface instruments through a specialty anchor cable. The surface mooring collects solar and wind energy for powering instruments and transmits data to shore-side researchers using a satellite communications modem. The use of a high modulus anchor cable to reach seafloor instrument networks is a high-risk development effort that is critical for the overall success of the portable observatory concept. An aggressive field test program off the California coast is underway to improve anchor cable constructions as well as end-to-end test overall system design. The overall MOOS observatory systems view is presented and the results of our field tests completed to date are summarized.
ROSETTA: How to archive more than 10 years of mission
NASA Astrophysics Data System (ADS)
Barthelemy, Maud; Heather, D.; Grotheer, E.; Besse, S.; Andres, R.; Vallejo, F.; Barnes, T.; Kolokolova, L.; O'Rourke, L.; Fraga, D.; A'Hearn, M. F.; Martin, P.; Taylor, M. G. G. T.
2018-01-01
The Rosetta spacecraft was launched in 2004 and, after several planetary and two asteroid fly-bys, arrived at comet 67P/Churyumov-Gerasimenko in August 2014. After escorting the comet for two years and executing its scientific observations, the mission ended on 30 September 2016 through a touch down on the comet surface. This paper describes how the Planetary Science Archive (PSA) and the Planetary Data System - Small Bodies Node (PDS-SBN) worked with the Rosetta instrument teams to prepare the science data collected over the course of the Rosetta mission for inclusion in the science archive. As Rosetta is an international mission in collaboration between ESA and NASA, all science data from the mission are fully archived within both the PSA and the PDS. The Rosetta archiving process, supporting tools, archiving systems, and their evolution throughout the mission are described, along with a discussion of a number of the challenges faced during the Rosetta implementation. The paper then presents the current status of the archive for each of the science instruments, before looking to the improvements planned both for the archive itself and for the Rosetta data content. The lessons learned from the first 13 years of archiving on Rosetta are finally discussed with an aim to help future missions plan and implement their science archives.
Desired Precision in Multi-Objective Optimization: Epsilon Archiving or Rounding Objectives?
NASA Astrophysics Data System (ADS)
Asadzadeh, M.; Sahraei, S.
2016-12-01
Multi-objective optimization (MO) aids in supporting the decision making process in water resources engineering and design problems. One of the main goals of solving a MO problem is to archive a set of solutions that is well-distributed across a wide range of all the design objectives. Modern MO algorithms use the epsilon dominance concept to define a mesh with pre-defined grid-cell size (often called epsilon) in the objective space and archive at most one solution at each grid-cell. Epsilon can be set to the desired precision level of each objective function to make sure that the difference between each pair of archived solutions is meaningful. This epsilon archiving process is computationally expensive in problems that have quick-to-evaluate objective functions. This research explores the applicability of a similar but computationally more efficient approach to respect the desired precision level of all objectives in the solution archiving process. In this alternative approach each objective function is rounded to the desired precision level before comparing any new solution to the set of archived solutions that already have rounded objective function values. This alternative solution archiving approach is compared to the epsilon archiving approach in terms of efficiency and quality of archived solutions for solving mathematical test problems and hydrologic model calibration problems.
Alygizakis, Nikiforos A; Samanipour, Saer; Hollender, Juliane; Ibáñez, María; Kaserzon, Sarit; Kokkali, Varvara; van Leerdam, Jan A; Mueller, Jochen F; Pijnappels, Martijn; Reid, Malcolm J; Schymanski, Emma L; Slobodnik, Jaroslav; Thomaidis, Nikolaos S; Thomas, Kevin V
2018-05-01
A key challenge in the environmental and exposure sciences is to establish experimental evidence of the role of chemical exposure in human and environmental systems. High resolution and accurate tandem mass spectrometry (HRMS) is increasingly being used for the analysis of environmental samples. One lauded benefit of HRMS is the possibility to retrospectively process data for (previously omitted) compounds that has led to the archiving of HRMS data. Archived HRMS data affords the possibility of exploiting historical data to rapidly and effectively establish the temporal and spatial occurrence of newly identified contaminants through retrospective suspect screening. We propose to establish a global emerging contaminant early warning network to rapidly assess the spatial and temporal distribution of contaminants of emerging concern in environmental samples through performing retrospective analysis on HRMS data. The effectiveness of such a network is demonstrated through a pilot study, where eight reference laboratories with available archived HRMS data retrospectively screened data acquired from aqueous environmental samples collected in 14 countries on 3 different continents. The widespread spatial occurrence of several surfactants (e.g., polyethylene glycols ( PEGs ) and C12AEO-PEGs ), transformation products of selected drugs (e.g., gabapentin-lactam, metoprolol-acid, carbamazepine-10-hydroxy, omeprazole-4-hydroxy-sulfide, and 2-benzothiazole-sulfonic-acid), and industrial chemicals (3-nitrobenzenesulfonate and bisphenol-S) was revealed. Obtaining identifications of increased reliability through retrospective suspect screening is challenging, and recommendations for dealing with issues such as broad chromatographic peaks, data acquisition, and sensitivity are provided.
The Italian National Seismic Network
NASA Astrophysics Data System (ADS)
Michelini, Alberto
2016-04-01
The Italian National Seismic Network is composed by about 400 stations, mainly broadband, installed in the Country and in the surrounding regions. About 110 stations feature also collocated strong motion instruments. The Centro Nazionale Terremoti, (National Earthquake Center), CNT, has installed and operates most of these stations, although a considerable number of stations contributing to the INGV surveillance has been installed and is maintained by other INGV sections (Napoli, Catania, Bologna, Milano) or even other Italian or European Institutions. The important technological upgrades carried out in the last years has allowed for significant improvements of the seismic monitoring of Italy and of the Euro-Mediterranean Countries. The adopted data transmission systems include satellite, wireless connections and wired lines. The Seedlink protocol has been adopted for data transmission. INGV is a primary node of EIDA (European Integrated Data Archive) for archiving and distributing, continuous, quality checked data. The data acquisition system was designed to accomplish, in near-real-time, automatic earthquake detection and hypocenter and magnitude determination (moment tensors, shake maps, etc.). Database archiving of all parametric results are closely linked to the existing procedures of the INGV seismic monitoring environment. Overall, the Italian earthquake surveillance service provides, in quasi real-time, hypocenter parameters which are then revised routinely by the analysts of the Bollettino Sismico Nazionale. The results are published on the web page http://cnt.rm.ingv.it/ and are publicly available to both the scientific community and the the general public. This presentation will describe the various activities and resulting products of the Centro Nazionale Terremoti. spanning from data acquisition to archiving, distribution and specialised products.
Development and implementation of ultrasound picture archiving and communication system
NASA Astrophysics Data System (ADS)
Weinberg, Wolfram S.; Tessler, Franklin N.; Grant, Edward G.; Kangarloo, Hooshang; Huang, H. K.
1990-08-01
The Department of Radiological Sciences at the UCLA School of Medicine is developing an archiving and communication system (PACS) for digitized ultrasound images. In its final stage the system will involve the acquisition and archiving of ultrasound studies from four different locations including the Center for Health Sciences, the Department for Mental Health and the Outpatient Radiology and Endoscopy Departments with a total of 200-250 patient studies per week. The concept comprises two stages of image manipulation for each ultrasound work area. The first station is located close to the examination site and accomodates the acquisition of digital images from up to five ultrasound devices and provides for instantaneous display and primary viewing and image selection. Completed patient studies are transferred to a main workstation for secondary review, further analysis and comparison studies. The review station has an on-line storage capacity of 10,000 images with a resolution of 512x512 8 bit data to allow for immediate retrieval of active patient studies of up to two weeks. The main work stations are connected through the general network and use one central archive for long term storage and a film printer for hardcopy output. First phase development efforts concentrate on the implementation and testing of a system at one location consisting of a number of ultrasound units with video digitizer and network interfaces and a microcomputer workstation as host for the display station with two color monitors, each allowing simultaneous display of four 512x512 images. The discussion emphasizes functionality, performance and acceptance of the system in the clinical environment.
NASA Astrophysics Data System (ADS)
Schneider, Uwe; Strack, Ruediger
1992-04-01
apART reflects the structure of an open, distributed environment. According to the general trend in the area of imaging, network-capable, general purpose workstations with capabilities of open system image communication and image input are used. Several heterogeneous components like CCD cameras, slide scanners, and image archives can be accessed. The system is driven by an object-oriented user interface where devices (image sources and destinations), operators (derived from a commercial image processing library), and images (of different data types) are managed and presented uniformly to the user. Browsing mechanisms are used to traverse devices, operators, and images. An audit trail mechanism is offered to record interactive operations on low-resolution image derivatives. These operations are processed off-line on the original image. Thus, the processing of extremely high-resolution raster images is possible, and the performance of resolution dependent operations is enhanced significantly during interaction. An object-oriented database system (APRIL), which can be browsed, is integrated into the system. Attribute retrieval is supported by the user interface. Other essential features of the system include: implementation on top of the X Window System (X11R4) and the OSF/Motif widget set; a SUN4 general purpose workstation, inclusive ethernet, magneto optical disc, etc., as the hardware platform for the user interface; complete graphical-interactive parametrization of all operators; support of different image interchange formats (GIF, TIFF, IIF, etc.); consideration of current IPI standard activities within ISO/IEC for further refinement and extensions.
DTS: The NOAO Data Transport System
NASA Astrophysics Data System (ADS)
Fitzpatrick, M.; Semple, T.
2014-05-01
The NOAO Data Transport System (DTS) provides high-throughput, reliable, data transfer between telescopes, pipelines and archive centers located in the Northern and Southern hemispheres. It is a distributed application using XML-RPC for command and control, and either parallel-TCP or UDT protocols for bulk data transport. The system is data-agnostic, allowing arbitrary files or directories to be moved using the same infrastructure. Data paths are configurable in the system by connecting nodes as the source or destination of data in a queue. Each leg of a data path may be configured independently based on the network environment between the sites. A queueing model is currently implemented to manage the automatic movement of data, a streaming model is planned to support arbitrarily large transfers (e.g. as in a disk recovery scenario) or to provide a 'pass-thru' interface to minize overheads. A web-based monitor allows anyone to get a graphical overview of the DTS system as it runs, operators will be able to control individual nodes in the system. Through careful tuning of the network paths DTS is able to achieve in excess of 80-percent of the nominal wire speed using only commodity networks, making it ideal for long-haul transport of large volumes of data.
Digital preservation of a highway photolog film archive in Connecticut.
DOT National Transportation Integrated Search
2014-01-28
The Connecticut Department of Transportation has been photologging their transportation network : for over forty years. Photologging at a minimum refers to the use of an instrumented vehicle, which is : designed to capture successive photographs of t...
The global Landsat archive: Status, consolidation, and direction
Wulder, Michael A.; White, Joanne C.; Loveland, Thomas; Woodcock, Curtis; Belward, Alan; Cohen, Warren B.; Fosnight, Eugene A.; Shaw, Jerad; Masek, Jeffery G.; Roy, David P.
2016-01-01
New and previously unimaginable Landsat applications have been fostered by a policy change in 2008 that made analysis-ready Landsat data free and open access. Since 1972, Landsat has been collecting images of the Earth, with the early years of the program constrained by onboard satellite and ground systems, as well as limitations across the range of required computing, networking, and storage capabilities. Rather than robust on-satellite storage for transmission via high bandwidth downlink to a centralized storage and distribution facility as with Landsat-8, a network of receiving stations, one operated by the U.S. government, the other operated by a community of International Cooperators (ICs), were utilized. ICs paid a fee for the right to receive and distribute Landsat data and over time, more Landsat data was held outside the archive of the United State Geological Survey (USGS) than was held inside, much of it unique. Recognizing the critical value of these data, the USGS began a Landsat Global Archive Consolidation (LGAC) initiative in 2010 to bring these data into a single, universally accessible, centralized global archive, housed at the Earth Resources Observation and Science (EROS) Center in Sioux Falls, South Dakota. The primary LGAC goals are to inventory the data held by ICs, acquire the data, and ingest and apply standard ground station processing to generate an L1T analysis-ready product. As of January 1, 2015 there were 5,532,454 images in the USGS archive. LGAC has contributed approximately 3.2 million of those images, more than doubling the original USGS archive holdings. Moreover, an additional 2.3 million images have been identified to date through the LGAC initiative and are in the process of being added to the archive. The impact of LGAC is significant and, in terms of images in the collection, analogous to that of having had twoadditional Landsat-5 missions. As a result of LGAC, there are regions of the globe that now have markedly improved Landsat data coverage, resulting in an enhanced capacity for mapping, monitoring change, and capturing historic conditions. Although future missions can be planned and implemented, the past cannot be revisited, underscoring the value and enhanced significance of historical Landsat data and the LGAC initiative. The aim of this paper is to report the current status of the global USGS Landsat archive, document the existing and anticipated contributions of LGAC to the archive, and characterize the current acquisitions of Landsat-7 and Landsat-8. Landsat-8 is adding data to the archive at an unprecedented rate as nearly all terrestrial images are now collected. We also offer key lessons learned so far from the LGAC initiative, plus insights regarding other critical elements of the Landsat program looking forward, such as acquisition, continuity, temporal revisit, and the importance of continuing to operationalize the Landsat program.
Space and Earth Sciences, Computer Systems, and Scientific Data Analysis Support, Volume 1
NASA Technical Reports Server (NTRS)
Estes, Ronald H. (Editor)
1993-01-01
This Final Progress Report covers the specific technical activities of Hughes STX Corporation for the last contract triannual period of 1 June through 30 Sep. 1993, in support of assigned task activities at Goddard Space Flight Center (GSFC). It also provides a brief summary of work throughout the contract period of performance on each active task. Technical activity is presented in Volume 1, while financial and level-of-effort data is presented in Volume 2. Technical support was provided to all Division and Laboratories of Goddard's Space Sciences and Earth Sciences Directorates. Types of support include: scientific programming, systems programming, computer management, mission planning, scientific investigation, data analysis, data processing, data base creation and maintenance, instrumentation development, and management services. Mission and instruments supported include: ROSAT, Astro-D, BBXRT, XTE, AXAF, GRO, COBE, WIND, UIT, SMM, STIS, HEIDI, DE, URAP, CRRES, Voyagers, ISEE, San Marco, LAGEOS, TOPEX/Poseidon, Pioneer-Venus, Galileo, Cassini, Nimbus-7/TOMS, Meteor-3/TOMS, FIFE, BOREAS, TRMM, AVHRR, and Landsat. Accomplishments include: development of computing programs for mission science and data analysis, supercomputer applications support, computer network support, computational upgrades for data archival and analysis centers, end-to-end management for mission data flow, scientific modeling and results in the fields of space and Earth physics, planning and design of GSFC VO DAAC and VO IMS, fabrication, assembly, and testing of mission instrumentation, and design of mission operations center.
[Space for the new. Archive - library - study center].
Weber, Danny
2014-01-01
This article features a short outline of both the architectural history and the inventories of Leopoldina's archive and library. Moreover, the article presents the construction plans that will--when implemented in the near future--generate and provide outstanding working facilities in the form of a building ensemble consisting of an archive, library and study center. The future infrastructure of these Leopoldina buildings, located in the area of Emil-Abderhalden-/August-Bebel-Strasse, will sustainably foster and support the establishment of research projects at the Leopoldina Study Center.
Communications among data and science centers
NASA Technical Reports Server (NTRS)
Green, James L.
1990-01-01
The ability to electronically access and query the contents of remote computer archives is of singular importance in space and earth sciences; the present evaluation of such on-line information networks' development status foresees swift expansion of their data capabilities and complexity, in view of the volumes of data that will continue to be generated by NASA missions. The U.S.'s National Space Science Data Center (NSSDC) manages NASA's largest science computer network, the Space Physics Analysis Network; a comprehensive account is given of the structure of NSSDC international access through BITNET, and of connections to the NSSDC available in the Americas via the International X.25 network.
The Fermi Science Support Center Data Servers and Archive
NASA Astrophysics Data System (ADS)
Reustle, Alexander; Fermi Science Support Center
2018-01-01
The Fermi Science Support Center (FSSC) provides the scientific community with access to Fermi data and other products. The Gamma-Ray Burst Monitor (GBM) data is stored at NASA's High Energy Astrophysics Science Archive Research Center (HEASARC) and is accessible through their searchable Browse web interface. The Large Area Telescope (LAT) data is distributed through a custom FSSC interface where users can request all photons detected from a region on the sky over a specified time and energy range. Through its website the FSSC also provides planning and scheduling products, such as long and short term observing timelines, spacecraft position and attitude histories, and exposure maps. We present an overview of the different data products provided by the FSSC, how they can be accessed, and statistics on the archive usage since launch.
TCIA: An information resource to enable open science.
Prior, Fred W; Clark, Ken; Commean, Paul; Freymann, John; Jaffe, Carl; Kirby, Justin; Moore, Stephen; Smith, Kirk; Tarbox, Lawrence; Vendt, Bruce; Marquez, Guillermo
2013-01-01
Reusable, publicly available data is a pillar of open science. The Cancer Imaging Archive (TCIA) is an open image archive service supporting cancer research. TCIA collects, de-identifies, curates and manages rich collections of oncology image data. Image data sets have been contributed by 28 institutions and additional image collections are underway. Since June of 2011, more than 2,000 users have registered to search and access data from this freely available resource. TCIA encourages and supports cancer-related open science communities by hosting and managing the image archive, providing project wiki space and searchable metadata repositories. The success of TCIA is measured by the number of active research projects it enables (>40) and the number of scientific publications and presentations that are produced using data from TCIA collections (39).
The Archivists' Toolkit: Another Step toward Streamlined Archival Processing
ERIC Educational Resources Information Center
Westbrook, Bradley D.; Mandell, Lee; Shepherd, Kelcy; Stevens, Brian; Varghese, Jason
2006-01-01
The Archivists' Toolkit is a software application currently in development and designed to support the creation and management of archival information. This article summarizes the development of the application, including some of the problems the application is designed to resolve. Primary emphasis is placed on describing the application's…
36 CFR § 1254.24 - Whom does NARA allow in research rooms?
Code of Federal Regulations, 2013 CFR
2013-07-01
... Research Support Branch (NWCC2). For regional archives and Presidential libraries, apply to the appropriate... research rooms? § 1254.24 Section § 1254.24 Parks, Forests, and Public Property NATIONAL ARCHIVES AND RECORDS ADMINISTRATION PUBLIC AVAILABILITY AND USE USING RECORDS AND DONATED HISTORICAL MATERIALS Research...
NASA Astrophysics Data System (ADS)
Clark, O.; Rice, A. L.
2017-12-01
Carbon dioxide (CO2) is the most abundant, anthropogenically forced greenhouse gas (GHG) in the global atmosphere. Emissions of CO2 account for approximately 75% of the world's total GHG emissions. Atmospheric concentrations of CO2 are higher now than they've been at any other time in the past 800,000 years. Currently, the global mean concentration exceeds 400 ppm. Today, global networks regularly monitor CO2 concentrations and isotopic composition (δ13C and δ18O). However, past data is sparse. Over 200 ambient air samples from Cape Meares, Oregon (45.5°N, 124.0°W), a coastal site in Western United States, were obtained by researchers at Oregon Institute of Science and Technology (OGI, now Oregon Health & Science University), between the years of 1977 and 1998 as part of a global monitoring program of six different sites in the polar, middle, and tropical latitudes of the Northern and Southern Hemispheres. Air liquefaction was used to compress approximately 1000L of air (STP) to 30bar, into 33L electropolished (SUMMA) stainless steel canisters. Select archived air samples from the original network are maintained at Portland State University (PSU) Department of Physics. These archived samples are a valuable look at changing atmospheric concentrations of CO2 and δ13C, which can contribute to a better understanding of changes in sources during this time. CO2 concentrations and δ13C of CO2 were measured at PSU, with a Picarro Cavity Ringdown Spectrometer, model G1101-i analytical system. This study presents the analytical methods used, calibration techniques, precision, and reproducibility. Measurements of select samples from the archive show rising CO2 concentrations and falling δ13C over the 1977 to 1998 period, compatible with previous observations and rising anthropogenic sources of CO2. The resulting data set was statistically analyzed in MATLAB. Results of preliminary seasonal and secular trends from the archive samples are presented.
Ensuring long-term reliability of the data storage on optical disc
NASA Astrophysics Data System (ADS)
Chen, Ken; Pan, Longfa; Xu, Bin; Liu, Wei
2008-12-01
"Quality requirements and handling regulation of archival optical disc for electronic records filing" is released by The State Archives Administration of the People's Republic of China (SAAC) on its network in March 2007. This document established a complete operative managing process for optical disc data storage in archives departments. The quality requirements of the optical disc used in archives departments are stipulated. Quality check of the recorded disc before filing is considered to be necessary and the threshold of the parameter of the qualified filing disc is set down. The handling regulations for the staffs in the archives departments are described. Recommended environment conditions of the disc preservation, recording, accessing and testing are presented. The block error rate of the disc is selected as main monitoring parameter of the lifetime of the filing disc and three classes pre-alarm lines are created for marking of different quality check intervals. The strategy of monitoring the variation of the error rate curve of the filing discs and moving the data to a new disc or a new media when the error rate of the disc reaches the third class pre-alarm line will effectively guarantee the data migration before permanent loss. Only when every step of the procedure is strictly implemented, it is believed that long-term reliability of the data storage on optical disc for archives departments can be effectively ensured.
NASA Astrophysics Data System (ADS)
Smith, Edward M.; Wandtke, John; Robinson, Arvin E.
1999-07-01
The Medical Information, Communication and Archive System (MICAS) is a multi-modality integrated image management system that is seamlessly integrated with the Radiology Information System (RIS). This project was initiated in the summer of 1995 with the first phase being installed during the first half of 1997 and the second phase installed during the summer of 1998. Phase II enhancements include a permanent archive, automated workflow including modality worklist, study caches, NT diagnostic workstations with all components adhering to Digital Imaging and Communications in Medicine (DICOM) standards. This multi-vendor phased approach to PACS implementation is designed as an enterprise-wide PACS to provide images and reports throughout our healthcare network. MICAS demonstrates that aa multi-vendor open system phased approach to PACS is feasible, cost-effective, and has significant advantages over a single vendor implementation.
Migration of medical image data archived using mini-PACS to full-PACS.
Jung, Haijo; Kim, Hee-Joung; Kang, Won-Suk; Lee, Sang-Ho; Kim, Sae-Rome; Ji, Chang Lyong; Kim, Jung-Han; Yoo, Sun Kook; Kim, Ki-Hwang
2004-06-01
This study evaluated the migration to full-PACS of medical image data archived using mini-PACS at two hospitals of the Yonsei University Medical Center, Seoul, Korea. A major concern in the migration of medical data is to match the image data from the mini-PACS with the hospital OCS (Ordered Communication System). Prior to carrying out the actual migration process, the principles, methods, and anticipated results for the migration with respect to both cost and effectiveness were evaluated. Migration gateway workstations were established and a migration software tool was developed. The actual migration process was performed based on the results of several migration simulations. Our conclusions were that a migration plan should be carefully prepared and tailored to the individual hospital environment because the server system, archive media, network, OCS, and policy for data management may be unique.
Distributed Active Archive Center
NASA Technical Reports Server (NTRS)
Bodden, Lee; Pease, Phil; Bedet, Jean-Jacques; Rosen, Wayne
1993-01-01
The Goddard Space Flight Center Version 0 Distributed Active Archive Center (GSFC V0 DAAC) is being developed to enhance and improve scientific research and productivity by consolidating access to remote sensor earth science data in the pre-EOS time frame. In cooperation with scientists from the science labs at GSFC, other NASA facilities, universities, and other government agencies, the DAAC will support data acquisition, validation, archive and distribution. The DAAC is being developed in response to EOSDIS Project Functional Requirements as well as from requirements originating from individual science projects such as SeaWiFS, Meteor3/TOMS2, AVHRR Pathfinder, TOVS Pathfinder, and UARS. The GSFC V0 DAAC has begun operational support for the AVHRR Pathfinder (as of April, 1993), TOVS Pathfinder (as of July, 1993) and the UARS (September, 1993) Projects, and is preparing to provide operational support for SeaWiFS (August, 1994) data. The GSFC V0 DAAC has also incorporated the existing data, services, and functionality of the DAAC/Climate, DAAC/Land, and the Coastal Zone Color Scanner (CZCS) Systems.
The Telecommunications and Data Acquisition Report
NASA Technical Reports Server (NTRS)
Posner, E. C. (Editor)
1989-01-01
Archival reports on developments in programs managed by the Jet Propulsion Laboratory's Office of Telecommunications and Data Acquisition are provided. Space communications, radio navigation, radio science, and ground based radio and radio astronomy are discussed. Deep Space Network projects are also discussed.
Neuroimaging Data Sharing on the Neuroinformatics Database Platform
Book, Gregory A; Stevens, Michael; Assaf, Michal; Glahn, David; Pearlson, Godfrey D
2015-01-01
We describe the Neuroinformatics Database (NiDB), an open-source database platform for archiving, analysis, and sharing of neuroimaging data. Data from the multi-site projects Autism Brain Imaging Data Exchange (ABIDE), Bipolar-Schizophrenia Network on Intermediate Phenotypes parts one and two (B-SNIP1, B-SNIP2), and Monetary Incentive Delay task (MID) are available for download from the public instance of NiDB, with more projects sharing data as it becomes available. As demonstrated by making several large datasets available, NiDB is an extensible platform appropriately suited to archive and distribute shared neuroimaging data. PMID:25888923
Moll, F H; Krischel, M; Zajaczkowski, T; Rathert, P
2010-10-01
A source in the archives of the German Society of Urology gives us a vivid insight into the situation in Berlin during the 1930s from the perspective of a young Polish doctor, and presents the situation at one of the leading urology institutions of the time in Germany. Furthermore, we learn about the social situation in hospitals as well as the discourse and networking taking place in the scientific community at that time.
NASA Technical Reports Server (NTRS)
Tuey, Richard C.; Lane, Robert; Hart, Susan V.
1995-01-01
The NASA Scientific and Technical Information Office was assigned the responsibility to continue with the expansion of the NASAwide networked electronic duplicating effort by including the Goddard Space Flight Center (GSFC) as an additional node to the existing configuration of networked electronic duplicating systems within NASA. The subject of this report is the evaluation of a networked electronic duplicating system which meets the duplicating requirements and expands electronic publishing capabilities without increasing current operating costs. This report continues the evaluation reported in 'NASA Electronic Publishing System - Electronic Printing and Duplicating Evaluation Report' (NASA TM-106242) and 'NASA Electronic Publishing System - Stage 1 Evaluation Report' (NASA TM-106510). This report differs from the previous reports through the inclusion of an external networked desktop editing, archival, and publishing functionality which did not exist with the previous networked electronic duplicating system. Additionally, a two-phase approach to the evaluation was undertaken; the first was a paper study justifying a 90-day, on-site evaluation, and the second phase was to validate, during the 90-day evaluation, the cost benefits and productivity increases that could be achieved in an operational mode. A benchmark of the functionality of the networked electronic publishing system and external networked desktop editing, archival, and publishing system was performed under a simulated daily production environment. This report can be used to guide others in determining the most cost effective duplicating/publishing alternative through the use of cost/benefit analysis and return on investment techniques. A treatise on the use of these techniques can be found by referring to 'NASA Electronic Publishing System -Cost/Benefit Methodology' (NASA TM-106662).
Puncture-proof picture archiving and communication system.
Willis, C E; McCluggage, C W; Orand, M R; Parker, B R
2001-06-01
As we become increasingly dependent on our picture archiving and communications system (PACS) for the clinical practice of medicine, the demand for improved reliability becomes urgent. Borrowing principles from the discipline of Reliability Engineering, we have identified components of our system that constitute single points of failure and have endeavored to eliminate these through redundant components and manual work-around procedures. To assess the adequacy of our preparations, we have identified a set of plausible events that could interfere with the function of one or more of our PACS components. These events could be as simple as the loss of the network connection to a single component or as broad as the loss of our central data center. We have identified the need to continue to operate during adverse conditions, as well as the requirement to recover rapidly from major disruptions in service. This assessment led us to modify the physical locations of central PACS components within our physical plant. We are also taking advantage of actual disruptive events coincident with a major expansion of our facility to test our recovery procedures. Based on our recognition of the vital nature of our electronic images for patient care, we are now recording electronic images in two copies on disparate media. The image database is critical to both continued operations and recovery. Restoration of the database from periodic tape backups with a 24-hour cycle time may not support our clinical scenario: acquisition modalities have a limited local storage capacity, some of which will not contain the daily workload. Restoration of the database from the archived media is an exceedingly slow process, that will likely not meet our requirement to restore clinical operations without significant delay. Our PACS vendor is working on concurrent image databases that would be capable of nearly immediate switchover and recovery.
Meeting Archival Standards in the Astronomical Photographic Data Archive at PARI
NASA Astrophysics Data System (ADS)
Cline, J. D.; Castelaz, M. W.; Barker, T.; Rottler, L.
2013-01-01
The Astronomical Photographic Data Archive (APDA) located at the Pisgah Astronomical Research Institute (PARI) was established in November 2007. APDA is dedicated to the task of collecting, restoring, preserving and storing astronomical photographic data and continues to accept collections. APDA is also tasked with scanning each image and establishing a database of images that can be accessed via the Internet by the global community of scientists, researchers and students. APDA is a new type of astronomical observatory - one that harnesses analog data of the night sky taken for more than a century and making that data digitally available. APDA is housed in a newly renovated Research Building on the PARI campus. An award from the NSF allowed renovation of the heating and air conditioning. Plates in APDA are kept in a 20 C +/- 1 C area with humidity at 38% +/- 3%. Renovation of the electrical system with backup power allows for support of a data center with a networked storage system and software donated from EMC Corp. The storage system can hold more than 400 terabytes of research data which can be accessed through multiple gigabyte connectivity to the Internet. APDA has a collection of more than 200,000 photographic plates and films from more than 40 collections, as well as major instrumentation, from NASA, the STScI, the US Naval Observatory, the Harvard Smithsonian CfA and others. APDA possesses two high precision glass plate scanners, GAMMA I and GAMMA II, built for NASA and the Space Telescope Science Institute (STScI). The scanners were used to develop the HST Guide Star Catalog and Digitized Sky Survey. GAMMA II has been rebuilt and we will report on its status as an astrometric measuring instrument.
Yan, Li; Liu, Song; Tang, Li; Hu, Qiang; Morrison, Carl D.; Ambrosone, Christine B.; Higgins, Michael J.; Sucheston-Campbell, Lara E.
2017-01-01
Background DNA from archival formalin-fixed and paraffin embedded (FFPE) tissue is an invaluable resource for genome-wide methylation studies although concerns about poor quality may limit its use. In this study, we compared DNA methylation profiles of breast tumors using DNA from fresh-frozen (FF) tissues and three types of matched FFPE samples. Results For 9/10 patients, correlation and unsupervised clustering analysis revealed that the FF and FFPE samples were consistently correlated with each other and clustered into distinct subgroups. Greater than 84% of the top 100 loci previously shown to differentiate ER+ and ER– tumors in FF tissues were also FFPE DML. Weighted Correlation Gene Network Analyses (WCGNA) grouped the DML loci into 16 modules in FF tissue, with ~85% of the module membership preserved across tissue types. Materials and Methods Restored FFPE and matched FF samples were profiled using the Illumina Infinium HumanMethylation450K platform. Methylation levels (β-values) across all loci and the top 100 loci previously shown to differentiate tumors by estrogen receptor status (ER+ or ER−) in a larger FF study, were compared between matched FF and FFPE samples using Pearson's correlation, hierarchical clustering and WCGNA. Positive predictive values and sensitivity levels for detecting differentially methylated loci (DML) in FF samples were calculated in an independent FFPE cohort. Conclusions FFPE breast tumors samples show lower overall detection of DMLs versus FF, however FFPE and FF DMLs compare favorably. These results support the emerging consensus that the 450K platform can be employed to investigate epigenetics in large sets of archival FFPE tissues. PMID:28118602
The Canadian Astronomy Data Centre
NASA Astrophysics Data System (ADS)
Ball, Nicholas M.; Schade, D.; Astronomy Data Centre, Canadian
2011-01-01
The Canadian Astronomy Data Centre (CADC) is the world's largest astronomical data center, holding over 0.5 Petabytes of information, and serving nearly 3000 astronomers worldwide. Its current data collections include BLAST, CFHT, CGPS, FUSE, Gemini, HST, JCMT, MACHO, MOST, and numerous other archives and services. It provides extensive data archiving, curation, and processing expertise, via projects such as MegaPipe, and enables substantial day-to-day collaboration between resident astronomers and computer specialists. It is a stable, powerful, persistent, and properly supported environment for the storage and processing of large volumes of data, a condition that is now absolutely vital for their science potential to be exploited by the community. Through initiatives such as the Common Archive Observation Model (CAOM), the Canadian Virtual Observatory (CVO), and the Canadian Advanced Network for Astronomical Research (CANFAR), the CADC is at the global forefront of advancing astronomical research through improved data services. The CAOM aims to provide homogeneous data access, and hence viable interoperability between a potentially unlimited number of different data collections, at many wavelengths. It is active in the definition of numerous emerging standards within the International Virtual Observatory, and several datasets are already available. The CANFAR project is an initiative to make cloud computing for storage and data-intensive processing available to the community. It does this via a Virtual Machine environment that is equivalent to managing a local desktop. Several groups are already processing science data. CADC is also at the forefront of advanced astronomical data analysis, driven by the science requirements of astronomers both locally and further afield. The emergence of 'Astroinformatics' promises to provide not only utility items like object classifications, but to directly enable new science by accessing previously undiscovered or intractable information. We are currently in the early stages of implementing Astroinformatics tools, such as machine learning, on CANFAR.
SPASE 2010 - Providing Access to the Heliophysics Data Environment
NASA Astrophysics Data System (ADS)
Thieman, J. R.; King, T. A.; Roberts, D.; Spase Consortium
2010-12-01
The Heliophysics division of NASA has adopted the Space Physics Archive Search and Extract (SPASE) Data Model for use within the Heliophysics Data Environment which is composed of virtual observatories, value-added services, resident and active archives, and other data providers. The SPASE Data Model has also been adopted by Japan's Inter-university Upper atmosphere Global Observation NETwork (IUGONET), NOAA's National Geophysics Data Center (NGDC), and the Canadian Space Science Data Portal (CSSDP). Europe's HELIO project harvests information from SPASE descriptions of resources as does Planetary Plasma Interactions (PPI) Node of NASA's Planetary Data System (PDS). All of the data sets in the Heliophysics Data Environment are intended to be described by the Space Physics Archive Search and Extract (SPASE) Data Model. Many have already been described in this way. The current version of the SPASE Data Model (2.2.0) may be found on the SPASE web site at http://www.spase-group.org SPASE data set descriptions are not as difficult to create as it might seem. Help is available in both the documentation and the many tools created to support SPASE description creators. There are now a number of very experienced users who are willing to help as well. The SPASE consortium has advanced to the next step in the odyssey to achieve well coordinated federation of resource providers by designing and implementing a set of core services to facilitate the exchange of metadata and delivery of data packages. An example is the registry service shown at http://vmo.igpp.ucla.edu/registry SPASE also incorporates new technologies that are useful to the overall effort, such as cloud storage. A review of the advances, uses of the SPASE data model, and role of services in a federated environment is presented.
NASA Astrophysics Data System (ADS)
Driscoll, Brandon; Jaffray, David; Coolens, Catherine
2014-03-01
Purpose: To provide clinicians & researchers participating in multi-centre clinical trials with a central repository for large volume dynamic imaging data as well as a set of tools for providing end-to-end testing and image analysis standards of practice. Methods: There are three main pieces to the data archiving and analysis system; the PACS server, the data analysis computer(s) and the high-speed networks that connect them. Each clinical trial is anonymized using a customizable anonymizer and is stored on a PACS only accessible by AE title access control. The remote analysis station consists of a single virtual machine per trial running on a powerful PC supporting multiple simultaneous instances. Imaging data management and analysis is performed within ClearCanvas Workstation® using custom designed plug-ins for kinetic modelling (The DCE-Tool®), quality assurance (The DCE-QA Tool) and RECIST. Results: A framework has been set up currently serving seven clinical trials spanning five hospitals with three more trials to be added over the next six months. After initial rapid image transfer (+ 2 MB/s), all data analysis is done server side making it robust and rapid. This has provided the ability to perform computationally expensive operations such as voxel-wise kinetic modelling on very large data archives (+20 GB/50k images/patient) remotely with minimal end-user hardware. Conclusions: This system is currently in its proof of concept stage but has been used successfully to send and analyze data from remote hospitals. Next steps will involve scaling up the system with a more powerful PACS and multiple high powered analysis machines as well as adding real-time review capabilities.
NASA Astrophysics Data System (ADS)
Cusma, Jack T.; Spero, Laurence A.; Groshong, Bennett R.; Cho, Teddy; Bashore, Thomas M.
1993-09-01
An economical and practical digital solution for the replacement of 35 mm cine film as the archive media in the cardiac x-ray imaging environment has remained lacking to date due to the demanding requirements of high capacity, high acquisition rate, high transfer rate, and a need for application in a distributed environment. A clinical digital image library and network based on the D2 digital video format has been installed in the Duke University Cardiac Catheterization Laboratory. The system architecture includes a central image library with digital video recorders and robotic tape retrieval, three acquisition stations, and remote review stations connected via a serial image network. The library has a capacity for over 20,000 Gigabytes of uncompressed image data, equivalent to records for approximately 20,000 patients. Image acquisition in the clinical laboratories is via a real-time digital interface between the digital angiography system and a local digital recorder. Images are transferred to the library over the serial network at a rate of 14.3 Mbytes/sec and permanently stored for later review. The image library and network are currently undergoing a clinical comparison with cine film for visual and quantitative assessment of coronary artery disease. At the conclusion of the evaluation, the configuration will be expanded to include four additional catheterization laboratories and remote review stations throughout the hospital.
Strawman Philosophical Guide for Developing International Network of GPM GV Sites
NASA Technical Reports Server (NTRS)
Smith, Eric A.
2005-01-01
The creation of an international network of ground validation (GV) sites that will support the Global Precipitation Measurement (GPM) Mission's international science programme will require detailed planning of mechanisms for exchanging technical information, GV data products, and scientific results. An important component of the planning will be the philosophical guide under which the network will grow and emerge as a successful element of the GPM Mission. This philosophical guide should be able to serve the mission in developing scientific pathways for ground validation research which will ensure the highest possible quality measurement record of global precipitation products. The philosophical issues, in this regard, partly stem from the financial architecture under which the GV network will be developed, i.e., each participating country will provide its own financial support through committed institutions -- regardless of whether a national or international space agency is involved.At the 1st International GPM Ground Validation Workshop held in Abingdon, UK in November-2003, most of the basic tenants behind the development of the international GV network were identified and discussed. Therefore, with this progress in mind, this presentation is intended to put forth a strawman philosophical guide supporting the development of the international network of GPM GV sites, noting that the initial progress has been reported in the Proceedings of the 1st International GPM GV Workshop -- available online. The central philosophical issues themselves, all flow from the fact that each participating institution can only bring to the table, GV facilities and scientific personnel that are affordable to the sanctioning (funding) national agency (be that a research, research-support, or operational agency). This situation imposes on the network, heterogeneity in the measuring sensors, data collection periods, data collection procedures, data latencies, and data reporting capabilities. Therefore, in order for the network to be effective in supporting the central scientific goals of the GPM mission, there must be a basic agreed upon doctrine under which the network participants function vis-a-vis: (1) an overriding set of general scientific requirements, (2) a minimal set of policies governing the free flow of GV data between the scientific participants, (3) a few basic definitions concerning the prioritization of measurements and their respective value to the mission, (4) a few basic procedures concerning data formats, data reporting procedures, data access, and data archiving, and (5) a simple means to differentiate GV sites according to their level of effort and ability to perform near real-time data acquisition - data reporting tasks. Most important, in case they choose to operate as a near real-time data collection-data distribution site, they would be expected to operate under a fairly narrowly defined protocol needed to ensure smooth GV support operations. This presentation will suggest measures responsive to items (1) - (5) from which to proceed,. In addition, this presentation will seek to stimulate discussion and debate concerning how much heterogeneity is tolerable within the eventual GV site network, given that the any individual GV site can only be considered scientifically useful if it supports the achievement of the central GPM Mission goals. Only ground validation research that has a direct connection to the space mission should be considered justifiable given the overarching scientific goals of the mission. Therefore each site will have to seek some level of accommodation to what the GPM Mission requires in the way of retrieval error characterization, retrieval error detection and reporting, and generation of GV data products that support assessment and improvement of the mission's standard precipitation retrieval algorithms. These are all important scientific issues that will be best resolved in open scientific debate.
NASA Langley Atmospheric Science Data Center (ASDC) Experience with Aircraft Data
NASA Astrophysics Data System (ADS)
Perez, J.; Sorlie, S.; Parker, L.; Mason, K. L.; Rinsland, P.; Kusterer, J.
2011-12-01
Over the past decade the NASA Langley ASDC has archived and distributed a variety of aircraft mission data sets. These datasets posed unique challenges for archiving from the rigidity of the archiving system and formats to the lack of metadata. The ASDC developed a state-of-the-art data archive and distribution system to serve the atmospheric sciences data provider and researcher communities. The system, called Archive - Next Generation (ANGe), is designed with a distributed, multi-tier, serviced-based, message oriented architecture enabling new methods for searching, accessing, and customizing data. The ANGe system provides the ease and flexibility to ingest and archive aircraft data through an ad hoc workflow or to develop a new workflow to suit the providers needs. The ASDC will describe the challenges encountered in preparing aircraft data for archiving and distribution. The ASDC is currently providing guidance to the DISCOVER-AQ (Deriving Information on Surface Conditions from Column and Vertically Resolved Observations Relevant to Air Quality) Earth Venture-1 project on developing collection, granule, and browse metadata as well as supporting the ADAM (Airborne Data For Assessing Models) site.
Status of worldwide Landsat archive
Warriner, Howard W.
1987-01-01
In cooperation with the International Landsat community, and through the Landsat Technical Working Group (LTWG), NOAA is assembling information about the status of the Worldwide Landsat Archive. During LTWG 9, member nations agreed to participate in a survey of International Landsat data holding and of their archive experiences with Landsat data. The goal of the effort was two-fold; one, to document the Landsat archive to date, and, two, to ensure that specific nations' experience with long-term Landsat archival problems were available to others. The survey requested details such as amount of data held, the format of the archive holdings by Spacecraft/Sensor, and acquisition years; the estimated costs to accumulated process, and replace the data (if necessary); the storage space required, and any member nation's plans that would establish the insurance of continuing quality. As a group, the LTWG nations are concerned about the characteristics and reliability of long-term magnetic media storage. Each nation's experience with older data retrieval is solicited in the survey. This information will allow nations to anticipate and plan for required changes to their archival holdings. Also solicited were reports of any upgrades to a nation's archival system that are currently planned and all results of attempts to reduce archive holdings including methodology, current status, and the planned access rates and product support that are anticipated for responding to future archival usage.
Survey of Special Collections and Archives in the United Kingdom and Ireland
ERIC Educational Resources Information Center
Dooley, Jackie M.; Beckett, Rachel; Cullingford, Alison; Sambrook, Katie; Sheppard, Chris; Worrall, Sue
2013-01-01
It has become widely recognised across the academic and research libraries sector that special collections and archives play a key role in differentiating each institution from its peers. In recognition of this, Research Libraries UK (RLUK) established the workstrand "Unique and Distinctive Collections" (UDC) in support of its strategic…
Supporting Student Research with Semantic Technologies and Digital Archives
ERIC Educational Resources Information Center
Martinez-Garcia, Agustina; Corti, Louise
2012-01-01
This article discusses how the idea of higher education students as producers of knowledge rather than consumers can be operationalised by means of student research projects, in which processes of research archiving and analysis are enabled through the use of semantic technologies. It discusses how existing digital repository frameworks can be…
Commercial imagery archive product development
NASA Astrophysics Data System (ADS)
Sakkas, Alysa
1999-12-01
The Lockheed Martin (LM) team had garnered over a decade of operational experience in digital imagery management and analysis for the US Government at numerous worldwide sites. Recently, it set out to create a new commercial product to serve the needs of large-scale imagery archiving and analysis markets worldwide. LM decided to provide a turnkey commercial solution to receive, store, retrieve, process, analyze and disseminate in 'push' or 'pull' modes components and adapted and developed its own algorithms to provide added functionality not commercially available elsewhere. The resultant product, Intelligent Library System, satisfies requirements for (a) a potentially unbounded, data archive automated workflow management for increased user productivity; (c) automatic tracking and management of files stored on shelves; (d) ability to ingest, process and disseminate data involves with bandwidths ranging up to multi-gigabit per second; (e) access through a thin client- to-server network environment; (f) multiple interactive users needing retrieval of filters in seconds from both archived images or in real time, and (g) scalability that maintains information throughput performance as the size of the digital library grows.
Multi-provider architecture for cloud outsourcing of medical imaging repositories.
Godinho, Tiago Marques; Bastião Silva, Luís A; Costa, Carlos; Oliveira, José Luís
2014-01-01
Over the last few years, the extended usage of medical imaging procedures has raised the medical community attention towards the optimization of their workflows. More recently, the federation of multiple institutions into a seamless distribution network has brought hope of increased quality healthcare services along with more efficient resource management. As a result, medical institutions are constantly looking for the best infrastructure to deploy their imaging archives. In this scenario, public cloud infrastructures arise as major candidates, as they offer elastic storage space, optimal data availability without great requirements of maintenance costs or IT personnel, in a pay-as-you-go model. However, standard methodologies still do not take full advantage of outsourced archives, namely because their integration with other in-house solutions is troublesome. This document proposes a multi-provider architecture for integration of outsourced archives with in-house PACS resources, taking advantage of foreign providers to store medical imaging studies, without disregarding security. It enables the retrieval of images from multiple archives simultaneously, improving performance, data availability and avoiding the vendor-locking problem. Moreover it enables load balancing and cache techniques.
NASA Astrophysics Data System (ADS)
Chun, Francis; Tippets, Roger; Della-Rose, Devin J.; Polsgrove, Daniel; Gresham, Kimberlee; Barnaby, David A.
2015-01-01
The Falcon Telescope Network (FTN) is a global network of small aperture telescopes developed by the Center for Space Situational Awareness Research in the Department of Physics at the United States Air Force Academy (USAFA). Consisting of commercially available equipment, the FTN is a collaborative effort between USAFA and other educational institutions ranging from two- and four-year colleges to major research universities. USAFA provides the equipment (e.g. telescope, mount, camera, filter wheel, dome, weather station, computers and storage devices) while the educational partners provide the building and infrastructure to support an observatory. The user base includes USAFA along with K-12 and higher education faculty and students. The diversity of the users implies a wide variety of observing interests, and thus the FTN collects images on diverse objects, including satellites, galactic and extragalactic objects, and objects popular for education and public outreach. The raw imagery, all in the public domain, will be accessible to FTN partners and will be archived at USAFA. USAFA cadets use the FTN to continue a tradition of satellite characterization and astronomical research; this tradition is the model used for designing the network to serve undergraduate research needs. Additionally, cadets have led the development of the FTN by investigating observation priority schemes and conducting a 'day-in-the-life' study of the FTN in regards to satellite observations. With respect to K-12 outreach, cadets have provided feedback to K-12 students and teachers through evaluation of first-light proposals. In this paper, we present the current status of the network and results from student participation in the project.
The Impact of Developing Technology on Media Communications.
ERIC Educational Resources Information Center
MacDonald, Lindsay W.
1997-01-01
Examines changes in media communications resulting from new information technologies: communications technologies (networks, World Wide Web, digital set-top box); graphic arts (digital photography, CD and digital archives, desktop design and publishing, printing technology); television and video (digital editing, interactive television, news and…
Novel carboxamides as potential mosquito reprellents.
USDA-ARS?s Scientific Manuscript database
A model was developed using 167 carboxamide compounds, from the US Department of Agriculture archival database, that were tested as arthropod repellents over the past 60 years. An artificial neural network utilizing CODESSA PRO descriptors was used to construct a Quantitative Structure-Activity Re...
32 CFR 2001.1 - Purpose and scope.
Code of Federal Regulations, 2014 CFR
2014-07-01
... Telecommunications, automated information systems, and network security 4.1, 4.2 2001.51 Technical security 4.1 2001... National Defense Other Regulations Relating to National Defense INFORMATION SECURITY OVERSIGHT OFFICE, NATIONAL ARCHIVES AND RECORDS ADMINISTRATION CLASSIFIED NATIONAL SECURITY INFORMATION Scope of Part § 2001...
32 CFR 2001.1 - Purpose and scope.
Code of Federal Regulations, 2012 CFR
2012-07-01
... Telecommunications, automated information systems, and network security 4.1, 4.2 2001.51 Technical security 4.1 2001... National Defense Other Regulations Relating to National Defense INFORMATION SECURITY OVERSIGHT OFFICE, NATIONAL ARCHIVES AND RECORDS ADMINISTRATION CLASSIFIED NATIONAL SECURITY INFORMATION Scope of Part § 2001...
32 CFR 2001.1 - Purpose and scope.
Code of Federal Regulations, 2011 CFR
2011-07-01
... Telecommunications, automated information systems, and network security 4.1, 4.2 2001.51 Technical security 4.1 2001... National Defense Other Regulations Relating to National Defense INFORMATION SECURITY OVERSIGHT OFFICE, NATIONAL ARCHIVES AND RECORDS ADMINISTRATION CLASSIFIED NATIONAL SECURITY INFORMATION Scope of Part § 2001...
32 CFR 2001.1 - Purpose and scope.
Code of Federal Regulations, 2013 CFR
2013-07-01
... Telecommunications, automated information systems, and network security 4.1, 4.2 2001.51 Technical security 4.1 2001... National Defense Other Regulations Relating to National Defense INFORMATION SECURITY OVERSIGHT OFFICE, NATIONAL ARCHIVES AND RECORDS ADMINISTRATION CLASSIFIED NATIONAL SECURITY INFORMATION Scope of Part § 2001...
32 CFR 2001.1 - Purpose and scope.
Code of Federal Regulations, 2010 CFR
2010-07-01
... Telecommunications, automated information systems, and network security 4.1, 4.2 2001.51 Technical security 4.1 2001... National Defense Other Regulations Relating to National Defense INFORMATION SECURITY OVERSIGHT OFFICE, NATIONAL ARCHIVES AND RECORDS ADMINISTRATION CLASSIFIED NATIONAL SECURITY INFORMATION Scope of Part § 2001...
NASA Technical Reports Server (NTRS)
Hearty, Thomas; Savtchenko, Andrey; Vollmer, Bruce; Albayrak, Arif; Theobald, Mike; Esfandiari, Ed; Wei, Jennifer
2015-01-01
This talk will describe the support and distribution of CO2 data products from OCO-2, AIRS, and ACOS, that are archived and distributed from the Goddard Earth Sciences Data and Information Services Center. We will provide a brief summary of the current online archive and distribution metrics for the OCO-2 Level 1 products and plans for the Level 2 products. We will also describe collaborative data sets and services (e.g., matchups with other sensors) and solicit feedback for potential future services.
CARDS: A blueprint and environment for domain-specific software reuse
NASA Technical Reports Server (NTRS)
Wallnau, Kurt C.; Solderitsch, Anne Costa; Smotherman, Catherine
1992-01-01
CARDS (Central Archive for Reusable Defense Software) exploits advances in domain analysis and domain modeling to identify, specify, develop, archive, retrieve, understand, and reuse domain-specific software components. An important element of CARDS is to provide visibility into the domain model artifacts produced by, and services provided by, commercial computer-aided software engineering (CASE) technology. The use of commercial CASE technology is important to provide rich, robust support for the varied roles involved in a reuse process. We refer to this kind of use of knowledge representation systems as supporting 'knowledge-based integration.'
U.S. Tsunami Warning System: Advancements since the 2004 Indian Ocean Tsunami (Invited)
NASA Astrophysics Data System (ADS)
Whitmore, P.
2009-12-01
The U.S. government embarked on a strengthening program for the U.S. Tsunami Warning System (TWS) in the aftermath of the disastrous 2004 Indian Ocean tsunami. The program was designed to improve several facets of the U.S. TWS, including: upgrade of the coastal sea level network - 16 new stations plus higher transmission rates; expansion of the deep ocean tsunameter network - 7 sites increased to 39; upgrade of seismic networks - both USGS and Tsunami Warning Center (TWC); increase of TWC staff to allow 24x7 coverage at two centers; development of an improved tsunami forecast system; increased preparedness in coastal communities; expansion of the Pacific Tsunami Warning Center facility; and improvement of the tsunami data archive effort at the National Geophysical Data Center. The strengthening program has been completed and has contributed to the many improvements attained in the U.S. TWS since 2004. Some of the more significant enhancements to the program are: the number of sea level and seismic sites worldwide available to the TWCs has more than doubled; the TWC areas-of-responsibility expanded to include the U.S./Canadian Atlantic coasts, Indian Ocean, Caribbean Sea, Gulf of Mexico, and U.S. Arctic coast; event response time decreased by approximately one-half; product accuracy has improved; a tsunami forecast system developed by NOAA capable of forecasting inundation during an event has been delivered to the TWCs; warning areas are now defined by pre-computed or forecasted threat versus distance or travel time, significantly reducing the amount of coast put in a warning; new warning dissemination techniques have been implemented to reach a broader audience in less time; tsunami product content better reflects the expected impact level; the number of TsunamiReady communities has quadrupled; and the historical data archive has increased in quantity and accuracy. In addition to the strengthening program, the U.S. National Tsunami Hazard Mitigation Program (NTHMP) has expanded its efforts since 2004 and improved tsunami preparedness throughout U.S. coastal communities. The NTHMP is a partnership of federal agencies and state tsunami response agencies whose efforts include: development of inundation and evacuation maps for most highly threatened communities; tsunami evacuation and educational signage for coastal communities; support for tsunami educational, awareness and planning seminars; increased number of local tsunami warning dissemination devices such as sirens; and support for regional tsunami exercises. These activities are major factors that have contributed to the increase of TsunamiReady communities throughout the country.
Towards more Global Coordination of Atmospheric Electricity Measurements (GloCAEM)
NASA Astrophysics Data System (ADS)
Nicoll, Keri; Harrison, Giles
2017-04-01
Earth's atmospheric electrical environment has been studied since the 1750s but its more recent applications to science questions around clouds and climate highlight the incompleteness of our understanding, in part due to lack of suitable global measurements. The Global Electric Circuit (GEC) sustains the near-surface fair weather (FW) electric field, which is present globally in regions which are not strongly electrically disturbed by weather or pollution. It can be measured routinely at the surface using well established instrumentation such as electric field mills. Despite the central role of lightning as a weather hazard and the potentially widespread importance of charge for atmospheric processes, research is hampered by the fragmented nature of surface atmospheric electricity measurements. This makes anything other than local studies in fortuitous fair weather conditions difficult. In contrast to detection of global lightning using satellite measurements and ground-based radio networks, the FW electric field and GEC cannot be measured by remote sensing and no similar measurement networks exist for its study. This presents an opportunity as many researchers worldwide now make high temporal resolution measurements of the FW electric field routinely, which is neither coordinated nor exploited. The GLOCAEM (Global Coordination of Atmospheric Electricity Measurements) project is currently bringing some of these experts together to make the first steps towards an effective global network for FW atmospheric electricity monitoring. A specific objective of the project is to establish the first modern archive of international FW atmospheric electric field data in close to real time to allow global studies of atmospheric electricity to be straightforwardly and robustly performed. Data will be archived through the UK Centre for Environmental Data Analysis (CEDA) and will be available for download by users from early 2018. Both 1 second and 1 minute electric field data will be archived, along with meteorological measurements (if available) for ease of interpretation of electrical measurements. Although the primary aim of the project is to provide a close to real time electric field database, archiving of existing historical electric field datasets is also planned to extend the range of studies possible. This presentation will provide a summary of progress with the GLOCAEM project.
Between Oais and Agile a Dynamic Data Management Approach
NASA Astrophysics Data System (ADS)
Bennett, V. L.; Conway, E. A.; Waterfall, A. M.; Pepler, S.
2015-12-01
In this paper we decribe an approach to the integration of existing archival activities which lies between compliance with the more rigid OAIS/TRAC standards and a more flexible "Agile" approach to the curation and preservation of Earth Observation data. We provide a high level overview of existing practice and discuss how these procedures can be extended and supported through the description of preservation state. The aim of which is to facilitate the dynamic controlled management of scientific data through its lifecycle. While processes are considered they are not statically defined but rather driven by human interactions in the form of risk management/review procedure that produce actionable plans, which are responsive to change. We then proceed by describing the feasibility testing of extended risk management and planning procedures which integrate current practices. This was done through the CEDA Archival Format Audit which inspected British Atmospheric Data Centre and NERC Earth Observation Data Centre Archival holdings. These holdings are extensive, comprising of around 2 Petabytes of data and 137 million individual files, which were analysed and characterised in terms of format, based risk. We are then able to present an overview of the format based risk burden faced by a large scale archive attempting to maintain the usability of heterogeneous environmental data sets We continue by presenting a dynamic data management information model and provide discussion of the following core model entities and their relationships: Aspirational entities, which include Data Entity definitions and their associated Preservation Objectives. Risk entities, which act as drivers for change within the data lifecycle. These include Acquisitional Risks, Technical Risks, Strategic Risks and External Risks Plan entities, which detail the actions to bring about change within an archive. These include Acquisition Plans, Preservation Plans and Monitoring plans which support responsive interactions with the community. The Result entities describe the outcomes of the plans. This includes Acquisitions. Mitigations and Accepted Risks. With risk acceptance permitting imperfect but functional solutions that can be realistically supported within an archives resource levels
Park, Bongsoo; Park, Jongsun; Cheong, Kyeong-Chae; Choi, Jaeyoung; Jung, Kyongyong; Kim, Donghan; Lee, Yong-Hwan; Ward, Todd J; O'Donnell, Kerry; Geiser, David M; Kang, Seogchan
2011-01-01
The fungal genus Fusarium includes many plant and/or animal pathogenic species and produces diverse toxins. Although accurate species identification is critical for managing such threats, it is difficult to identify Fusarium morphologically. Fortunately, extensive molecular phylogenetic studies, founded on well-preserved culture collections, have established a robust foundation for Fusarium classification. Genomes of four Fusarium species have been published with more being currently sequenced. The Cyber infrastructure for Fusarium (CiF; http://www.fusariumdb.org/) was built to support archiving and utilization of rapidly increasing data and knowledge and consists of Fusarium-ID, Fusarium Comparative Genomics Platform (FCGP) and Fusarium Community Platform (FCP). The Fusarium-ID archives phylogenetic marker sequences from most known species along with information associated with characterized isolates and supports strain identification and phylogenetic analyses. The FCGP currently archives five genomes from four species. Besides supporting genome browsing and analysis, the FCGP presents computed characteristics of multiple gene families and functional groups. The Cart/Favorite function allows users to collect sequences from Fusarium-ID and the FCGP and analyze them later using multiple tools without requiring repeated copying-and-pasting of sequences. The FCP is designed to serve as an online community forum for sharing and preserving accumulated experience and knowledge to support future research and education.
Park, Bongsoo; Park, Jongsun; Cheong, Kyeong-Chae; Choi, Jaeyoung; Jung, Kyongyong; Kim, Donghan; Lee, Yong-Hwan; Ward, Todd J.; O'Donnell, Kerry; Geiser, David M.; Kang, Seogchan
2011-01-01
The fungal genus Fusarium includes many plant and/or animal pathogenic species and produces diverse toxins. Although accurate species identification is critical for managing such threats, it is difficult to identify Fusarium morphologically. Fortunately, extensive molecular phylogenetic studies, founded on well-preserved culture collections, have established a robust foundation for Fusarium classification. Genomes of four Fusarium species have been published with more being currently sequenced. The Cyber infrastructure for Fusarium (CiF; http://www.fusariumdb.org/) was built to support archiving and utilization of rapidly increasing data and knowledge and consists of Fusarium-ID, Fusarium Comparative Genomics Platform (FCGP) and Fusarium Community Platform (FCP). The Fusarium-ID archives phylogenetic marker sequences from most known species along with information associated with characterized isolates and supports strain identification and phylogenetic analyses. The FCGP currently archives five genomes from four species. Besides supporting genome browsing and analysis, the FCGP presents computed characteristics of multiple gene families and functional groups. The Cart/Favorite function allows users to collect sequences from Fusarium-ID and the FCGP and analyze them later using multiple tools without requiring repeated copying-and-pasting of sequences. The FCP is designed to serve as an online community forum for sharing and preserving accumulated experience and knowledge to support future research and education. PMID:21087991
Historical Archives in Italian Astronomical Observatories: The ``Specola 2000'' Project
NASA Astrophysics Data System (ADS)
Chinnici, I.; Mandrino, A.; Bònoli, F.
2006-12-01
Italy's well-consolidated tradition in astronomy is fully witnessed by its rich archival heritage. Astronomical records are stored in many observatories and universities, as well as in libraries and in private institutions. In 2000 a project was promoted to arrange and produce inventories of all material kept in Italian observatory archives. The project was planned by the Società Astronomica Italiana, and financial support was provided by the Italian Ministero per i Beni e le Attività Culturali. In this paper, the results obtained thus far are presented and commented on.
The Future of the Protein Data Bank
Berman, Helen M.; Kleywegt, Gerard J.; Nakamura, Haruki; Markley, John L.
2013-01-01
The Worldwide Protein Data Bank (wwPDB) is the international collaboration that manages the deposition, processing and distribution of the PDB archive. The wwPDB’s mission is to maintain a single archive of macromolecular structural data that are freely and publicly available to the global community. Its members [RCSB PDB (USA), PDBe (Europe), PDBj (Japan), and BMRB (USA)] host data-deposition sites and mirror the PDB ftp archive. To support future developments in structural biology, the wwPDB partners are addressing organizational, scientific, and technical challenges. PMID:23023942
NASA Astrophysics Data System (ADS)
Anantharaj, V.; Mayer, B.; Wang, F.; Hack, J.; McKenna, D.; Hartman-Baker, R.
2012-04-01
The Oak Ridge Leadership Computing Facility (OLCF) facilitates the execution of computational experiments that require tens of millions of CPU hours (typically using thousands of processors simultaneously) while generating hundreds of terabytes of data. A set of ultra high resolution climate experiments in progress, using the Community Earth System Model (CESM), will produce over 35,000 files, ranging in sizes from 21 MB to 110 GB each. The execution of the experiments will require nearly 70 Million CPU hours on the Jaguar and Titan supercomputers at OLCF. The total volume of the output from these climate modeling experiments will be in excess of 300 TB. This model output must then be archived, analyzed, distributed to the project partners in a timely manner, and also made available more broadly. Meeting this challenge would require efficient movement of the data, staging the simulation output to a large and fast file system that provides high volume access to other computational systems used to analyze the data and synthesize results. This file system also needs to be accessible via high speed networks to an archival system that can provide long term reliable storage. Ideally this archival system is itself directly available to other systems that can be used to host services making the data and analysis available to the participants in the distributed research project and to the broader climate community. The various resources available at the OLCF now support this workflow. The available systems include the new Jaguar Cray XK6 2.63 petaflops (estimated) supercomputer, the 10 PB Spider center-wide parallel file system, the Lens/EVEREST analysis and visualization system, the HPSS archival storage system, the Earth System Grid (ESG), and the ORNL Climate Data Server (CDS). The ESG features federated services, search & discovery, extensive data handling capabilities, deep storage access, and Live Access Server (LAS) integration. The scientific workflow enabled on these systems, and developed as part of the Ultra-High Resolution Climate Modeling Project, allows users of OLCF resources to efficiently share simulated data, often multi-terabyte in volume, as well as the results from the modeling experiments and various synthesized products derived from these simulations. The final objective in the exercise is to ensure that the simulation results and the enhanced understanding will serve the needs of a diverse group of stakeholders across the world, including our research partners in U.S. Department of Energy laboratories & universities, domain scientists, students (K-12 as well as higher education), resource managers, decision makers, and the general public.
Neural networks: Application to medical imaging
NASA Technical Reports Server (NTRS)
Clarke, Laurence P.
1994-01-01
The research mission is the development of computer assisted diagnostic (CAD) methods for improved diagnosis of medical images including digital x-ray sensors and tomographic imaging modalities. The CAD algorithms include advanced methods for adaptive nonlinear filters for image noise suppression, hybrid wavelet methods for feature segmentation and enhancement, and high convergence neural networks for feature detection and VLSI implementation of neural networks for real time analysis. Other missions include (1) implementation of CAD methods on hospital based picture archiving computer systems (PACS) and information networks for central and remote diagnosis and (2) collaboration with defense and medical industry, NASA, and federal laboratories in the area of dual use technology conversion from defense or aerospace to medicine.
A Routing Mechanism for Cloud Outsourcing of Medical Imaging Repositories.
Godinho, Tiago Marques; Viana-Ferreira, Carlos; Bastião Silva, Luís A; Costa, Carlos
2016-01-01
Web-based technologies have been increasingly used in picture archive and communication systems (PACS), in services related to storage, distribution, and visualization of medical images. Nowadays, many healthcare institutions are outsourcing their repositories to the cloud. However, managing communications between multiple geo-distributed locations is still challenging due to the complexity of dealing with huge volumes of data and bandwidth requirements. Moreover, standard methodologies still do not take full advantage of outsourced archives, namely because their integration with other in-house solutions is troublesome. In order to improve the performance of distributed medical imaging networks, a smart routing mechanism was developed. This includes an innovative cache system based on splitting and dynamic management of digital imaging and communications in medicine objects. The proposed solution was successfully deployed in a regional PACS archive. The results obtained proved that it is better than conventional approaches, as it reduces remote access latency and also the required cache storage space.
A technique that couples lead (Pb) isotopes and multi-element concentrations with meteorological analysis was used to assess source contributions to precipitation samples at the Bondville, Illinois USA National Trends Network (NTN) site. Precipitation samples collected over a 16 ...
Education for Australia's Information Future
ERIC Educational Resources Information Center
Burford, Sally; Partridge, Helen; Brown, Sarah; Hider, Philip; Ellis, Leonie
2015-01-01
Digital disruption and an increasingly networked society drive rapid change in many professions and a corresponding need for change in tertiary education. Across the world, information education has, to date, prepared graduates for employment in discrete professions, such as librarianship, records management, archives and teacher librarianship.…
MODIS land data at the EROS data center DAAC
Jenkerson, Calli B.; Reed, B.C.
2001-01-01
The US Geological Survey's (USGS) Earth Resources Observation Systems (EROS) Data Center (EDC) in Sioux Falls, SD, USA, is the primary national archive for land processes data and one of the National Aeronautics and Space Administration's (NASA) Distributed Active Archive Centers (DAAC) for the Earth Observing System (EOS). One of EDC's functions as a DAAC is the archival and distribution of Moderate Resolution Spectroradiometer (MODIS) Land Data collected from the Earth Observing System (EOS) satellite Terra. More than 500,000 publicly available MODIS land data granules totaling 25 Terabytes (Tb) are currently stored in the EDC archive. This collection is managed, archived, and distributed by EOS Data and Information System (EOSDIS) Core System (ECS) at EDC. EDC User Services support the use of MODIS Land data, which include land surface reflectance/albedo, temperature/emissivity, vegetation characteristics, and land cover, by responding to user inquiries, constructing user information sites on the EDC web page, and presenting MODIS materials worldwide.
The Practical Obstacles of Data Transfer: Why researchers still love scp
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nam, Hai Ah; Hill, Jason J; Parete-Koon, Suzanne T
The importance of computing facilities is heralded every six months with the announcement of the new Top500 list, showcasing the world s fastest supercomputers. Unfortu- nately, with great computing capability does not come great long-term data storage capacity, which often means users must move their data to their local site archive, to remote sites where they may be doing future computation or anal- ysis, or back to their home institution, else face the dreaded data purge that most HPC centers employ to keep utiliza- tion of large parallel filesystems low to manage performance and capacity. At HPC centers, data transfermore » is crucial to the scientific workflow and will increase in importance as computing systems grow in size. The Energy Sciences Net- work (ESnet) recently launched its fifth generation network, a 100 Gbps high-performance, unclassified national network connecting more than 40 DOE research sites to support scientific research and collaboration. Despite the tenfold increase in bandwidth to DOE research sites amenable to multiple data transfer streams and high throughput, in prac- tice, researchers often under-utilize the network and resort to painfully-slow single stream transfer methods such as scp to avoid the complexity of using multiple stream tools such as GridFTP and bbcp, and contend with frustration from the lack of consistency of available tools between sites. In this study we survey and assess the data transfer methods pro- vided at several DOE supported computing facilities, includ- ing both leadership-computing facilities, connected through ESnet. We present observed transfer rates, suggested opti- mizations, and discuss the obstacles the tools must overcome to receive wide-spread adoption over scp.« less
Interoperability in the Planetary Science Archive (PSA)
NASA Astrophysics Data System (ADS)
Rios Diaz, C.
2017-09-01
The protocols and standards currently being supported by the recently released new version of the Planetary Science Archive at this time are the Planetary Data Access Protocol (PDAP), the EuroPlanet- Table Access Protocol (EPN-TAP) and Open Geospatial Consortium (OGC) standards. We explore these protocols in more detail providing scientifically useful examples of their usage within the PSA.
ERIC Educational Resources Information Center
Stackpole, Laurie
2001-01-01
The Naval Research Laboratory Library has made significant progress providing its distributed user community with a single point-of-access to information needed to support scientific research through TORPEDO "Ultra," a digital archive that in many respects functions as an electronic counterpart of a traditional library. It consists of…
NASA Astrophysics Data System (ADS)
Budden, A. E.; Arzayus, K. M.; Baker-Yeboah, S.; Casey, K. S.; Dozier, J.; Jones, C. S.; Jones, M. B.; Schildhauer, M.; Walker, L.
2016-12-01
The newly established NSF Arctic Data Center plays a critical support role in archiving and curating the data and software generated by Arctic researchers from diverse disciplines. The Arctic community, comprising Earth science, archaeology, geography, anthropology, and other social science researchers, are supported through data curation services and domain agnostic tools and infrastructure, ensuring data are accessible in the most transparent and usable way possible. This interoperability across diverse disciplines within the Arctic community facilitates collaborative research and is mirrored by interoperability between the Arctic Data Center infrastructure and other large scale cyberinfrastructure initiatives. The Arctic Data Center leverages the DataONE federation to standardize access to and replication of data and metadata to other repositories, specifically the NOAA's National Centers for Environmental Information (NCEI). This approach promotes long-term preservation of the data and metadata, as well as opening the door for other data repositories to leverage this replication infrastructure with NCEI and other DataONE member repositories. The Arctic Data Center uses rich, detailed metadata following widely recognized standards. Particularly, measurement-level and provenance metadata provide scientists the details necessary to integrate datasets across studies and across repositories while enabling a full understanding of the provenance of data used in the system. The Arctic Data Center gains this deep metadata and provenance support by simply adopting DataONE services, which results in significant efficiency gains by eliminating the need to develop systems de novo. Similarly, the advanced search tool developed by the Knowledge Network for Biocomplexity and extended for data submission by the Arctic Data Center, can be used by other DataONE-compliant repositories without further development. By standardizing interfaces and leveraging the DataONE federation, the Arctic Data Center has advanced rapidly and can itself contribute to raising the capabilities of all members of the federation.
NASA Astrophysics Data System (ADS)
Tomlin, M. C.; Jenkyns, R.
2015-12-01
Ocean Networks Canada (ONC) collects data from observatories in the northeast Pacific, Salish Sea, Arctic Ocean, Atlantic Ocean, and land-based sites in British Columbia. Data are streamed, collected autonomously, or transmitted via satellite from a variety of instruments. The Software Engineering group at ONC develops and maintains Oceans 2.0, an in-house software system that acquires and archives data from sensors, and makes data available to scientists, the public, government and non-government agencies. The Oceans 2.0 workflow tool was developed by ONC to manage a large volume of tasks and processes required for instrument installation, recovery and maintenance activities. Since 2013, the workflow tool has supported 70 expeditions and grown to include 30 different workflow processes for the increasing complexity of infrastructures at ONC. The workflow tool strives to keep pace with an increasing heterogeneity of sensors, connections and environments by supporting versioning of existing workflows, and allowing the creation of new processes and tasks. Despite challenges in training and gaining mutual support from multidisciplinary teams, the workflow tool has become invaluable in project management in an innovative setting. It provides a collective place to contribute to ONC's diverse projects and expeditions and encourages more repeatable processes, while promoting interactions between the multidisciplinary teams who manage various aspects of instrument development and the data they produce. The workflow tool inspires documentation of terminologies and procedures, and effectively links to other tools at ONC such as JIRA, Alfresco and Wiki. Motivated by growing sensor schemes, modes of collecting data, archiving, and data distribution at ONC, the workflow tool ensures that infrastructure is managed completely from instrument purchase to data distribution. It integrates all areas of expertise and helps fulfill ONC's mandate to offer quality data to users.
NASA Technical Reports Server (NTRS)
Stevens, Grady H.
1992-01-01
The Data Distribution Satellite (DDS), operating in conjunction with the planned space network, the National Research and Education Network and its commercial derivatives, would play a key role in networking the emerging supercomputing facilities, national archives, academic, industrial, and government institutions. Centrally located over the United States in geostationary orbit, DDS would carry sophisticated on-board switching and make use of advanced antennas to provide an array of special services. Institutions needing continuous high data rate service would be networked together by use of a microwave switching matrix and electronically steered hopping beams. Simultaneously, DDS would use other beams and on board processing to interconnect other institutions with lesser, low rate, intermittent needs. Dedicated links to White Sands and other facilities would enable direct access to space payloads and sensor data. Intersatellite links to a second generation ATDRS, called Advanced Space Data Acquisition and Communications System (ASDACS), would eliminate one satellite hop and enhance controllability of experimental payloads by reducing path delay. Similarly, direct access would be available to the supercomputing facilities and national data archives. Economies with DDS would be derived from its ability to switch high rate facilities amongst users needed. At the same time, having a CONUS view, DDS would interconnect with any institution regardless of how remote. Whether one needed high rate service or low rate service would be immaterial. With the capability to assign resources on demand, DDS will need only carry a portion of the resources needed if dedicated facilities were used. Efficiently switching resources to users as needed, DDS would become a very feasible spacecraft, even though it would tie together the space network, the terrestrial network, remote sites, 1000's of small users, and those few who need very large data links intermittently.
Ground-Based Network and Supersite Observations to Complement and Enrich EOS Research
NASA Technical Reports Server (NTRS)
Tsay, Si-Chee; Holben, Brent N.; Welton, Ellsworth J.
2011-01-01
Since 1997 NASA has been successfully launching a series of satellites - the Earth Observing System (EOS) - to intensively study, and gain a better understanding of, the Earth as an integrated system. Space-borne remote sensing observations, however, are often plagued by contamination of surface signatures. Thus, ground-based in-situ and remote-sensing measurements, where signals come directly from atmospheric constituents, the sun, and/or the Earth-atmosphere interactions, provide additional information content for comparisons that confirm quantitatively the usefulness of the integrated surface, aircraft, and satellite datasets. Through numerous participations, particularly but not limited to the EOS remote-sensing/retrieval and validation projects over the years, NASA/GSFC has developed and continuously refined ground-based networks and mobile observatories that proved to be vital in providing high temporal measurements, which complement and enrich the satellite observations. These are: the AERO NET (AErosol RObotic NETwork) a federation of ground-based globally distributed network of spectral sun-sky photometers; the MPLNET (Micro-Pulse Lidar NETwork, a similarly organized network of micro-pulse lidar systems measuring aerosol and cloud vertical structure continuously; and the SMART-COMMIT (Surface-sensing Measurements for Atmospheric Radiative Transfer - Chemical, Optical & Microphysical Measurements of In-situ Troposphere, mobile observatories, a suite of spectral radiometers and in-situ probes acquiring supersite measurements. Most MPLNET sites are collocated with those of AERONET, and both networks always support the deployment of SMART-COMMIT worldwide. These data products follow the data structure of EOS conventions: Level-0, instrument archived raw data; Level-1 (or 1.5), real-time data with no (or limited) quality assurance; Level-2, not real high temporal and spectral resolutions. In this talk, we will present NASA/GSFC groundbased facilities, serving as network or supersite observations, which have been playing key roles in major international research projects over diverse aerosol regimes to complement and enrich the EOS scientific research.
NASA Technical Reports Server (NTRS)
Bilitza, D.; King, J. H.
1988-01-01
The activities and services of the National Space Science data Center (NSSDC) and the World Data Center A for Rockets and Satellites (WDC-A-R and S) are described with special emphasis on ionospheric physics. The present catalog/archive system is explained and future developments are indicated. In addition to the basic data acquisition, archiving, and dissemination functions, ongoing activities include the Central Online Data Directory (CODD), the Coordinated Data Analysis Workshopps (CDAW), the Space Physics Analysis Network (SPAN), advanced data management systems (CD/DIS, NCDS, PLDS), and publication of the NSSDC News, the SPACEWARN Bulletin, and several NSSD reports.
Internet Services for Professional Astronomy
NASA Astrophysics Data System (ADS)
Andernach, H.
A (subjective) overview of Internet resources relevant to professional astronomers is given. Special emphasis is put on databases of astronomical objects and servers providing general information, e.g. on astronomical catalogues, finding charts from sky surveys, bibliographies, directories, browsers through multi-wavelength observational archives, etc. Archives of specific observational data will be discussed in more detail in other chapters of this book, dealing with the corresponding part of the electromagnetic spectrum. About 200 different links are mentioned, and every attempt was made to make this report as up-to-date as possible. As the field is rapidly growing with improved network technology, it will be just a snapshot of the situation in mid-1998.
Archiving Space Geodesy Data for 20+ Years at the CDDIS
NASA Technical Reports Server (NTRS)
Noll, Carey E.; Dube, M. P.
2004-01-01
Since 1982, the Crustal Dynamics Data Information System (CDDIS) has supported the archive and distribution of geodetic data products acquired by NASA programs. These data include GPS (Global Positioning System), GLONASS (GLObal NAvigation Satellite System), SLR (Satellite Laser Ranging), VLBI (Very Long Baseline Interferometry), and DORIS (Doppler Orbitography and Radiolocation Integrated by Satellite). The data archive supports NASA's space geodesy activities through the Solid Earth and Natural Hazards (SENH) program. The CDDIS data system and its archive have become increasingly important to many national and international programs, particularly several of the operational services within the International Association of Geodesy (IAG), including the International GPS Service (IGS), the International Laser Ranging Service (ILRS), the International VLBI Service for Geodesy and Astrometry (IVS), the International DORIS Service (IDS), and the International Earth Rotation Service (IERS). The CDDIS provides easy and ready access to a variety of data sets, products, and information about these data. The specialized nature of the CDDIS lends itself well to enhancement and thus can accommodate diverse data sets and user requirements. All data sets and metadata extracted from these data sets are accessible to scientists through ftp and the web; general information about each data set is accessible via the web. The CDDIS, including background information about the system and its user communities, the computer architecture, archive contents, available metadata, and future plans will be discussed.
Archiving 40+ Years of Planetary Mission Data - Lessons Learned
NASA Astrophysics Data System (ADS)
Simmons, K. E.
2012-12-01
NASA has invested billions of dollars and millions of man-hours in obtaining information about our planet and its neighbors. Will the data obtained from those investments be accessible in 50 years, nae 20 or even 10? Will scientists be able to look back at the record and understand what stayed the same or has changed? Saving the data is critical, we all understand that, and keeping it reformatted to maintain usability is a given. But what is easily more critical is saving the information that allows a future person to use these data. This work explores the difficul-ties, costs and heartaches encountered with archiving data from six major NASA missions spanning 40+ years: Mariner 6, 7 and 9, Pioneer Venus, Voyager and Galileo. Some of these lessons are a) a central archive for Mission documents needs to be established, b) metadata from the early stages of a mission are frequntly poorly recorded, c) instrument microprocessors improve science flexibility but make documenting harder, d) archiving observation de-signs improves data recovery, e) more post mission time and dollars need to be allocated to archiving, f) additional PDS node funding would support more timely data ingestion, faster peer review and quicker public access and g) trained archivists should be part of mission teams at all levels. This work is supported from ROSES grant NNX09AM04GS04.
Satellite image analysis using neural networks
NASA Technical Reports Server (NTRS)
Sheldon, Roger A.
1990-01-01
The tremendous backlog of unanalyzed satellite data necessitates the development of improved methods for data cataloging and analysis. Ford Aerospace has developed an image analysis system, SIANN (Satellite Image Analysis using Neural Networks) that integrates the technologies necessary to satisfy NASA's science data analysis requirements for the next generation of satellites. SIANN will enable scientists to train a neural network to recognize image data containing scenes of interest and then rapidly search data archives for all such images. The approach combines conventional image processing technology with recent advances in neural networks to provide improved classification capabilities. SIANN allows users to proceed through a four step process of image classification: filtering and enhancement, creation of neural network training data via application of feature extraction algorithms, configuring and training a neural network model, and classification of images by application of the trained neural network. A prototype experimentation testbed was completed and applied to climatological data.
NASA Technical Reports Server (NTRS)
Danielsen, Edwin F.; Pfister, Leonhard; Hipskind, R. Stephen; Gaines, Steven E.
1990-01-01
The purpose of this task is the acquisition, distribution, archival, and analysis of data collected during and in support of the Upper Atmospheric Research Program (UARP) field experiments. Meteorological and U2 data from the 1984 Stratosphere-Troposphere Exchange Project (STEP) was analyzed to determine characteristics of internal atmospheric waves. CDROM's containing data from the 1987 STEP, 1987 Airborne Antarctic Ozone Expedition (AAOE), and the 1989 Airborne Arctic Stratospheric Expedition (AASE) were produced for archival and distribution of those data sets. The AASE CDROM contains preliminary data and a final release is planned for February 1990. Comparisons of data from the NASA ER-2 Meteorological Measurement System (MMS) with radar tracking and radiosonde data show good agreement. Planning for a Meteorological Support Facility continues. We are investigating existing and proposed hardware and software to receive, manipulate, and display satellite imagery and standard meteorological analyses, forecasts, and radiosonde data.
WFIRST Science Operations at STScI
NASA Astrophysics Data System (ADS)
Gilbert, Karoline; STScI WFIRST Team
2018-06-01
With sensitivity and resolution comparable the Hubble Space Telescope, and a field of view 100 times larger, the Wide Field Instrument (WFI) on WFIRST will be a powerful survey instrument. STScI will be the Science Operations Center (SOC) for the WFIRST Mission, with additional science support provided by the Infrared Processing and Analysis Center (IPAC) and foreign partners. STScI will schedule and archive all WFIRST observations, calibrate and produce pipeline-reduced data products for imaging with the Wide Field Instrument, support the High Latitude Imaging and Supernova Survey Teams, and support the astronomical community in planning WFI imaging observations and analyzing the data. STScI has developed detailed concepts for WFIRST operations, including a data management system integrating data processing and the archive which will include a novel, cloud-based framework for high-level data processing, providing a common environment accessible to all users (STScI operations, Survey Teams, General Observers, and archival investigators). To aid the astronomical community in examining the capabilities of WFIRST, STScI has built several simulation tools. We describe the functionality of each tool and give examples of its use.
Secure and Privacy-Preserving Distributed Information Brokering
ERIC Educational Resources Information Center
Li, Fengjun
2010-01-01
As enormous structured, semi-structured and unstructured data are collected and archived by organizations in many realms ranging from business to health networks to government agencies, the needs for efficient yet secure inter-organization information sharing naturally arise. Unlike early information sharing approaches that only involve a small…
TOTAL PRECIPITATION DATA - U.S HISTORICAL CLIMATOLOGY NETWORK (HCN)
The Carbon Dioxide Information Analysis Center, which includes the World Data Center-A for Atmospheric Trace Gases, is the primary global-change data and information analysis center of the U.S. Department of Energy (DOE). More than just an archive of data sets and publications, ...
Unobtrusive Social Network Data From Email
2008-12-01
PERSON Standard Form 298 (Rev. 8-98) Prescribed by ANSI Std Z39-18 outlook archived files and stores that data into an SQL - database. Communication...Applications ( VBA ) program was installed on the personal computers (PC) of all participants, in the session window of their Microsoft Outlook. Details of
Development of acylpiperidine and carboxamide repellents using structure-activity models
USDA-ARS?s Scientific Manuscript database
The United States Department of Agriculture (USDA) has developed repellents and insecticides for the U.S. military since 1942. Data for over 30,000 compounds are contained within the USDA archive. Repellency data from similarly structured compounds were used to develop artificial neural network (ANN...
Using the Web To Explore the Great Depression.
ERIC Educational Resources Information Center
Chamberlin, Paul
2001-01-01
Presents an annotated list of Web sites that focus on the Great Depression. Includes the American Experience, American Memory, the National Archives and Records Administration, and the New Deal Network Web sites. Offers additional sites covering topics such as the Jersey homesteads and labor history. (CMK)
ERIC Educational Resources Information Center
Ramaswami, Rama
2008-01-01
The Storage Networking Industry Association (SNIA) does not mince words when describing the looming data storage problem. In its 2007 report, "Solving the Coming Archive Crisis--the 100-Year Dilemma," the trade group asserts that the volume of disparate digital information sources being kept online for long-term preservation is overwhelming and…
Deep-Learning Convolutional Neural Networks Accurately Classify Genetic Mutations in Gliomas.
Chang, P; Grinband, J; Weinberg, B D; Bardis, M; Khy, M; Cadena, G; Su, M-Y; Cha, S; Filippi, C G; Bota, D; Baldi, P; Poisson, L M; Jain, R; Chow, D
2018-05-10
The World Health Organization has recently placed new emphasis on the integration of genetic information for gliomas. While tissue sampling remains the criterion standard, noninvasive imaging techniques may provide complimentary insight into clinically relevant genetic mutations. Our aim was to train a convolutional neural network to independently predict underlying molecular genetic mutation status in gliomas with high accuracy and identify the most predictive imaging features for each mutation. MR imaging data and molecular information were retrospectively obtained from The Cancer Imaging Archives for 259 patients with either low- or high-grade gliomas. A convolutional neural network was trained to classify isocitrate dehydrogenase 1 ( IDH1 ) mutation status, 1p/19q codeletion, and O6-methylguanine-DNA methyltransferase ( MGMT ) promotor methylation status. Principal component analysis of the final convolutional neural network layer was used to extract the key imaging features critical for successful classification. Classification had high accuracy: IDH1 mutation status, 94%; 1p/19q codeletion, 92%; and MGMT promotor methylation status, 83%. Each genetic category was also associated with distinctive imaging features such as definition of tumor margins, T1 and FLAIR suppression, extent of edema, extent of necrosis, and textural features. Our results indicate that for The Cancer Imaging Archives dataset, machine-learning approaches allow classification of individual genetic mutations of both low- and high-grade gliomas. We show that relevant MR imaging features acquired from an added dimensionality-reduction technique demonstrate that neural networks are capable of learning key imaging components without prior feature selection or human-directed training. © 2018 by American Journal of Neuroradiology.
The Genome Austria Tissue Bank (GATiB).
Asslaber, M; Abuja, P M; Stark, K; Eder, J; Gottweis, H; Trauner, M; Samonigg, H; Mischinger, H J; Schippinger, W; Berghold, A; Denk, H; Zatloukal, K
2007-01-01
In the context of the Austrian Genome Program, a tissue bank is being established (Genome Austria Tissue Bank, GATiB) which is based on a collection of diseased and corresponding normal tissues representing a great variety of diseases at their natural frequency of occurrence from a non-selected Central European population of more than 700,000 patients. Major emphasis is put on annotation of archival tissue with comprehensive clinical data, including follow-up data. A specific IT infrastructure supports sample annotation, tracking of sample usage as well as sample and data storage. Innovative data protection tools were developed which prevent sample donor re-identification, particularly if detailed medical and genetic data are combined. For quality control of old archival tissues, new techniques were established to check RNA quality and antigen stability. Since 2003, GATiB has changed from a population-based tissue bank to a disease-focused biobank comprising major cancers such as colon, breast, liver, as well as metabolic liver diseases and organs affected by the metabolic syndrome. Prospectively collected tissues are associated with blood samples and detailed data on the sample donor's disease, lifestyle and environmental exposure, following standard operating procedures. Major emphasis is also placed on ethical, legal and social issues (ELSI) related to biobanks. A specific research project and an international advisory board ensure the proper embedding of GATiB in society and facilitate international networking. (c) 2007 S. Karger AG, Basel.
NASA Astrophysics Data System (ADS)
Schwarz, Joseph; Raffi, Gianni
2002-12-01
The Atacama Large Millimeter Array (ALMA) is a joint project involving astronomical organizations in Europe and North America. ALMA will consist of at least 64 12-meter antennas operating in the millimeter and sub-millimeter range. It will be located at an altitude of about 5000m in the Chilean Atacama desert. The primary challenge to the development of the software architecture is the fact that both its development and runtime environments will be distributed. Groups at different institutes will develop the key elements such as Proposal Preparation tools, Instrument operation, On-line calibration and reduction, and Archiving. The Proposal Preparation software will be used primarily at scientists' home institutions (or on their laptops), while Instrument Operations will execute on a set of networked computers at the ALMA Operations Support Facility. The ALMA Science Archive, itself to be replicated at several sites, will serve astronomers worldwide. Building upon the existing ALMA Common Software (ACS), the system architects will prepare a robust framework that will use XML-encoded entity objects to provide an effective solution to the persistence needs of this system, while remaining largely independent of any underlying DBMS technology. Independence of distributed subsystems will be facilitated by an XML- and CORBA-based pass-by-value mechanism for exchange of objects. Proof of concept (as well as a guide to subsystem developers) will come from a prototype whose details will be presented.
Kalvelage, T.; Willems, Jennifer
2003-01-01
The design of the EOS Data and Information Systems (EOSDIS) to acquire, archive, manage and distribute Earth observation data to the broadest possible user community was discussed. A number of several integrated retrieval, processing and distribution capabilities have been explained. The value of these functions to the users were described and potential future improvements were laid out for the users. The users were interested in acquiring the retrieval, processing and archiving systems integrated so that they can get the data they want in the format and delivery mechanism of their choice.
Strategy of Planetary Data Archives in Japanese Missions for Planetary Data System
NASA Astrophysics Data System (ADS)
Yamamoto, Y.; Ishihara, Y.; Murakami, S. Y.
2017-12-01
To preserve data acquired by Japanese planetary explorations for a long time, we need a data archiving strategy in a form suitable for resources. Planetary Data System(PDS) developed by NASA is an excellent system for saving data over a long period. Especially for the current version 4 (PDS4), it is possible to create a data archive with high completeness using information technology. Historically, the Japanese planetary missions have archived data by scientists in their ways, but in the past decade, JAXA has been aiming to conform data to PDS considering long term preservation. Hayabusa, Akatsuki are archived in PDS3. Kaguya(SELENE) data have been newly converted from the original format to PDS3. Hayabusa2 and BepiColombo, and future planetary explorations will release data in PDS4. The cooperation of engineers who are familiar with information technology is indispensable to create data archives for scientists. In addition, it is essential to have experience, information sharing, and a system to support it. There is a challenge in Japan about the system.
Code of Federal Regulations, 2010 CFR
2010-07-01
... use personal paper-to-paper copying equipment in the Textual Research Room (Room 2000). Requests must be made in writing to the chief of the Research Support Branch (NWCC2), National Archives and Records... copying projects in the research room at one time, with Federal agencies given priority over other users...
Berklee College of Music Archives: Preserving the Past and Learning for the Future
ERIC Educational Resources Information Center
Esty, Anna
2012-01-01
In this article, the author discusses how the Berklee College of Music preserves its past and learns for the future. Though the library has received support from the college administration for the creation of an archive, it has been difficult for the interest in preserving the college's history to compete with more urgent needs, such as being able…
NASA Astrophysics Data System (ADS)
Do, Hong; Gudmundsson, Lukas; Leonard, Michael; Westra, Seth; Senerivatne, Sonia
2017-04-01
In-situ observations of daily streamflow with global coverage are a crucial asset for understanding large-scale freshwater resources which are an essential component of the Earth system and a prerequisite for societal development. Here we present the Global Streamflow Indices and Metadata archive (G-SIM), a collection indices derived from more than 20,000 daily streamflow time series across the globe. These indices are designed to support global assessments of change in wet and dry extremes, and have been compiled from 12 free-to-access online databases (seven national databases and five international collections). The G-SIM archive also includes significant metadata to help support detailed understanding of streamflow dynamics, with the inclusion of drainage area shapefile and many essential catchment properties such as land cover type, soil and topographic characteristics. The automated procedure in data handling and quality control of the project makes G-SIM a reproducible, extendible archive and can be utilised for many purposes in large-scale hydrology. Some potential applications include the identification of observational trends in hydrological extremes, the assessment of climate change impacts on streamflow regimes, and the validation of global hydrological models.
Automated extraction of metadata from remotely sensed satellite imagery
NASA Technical Reports Server (NTRS)
Cromp, Robert F.
1991-01-01
The paper discusses research in the Intelligent Data Management project at the NASA/Goddard Space Flight Center, with emphasis on recent improvements in low-level feature detection algorithms for performing real-time characterization of images. Images, including MSS and TM data, are characterized using neural networks and the interpretation of the neural network output by an expert system for subsequent archiving in an object-oriented data base. The data show the applicability of this approach to different arrangements of low-level remote sensing channels. The technique works well when the neural network is trained on data similar to the data used for testing.
A flexible, open, decentralized system for digital pathology networks.
Schuler, Robert; Smith, David E; Kumaraguruparan, Gowri; Chervenak, Ann; Lewis, Anne D; Hyde, Dallas M; Kesselman, Carl
2012-01-01
High-resolution digital imaging is enabling digital archiving and sharing of digitized microscopy slides and new methods for digital pathology. Collaborative research centers, outsourced medical services, and multi-site organizations stand to benefit from sharing pathology data in a digital pathology network. Yet significant technological challenges remain due to the large size and volume of digitized whole slide images. While information systems do exist for managing local pathology laboratories, they tend to be oriented toward narrow clinical use cases or offer closed ecosystems around proprietary formats. Few solutions exist for networking digital pathology operations. Here we present a system architecture and implementation of a digital pathology network and share results from a production system that federates major research centers.
A Flexible, Open, Decentralized System for Digital Pathology Networks
SMITH, David E.; KUMARAGURUPARAN, Gowri; CHERVENAK, Ann; LEWIS, Anne D.; HYDE, Dallas M.; KESSELMAN, Carl
2014-01-01
High-resolution digital imaging is enabling digital archiving and sharing of digitized microscopy slides and new methods for digital pathology. Collaborative research centers, outsourced medical services, and multi-site organizations stand to benefit from sharing pathology data in a digital pathology network. Yet significant technological challenges remain due to the large size and volume of digitized whole slide images. While information systems do exist for managing local pathology laboratories, they tend to be oriented toward narrow clinical use cases or offer closed ecosystems around proprietary formats. Few solutions exist for networking digital pathology operations. Here we present a system architecture and implementation of a digital pathology network and share results from a production system that federates major research centers. PMID:22941985
NNI Supplement to the President's 2013 Budget | Nano
Skip main navigation Nano.gov Nanotechnology 101 What It Is and How It Works What is Nanotechnology What's So Special about the Nanoscale? NNI Accomplishments NNI Accomplishments Archive Nanotechnology Timeline Frequently Asked Questions Glossary Nanotechnology and You Benefits and Applications Networks and
Information Retrieval Using ADABAS-NATURAL (with Applications for Television and Radio).
ERIC Educational Resources Information Center
Silbergeld, I.; Kutok, P.
1984-01-01
Describes use of the software ADABAS (general purpose database management system) and NATURAL (interactive programing language) in development and implementation of an information retrieval system for the National Television and Radio Network of Israel. General design considerations, files contained in each archive, search strategies, and keywords…
Secondary Lessons from Indiana's Underground Railroad Institute (July 22-27, 2001).
ERIC Educational Resources Information Center
Indiana Univ.-Purdue Univ., Indianapolis. Geography Educators' Network of Indiana.
The Geography Educator's Network of Indiana's 2001 Exploring and Teaching Institute series led 23 educators from around the state on a six day traveling adventure. Participants explored art, literature/folklore, historical sites and archives, physical environments, architecture, economics, politics, and cultures associated with the Underground…
Elementary Lessons from Indiana's Underground Railroad Institute (July 22-27, 2001).
ERIC Educational Resources Information Center
Indiana Univ.-Purdue Univ., Indianapolis. Geography Educators' Network of Indiana.
The Geography Educators' Network of Indiana's 2001 Exploring and Teaching Institute led 23 educators from around the state on a six day traveling adventure. Participants explored art, literature/folklore, historical sites and archives, physical environments, architecture, economics, politics, and cultures associated with the Underground Railroad…
ERIC Educational Resources Information Center
Fantini, M.; And Others
1990-01-01
Describes the architecture of the prototype of an image management system that has been used to develop an application concerning images of frescoes in the Sistina Chapel in the Vatican. Hardware and software design are described, the use of local area networks (LANs) is discussed, and data organization is explained. (15 references) (LRW)
ERIC Educational Resources Information Center
Richerme, Lauren Kapalka
2011-01-01
An examination of the 2005-2010 online archives of major American network news stations and newspapers reveals a troubling picture for music education. News stories frequently mention the disappearance of music education. When the media mention the existence of music education, they often promote it as a means of raising standardized test scores…
ERIC Educational Resources Information Center
de Lange, Naydene; Mnisi, Thoko; Mitchell, Claudia; Park, Eun G.
2010-01-01
The partnerships, especially university-community partnerships, that sustain globally networked learning environments often face challenges in mobilizing research to empower local communities to effect change. This article examines these challenges by describing a university-community partnership involving researchers and graduate students in…
Visitor's Computer Guidelines | CTIO
Visitor's Computer Guidelines Network Connection Request Instruments Instruments by Telescope IR Instruments Logs Tololo Kaxis Webcam NOAO Newsletters NOAO Data Archive Astronomical Links Visitor's Computer â¹âº You are here CTIO Home » Astronomers » Visitor's Computer Guidelines Visitor's Computer
Use of NASA Near Real-Time and Archived Satellite Data to Support Disaster Assessment
NASA Technical Reports Server (NTRS)
McGrath, Kevin M.; Molthan, Andrew L.; Burks, Jason E.
2014-01-01
NASA's Short-term Prediction Research and Transition (SPoRT) Center partners with the NWS to provide near realtime data in support of a variety of weather applications, including disasters. SPoRT supports NASA's Applied Sciences Program: Disasters focus area by developing techniques that will aid the disaster monitoring, response, and assessment communities. SPoRT has explored a variety of techniques for utilizing archived and near real-time NASA satellite data. An increasing number of end-users - such as the NWS Damage Assessment Toolkit (DAT) - access geospatial data via a Web Mapping Service (WMS). SPoRT has begun developing open-standard Geographic Information Systems (GIS) data sets via WMS to respond to end-user needs.
DICOM-compliant PACS with CD-based image archival
NASA Astrophysics Data System (ADS)
Cox, Robert D.; Henri, Christopher J.; Rubin, Richard K.; Bret, Patrice M.
1998-07-01
This paper describes the design and implementation of a low- cost PACS conforming to the DICOM 3.0 standard. The goal was to provide an efficient image archival and management solution on a heterogeneous hospital network as a basis for filmless radiology. The system follows a distributed, client/server model and was implemented at a fraction of the cost of a commercial PACS. It provides reliable archiving on recordable CD and allows access to digital images throughout the hospital and on the Internet. Dedicated servers have been designed for short-term storage, CD-based archival, data retrieval and remote data access or teleradiology. The short-term storage devices provide DICOM storage and query/retrieve services to scanners and workstations and approximately twelve weeks of 'on-line' image data. The CD-based archival and data retrieval processes are fully automated with the exception of CD loading and unloading. The system employs lossless compression on both short- and long-term storage devices. All servers communicate via the DICOM protocol in conjunction with both local and 'master' SQL-patient databases. Records are transferred from the local to the master database independently, ensuring that storage devices will still function if the master database server cannot be reached. The system features rules-based work-flow management and WWW servers to provide multi-platform remote data access. The WWW server system is distributed on the storage, retrieval and teleradiology servers allowing viewing of locally stored image data directly in a WWW browser without the need for data transfer to a central WWW server. An independent system monitors disk usage, processes, network and CPU load on each server and reports errors to the image management team via email. The PACS was implemented using a combination of off-the-shelf hardware, freely available software and applications developed in-house. The system has enabled filmless operation in CT, MR and ultrasound within the radiology department and throughout the hospital. The use of WWW technology has enabled the development of an intuitive we- based teleradiology and image management solution that provides complete access to image data.
NASA Astrophysics Data System (ADS)
Martinez, E.; Glassy, J. M.; Fowler, D. K.; Khayat, M.; Olding, S. W.
2014-12-01
The NASA Earth Science Data Systems Working Groups (ESDSWG) focuses on improving technologies and processes related to science discovery and preservation. One particular group, the Data Preservation Practices, is defining a set of guidelines to aid data providers in planning both what to submit for archival, and when to submit artifacts, so that the archival process can begin early in the project's life cycle. This has the benefit of leveraging knowledge within the project before staff roll off to other work. In this poster we describe various project archival use cases and identify possible archival life cycles that map closely to the pace and flow of work. To understand "archival life cycles", i.e., distinct project phases that produce archival artifacts such as instrument capabilities, calibration reports, and science data products, the workig group initially mapped the archival requirements defined in the Preservation Content Specification to the typical NASA project life cycle. As described in the poster, this work resulted in a well-defined archival life cycle, but only for some types of projects; it did not fit well for condensed project life cycles experienced within airborne and balloon campaigns. To understand the archival process for projects with compressed cycles, the working group gathered use cases from various communities. This poster will describe selected uses cases that provided insight into the unique flow of these projects, as well as proposing archival life cycles that map artifacts to projects with compressed timelines. Finally, the poster will conclude with some early recommendations for data providers, which will be captured in a formal Guidelines document - to be published in 2015.
Applications of hypermedia systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lennon, J.; Maurer, H.
1995-05-01
In this paper, we consider several new aspects of modern hypermedia systems. The applications discussed include: (1) General Information and Communication Systems: Distributed information systems for businesses, schools and universities, museums, libraries, health systems, etc. (2) Electronic orientation and information displays: Electronic guided tours, public information kiosks, and publicity dissemination with archive facilities. (3) Lecturing: A system going beyond the traditional to empower both teachers and learners. (4) Libraries: A further step towards fully electronic library systems. (5) Directories of all kinds: Staff, telephone, and all sorts of generic directories. (6) Administration: A fully integrated system such as the onemore » proposed will mean efficient data processing and valuable statistical data. (7) Research: Material can now be accessed from databases all around the world. The effects of networking and computer-supported collaborative work are discussed, and examples of new scientific visualization programs are quoted. The paper concludes with a section entitled {open_quotes}Future Directions{close_quotes}.« less
Toward a Virtual Solar Observatory: Starting Before the Petabytes Fall
NASA Technical Reports Server (NTRS)
Gurman, Joseph; Fisher, Richard R. (Technical Monitor)
2001-01-01
Although a few, large, space- and groundbased solar physics databases exist at selected locations, there is as yet only limited standardization or interoperability. I describe the outline of a plan to facilitate access to a distributed network of online solar data archives, both large and small. The underlying principle is that the user need not know where- the data are, only how to specify which data are desired. At the least, such an approach could considerably simplify the scientific user's access to the enormous amount of solar physics data to be obtained in the next decade. At best, it might mean the withering away of traditional data centers, and all the bureaucracy they entail. This work is supported by the Sun-Earth Connections Division of NASA Office of Space Science, thanks to an anomalous act of largess on the part of the 2001 SEC Senior Review.
ontologyX: a suite of R packages for working with ontological data.
Greene, Daniel; Richardson, Sylvia; Turro, Ernest
2017-04-01
Ontologies are widely used constructs for encoding and analyzing biomedical data, but the absence of simple and consistent tools has made exploratory and systematic analysis of such data unnecessarily difficult. Here we present three packages which aim to simplify such procedures. The ontologyIndex package enables arbitrary ontologies to be read into R, supports representation of ontological objects by native R types, and provides a parsimonius set of performant functions for querying ontologies. ontologySimilarity and ontologyPlot extend ontologyIndex with functionality for straightforward visualization and semantic similarity calculations, including statistical routines. ontologyIndex , ontologyPlot and ontologySimilarity are all available on the Comprehensive R Archive Network website under https://cran.r-project.org/web/packages/ . Daniel Greene dg333@cam.ac.uk. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.
Toward a Virtual Solar Observatory: Starting Before the Petabytes Fall
NASA Astrophysics Data System (ADS)
Gurman, J. B.
2001-12-01
Although a few, large, space- and groundbased solar physics databases exist at selected locations, there is as yet only limited standardization or interoperability. I describe the outline of a plan to facilitate access to a distributed network of online solar data archives, both large and small. The underlying principle is that the user need not know where the data are, only how to specify which data are desired. At the least, such an approach could considerably simplify the scientific user's access to the enormous amount of solar physics data to be obtained in the next decade. At best, it might mean the withering away of traditional data centers, and all the bureaucracy they entail. This work is supported by the Sun-Earth Connections Division of NASA Office of Space Science, thanks to an anomalous act of largess on the part of the 2001 SEC Senior Review.
Picture archiving and communication in radiology.
Napoli, Marzia; Nanni, Marinella; Cimarra, Stefania; Crisafulli, Letizia; Campioni, Paolo; Marano, Pasquale
2003-01-01
After over 80 years of exclusive archiving of radiologic films, at present, in Radiology, digital archiving is increasingly gaining ground. Digital archiving allows a considerable reduction in costs and space saving, but most importantly, immediate or remote consultation of all examinations and reports in the hospital clinical wards, is feasible. The RIS system, in this case, is the starting point of the process of electronic archiving which however is the task of PACS. The latter can be used as radiologic archive in accordance with the law provided that it is in conformance with some specifications as the use of optical long-term storage media or with electronic track of change. PACS archives, in a hierarchical system, all digital images produced by each diagnostic imaging modality. Images and patient data can be retrieved and used for consultation or remote consultation by the reporting radiologist who requires images and reports of previous radiologic examinations or by the referring physician of the ward. Modern PACS owing to the WEB server allow remote access to extremely simplified images and data however ensuring the due regulations and access protections. Since the PACS enables a simpler data communication within the hospital, security and patient privacy should be protected. A secure and reliable PACS should be able to minimize the risk of accidental data destruction, and should prevent non authorized access to the archive with adequate security measures in relation to the acquired knowledge and based on the technological advances. Archiving of data produced by modern digital imaging is a problem now present also in small Radiology services. The technology is able to readily solve problems which were extremely complex up to some years ago as the connection between equipment and archiving system owing also to the universalization of the DICOM 3.0 standard. The evolution of communication networks and the use of standard protocols as TCP/IP can minimize problems of data and image remote transmission within the healthcare enterprise as well as over the territory. However, new problems are appearing as that of digital data security profiles and of the different systems which should ensure it. Among these, algorithms of electronic signature should be mentioned. In Italy they are validated by law and therefore can be used in digital archives in accordance with the law.
Commercial imagery archive, management, exploitation, and distribution project development
NASA Astrophysics Data System (ADS)
Hollinger, Bruce; Sakkas, Alysa
1999-10-01
The Lockheed Martin (LM) team had garnered over a decade of operational experience on the U.S. Government's IDEX II (Imagery Dissemination and Exploitation) system. Recently, it set out to create a new commercial product to serve the needs of large-scale imagery archiving and analysis markets worldwide. LM decided to provide a turnkey commercial solution to receive, store, retrieve, process, analyze and disseminate in 'push' or 'pull' modes imagery, data and data products using a variety of sources and formats. LM selected 'best of breed' hardware and software components and adapted and developed its own algorithms to provide added functionality not commercially available elsewhere. The resultant product, Intelligent Library System (ILS)TM, satisfies requirements for (1) a potentially unbounded, data archive (5000 TB range) (2) automated workflow management for increased user productivity; (3) automatic tracking and management of files stored on shelves; (4) ability to ingest, process and disseminate data volumes with bandwidths ranging up to multi- gigabit per second; (5) access through a thin client-to-server network environment; (6) multiple interactive users needing retrieval of files in seconds from both archived images or in real time, and (7) scalability that maintains information throughput performance as the size of the digital library grows.
Commercial imagery archive, management, exploitation, and distribution product development
NASA Astrophysics Data System (ADS)
Hollinger, Bruce; Sakkas, Alysa
1999-12-01
The Lockheed Martin (LM) team had garnered over a decade of operational experience on the U.S. Government's IDEX II (Imagery Dissemination and Exploitation) system. Recently, it set out to create a new commercial product to serve the needs of large-scale imagery archiving and analysis markets worldwide. LM decided to provide a turnkey commercial solution to receive, store, retrieve, process, analyze and disseminate in 'push' or 'pull' modes imagery, data and data products using a variety of sources and formats. LM selected 'best of breed' hardware and software components and adapted and developed its own algorithms to provide added functionality not commercially available elsewhere. The resultant product, Intelligent Library System (ILS)TM, satisfies requirements for (a) a potentially unbounded, data archive (5000 TB range) (b) automated workflow management for increased user productivity; (c) automatic tracking and management of files stored on shelves; (d) ability to ingest, process and disseminate data volumes with bandwidths ranging up to multi- gigabit per second; (e) access through a thin client-to-server network environment; (f) multiple interactive users needing retrieval of files in seconds from both archived images or in real time, and (g) scalability that maintains information throughput performance as the size of the digital library grows.
EMMA—mouse mutant resources for the international scientific community
Wilkinson, Phil; Sengerova, Jitka; Matteoni, Raffaele; Chen, Chao-Kung; Soulat, Gaetan; Ureta-Vidal, Abel; Fessele, Sabine; Hagn, Michael; Massimi, Marzia; Pickford, Karen; Butler, Richard H.; Marschall, Susan; Mallon, Ann-Marie; Pickard, Amanda; Raspa, Marcello; Scavizzi, Ferdinando; Fray, Martin; Larrigaldie, Vanessa; Leyritz, Johan; Birney, Ewan; Tocchini-Valentini, Glauco P.; Brown, Steve; Herault, Yann; Montoliu, Lluis; de Angelis, Martin Hrabé; Smedley, Damian
2010-01-01
The laboratory mouse is the premier animal model for studying human disease and thousands of mutants have been identified or produced, most recently through gene-specific mutagenesis approaches. High throughput strategies by the International Knockout Mouse Consortium (IKMC) are producing mutants for all protein coding genes. Generating a knock-out line involves huge monetary and time costs so capture of both the data describing each mutant alongside archiving of the line for distribution to future researchers is critical. The European Mouse Mutant Archive (EMMA) is a leading international network infrastructure for archiving and worldwide provision of mouse mutant strains. It operates in collaboration with the other members of the Federation of International Mouse Resources (FIMRe), EMMA being the European component. Additionally EMMA is one of four repositories involved in the IKMC, and therefore the current figure of 1700 archived lines will rise markedly. The EMMA database gathers and curates extensive data on each line and presents it through a user-friendly website. A BioMart interface allows advanced searching including integrated querying with other resources e.g. Ensembl. Other resources are able to display EMMA data by accessing our Distributed Annotation System server. EMMA database access is publicly available at http://www.emmanet.org. PMID:19783817
Overview of NASA's Earth Science Data Systems
NASA Technical Reports Server (NTRS)
McDonald, Kenneth
2004-01-01
For over the last 15 years, NASA's Earth Science Enterprise (ESE) has devoted a tremendous effort to design and build the Earth Observing System (EOS) Data and Information System (EOSDIS) to acquire, process, archive and distribute the data of the EOS series of satellites and other ESE missions and field programs. The development of EOSDIS began with an early prototype to support NASA data from heritage missions and progressed through a formal development process to today's system that supports the data from multiple missions including Landsat 7, Terra, Aqua, SORCE and ICESat. The system is deployed at multiple Distributed Active Archive Centers (DAACs) and its current holdings are approximately 4.5 petabytes. The current set of unique users requesting EOS data and information products exceeds 2 million. While EOSDIS has been the centerpiece of NASA's Earth Science Data Systems, other initiatives have augmented the services of EOSDIS and have impacted its evolution and the future directions of data systems within the ESE. ESDIS had an active prototyping effort and has continued to be involved in the activities of the Earth Science Technology Office (ESTO). In response to concerns from the science community that EOSDIS was too large and monolithic, the ESE initiated the Earth Science Information Partners (ESP) Federation Experiment that funded a series of projects to develop specialized products and services to support Earth science research and applications. Last year, the enterprise made 41 awards to successful proposals to the Research, Education and Applications Solutions Network (REASON) Cooperative Agreement Notice to continue and extend the ESP activity. The ESE has also sponsored a formulation activity called the Strategy for the Evolution of ESE Data Systems (SEEDS) to develop approaches and decision support processes for the management of the collection of data system and service providers of the enterprise. Throughout the development of its earth science data systems, NASA has had an active collaboration with a number of interagency and international partners. One of the mechanisms that has been extremely helpful in initiating and promoting this collaboration has been NASA's participation in the Committee on Earth Observation Satellites (CEOS) and its Working Group on Information Systems and Services (WGISS). The CEOS members, working together, have implemented an International Directory Network that enables users to locate collections of earth science data held by the international community and an International Catalog System to search and order specific data products. CEOS WGISS has also promoted the international interest in the Open GIS Consortium s specifications that further advance the access and use of geospatial data and the interoperation of GTS components. These are just a few highlights of the benefits that member agencies gain from CEOS participation.
NASA Astrophysics Data System (ADS)
Moore, J.; Serreze, M. C.; Middleton, D.; Ramamurthy, M. K.; Yarmey, L.
2013-12-01
The NSF funds the Advanced Cooperative Arctic Data and Information System (ACADIS), url: (http://www.aoncadis.org/). It serves the growing and increasingly diverse data management needs of NSF's arctic research community. The ACADIS investigator team combines experienced data managers, curators and software engineers from the NSIDC, UCAR and NCAR. ACADIS fosters scientific synthesis and discovery by providing a secure long-term data archive to NSF investigators. The system provides discovery and access to arctic related data from this and other archives. This paper updates the technical components of ACADIS, the implementation of best practices, the value of ACADIS to the community and the major challenges facing this archive for the future in handling the diverse data coming from NSF Arctic investigators. ACADIS provides sustainable data management, data stewardship services and leadership for the NSF Arctic research community through open data sharing, adherence to best practices and standards, capitalizing on appropriate evolving technologies, community support and engagement. ACADIS leverages other pertinent projects, capitalizing on appropriate emerging technologies and participating in emerging cyberinfrastructure initiatives. The key elements of ACADIS user services to the NSF Arctic community include: data and metadata upload; support for datasets with special requirements; metadata and documentation generation; interoperability and initiatives with other archives; and science support to investigators and the community. Providing a self-service data publishing platform requiring minimal curation oversight while maintaining rich metadata for discovery, access and preservation is challenging. Implementing metadata standards are a first step towards consistent content. The ACADIS Gateway and ADE offer users choices for data discovery and access with the clear objective of increasing discovery and use of all Arctic data especially for analysis activities. Metadata is at the core of ACADIS activities, from capturing metadata at the point of data submission to ensuring interoperability , providing data citations, and supporting data discovery. ACADIS metadata efforts include: 1) Evolution of the ACADIS metadata profile to increase flexibility in search; 2) Documentation guidelines; and 3) Metadata standardization efforts. A major activity is now underway to ensure consistency in the metadata profile across all archived datasets. ACADIS is embarking on a critical activity to create Digital Object Identifiers (DOI) for all its holdings. The data services offered by ACADIS focus on meeting the needs of the data providers, providing dynamic search capabilities to peruse the ACADIS and related cyrospheric data repositories, efficient data download and some special services including dataset reformatting and visualization. The service is built around of the following key technical elements: The ACADIS Gateway housed at NCAR has been developed to support NSF Arctic data coming from AON and now broadly across PLR/ARC and related archives: The Arctic Data Explorer (ADE) developed at NSIDC is an integral service of ACADIS bringing the rich archive from NSIDC together with catalogs from ACADIS and international partners in Arctic research: and Rosetta and the Digital Object Identifier (DOI) generation scheme are tools available to the community to help publish and utilize datasets in integration and synthesis and publication.
NASA Technical Reports Server (NTRS)
Mah, G. R.; Myers, J.
1993-01-01
The U.S. Government has initiated the Global Change Research program, a systematic study of the Earth as a complete system. NASA's contribution of the Global Change Research Program is the Earth Observing System (EOS), a series of orbital sensor platforms and an associated data processing and distribution system. The EOS Data and Information System (EOSDIS) is the archiving, production, and distribution system for data collected by the EOS space segment and uses a multilayer architecture for processing, archiving, and distributing EOS data. The first layer consists of the spacecraft ground stations and processing facilities that receive the raw data from the orbiting platforms and then separate the data by individual sensors. The second layer consists of Distributed Active Archive Centers (DAAC) that process, distribute, and archive the sensor data. The third layer consists of a user science processing network. The EOSDIS is being developed in a phased implementation. The initial phase, Version 0, is a prototype of the operational system. Version 0 activities are based upon existing systems and are designed to provide an EOSDIS-like capability for information management and distribution. An important science support task is the creation of simulated data sets for EOS instruments from precursor aircraft or satellite data. The Land Processes DAAC, at the EROS Data Center (EDC), is responsible for archiving and processing EOS precursor data from airborne instruments such as the Thermal Infrared Multispectral Scanner (TIMS), the Thematic Mapper Simulator (TMS), and Airborne Visible and Infrared Imaging Spectrometer (AVIRIS). AVIRIS, TIMS, and TMS are flown by the NASA-Ames Research Center ARC) on an ER-2. The ER-2 flies at 65000 feet and can carry up to three sensors simultaneously. Most jointly collected data sets are somewhat boresighted and roughly registered. The instrument data are being used to construct data sets that simulate the spectral and spatial characteristics of the Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) instrument scheduled to be flown on the first EOS-AM spacecraft. The ASTER is designed to acquire 14 channels of land science data in the visible and near-IR (VNIR), shortwave-IR (SWIR), and thermal-IR (TIR) regions from 0.52 micron to 11.65 micron at high spatial resolutions of 15 m to 90 m. Stereo data will also be acquired in the VNIR region in a single band. The AVIRIS and TMS cover the ASTER VNIR and SWIR bands, and the TIMS covers the TIR bands. Simulated ASTER data sets have been generated over Death Valley, California, Cuprite, Nevada, and the Drum Mountains, Utah using a combination of AVIRIS, TIMS, amd TMS data, and existing digital elevation models (DEM) for the topographic information.
Protein Data Bank (PDB): The Single Global Macromolecular Structure Archive
Burley, Stephen K.; Berman, Helen M.; Kleywegt, Gerard J.; Markley, John L.; Nakamura, Haruki; Velankar, Sameer
2018-01-01
The Protein Data Bank (PDB)—the single global repository of experimentally determined 3D structures of biological macromolecules and their complexes—was established in 1971, becoming the first open-access digital resource in the biological sciences. The PDB archive currently houses ~130,000 entries (May 2017). It is managed by the Worldwide Protein Data Bank organization (wwPDB; wwpdb.org), which includes the RCSB Protein Data Bank (RCSB PDB; rcsb.org), the Protein Data Bank Japan (PDBj; pdbj.org), the Protein Data Bank in Europe (PDBe; pdbe.org), and BioMagResBank (BMRB; www.bmrb.wisc.edu). The four wwPDB partners operate a unified global software system that enforces community-agreed data standards and supports data Deposition, Biocuration, and Validation of ~11,000 new PDB entries annually (deposit.wwpdb.org). The RCSB PDB currently acts as the archive keeper, ensuring disaster recovery of PDB data and coordinating weekly updates. wwPDB partners disseminate the same archival data from multiple FTP sites, while operating complementary websites that provide their own views of PDB data with selected value-added information and links to related data resources. At present, the PDB archives experimental data, associated metadata, and 3D-atomic level structural models derived from three well-established methods: crystallography, nuclear magnetic resonance spectroscopy (NMR), and electron microscopy (3DEM). wwPDB partners are working closely with experts in related experimental areas (small-angle scattering, chemical cross-linking/mass spectrometry, Forster energy resonance transfer or FRET, etc.) to establish a federation of data resources that will support sustainable archiving and validation of 3D structural models and experimental data derived from integrative or hybrid methods. PMID:28573592
Protein Data Bank (PDB): The Single Global Macromolecular Structure Archive.
Burley, Stephen K; Berman, Helen M; Kleywegt, Gerard J; Markley, John L; Nakamura, Haruki; Velankar, Sameer
2017-01-01
The Protein Data Bank (PDB)--the single global repository of experimentally determined 3D structures of biological macromolecules and their complexes--was established in 1971, becoming the first open-access digital resource in the biological sciences. The PDB archive currently houses ~130,000 entries (May 2017). It is managed by the Worldwide Protein Data Bank organization (wwPDB; wwpdb.org), which includes the RCSB Protein Data Bank (RCSB PDB; rcsb.org), the Protein Data Bank Japan (PDBj; pdbj.org), the Protein Data Bank in Europe (PDBe; pdbe.org), and BioMagResBank (BMRB; www.bmrb.wisc.edu). The four wwPDB partners operate a unified global software system that enforces community-agreed data standards and supports data Deposition, Biocuration, and Validation of ~11,000 new PDB entries annually (deposit.wwpdb.org). The RCSB PDB currently acts as the archive keeper, ensuring disaster recovery of PDB data and coordinating weekly updates. wwPDB partners disseminate the same archival data from multiple FTP sites, while operating complementary websites that provide their own views of PDB data with selected value-added information and links to related data resources. At present, the PDB archives experimental data, associated metadata, and 3D-atomic level structural models derived from three well-established methods: crystallography, nuclear magnetic resonance spectroscopy (NMR), and electron microscopy (3DEM). wwPDB partners are working closely with experts in related experimental areas (small-angle scattering, chemical cross-linking/mass spectrometry, Forster energy resonance transfer or FRET, etc.) to establish a federation of data resources that will support sustainable archiving and validation of 3D structural models and experimental data derived from integrative or hybrid methods.
The design of a petabyte archive and distribution system for the NASA ECS project
NASA Technical Reports Server (NTRS)
Caulk, Parris M.
1994-01-01
The NASA EOS Data and Information System (EOSDIS) Core System (ECS) will contain one of the largest data management systems ever built - the ECS Science and Data Processing System (SDPS). SDPS is designed to support long term Global Change Research by acquiring, producing, and storing earth science data, and by providing efficient means for accessing and manipulating that data. The first two releases of SDPS, Release A and Release B, will be operational in 1997 and 1998, respectively. Release B will be deployed at eight Distributed Active Archiving Centers (DAAC's). Individual DAAC's will archive different collections of earth science data, and will vary in archive capacity. The storage and management of these data collections is the responsibility of the SDPS Data Server subsystem. It is anticipated that by the year 2001, the Data Server subsystem at the Goddard DAAC must support a near-line data storage capacity of one petabyte. The development of SDPS is a system integration effort in which COTS products will be used in favor of custom components in very possible way. Some software and hardware capabilities required to meet ECS data volume and storage management requirements beyond 1999 are not yet supported by available COTS products. The ECS project will not undertake major custom development efforts to provide these capabilities. Instead, SDPS and its Data Server subsystem are designed to support initial implementations with current products, and provide an evolutionary framework that facilitates the introduction of advanced COTS products as they become available. This paper provides a high-level description of the Data Server subsystem design from a COTS integration standpoint, and discussed some of the major issues driving the design. The paper focuses on features of the design that will make the system scalable and adaptable to changing technologies.
NASA Astrophysics Data System (ADS)
Weltzin, J. F.; Browning, D. M.
2014-12-01
The USA National Phenology Network (USA-NPN; www.usanpn.org) is a national-scale science and monitoring initiative focused on phenology - the study of seasonal life-cycle events such as leafing, flowering, reproduction, and migration - as a tool to understand the response of biodiversity to environmental variation and change. USA-NPN provides a hierarchical, national monitoring framework that enables other organizations to leverage the capacity of the Network for their own applications - minimizing investment and duplication of effort - while promoting interoperability. Network participants can leverage: (1) Standardized monitoring protocols that have been broadly vetted, tested and published; (2) A centralized National Phenology Database (NPDb) for maintaining, archiving and replicating data, with standard metadata, terms-of-use, web-services, and documentation of QA/QC, plus tools for discovery, visualization and download of raw data and derived data products; and/or (3) A national in-situ, multi-taxa phenological monitoring system, Nature's Notebook, which enables participants to observe and record phenology of plants and animals - based on the protocols and information management system (IMS) described above - via either web or mobile applications. The protocols, NPDb and IMS, and Nature's Notebook represent a hierarchy of opportunities for involvement by a broad range of interested stakeholders, from individuals to agencies. For example, some organizations have adopted (e.g., the National Ecological Observatory Network or NEON) -- or are considering adopting (e.g., the Long-Term Agroecosystems Network or LTAR) -- the USA-NPN standardized protocols, but will develop their own database and IMS with web services to promote sharing of data with the NPDb. Other organizations (e.g., the Inventory and Monitoring Programs of the National Wildlife Refuge System and the National Park Service) have elected to use Nature's Notebook to support their phenological monitoring programs. We highlight the challenges and benefits of integrating phenology monitoring within existing and emerging national monitoring networks, and showcase opportunities that exist when standardized protocols are adopted and implemented to promote data interoperability and sharing.
NASA Technical Reports Server (NTRS)
Noll, Carey; Michael, Patrick; Dube, Maurice P.; Pollack, N.
2012-01-01
The Crustal Dynamics Data Inforn1ation System (CoorS) supports data archiving and distribution activities for the space geodesy and geodynamics community. The main objectives of the system are to store space geodesy and geodynamics related data products in a central data bank, to maintain infom1ation about the archival of these data, and to disseminate these data and information in a timely mam1er to a global scientific research community. The archive consists of GNSS, laser ranging, VLBI, and OORIS data sets and products derived from these data. The coors is one of NASA's Earth Observing System Oata and Infom1ation System (EOSorS) distributed data centers; EOSOIS data centers serve a diverse user community and are tasked to provide facilities to search and access science data and products. The coors data system and its archive have become increasingly important to many national and international science communities, in pal1icular several of the operational services within the International Association of Geodesy (lAG) and its project the Global Geodetic Observing System (GGOS), including the International OORIS Service (IDS), the International GNSS Service (IGS), the International Laser Ranging Service (ILRS), the International VLBI Service for Geodesy and Astrometry (IVS), and the International Earth Rotation Service (IERS). The coors has recently expanded its archive to supp011 the IGS Multi-GNSS Experiment (MGEX). The archive now contains daily and hourly 3D-second and subhourly I-second data from an additional 35+ stations in RINEX V3 fOm1at. The coors will soon install an Ntrip broadcast relay to support the activities of the IGS Real-Time Pilot Project (RTPP) and the future Real-Time IGS Service. The coors has also developed a new web-based application to aid users in data discovery, both within the current community and beyond. To enable this data discovery application, the CDDIS is currently implementing modifications to the metadata extracted from incoming data and product files pushed to its archive. This poster will include background information about the system and its user communities, archive contents and updates, enhancements for data discovery, new system architecture, and future plans.
Myneni, Sahiti; Cobb, Nathan K; Cohen, Trevor
2013-01-01
Unhealthy behaviors increase individual health risks and are a socioeconomic burden. Harnessing social influence is perceived as fundamental for interventions to influence health-related behaviors. However, the mechanisms through which social influence occurs are poorly understood. Online social networks provide the opportunity to understand these mechanisms as they digitally archive communication between members. In this paper, we present a methodology for content-based social network analysis, combining qualitative coding, automated text analysis, and formal network analysis such that network structure is determined by the content of messages exchanged between members. We apply this approach to characterize the communication between members of QuitNet, an online social network for smoking cessation. Results indicate that the method identifies meaningful theme-based social sub-networks. Modeling social network data using this method can provide us with theme-specific insights such as the identities of opinion leaders and sub-community clusters. Implications for design of targeted social interventions are discussed.
Learning and Networking: Utilization of a Primary Care Listserv by Pharmacists
Trinacty, Melanie; Farrell, Barbara; Schindel, Theresa J; Sunstrum, Lisa; Dolovich, Lisa; Kennie, Natalie; Russell, Grant; Waite, Nancy
2014-01-01
Background Expanding into new types of practice, such as family health teams, presents challenges for practising pharmacists. The Primary Care Pharmacy Specialty Network (PC-PSN) was established in 2007 to support collaboration among pharmacists working in primary care. The PC-PSN offers to its members a listserv (also referred to as an electronic mailing list) jointly hosted by the Canadian Society of Hospital Pharmacists and the Canadian Pharmacists Association. Objectives: To characterize PC-PSN membership and participation in the listserv and to examine how the listserv is used by analyzing questions posted, concerns raised, and issues discussed. Methods: Qualitative content analysis was used to examine 1 year of archived PC-PSN listserv posts from the year 2010. Two coders used NVivo software to classify the content of posts. Research team members reviewed and discussed the coding reports to confirm themes emerging from the data. Results: Overall, 129 people (52.9% of the 244 listserv members registered at the end of the calendar year) posted to the listserv during the study period. These participants worked in various practice settings, with over half residing in Ontario (68/129 [52.7%]). A total of 623 posts were coded. Agreement between coders, for a sample of posts from 10 users, was acceptable (kappa = 0.78). The listserv was used to share information on a diverse set of topics, to support decision-making and acquire solutions for complex problems, and as a forum for mentorship. Conclusions: The qualitative content analysis of the PC-PSN listserv posts for the year 2010 showed that the listserv was a medium for information-sharing and for providing and receiving support, through mentorship from colleagues. Apparent learning needs included effective question-posing skills and application of evidence to individual patients. PMID:25364016
Smith, Vincent S; Rycroft, Simon D; Harman, Kehan T; Scott, Ben; Roberts, David
2009-01-01
Background Natural History science is characterised by a single immense goal (to document, describe and synthesise all facets pertaining to the diversity of life) that can only be addressed through a seemingly infinite series of smaller studies. The discipline's failure to meaningfully connect these small studies with natural history's goal has made it hard to demonstrate the value of natural history to a wider scientific community. Digital technologies provide the means to bridge this gap. Results We describe the system architecture and template design of "Scratchpads", a data-publishing framework for groups of people to create their own social networks supporting natural history science. Scratchpads cater to the particular needs of individual research communities through a common database and system architecture. This is flexible and scalable enough to support multiple networks, each with its own choice of features, visual design, and constituent data. Our data model supports web services on standardised data elements that might be used by related initiatives such as GBIF and the Encyclopedia of Life. A Scratchpad allows users to organise data around user-defined or imported ontologies, including biological classifications. Automated semantic annotation and indexing is applied to all content, allowing users to navigate intuitively and curate diverse biological data, including content drawn from third party resources. A system of archiving citable pages allows stable referencing with unique identifiers and provides credit to contributors through normal citation processes. Conclusion Our framework currently serves more than 1,100 registered users across 100 sites, spanning academic, amateur and citizen-science audiences. These users have generated more than 130,000 nodes of content in the first two years of use. The template of our architecture may serve as a model to other research communities developing data publishing frameworks outside biodiversity research. PMID:19900302
Toward a standardized soil carbon database platform in the US Critical Zone Observatory Network
NASA Astrophysics Data System (ADS)
Filley, T. R.; Marini, L.; Todd-Brown, K. E.; Malhotra, A.; Harden, J. W.; Kumar, P.
2017-12-01
Within the soil carbon community of the US Critical Zone Observatory (CZO) Network, efforts are underway to promote network-level data syntheses and modeling projects and to identify barriers to data intercomparability. This represents a challenging goal given the diversity of soil carbon sampling methodologies, spatial and vertical resolution, carbon pool isolation protocols, subsequent measurement techniques, and matrix terminology. During the last annual meeting of the CZO SOC Working Group, Dec 11, 2016, it was decided that integration with, and potentially adoption of, a widely used, active, and mature data aggregation, archival, and visualization platform was the easiest route to achieve this ultimate goal. Additionally, to assess the state of deep and shallow soil C data among the CZO sites it was recommended that a comprehensive survey must be undertaken to identify data gaps and catalog the various soil sampling and analysis methodologies. The International Soil Carbon Network (ISCN) has a long history of leadership in the development of soil C data aggregation, archiving, and visualization tools and currently houses data for over 70,000 soil cores contributed from international soil carbon community. Over the past year, members of the CZO network and the ISCN have met to discuss logistics of adopting the ISCN template within the CZO. Collaborative efforts among all of the CZO site data managers, led by the Intensively Managed Landscapes CZO, will evaluate feasibility of adoption of the ISCN template, or some modification thereof, and distribution to the appropriate soil scientists for data upload and aggregation. Partnering with ISCN also ensures that soil characteristics from the US CZO are placed in a developing global soil context and paves the way for future integration of data from other international CZO networks. This poster will provide an update of this overall effort along with a summary of data products, partnering networks, and recommendations for data language template and the future CZO APIs.
Democratizing LGBTQ History Online: Digitizing Public History in "U.S. Homophile Internationalism".
de Szegheo Lang, Tamara
2017-01-01
This article argues that the online archive and exhibit "U.S. Homophile Internationalism" effectively contributes to the democratizing effects that digital archives and online initiatives are having on the practice of history. "U.S. Homophile Internationalism" is an online archive of over 800 digitized articles, letters, advertisements, and other materials from the U.S. homophile press that reference six non-U.S. regions of the world. It also provides visitors with introductory regional essays, annotated bibliographies, and an interactive map feature. This essay weaves "U.S. Homophile Internationalism" into the debates in community-run LGBTQ archives regarding the digitization of archival materials and the possibilities presented by digital public history. In doing so, it outlines the structure and content of "U.S. Homophile Internationalism," highlighting how it increases the public accessibility of primary sources, encourages historical research on regions of the world that have not been adequately represented in LGBTQ history writing, and creates interactive components to support public engagements with the Web site.
Supplementing the Digitized Sky Survey for UV-Mission Planning
NASA Technical Reports Server (NTRS)
McLean, Brian
2004-01-01
The Space Telescope Science Institute worked on a project to augment the Digitized Sky Survey archive by completing the scanning and processing of the POSS-I blue survey. This will provide an additional valuable resource to support UV-mission planning. All of the data will be made available through the NASA optical/UV archive (MAST) at STScI. The activities completed during this project are included.
Interoperability In The New Planetary Science Archive (PSA)
NASA Astrophysics Data System (ADS)
Rios, C.; Barbarisi, I.; Docasal, R.; Macfarlane, A. J.; Gonzalez, J.; Arviset, C.; Grotheer, E.; Besse, S.; Martinez, S.; Heather, D.; De Marchi, G.; Lim, T.; Fraga, D.; Barthelemy, M.
2015-12-01
As the world becomes increasingly interconnected, there is a greater need to provide interoperability with software and applications that are commonly being used globally. For this purpose, the development of the new Planetary Science Archive (PSA), by the European Space Astronomy Centre (ESAC) Science Data Centre (ESDC), is focused on building a modern science archive that takes into account internationally recognised standards in order to provide access to the archive through tools from third parties, for example by the NASA Planetary Data System (PDS), the VESPA project from the Virtual Observatory of Paris as well as other international institutions. The protocols and standards currently being supported by the new Planetary Science Archive at this time are the Planetary Data Access Protocol (PDAP), the EuroPlanet-Table Access Protocol (EPN-TAP) and Open Geospatial Consortium (OGC) standards. The architecture of the PSA consists of a Geoserver (an open-source map server), the goal of which is to support use cases such as the distribution of search results, sharing and processing data through a OGC Web Feature Service (WFS) and a Web Map Service (WMS). This server also allows the retrieval of requested information in several standard output formats like Keyhole Markup Language (KML), Geography Markup Language (GML), shapefile, JavaScript Object Notation (JSON) and Comma Separated Values (CSV), among others. The provision of these various output formats enables end-users to be able to transfer retrieved data into popular applications such as Google Mars and NASA World Wind.
Application service provider (ASP) financial models for off-site PACS archiving
NASA Astrophysics Data System (ADS)
Ratib, Osman M.; Liu, Brent J.; McCoy, J. Michael; Enzmann, Dieter R.
2003-05-01
For the replacement of its legacy Picture Archiving and Communication Systems (approx. annual workload of 300,000 procedures), UCLA Medical Center has evaluated and adopted an off-site data-warehousing solution based on an ASP financial with a one-time single payment per study archived. Different financial models for long-term data archive services were compared to the traditional capital/operational costs of on-site digital archives. Total cost of ownership (TCO), including direct and indirect expenses and savings, were compared for each model. Financial parameters were considered: logistic/operational advantages and disadvantages of ASP models versus traditional archiving systems. Our initial analysis demonstrated that the traditional linear ASP business model for data storage was unsuitable for large institutions. The overall cost markedly exceeds the TCO of an in-house archive infrastructure (when support and maintenance costs are included.) We demonstrated, however, that non-linear ASP pricing models can be cost-effective alternatives for large-scale data storage, particularly if they are based on a scalable off-site data-warehousing service and the prices are adapted to the specific size of a given institution. The added value of ASP is that it does not require iterative data migrations from legacy media to new storage media at regular intervals.
From Revolution to Evolution: 21 Years of the IRIS Data Management Facility
NASA Astrophysics Data System (ADS)
Benson, R. B.; Ahern, T. K.; Trabant, C. M.; Casey, R.
2009-12-01
A revolution is defined as a "drastic and far-reaching change in the way of thinking and behaving", and this describes the effect that the IRIS Data Management System (DMS) has had in the framework of seismological research. The IRIS Data Management Center (DMC) prototype began operation at theUniversity of Texas at Austin in 1988 and has evolved to to a fully functioning facility at the University of Washington in Seattle. The focus of this submission is to share an historical timeline of events related to the DMC, a bit of scrap-booking perhaps, centered around the business of waveform data management conducted by IRIS and it's partners. We hope to illustrate how the successful philosophy of sharing unrestricted data, openly collaborating with a global network of networks, and partnering with computer and software industry vendors that have supported our community's academic research, has enabled generations of seismologists to both curate and continually "mine" the ever-growing 45 years of digital data currently stored in the perpetual archive located in Seattle, WA. At the time that the Loma Prieta Earthquake shook and burned San Francisco on October 17, 1989, there were only 3 stations which had dial-up modems enabling what then was considered "fast access" to waveform data delivery from them back to the DMC. This was considered the first test of the DMC, since waveform data were collected and distributed within days of this event. But this was only a foretelling of things to come, since the rates of archiving and distribution have continually grown to a current 1900+ stations and a 17,000+ channel count of real-time data streaming in and out of the DMC daily, totaling 50 Terabytes of data shipped in 2009 alone. We will attempt to summarize the important technological challenges, solutions, and growth of the DMC with the goal of illustrating that "if you build it, they will come", and how this could be useful to other organizations.
VETA x ray data acquisition and control system
NASA Technical Reports Server (NTRS)
Brissenden, Roger J. V.; Jones, Mark T.; Ljungberg, Malin; Nguyen, Dan T.; Roll, John B., Jr.
1992-01-01
We describe the X-ray Data Acquisition and Control System (XDACS) used together with the X-ray Detection System (XDS) to characterize the X-ray image during testing of the AXAF P1/H1 mirror pair at the MSFC X-ray Calibration Facility. A variety of X-ray data were acquired, analyzed and archived during the testing including: mirror alignment, encircled energy, effective area, point spread function, system housekeeping and proportional counter window uniformity data. The system architecture is presented with emphasis placed on key features that include a layered UNIX tool approach, dedicated subsystem controllers, real-time X-window displays, flexibility in combining tools, network connectivity and system extensibility. The VETA test data archive is also described.
Clinical applications of an ATM/Ethernet network in departments of neuroradiology and radiotherapy.
Cimino, C; Pizzi, R; Fusca, M; Bruzzone, M G; Casolino, D; Sicurello, F
1997-01-01
An integrated system for the multimedia management of images and clinical information has been developed at the Isituto Nazionale Neurologico C. Besta in Milan. The Institute physicians have the daily need of consulting images coming from various modalities. The high volume of archived material and the need of retrieving and displaying new and past images and clinical information has motivated the development of a Picture Archiving and Communication System (PACS) for the automatic management of images and clinical data, related not only to the Radiology Department, but also to the Radiotherapy Department for 3D virtual simulation, to remote teleconsulting, and in the following to all the wards, ambulatories and labs.
NASA Technical Reports Server (NTRS)
Herman, J. R.; Hudson, R. D.; Serafino, G.
1990-01-01
Arguments are presented showing that the basic empirical model of the solar backscatter UV (SBUV) instrument degradation used by Cebula et al. (1988) in their analysis of the SBUV data is likely to lead to an incorrect estimate of the ozone trend. A correction factor is given as a function of time and altitude that brings the SBUV data into approximate agreement with the SAGE, SME, and Dobson network ozone trends. It is suggested that the currently archived SBUV ozone data should be used with caution for periods of analysis exceeding 1 yr, since it is likely that the yearly decreases contained in the archived data are too large.
Observations and modeling of seismic background noise
Peterson, Jon R.
1993-01-01
The preparation of this report had two purposes. One was to present a catalog of seismic background noise spectra obtained from a worldwide network of seismograph stations. The other purpose was to refine and document models of seismic background noise that have been in use for several years. The second objective was, in fact, the principal reason that this study was initiated and influenced the procedures used in collecting and processing the data.With a single exception, all of the data used in this study were extracted from the digital data archive at the U.S. Geological Survey's Albuquerque Seismological Laboratory (ASL). This archive dates from 1972 when ASL first began deploying digital seismograph systems and collecting and distributing digital data under the sponsorship of the Defense Advanced Research Projects Agency (DARPA). There have been many changes and additions to the global seismograph networks during the past twenty years, but perhaps none as significant as the current deployment of very broadband seismographs by the U.S. Geological Survey (USGS) and the University of California San Diego (UCSD) under the scientific direction of the IRIS consortium. The new data acquisition systems have extended the bandwidth and resolution of seismic recording, and they utilize high-density recording media that permit the continuous recording of broadband data. The data improvements and continuous recording greatly benefit and simplify surveys of seismic background noise.Although there are many other sources of digital data, the ASL archive data were used almost exclusively because of accessibility and because the data systems and their calibration are well documented for the most part. Fortunately, the ASL archive contains high-quality data from other stations in addition to those deployed by the USGS. Included are data from UCSD IRIS/IDA stations, the Regional Seismic Test Network (RSTN) deployed by Sandia National Laboratories (SNL), and the TERRAscope network deployed by the California Institute of Technology in cooperation with other institutions.A map showing the approximate locations of the stations used in this study is provided in Figure 1. One might hope for a better distribution of stations in the southern hemisphere, especially Africa and South America, in order to look for regional variations in seismic noise (apart from the major differences between continental, coastal and island sites). Unfortunately, anyone looking for subtle regional variations in seismic noise is probably going to be disappointed by the spectral data presented in this report because much of the station data appear to be dominated by local disturbances caused by instrumental, environmental, cultural, or surf noise. Better instruments and better instrument siting, or a well-funded field program, will be needed before a global isoseismal noise map can be produced. However, by assembling a composite of background noise from a large network of stations, many of the local station variables are masked, and it is possible to create generalized spectral plots of Earth noise for hypothetical quiet and noisy station sites.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dolan, Daniel H.; Ao, Tommy
The Sandia Data Archive (SDA) format is a specific implementation of the HDF5 (Hierarchal Data Format version 5) standard. The format was developed for storing data in a universally accessible manner. SDA files may contain one or more data records, each associated with a distinct text label. Primitive records provide basic data storage, while compound records support more elaborate grouping. External records allow text/binary files to be carried inside an archive and later recovered. This report documents version 1.0 of the SDA standard. The information provided here is sufficient for reading from and writing to an archive. Although the formatmore » was original designed for use in MATLAB, broader use is encouraged.« less
Carneggie, David M.; Metz, Gary G.; Draeger, William C.; Thompson, Ralph J.
1991-01-01
The U.S. Geological Survey's Earth Resources Observation Systems (EROS) Data Center, the national archive for Landsat data, has 20 years of experience in acquiring, archiving, processing, and distributing Landsat and earth science data. The Center is expanding its satellite and earth science data management activities to support the U.S. Global Change Research Program and the National Aeronautics and Space Administration (NASA) Earth Observing System Program. The Center's current and future data management activities focus on land data and include: satellite and earth science data set acquisition, development and archiving; data set preservation, maintenance and conversion to more durable and accessible archive medium; development of an advanced Land Data Information System; development of enhanced data packaging and distribution mechanisms; and data processing, reprocessing, and product generation systems.
Picture archiving and communication system--Part one: Filmless radiology and distance radiology.
De Backer, A I; Mortelé, K J; De Keulenaer, B L
2004-01-01
Picture archiving and communication system (PACS) is a collection of technologies used to carry out digital medical imaging. PACS is used to digitally acquire medical images from the various modalities, such as computed tomography (CT), magnetic resonance imaging (MRI), ultrasound, and digital projection radiography. The image data and pertinent information are transmitted to other and possibly remote locations over networks, where they may be displayed on computer workstations for soft copy viewing in multiple locations, thus permitting simultaneous consultations and almost instant reporting from radiologists at a distance. Data are secured and archived on digital media such as optical disks or tape, and may be automatically retrieved as necessary. Close integration with the hospital information system (HIS)--radiology information system (RIS) is critical for system functionality. Medical image management systems are maturing, providing access outside of the radiology department to images throughout the hospital via the Ethernet, at different hospitals, or from a home workstation if teleradiology has been implemented.
Recent advances and plans in processing and geocoding of SAR data at the DFD
NASA Technical Reports Server (NTRS)
Noack, W.
1993-01-01
Because of the needs of future projects like ENVISAT and the experiences made with the current operational ERS-1 facilities, a radical change in the synthetic aperture radar (SAR) processing scenarios can be predicted for the next years. At the German PAF several new developments were initialized which are driven mainly either by user needs or by system and operational constraints ('lessons learned'). At the end there will be a major simplification and uniformation of all used computer systems. Especially the following changes are likely to be implemented at the German PAF: transcription before archiving, processing of all standard products with high throughput directly at the receiving stations, processing of special 'high-valued' products at the PAF, usage of a single type of processor hardware, implementation of a large and fast on-line data archive, and improved and unified fast data network between the processing and archiving facilities. A short description of the current operational SAR facilities as well as the future implementations are given.
High resolution global gridded data for use in population studies
NASA Astrophysics Data System (ADS)
Lloyd, Christopher T.; Sorichetta, Alessandro; Tatem, Andrew J.
2017-01-01
Recent years have seen substantial growth in openly available satellite and other geospatial data layers, which represent a range of metrics relevant to global human population mapping at fine spatial scales. The specifications of such data differ widely and therefore the harmonisation of data layers is a prerequisite to constructing detailed and contemporary spatial datasets which accurately describe population distributions. Such datasets are vital to measure impacts of population growth, monitor change, and plan interventions. To this end the WorldPop Project has produced an open access archive of 3 and 30 arc-second resolution gridded data. Four tiled raster datasets form the basis of the archive: (i) Viewfinder Panoramas topography clipped to Global ADMinistrative area (GADM) coastlines; (ii) a matching ISO 3166 country identification grid; (iii) country area; (iv) and slope layer. Further layers include transport networks, landcover, nightlights, precipitation, travel time to major cities, and waterways. Datasets and production methodology are here described. The archive can be downloaded both from the WorldPop Dataverse Repository and the WorldPop Project website.
High resolution global gridded data for use in population studies.
Lloyd, Christopher T; Sorichetta, Alessandro; Tatem, Andrew J
2017-01-31
Recent years have seen substantial growth in openly available satellite and other geospatial data layers, which represent a range of metrics relevant to global human population mapping at fine spatial scales. The specifications of such data differ widely and therefore the harmonisation of data layers is a prerequisite to constructing detailed and contemporary spatial datasets which accurately describe population distributions. Such datasets are vital to measure impacts of population growth, monitor change, and plan interventions. To this end the WorldPop Project has produced an open access archive of 3 and 30 arc-second resolution gridded data. Four tiled raster datasets form the basis of the archive: (i) Viewfinder Panoramas topography clipped to Global ADMinistrative area (GADM) coastlines; (ii) a matching ISO 3166 country identification grid; (iii) country area; (iv) and slope layer. Further layers include transport networks, landcover, nightlights, precipitation, travel time to major cities, and waterways. Datasets and production methodology are here described. The archive can be downloaded both from the WorldPop Dataverse Repository and the WorldPop Project website.
Remotely Sensed Imagery from USGS: Update on Products and Portals
NASA Astrophysics Data System (ADS)
Lamb, R.; Lemig, K.
2016-12-01
The USGS Earth Resources Observation and Science (EROS) Center has recently implemented a number of additions and changes to its existing suite of products and user access systems. Together, these changes will enhance the accessibility, breadth, and usability of the remotely sensed image products and delivery mechanisms available from USGS. As of late 2016, several new image products are now available for public download at no charge from USGS/EROS Center. These new products include: (1) global Level 1T (precision terrain-corrected) products from Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER), provided via NASA's Land Processes Distributed Active Archive Center (LP DAAC); and (2) Sentinel-2 Multispectral Instrument (MSI) products, available through a collaborative effort with the European Space Agency (ESA). Other new products are also planned to become available soon. In an effort to enable future scientific analysis of the full 40+ year Landsat archive, the USGS also introduced a new "Collection Management" strategy for all Landsat Level 1 products. This new archive and access schema involves quality-based tier designations that will support future time series analysis of the historic Landsat archive at the pixel level. Along with the quality tier designations, the USGS has also implemented a number of other Level 1 product improvements to support Landsat science applications, including: enhanced metadata, improved geometric processing, refined quality assessment information, and angle coefficient files. The full USGS Landsat archive is now being reprocessed in accordance with the new `Collection 1' specifications. Several USGS data access and visualization systems have also seen major upgrades. These user interfaces include a new version of the USGS LandsatLook Viewer which was released in Fall 2017 to provide enhanced functionality and Sentinel-2 visualization and access support. A beta release of the USGS Global Visualization Tool ("GloVis Next") was also released in Fall 2017, with many new features including data visualization at full resolution. The USGS also introduced a time-enabled web mapping service (WMS) to support time-based access to the existing LandsatLook "natural color" full-resolution browse image services.
NASA Astrophysics Data System (ADS)
Hoffmann, Friederike; Meyer, Stefanie; de Vareilles, Mahaut
2017-04-01
In the past years there has been a strong push in Norway for increasing participation in the EU Framework Programmes for Research and Innovation. EU projects coordinated by the University of Bergen (UiB) usually receive management support from the central administration (mostly financial) in collaboration with a full- or part-time scientific project manager working on a fixed-term contract at the same institute as the project's principal scientist. With an increasing amount of granted EU projects, the number of scientific project managers employed across the whole university has also increased, and a need for coordination and professionalization of this service became obvious. Until recently, UiB had no unified structures and routines for training of newly recruited project managers, or for archiving and transferring routines and skills after the end of the project and the manager's employment contract. To overcome this administrative knowledge gap, the "Forum for scientific EU project managers at UiB" was founded in spring 2016 as an informal communication platform. Its purpose is to bring together current and previous scientific EU project managers from different disciplines to share their experiences. The main aim of the forum is to transfer and improve knowledge, skills and routines on effective management of EU funded projects, but also to function as a discussion forum where issues arising from handling international consortia can be reviewed. The group meets monthly and discusses current challenges from on-going EU projects as well as routines for specific project operation tasks. These routines are archived in an online best-practise guide which the group currently develops. The regular personal meetings are supplemented with an intense communication via a group mailing list and several individual mail- and phone-meetings. Since lessons learned during project implementation may improve future proposals, UiB research advisors for proposal support frequently interact with the members of the forum. The forum is also used to spread relevant information received from other sources. We already realize that the forum and its products lead to increased competence of scientific EU project managers and research advisors at UiB. To further harvest these synergy effects, we aim to increase our interaction with similar groups, networks, and online platforms in and beyond Europe.
NASA Technical Reports Server (NTRS)
Barry, Matthew R.
2006-01-01
The X-Windows Process Validation Table (PVT) Widget Class ( Class is used here in the object-oriented-programming sense of the word) was devised to simplify the task of implementing network registration services for Information Sharing Protocol (ISP) graphical-user-interface (GUI) computer programs. Heretofore, ISP PVT programming tasks have required many method calls to identify, query, and interpret the connections and messages exchanged between a client and a PVT server. Normally, programmers have utilized direct access to UNIX socket libraries to implement the PVT protocol queries, necessitating the use of many lines of source code to perform frequent tasks. Now, the X-Windows PVT Widget Class encapsulates ISP client server network registration management tasks within the framework of an X Windows widget. Use of the widget framework enables an X Windows GUI program to interact with PVT services in an abstract way and in the same manner as that of other graphical widgets, making it easier to program PVT clients. Wrapping the PVT services inside the widget framework enables a programmer to treat a PVT server interface as though it were a GUI. Moreover, an alternate subclass could implement another service in a widget of the same type. This program was written by Matthew R. Barry of United Space Alliance for Johnson Space Center. For further information, contact the Johnson Technology Transfer Office at (281) 483-3809. MSC-23582 Shuttle Data Center File- Processing Tool in Java A Java-language computer program has been written to facilitate mining of data in files in the Shuttle Data Center (SDC) archives. This program can be executed on a variety of workstations or via Web-browser programs. This program is partly similar to prior C-language programs used for the same purpose, while differing from those programs in that it exploits the platform-neutrality of Java in implementing several features that are important for analysis of large sets of time-series data. The program supports regular expression queries of SDC archive files, reads the files, interleaves the time-stamped samples according to a chosen output, then transforms the results into that format. A user can choose among a variety of output file formats that are useful for diverse purposes, including plotting, Markov modeling, multivariate density estimation, and wavelet multiresolution analysis, as well as for playback of data in support of simulation and testing.
NASA Astrophysics Data System (ADS)
Lawrence, B.; Bennett, V.; Callaghan, S.; Juckes, M. N.; Pepler, S.
2013-12-01
The UK Centre for Environmental Data Archival (CEDA) hosts a number of formal data centres, including the British Atmospheric Data Centre (BADC), and is a partner in a range of national and international data federations, including the InfraStructure for the European Network for Earth system Simulation, the Earth System Grid Federation, and the distributed IPCC Data Distribution Centres. The mission of CEDA is to formally curate data from, and facilitate the doing of, environmental science. The twin aims are symbiotic: data curation helps facilitate science, and facilitating science helps with data curation. Here we cover how CEDA delivers this strategy by established internal processes supplemented by short-term projects, supported by staff with a range of roles. We show how CEDA adds value to data in the curated archive, and how it supports science, and show examples of the aforementioned symbiosis. We begin by discussing curation: CEDA has the formal responsibility for curating the data products of atmospheric science and earth observation research funded by the UK Natural Environment Research Council (NERC). However, curation is not just about the provider community, the consumer communities matter too, and the consumers of these data cross the boundaries of science, including engineers, medics, as well as the gamut of the environmental sciences. There is a small, and growing cohort of non-science users. For both producers and consumers of data, information about data is crucial, and a range of CEDA staff have long worked on tools and techniques for creating, managing, and delivering metadata (as well as data). CEDA "science support" staff work with scientists to help them prepare and document data for curation. As one of a spectrum of activities, CEDA has worked on data Publication as a method of both adding value to some data, and rewarding the effort put into the production of quality datasets. As such, we see this activity as both a curation and a facilitation activity. A range of more focused facilitation activities are carried out, from providing a computing platform suitable for big-data analytics (the Joint Analysis System, JASMIN), to working on distributed data analysis (EXARCH), and the acquisition of third party data to support science and impact (e.g. in the context of the facility for Climate and Environmental Monitoring from Space, CEMS). We conclude by confronting the view of Parsons and Fox (2013) that metaphors such as Data Publication, Big Iron, Science Support etc are limiting, and suggest the CEDA experience is that these sorts of activities can and do co-exist, much as they conclude they should. However, we also believe that within co-existing metaphors, production systems need to be limited in their scope, even if they are on a road to a more joined up infrastructure. We shouldn't confuse what we can do now with what we might want to do in the future.
DOE Office of Scientific and Technical Information (OSTI.GOV)
- PNNL, Harold Trease
2012-10-10
ASSA is a software application that processes binary data into summarized index tables that can be used to organize features contained within the data. ASSA's index tables can also be used to search for user specified features. ASSA is designed to organize and search for patterns in unstructured binary data streams or archives, such as video, images, audio, and network traffic. ASSA is basically a very general search engine used to search for any pattern in any binary data stream. It has uses in video analytics, image analysis, audio analysis, searching hard-drives, monitoring network traffic, etc.
Research and Development in Very Long Baseline Interferometry (VLBI)
NASA Technical Reports Server (NTRS)
Himwich, William E.
2004-01-01
Contents include the following: 1.Observation coordination. 2. Data acquisition system control software. 3. Station support. 4. Correlation, data processing, and analysis. 5. Data distribution and archiving. 6. Technique improvement and research. 7. Computer support.
Beginning with the Particular: Reimagining Professional Development as a Feminist Practice
ERIC Educational Resources Information Center
Schultz, Katherine
2011-01-01
This article analyzes the work of a long-term network of teachers, the Philadelphia Teachers Learning Cooperative, with a focus on their descriptive practices. Drawing on three years of ethnographic documentation of weekly meetings and a historical archive of meetings over 30 years, I characterize the teachers' knowledge about teaching and…
Use of MCIDAS as an earth science information systems tool
NASA Technical Reports Server (NTRS)
Goodman, H. Michael; Karitani, Shogo; Parker, Karen G.; Stooksbury, Laura M.; Wilson, Gregory S.
1988-01-01
The application of the man computer interactive data access system (MCIDAS) to information processing is examined. The computer systems that interface with the MCIDAS are discussed. Consideration is given to the computer networking of MCIDAS, data base archival, and the collection and distribution of real-time special sensor microwave/imager data.
ERIC Educational Resources Information Center
Nelson, Carnot E.
1973-01-01
Identifies two major problems--first, both the formal and informal communication networks are extremely diffuse; and second, the interval from the start of a piece of research until its integration into the archival body of scientific knowledge is long--and presents some suggestions for alleviating them. (Author/JM)
Brooklyn Historical Society and the New York State Historical Documents Inventory, 1985-2007
ERIC Educational Resources Information Center
Pettit, Marilyn H.
2008-01-01
This article summarizes the New York State Historical Documents Inventory as experienced at Brooklyn Historical Society. The archives and manuscripts, dating from the seventeenth century and surveyed by the Historical Documents Inventory in the 1980s, were cataloged as Historical Documents Inventory/Research Libraries Information Network records…
Research Library Issues: A Quarterly Report from ARL, CNI, and SPARC. RLI 279
ERIC Educational Resources Information Center
Baughman, M. Sue, Ed.
2012-01-01
"Research Library Issues" ("RLI") is a quarterly report from ARL (Association of Research Libraries), CNI (Coalition of Networked Information), and SPARC (Scholarly Publishing and Academic Resources Coalition). This issue includes the following articles: (1) Digitization of Special Collections and Archives: Legal and Contractual Issues (Peter B.…
Web Camera Use in Developing Biology, Molecular Biology and Biochemistry Laboratories
ERIC Educational Resources Information Center
Ogren, Paul J.; Deibel, Michael; Kelly, Ian; Mulnix, Amy B.; Peck, Charlie
2004-01-01
The use of a network-ready color camera is described which is primarily marketed as a security device and is used for experiments in developmental biology, genetics and biochemistry laboratories and in special student research projects. Acquiring and analyzing project and archiving images is very important in microscopy, electrophoresis and…
Data archiving and serving system implementation in CLEP's GRAS Core System
NASA Astrophysics Data System (ADS)
Zuo, Wei; Zeng, Xingguo; Zhang, Zhoubin; Geng, Liang; Li, Chunlai
2017-04-01
The Ground Research & Applications System(GRAS) is one of the five systems of China's Lunar Exploration Project(CLEP), it is responsible for data acquisition, processing, management and application, and it is also the operation control center during satellite in-orbit and payload operation management. Chang'E-1, Chang'E-2 and Chang'E-3 have collected abundant lunar exploration data. The aim of this work is to present the implementation of data archiving and Serving in CLEP's GRAS Core System software. This first approach provides a client side API and server side software allowing the creation of a simplified version of CLEPDB data archiving software, and implements all required elements to complete data archiving flow from data acquisition until its persistent storage technology. The client side includes all necessary components that run on devices that acquire or produce data, distributing and streaming to configure remote archiving servers. The server side comprises an archiving service that stores into PDS files all received data. The archiving solution aims at storing data coming for the Data Acquisition Subsystem, the Operation Management Subsystem, the Data Preprocessing Subsystem and the Scientific Application & Research Subsystem. The serving solution aims at serving data for the various business systems, scientific researchers and public users. The data-driven and component clustering methods was adopted in this system, the former is used to solve real-time data archiving and data persistence services; the latter is used to keep the continuous supporting ability of archive and service to new data from Chang'E Mission. Meanwhile, it can save software development cost as well.
NASA Astrophysics Data System (ADS)
Petitjean, Gilles; de Hauteclocque, Bertrand
2004-06-01
EADS Defence and Security Systems (EADS DS SA) have developed an expertise as integrator of archive management systems for both their commercial and defence customers (ESA, CNES, EC, EUMETSAT, French MOD, US DOD, etc.), especially in Earth Observation and in Meteorology fields.The concern of valuable data owners is both their long-term preservation but also the integration of the archive in their information system with in particular an efficient access to archived data for their user community. The system integrator answers to this requirement by a methodology combining understanding of user needs, exhaustive knowledge of the existing solutions both for hardware and software elements and development and integration ability. The system integrator completes the facility development by support activities.The long-term preservation of archived data obviously involves a pertinent selection of storage media and archive library. This selection relies on storage technology survey but the selection criteria depend on the analysis of the user needs. The system integrator will recommend the best compromise for implementing an archive management facility, thanks to its knowledge and its independence of storage market and through the analysis of the user requirements. He will provide a solution, which is able to evolve to take advantage of the storage technology progress.But preserving the data for long-term is not only a question of storage technology. Some functions are required to secure the archive management system against contingency situation: multiple data set copies using operational procedures, active quality control of the archived data, migration policy optimising the cost of ownership.
Zong, W; Wang, P; Leung, B; Moody, G B; Mark, R G
2002-01-01
The advent of implantable cardioverter defibrillators (ICDs) has resulted in significant reductions in mortality in patients at high risk for sudden cardiac death. Extensive related basic research and clinical investigation continue. ICDs typically record intracardiac electrograms and inter-beat intervals along with device settings during episodes of device delivery of therapy. Researchers wishing to study these data further have until now been limited to viewing paper plots. In support of multi-center clinical studies of patients with ICDs, we have developed a web based searchable ICD data archiving system, which allows users to use a web browser to upload ICD data from diskettes to a server where the data are automatically processed and archived. Users can view and download the archived ICD data directly via the web. The entire system is built from open source software. At present more than 500 patient ICD data sets have been uploaded to and archived in the system. This project will be of value not only to those who wish to conduct research using ICD data, but also to clinicians who need to archive and review ICD data collected from their patients.
Acquisition plan for Digital Document Storage (DDS) prototype system
NASA Technical Reports Server (NTRS)
1990-01-01
NASA Headquarters maintains a continuing interest in and commitment to exploring the use of new technology to support productivity improvements in meeting service requirements tasked to the NASA Scientific and Technical Information (STI) Facility, and to support cost effective approaches to the development and delivery of enhanced levels of service provided by the STI Facility. The DDS project has been pursued with this interest and commitment in mind. It is believed that DDS will provide improved archival blowback quality and service for ad hoc requests for paper copies of documents archived and serviced centrally at the STI Facility. It will also develop an operating capability to scan, digitize, store, and reproduce paper copies of 5000 NASA technical reports archived annually at the STI Facility and serviced to the user community. Additionally, it will provide NASA Headquarters and field installations with on-demand, remote, electronic retrieval of digitized, bilevel, bit mapped report images along with branched, nonsequential retrieval of report subparts.
The RCSB protein data bank: integrative view of protein, gene and 3D structural information
Rose, Peter W.; Prlić, Andreas; Altunkaya, Ali; Bi, Chunxiao; Bradley, Anthony R.; Christie, Cole H.; Costanzo, Luigi Di; Duarte, Jose M.; Dutta, Shuchismita; Feng, Zukang; Green, Rachel Kramer; Goodsell, David S.; Hudson, Brian; Kalro, Tara; Lowe, Robert; Peisach, Ezra; Randle, Christopher; Rose, Alexander S.; Shao, Chenghua; Tao, Yi-Ping; Valasatava, Yana; Voigt, Maria; Westbrook, John D.; Woo, Jesse; Yang, Huangwang; Young, Jasmine Y.; Zardecki, Christine; Berman, Helen M.; Burley, Stephen K.
2017-01-01
The Research Collaboratory for Structural Bioinformatics Protein Data Bank (RCSB PDB, http://rcsb.org), the US data center for the global PDB archive, makes PDB data freely available to all users, from structural biologists to computational biologists and beyond. New tools and resources have been added to the RCSB PDB web portal in support of a ‘Structural View of Biology.’ Recent developments have improved the User experience, including the high-speed NGL Viewer that provides 3D molecular visualization in any web browser, improved support for data file download and enhanced organization of website pages for query, reporting and individual structure exploration. Structure validation information is now visible for all archival entries. PDB data have been integrated with external biological resources, including chromosomal position within the human genome; protein modifications; and metabolic pathways. PDB-101 educational materials have been reorganized into a searchable website and expanded to include new features such as the Geis Digital Archive. PMID:27794042
Accessing northern California earthquake data via Internet
NASA Astrophysics Data System (ADS)
Romanowicz, Barbara; Neuhauser, Douglas; Bogaert, Barbara; Oppenheimer, David
The Northern California Earthquake Data Center (NCEDC) provides easy access to central and northern California digital earthquake data. It is located at the University of California, Berkeley, and is operated jointly with the U.S. Geological Survey (USGS) in Menlo Park, Calif., and funded by the University of California and the National Earthquake Hazard Reduction Program. It has been accessible to users in the scientific community through Internet since mid-1992.The data center provides an on-line archive for parametric and waveform data from two regional networks: the Northern California Seismic Network (NCSN) operated by the USGS and the Berkeley Digital Seismic Network (BDSN) operated by the Seismographic Station at the University of California, Berkeley.
Georgia's Stream-Water-Quality Monitoring Network, 2006
Nobles, Patricia L.; ,
2006-01-01
The USGS stream-water-quality monitoring network for Georgia is an aggregation of smaller networks and individual monitoring stations that have been established in cooperation with Federal, State, and local agencies. These networks collectively provide data from 130 sites, 62 of which are monitored continuously in real time using specialized equipment that transmits these data via satellite to a centralized location for processing and storage. These data are made available on the Web in near real time at http://waterdata.usgs.gov/ga/nwis/ Ninety-eight stations are sampled periodically for a more extensive suite of chemical and biological constituents that require laboratory analysis. Both the continuous and the periodic water-quality data are archived and maintained in the USGS National Water Information System and are available to cooperators, water-resource managers, and the public. The map at right shows the USGS stream-water-quality monitoring network for Georgia and major watersheds. The network represents an aggregation of smaller networks and individual monitoring stations that collectively provide data from 130 sites.
Meyer, Patrick E; Lafitte, Frédéric; Bontempi, Gianluca
2008-10-29
This paper presents the R/Bioconductor package minet (version 1.1.6) which provides a set of functions to infer mutual information networks from a dataset. Once fed with a microarray dataset, the package returns a network where nodes denote genes, edges model statistical dependencies between genes and the weight of an edge quantifies the statistical evidence of a specific (e.g transcriptional) gene-to-gene interaction. Four different entropy estimators are made available in the package minet (empirical, Miller-Madow, Schurmann-Grassberger and shrink) as well as four different inference methods, namely relevance networks, ARACNE, CLR and MRNET. Also, the package integrates accuracy assessment tools, like F-scores, PR-curves and ROC-curves in order to compare the inferred network with a reference one. The package minet provides a series of tools for inferring transcriptional networks from microarray data. It is freely available from the Comprehensive R Archive Network (CRAN) as well as from the Bioconductor website.
Formats for Digital Preservation: A Review of Alternatives and Issues
2007-03-01
During 2006, several commercial companies produced products supporting the creation, migration, and validation of PDF/A files. The growing...layout and presentation. To what degree does the archive want or need to insure flexibility in the re-use of content in the future? To what extent does... erm -guidance.html [October 5, 2006] NCBI (National Center for Biotechnology Information), National Library of Medicine. “Archiving and
Web Services and Data Enhancements at the Northern California Earthquake Data Center
NASA Astrophysics Data System (ADS)
Neuhauser, D. S.; Zuzlewski, S.; Lombard, P. N.; Allen, R. M.
2013-12-01
The Northern California Earthquake Data Center (NCEDC) provides data archive and distribution services for seismological and geophysical data sets that encompass northern California. The NCEDC is enhancing its ability to deliver rapid information through Web Services. NCEDC Web Services use well-established web server and client protocols and REST software architecture to allow users to easily make queries using web browsers or simple program interfaces and to receive the requested data in real-time rather than through batch or email-based requests. Data are returned to the user in the appropriate format such as XML, RESP, simple text, or MiniSEED depending on the service and selected output format. The NCEDC offers the following web services that are compliant with the International Federation of Digital Seismograph Networks (FDSN) web services specifications: (1) fdsn-dataselect: time series data delivered in MiniSEED format, (2) fdsn-station: station and channel metadata and time series availability delivered in StationXML format, (3) fdsn-event: earthquake event information delivered in QuakeML format. In addition, the NCEDC offers the the following IRIS-compatible web services: (1) sacpz: provide channel gains, poles, and zeros in SAC format, (2) resp: provide channel response information in RESP format, (3) dataless: provide station and channel metadata in Dataless SEED format. The NCEDC is also developing a web service to deliver timeseries from pre-assembled event waveform gathers. The NCEDC has waveform gathers for ~750,000 northern and central California events from 1984 to the present, many of which were created by the USGS NCSN prior to the establishment of the joint NCSS (Northern California Seismic System). We are currently adding waveforms to these older event gathers with time series from the UCB networks and other networks with waveforms archived at the NCEDC, and ensuring that the waveform for each channel in the event gathers have the highest quality waveform from the archive.
An Intelligent Archive Testbed Incorporating Data Mining
NASA Technical Reports Server (NTRS)
Ramapriyan, H.; Isaac, D.; Yang, W.; Bonnlander, B.; Danks, D.
2009-01-01
Many significant advances have occurred during the last two decades in remote sensing instrumentation, computation, storage, and communication technology. A series of Earth observing satellites have been launched by U.S. and international agencies and have been operating and collecting global data on a regular basis. These advances have created a data rich environment for scientific research and applications. NASA s Earth Observing System (EOS) Data and Information System (EOSDIS) has been operational since August 1994 with support for pre-EOS data. Currently, EOSDIS supports all the EOS missions including Terra (1999), Aqua (2002), ICESat (2002) and Aura (2004). EOSDIS has been effectively capturing, processing and archiving several terabytes of standard data products each day. It has also been distributing these data products at a rate of several terabytes per day to a diverse and globally distributed user community (Ramapriyan et al. 2009). There are other NASA-sponsored data system activities including measurement-based systems such as the Ocean Data Processing System and the Precipitation Processing system, and several projects under the Research, Education and Applications Solutions Network (REASoN), Making Earth Science Data Records for Use in Research Environments (MEaSUREs), and the Advancing Collaborative Connections for Earth-Sun System Science (ACCESS) programs. Together, these activities provide a rich set of resources constituting a value chain for users to obtain data at various levels ranging from raw radiances to interdisciplinary model outputs. The result has been a significant leap in our understanding of the Earth systems that all humans depend on for their enjoyment, livelihood, and survival. The trend in the community today is towards many distributed sets of providers of data and services. Despite this, visions for the future include users being able to locate, fuse and utilize data with location transparency and high degree of interoperability, and being able to convert data to information and usable knowledge in an efficient, convenient manner, aided significantly by automation (Ramapriyan et al. 2004; NASA 2005). We can look upon the distributed provider environment with capabilities to convert data to information and to knowledge as an Intelligent Archive in the Context of a Knowledge Building system (IA-KBS). Some of the key capabilities of an IA-KBS are: Virtual Product Generation, Significant Event Detection, Automated Data Quality Assessment, Large-Scale Data Mining, Dynamic Feedback Loop, and Data Discovery and Efficient Requesting (Ramapriyan et al. 2004).
A LANGUAGE FOR MODULAR SPATIO-TEMPORAL SIMULATION (R824766)
Creating an effective environment for collaborative spatio-temporal model development will require computational systems that provide support for the user in three key areas: (1) Support for modular, hierarchical model construction and archiving/linking of simulation modules; (2)...
Intelligent Systems Technologies and Utilization of Earth Observation Data
NASA Technical Reports Server (NTRS)
Ramapriyan, H. K.; McConaughy, G. R.; Morse, H. S.
2004-01-01
The addition of raw data and derived geophysical parameters from several Earth observing satellites over the last decade to the data held by NASA data centers has created a data rich environment for the Earth science research and applications communities. The data products are being distributed to a large and diverse community of users. Due to advances in computational hardware, networks and communications, information management and software technologies, significant progress has been made in the last decade in archiving and providing data to users. However, to realize the full potential of the growing data archives, further progress is necessary in the transformation of data into information, and information into knowledge that can be used in particular applications. Sponsored by NASA s Intelligent Systems Project within the Computing, Information and Communication Technology (CICT) Program, a conceptual architecture study has been conducted to examine ideas to improve data utilization through the addition of intelligence into the archives in the context of an overall knowledge building system (KBS). Potential Intelligent Archive concepts include: 1) Mining archived data holdings to improve metadata to facilitate data access and usability; 2) Building intelligence about transformations on data, information, knowledge, and accompanying services; 3) Recognizing the value of results, indexing and formatting them for easy access; 4) Interacting as a cooperative node in a web of distributed systems to perform knowledge building; and 5) Being aware of other nodes in the KBS, participating in open systems interfaces and protocols for virtualization, and achieving collaborative interoperability.
Policies and Procedures for Accessing Archived NASA Lunar Data via the Web
NASA Technical Reports Server (NTRS)
James, Nathan L.; Williams, David R.
2011-01-01
The National Space Science Data Center (NSSDC) was established by NASA to provide for the preservation and dissemination of scientific data from NASA missions. This paper describes the policies specifically related to lunar science data. NSSDC presently archives 660 lunar data collections. Most of these data (423 units) are stored offline in analog format. The remainder of this collection consists of magnetic tapes and discs containing approximately 1.7 TB of digital lunar data. The active archive for NASA lunar data is the Planetary Data System (PDS). NSSDC has an agreement with the PDS Lunar Data Node to assist in the restoration and preparation of NSSDC-resident lunar data upon request for access and distribution via the PDS archival system. Though much of NSSDC's digital store also resides in PDS, NSSDC has many analog data collections and some digital lunar data sets that are not in PDS. NSSDC stands ready to make these archived lunar data accessible to both the research community and the general public upon request as resources allow. Newly requested offline lunar data are digitized and moved to near-line storage devices called digital linear tape jukeboxes. The data are then packaged and made network-accessible via FTP for the convenience of a growing segment of the user community. This publication will 1) discuss the NSSDC processes and policies that govern how NASA lunar data is preserved, restored, and made accessible via the web and 2) highlight examples of special lunar data requests.
NASA Astrophysics Data System (ADS)
Boler, F. M.; Blewitt, G.; Kreemer, C. W.; Bock, Y.; Noll, C. E.; McWhirter, J.; Jamason, P.; Squibb, M. B.
2010-12-01
Space geodetic science and other disciplines using geodetic products have benefited immensely from open sharing of data and metadata from global and regional archives Ten years ago Scripps Orbit and Permanent Array Center (SOPAC), the NASA Crustal Dynamics Data Information System (CDDIS), UNAVCO and other archives collaborated to create the GPS Seamless Archive Centers (GSAC) in an effort to further enable research with the expanding collections of GPS data then becoming available. The GSAC partners share metadata to facilitate data discovery and mining across participating archives and distribution of data to users. This effort was pioneering, but was built on technology that has now been rendered obsolete. As the number of geodetic observing technologies has expanded, the variety of data and data products has grown dramatically, exposing limitations in data product sharing. Through a NASA ROSES project, the three archives (CDDIS, SOPAC and UNAVCO) have been funded to expand the original GSAC capability for multiple geodetic observation types and to simultaneously modernize the underlying technology by implementing web services. The University of Nevada, Reno (UNR) will test the web services implementation by incorporating them into their daily GNSS data processing scheme. The effort will include new methods for quality control of current and legacy data that will be a product of the analysis/testing phase performed by UNR. The quality analysis by UNR will include a report of the stability of the stations coordinates over time that will enable data users to select sites suitable for their application, for example identifying stations with large seasonal effects. This effort will contribute to enhanced ability for very large networks to obtain complete data sets for processing.
High-performance mass storage system for workstations
NASA Technical Reports Server (NTRS)
Chiang, T.; Tang, Y.; Gupta, L.; Cooperman, S.
1993-01-01
Reduced Instruction Set Computer (RISC) workstations and Personnel Computers (PC) are very popular tools for office automation, command and control, scientific analysis, database management, and many other applications. However, when using Input/Output (I/O) intensive applications, the RISC workstations and PC's are often overburdened with the tasks of collecting, staging, storing, and distributing data. Also, by using standard high-performance peripherals and storage devices, the I/O function can still be a common bottleneck process. Therefore, the high-performance mass storage system, developed by Loral AeroSys' Independent Research and Development (IR&D) engineers, can offload a RISC workstation of I/O related functions and provide high-performance I/O functions and external interfaces. The high-performance mass storage system has the capabilities to ingest high-speed real-time data, perform signal or image processing, and stage, archive, and distribute the data. This mass storage system uses a hierarchical storage structure, thus reducing the total data storage cost, while maintaining high-I/O performance. The high-performance mass storage system is a network of low-cost parallel processors and storage devices. The nodes in the network have special I/O functions such as: SCSI controller, Ethernet controller, gateway controller, RS232 controller, IEEE488 controller, and digital/analog converter. The nodes are interconnected through high-speed direct memory access links to form a network. The topology of the network is easily reconfigurable to maximize system throughput for various applications. This high-performance mass storage system takes advantage of a 'busless' architecture for maximum expandability. The mass storage system consists of magnetic disks, a WORM optical disk jukebox, and an 8mm helical scan tape to form a hierarchical storage structure. Commonly used files are kept in the magnetic disk for fast retrieval. The optical disks are used as archive media, and the tapes are used as backup media. The storage system is managed by the IEEE mass storage reference model-based UniTree software package. UniTree software will keep track of all files in the system, will automatically migrate the lesser used files to archive media, and will stage the files when needed by the system. The user can access the files without knowledge of their physical location. The high-performance mass storage system developed by Loral AeroSys will significantly boost the system I/O performance and reduce the overall data storage cost. This storage system provides a highly flexible and cost-effective architecture for a variety of applications (e.g., realtime data acquisition with a signal and image processing requirement, long-term data archiving and distribution, and image analysis and enhancement).
Monitoring of stability of ASG-EUPOS network coordinates
NASA Astrophysics Data System (ADS)
Figurski, M.; Szafranek, K.; Wrona, M.
2009-04-01
ASG-EUPOS (Active Geodetic Network - European Position Determination System) is the national system of precise satellite positioning in Poland, which increases a density of regional and global GNSS networks and is widely used by public administration, national institutions, entrepreneurs and citizens (especially surveyors). In near future ASG-EUPOS is to take role of main national network. Control of proper activity of stations and realization of ETRS'89 is a necessity. User of the system needs to be sure that observations quality and coordinates accuracy are high enough. Coordinates of IGS (International GNSS Service) and EPN (European Permanent Network) stations are precisely determined and any changes are monitored all the time. Observations are verified before they are archived in regional and global databases. The same applies to ASG-EUPOS. This paper concerns standardization of GNSS observations from different stations (uniform adjustment), examination of solutions correctness according to IGS and EPN standards and stability of solutions and sites activity
Analog-to-digital clinical data collection on networked workstations with graphic user interface.
Lunt, D
1991-02-01
An innovative respiratory examination system has been developed that combines physiological response measurement, real-time graphic displays, user-driven operating sequences, and networked file archiving and review into a scientific research and clinical diagnosis tool. This newly constructed computer network is being used to enhance the research center's ability to perform patient pulmonary function examinations. Respiratory data are simultaneously acquired and graphically presented during patient breathing maneuvers and rapidly transformed into graphic and numeric reports, suitable for statistical analysis or database access. The environment consists of the hardware (Macintosh computer, MacADIOS converters, analog amplifiers), the software (HyperCard v2.0, HyperTalk, XCMDs), and the network (AppleTalk, fileservers, printers) as building blocks for data acquisition, analysis, editing, and storage. System operation modules include: Calibration, Examination, Reports, On-line Help Library, Graphic/Data Editing, and Network Storage.
NASA Astrophysics Data System (ADS)
Campbell, J. D.; Heilman, P.; Goodrich, D. C.; Sadler, J.
2015-12-01
The objective for the USDA Long-Term Agroecosystem Research (LTAR) network Common Observatory Repository (CORe) is to provide data management services including archive, discovery, and access for consistently observed data across all 18 nodes. LTAR members have an average of 56 years of diverse historic data. Each LTAR has designated a representative 'permanent' site as the location's common meteorological observatory. CORe implementation is phased, starting with meteorology, then adding hydrology, eddy flux, soil, and biology data. A design goal was to adopt existing best practices while minimizing the additional data management duties for the researchers. LTAR is providing support for data management specialists at the locations and the National Agricultural Library is providing central data management services. Maintaining continuity with historical observations is essential, so observations from both the legacy and new common methods are included in CORe. International standards are used to store robust descriptive metadata (ISO 19115) for the observation station and surrounding locale (WMO), sensors (Sensor ML), and activity (e.g., re-calibration, locale changes) to provide sufficient detail for novel data re-use for the next 50 years. To facilitate data submission a simple text format was designed. Datasets in CORe will receive DOIs to encourage citations giving fair credit for data providers. Data and metadata access are designed to support multiple formats and naming conventions. An automated QC process is being developed to enhance comparability among LTAR locations and to generate QC process metadata. Data provenance is maintained with a permanent record of changes including those by local scientists reviewing the automated QC results. Lessons learned so far include increase in site acceptance of CORe with the decision to store data from both legacy and new common methods. A larger than anticipated variety of currently used methods with potentially significant differences for future data use was found. Cooperative peer support among locations with the same sensors coupled with central support has reduced redundancy in procedural and data documentation.
EOSDIS: Archive and Distribution Systems in the Year 2000
NASA Technical Reports Server (NTRS)
Behnke, Jeanne; Lake, Alla
2000-01-01
Earth Science Enterprise (ESE) is a long-term NASA research mission to study the processes leading to global climate change. The Earth Observing System (EOS) is a NASA campaign of satellite observatories that are a major component of ESE. The EOS Data and Information System (EOSDIS) is another component of ESE that will provide the Earth science community with easy, affordable, and reliable access to Earth science data. EOSDIS is a distributed system, with major facilities at seven Distributed Active Archive Centers (DAACs) located throughout the United States. The EOSDIS software architecture is being designed to receive, process, and archive several terabytes of science data on a daily basis. Thousands of science users and perhaps several hundred thousands of non-science users are expected to access the system. The first major set of data to be archived in the EOSDIS is from Landsat-7. Another EOS satellite, Terra, was launched on December 18, 1999. With the Terra launch, the EOSDIS will be required to support approximately one terabyte of data into and out of the archives per day. Since EOS is a multi-mission program, including the launch of more satellites and many other missions, the role of the archive systems becomes larger and more critical. In 1995, at the fourth convening of NASA Mass Storage Systems and Technologies Conference, the development plans for the EOSDIS information system and archive were described. Five years later, many changes have occurred in the effort to field an operational system. It is interesting to reflect on some of the changes driving the archive technology and system development for EOSDIS. This paper principally describes the Data Server subsystem including how the other subsystems access the archive, the nature of the data repository, and the mass-storage I/O management. The paper reviews the system architecture (both hardware and software) of the basic components of the archive. It discusses the operations concept, code development, and testing phase of the system. Finally, it describes the future plans for the archive.
A Complete Public Archive for the Einstein Imaging Proportional Counter
NASA Technical Reports Server (NTRS)
Helfand, David J.
1996-01-01
Consistent with our proposal to the Astrophysics Data Program in 1992, we have completed the design, construction, documentation, and distribution of a flexible and complete archive of the data collected by the Einstein Imaging Proportional Counter. Along with software and data delivered to the High Energy Astrophysics Science Archive Research Center at Goddard Space Flight Center, we have compiled and, where appropriate, published catalogs of point sources, soft sources, hard sources, extended sources, and transient flares detected in the database along with extensive analyses of the instrument's backgrounds and other anomalies. We include in this document a brief summary of the archive's functionality, a description of the scientific catalogs and other results, a bibliography of publications supported in whole or in part under this contract, and a list of personnel whose pre- and post-doctoral education consisted in part in participation in this project.
Lunar Data Node: Apollo Data Restoration and Archiving Update
NASA Technical Reports Server (NTRS)
Williams, David R.; Hills, Howard K.; Guiness, Edward A.; Taylor, Patrick T.; McBride, Marie Julia
2013-01-01
The Lunar Data Node (LDN) of the Planetary Data System (PDS) is responsible for the restoration and archiving of Apollo data. The LDN is located at the National Space Science Data Center (NSSDC), which holds much of the extant Apollo data on microfilm, microfiche, hard-copy documents, and magnetic tapes in older formats. The goal of the restoration effort is to convert the data into user-accessible PDS formats, create a full set of explanatory supporting data (metadata), archive the full data sets through PDS, and post the data online at the PDS Geosciences Node. This will both enable easy use of the data by current researchers and ensure that the data and metadata are securely preserved for future use. We are also attempting to locate and preserve Apollo data which were never archived at NSSDC. We will give a progress report on the data sets we have been restoring and future work.
The NASA Ames Life Sciences Data Archive: Biobanking for the Final Frontier
NASA Technical Reports Server (NTRS)
Rask, Jon; Chakravarty, Kaushik; French, Alison J.; Choi, Sungshin; Stewart, Helen J.
2017-01-01
The NASA Ames Institutional Scientific Collection involves the Ames Life Sciences Data Archive (ALSDA) and a biospecimen repository, which are responsible for archiving information and non-human biospecimens collected from spaceflight and matching ground control experiments. The ALSDA also manages a biospecimen sharing program, performs curation and long-term storage operations, and facilitates distribution of biospecimens for research purposes via a public website (https:lsda.jsc.nasa.gov). As part of our best practices, a tissue viability testing plan has been developed for the repository, which will assess the quality of samples subjected to long-term storage. We expect that the test results will confirm usability of the samples, enable broader science community interest, and verify operational efficiency of the archives. This work will also support NASA open science initiatives and guides development of NASA directives and policy for curation of biological collections.
NASA Astrophysics Data System (ADS)
Galetzka, J.; Feaux, K.; Cabral, E.; Salazar-Tlaczani, L.; Adams, D. K.; Serra, Y. L.; Mattioli, G. S.; Miller, M. M.
2014-12-01
TLALOCNet is a combined atmospheric and tectonic cGPS-Met network in Mexico designed for the investigation of climate, atmospheric processes, the earthquake cycle, and tectonics. While EarthScope-Plate Boundary Observatory (conterminous US, Alaska, Puerto Rico) is among the networks poised to become a nucleus for hemisphere-scale GPS observations, the completion of TLALOCNet at the end of 2015 will close a gap between PBO and other Latin American GPS networks that include COCONet (Central America, Caribbean, and Northern South America), CAnTO, CAP, and IGS extending from Alaska to Patagonia. The National Science Foundation funded the construction and operation of TLALOCNet, with significant matching funds and resources provided by the Universidad Nacional Autónoma de México (UNAM). The project will involve the construction or refurbishment of 38 cGPS-Met stations in Mexico built to PBO standards. The first three TLALOCNet stations were installed in the northern Mexican states of Sonora and Chihuahua in July 2014, following the North American Monsoon GPS Transect Experiment 2013. Together these observations better characterize critical components of water transport in the region. Data from these stations are now available through the UNAVCO data archive and can be downloaded from http://facility.unavco.org/data/dai2/app/dai2.html#. By the end of 2014, TLALOCNet data, together with complementary data from other regional cGPS networks in Mexico, will also be openly available through a Mexico-based data center. We will present the status of the project to date, including an overview of the station hardware, data communications, data flow, construction schedule, and science objectives. We will also present some of the challenges encountered, including regional logistics, shipping and importation, site security, and other issues associated with the construction and operation of a large continuous GPS network.
NASA Astrophysics Data System (ADS)
Addison, J. A.
2015-12-01
The Past Global Changes (PAGES) project of IGBP and Future Earth supports research to understand the Earth's past environment to improve future climate predictions and inform strategies for sustainability. Within this framework, the PAGES 2k Network was established to provide a focus on the past 2000 years, a period that encompasses Medieval Climate Anomaly warming, Little Ice Age cooling, and recent anthropogenically-forced climate change. The results of these studies are used for testing earth system models, and for understanding decadal- to centennial-scale variability, which is needed for long-term planning. International coordination and cooperation among the nine regional Working Groups that make up the 2k Network has been critical to the success of PAGES 2k. The collaborative approach is moving toward scientific achievements across the regional groups, including: (i) the development of a community-driven open-access proxy climate database; (ii) integration of multi-resolution proxy records; (iii) development of multivariate climate reconstructions; and (iv) a leap forward in the spatial resolution of paleoclimate reconstructions. The last addition to the 2k Network, the Ocean2k Working Group has further innovated the collaborative approach by: (1) creating an open, receptive environment to discuss ideas exclusively in the virtual space; (2) employing an array of real-time collaborative software tools to enable communication, group document writing, and data analysis; (3) consolidating executive leadership teams to oversee project development and manage grassroots-style volunteer pools; and (4) embracing the value-added role that international and interdisciplinary science can play in advancing paleoclimate hypotheses critical to understanding future change. Ongoing efforts for the PAGES 2k Network are focused on developing new standards for data quality control and archiving. These tasks will provide the foundation for new and continuing "trans-regional" 2k projects which address paleoclimate science that transcend regional boundaries. The PAGES 2k Network encourages participation by all investigators interested in this community-wide project.
NASA Astrophysics Data System (ADS)
Neuhauser, D.; Dietz, L.; Lombard, P.; Klein, F.; Zuzlewski, S.; Kohler, W.; Hellweg, M.; Luetgert, J.; Oppenheimer, D.; Romanowicz, B.
2006-12-01
The longstanding cooperation between the USGS Menlo Park and UC Berkeley's Seismological Laboratory for monitoring earthquakes and providing data to the research community is achieving a new level of integration. While station support and data collection for each network (NC, BK, BP) remain the responsibilities of the host institution, picks, codas and amplitudes will be produced and shared between the data centers continuously. Thus, realtime earthquake processing from triggering and locating through magnitude and moment tensor calculation and Shakemap production will take place independently at both locations, improving the robustness of event reporting in the Northern California Earthquake Management Center. Parametric data will also be exchanged with the Southern California Earthquake Management System to allow statewide earthquake detection and processing for further redundancy within the California Integrated Seismic Network (CISN). The database plays an integral part in this system, providing the coordination for event processing as well as the repository for event, instrument (metadata) and waveform information. The same master database serves both realtime processing, data quality control and archival, and the data center which provides waveforms and earthquake data to users in the research community. Continuous waveforms from all BK, BP, and NC stations, event waveform gathers, and event information automatically become available at the Northern California Earthquake Data Center (NCEDC). Currently, the NCEDC is collecting and makes available over 4 TByes of data per year from the NCEMC stations and other seismic networks, as well as from GPS and and other geophysical instrumentation.
Data driven innovations in structural health monitoring
NASA Astrophysics Data System (ADS)
Rosales, M. J.; Liyanapathirana, R.
2017-05-01
At present, substantial investments are being allocated to civil infrastructures also considered as valuable assets at a national or global scale. Structural Health Monitoring (SHM) is an indispensable tool required to ensure the performance and safety of these structures based on measured response parameters. The research to date on damage assessment has tended to focus on the utilization of wireless sensor networks (WSN) as it proves to be the best alternative over the traditional visual inspections and tethered or wired counterparts. Over the last decade, the structural health and behaviour of innumerable infrastructure has been measured and evaluated owing to several successful ventures of implementing these sensor networks. Various monitoring systems have the capability to rapidly transmit, measure, and store large capacities of data. The amount of data collected from these networks have eventually been unmanageable which paved the way to other relevant issues such as data quality, relevance, re-use, and decision support. There is an increasing need to integrate new technologies in order to automate the evaluation processes as well as to enhance the objectivity of data assessment routines. This paper aims to identify feasible methodologies towards the application of time-series analysis techniques to judiciously exploit the vast amount of readily available as well as the upcoming data resources. It continues the momentum of a greater effort to collect and archive SHM approaches that will serve as data-driven innovations for the assessment of damage through efficient algorithms and data analytics.
2016 update of the PRIDE database and its related tools
Vizcaíno, Juan Antonio; Csordas, Attila; del-Toro, Noemi; Dianes, José A.; Griss, Johannes; Lavidas, Ilias; Mayer, Gerhard; Perez-Riverol, Yasset; Reisinger, Florian; Ternent, Tobias; Xu, Qing-Wei; Wang, Rui; Hermjakob, Henning
2016-01-01
The PRoteomics IDEntifications (PRIDE) database is one of the world-leading data repositories of mass spectrometry (MS)-based proteomics data. Since the beginning of 2014, PRIDE Archive (http://www.ebi.ac.uk/pride/archive/) is the new PRIDE archival system, replacing the original PRIDE database. Here we summarize the developments in PRIDE resources and related tools since the previous update manuscript in the Database Issue in 2013. PRIDE Archive constitutes a complete redevelopment of the original PRIDE, comprising a new storage backend, data submission system and web interface, among other components. PRIDE Archive supports the most-widely used PSI (Proteomics Standards Initiative) data standard formats (mzML and mzIdentML) and implements the data requirements and guidelines of the ProteomeXchange Consortium. The wide adoption of ProteomeXchange within the community has triggered an unprecedented increase in the number of submitted data sets (around 150 data sets per month). We outline some statistics on the current PRIDE Archive data contents. We also report on the status of the PRIDE related stand-alone tools: PRIDE Inspector, PRIDE Converter 2 and the ProteomeXchange submission tool. Finally, we will give a brief update on the resources under development ‘PRIDE Cluster’ and ‘PRIDE Proteomes’, which provide a complementary view and quality-scored information of the peptide and protein identification data available in PRIDE Archive. PMID:26527722
Improvements in Space Geodesy Data Discovery at the CDDIS
NASA Technical Reports Server (NTRS)
Noll, C.; Pollack, N.; Michael, P.
2011-01-01
The Crustal Dynamics Data Information System (CDDIS) supports data archiving and distribution activities for the space geodesy and geodynamics community. The main objectives of the system are to store space geodesy and geodynamics related data products in a central data bank. to maintain information about the archival of these data, and to disseminate these data and information in a timely manner to a global scientific research community. The archive consists of GNSS, laser ranging, VLBI, and DORIS data sets and products derived from these data. The CDDIS is one of NASA's Earth Observing System Data and Information System (EOSDIS) distributed data centers; EOSDIS data centers serve a diverse user community and arc tasked to provide facilities to search and access science data and products. Several activities are currently under development at the CDDIS to aid users in data discovery, both within the current community and beyond. The CDDIS is cooperating in the development of Geodetic Seamless Archive Centers (GSAC) with colleagues at UNAVCO and SIO. TIle activity will provide web services to facilitate data discovery within and across participating archives. In addition, the CDDIS is currently implementing modifications to the metadata extracted from incoming data and product files pushed to its archive. These enhancements will permit information about COOlS archive holdings to be made available through other data portals such as Earth Observing System (EOS) Clearinghouse (ECHO) and integration into the Global Geodetic Observing System (GGOS) portal.
Digital information management: a progress report on the National Digital Mammography Archive
NASA Astrophysics Data System (ADS)
Beckerman, Barbara G.; Schnall, Mitchell D.
2002-05-01
Digital mammography creates very large images, which require new approaches to storage, retrieval, management, and security. The National Digital Mammography Archive (NDMA) project, funded by the National Library of Medicine (NLM), is developing a limited testbed that demonstrates the feasibility of a national breast imaging archive, with access to prior exams; patient information; computer aids for image processing, teaching, and testing tools; and security components to ensure confidentiality of patient information. There will be significant benefits to patients and clinicians in terms of accessible data with which to make a diagnosis and to researchers performing studies on breast cancer. Mammography was chosen for the project, because standards were already available for digital images, report formats, and structures. New standards have been created for communications protocols between devices, front- end portal and archive. NDMA is a distributed computing concept that provides for sharing and access across corporate entities. Privacy, auditing, and patient consent are all integrated into the system. Five sites, Universities of Pennsylvania, Chicago, North Carolina and Toronto, and BWXT Y12, are connected through high-speed networks to demonstrate functionality. We will review progress, including technical challenges, innovative research and development activities, standards and protocols being implemented, and potential benefits to healthcare systems.
NASA Astrophysics Data System (ADS)
Minnett, R.; Koppers, A. A. P.; Jarboe, N.; Tauxe, L.; Constable, C.; Jonestrask, L.; Shaar, R.
2014-12-01
Earth science grand challenges often require interdisciplinary and geographically distributed scientific collaboration to make significant progress. However, this organic collaboration between researchers, educators, and students only flourishes with the reduction or elimination of technological barriers. The Magnetics Information Consortium (http://earthref.org/MagIC/) is a grass-roots cyberinfrastructure effort envisioned by the geo-, paleo-, and rock magnetic scientific community to archive their wealth of peer-reviewed raw data and interpretations from studies on natural and synthetic samples. MagIC is dedicated to facilitating scientific progress towards several highly multidisciplinary grand challenges and the MagIC Database team is currently beta testing a new MagIC Search Interface and API designed to be flexible enough for the incorporation of large heterogeneous datasets and for horizontal scalability to tens of millions of records and hundreds of requests per second. In an effort to reduce the barriers to effective collaboration, the search interface includes a simplified data model and upload procedure, support for online editing of datasets amongst team members, commenting by reviewers and colleagues, and automated contribution workflows and data retrieval through the API. This web application has been designed to generalize to other databases in MagIC's umbrella website (EarthRef.org) so the Geochemical Earth Reference Model (http://earthref.org/GERM/) portal, Seamount Biogeosciences Network (http://earthref.org/SBN/), EarthRef Digital Archive (http://earthref.org/ERDA/) and EarthRef Reference Database (http://earthref.org/ERR/) will benefit from its development.
RESIF national datacentre : new features and forthcoming evolutions
NASA Astrophysics Data System (ADS)
Pequegnat, C.; Volcke, P.; le Tanou, J.; Wolyniec, D.; Lecointre, A.; Guéguen, P.
2013-12-01
RESIF is a nationwide french project aimed at building an high quality system to observe and understand the inner earth. The goal is to create a network throughout mainland France comprising 750 seismometers and geodetic measurement instruments, 250 of which will be mobile, to enable the observation network to be focussed on specific investigation subjects and geographic locations. The RESIF data distribution centre, which is a part of the global project, is operated by the Université Joseph Fourier (Grenoble, France) and is being implemented for two years. Data from french broadband permanent network, strong motion permanent network, and mobile seismological antenna are freely accessible as realtime streams and continuous validated data, along with instrumental metadata, delivered using widely known formats and requests tools. New features of the datacentre are : - new modern distribution tools : two FDSN WEBservices has been implemented and deliver data and metadata. - new data and datasets : the number of permanent stations rose by over 40 % percent in one year and the RESIF archive now includes past data (down to 1995) and data from new networks. Moreover, data from mobile experiments prior to 2011 is progressively released, and data from new mobile experiments in the Alps and in the Pyrenean mountains is progressively integrated. - new infrastructures : (1) the RESIF databank is about to be connected to the grid storage of the University High Performance Computing (HPC) centre. As a scientific use case of this datacenter facility, a focus is made on intensive exploitation of combined data from permanent and mobile networks (2) the RESIF databank will be progressively hosted on a new shared storage facility operated by the Université Joseph Fourier. This infrastructure offers high availability data storage (both in blocks and files modes) as well as backup and long term archival capabilities, and will be fully operational at the beginning of 2014.
NASA Technical Reports Server (NTRS)
2002-01-01
The Goddard Earth Sciences Distributed Active Archive Center (DAAC) is the designated archive for all of the ocean color data produced by NASA satellite missions. The DAAC is a long-term, high volume, secure repository for many different kinds of environmental data. With respect to ocean color, the Goddard DAAC holds all the data obtained during the eight-year mission of the Coastal Zone Color Scanner (CZCS). The DAAC is currently receiving data from the Sea-viewing Wide Field-of-view Sensor (SeaWiFS), and the MODIS-Terra instrument. The DAAC recently received reformatted data from the Ocean Color and Temperature Scanner (OCTS) and will also archive MODIS-Aqua Ocean products. In addition to its archive and distribution services, the Goddard DAAC strives to improve data access, ease-of-use, and data applicability for a broad spectrum of customers. The DAAC's data support teams practice dual roles, both insuring the integrity of the DAAC data archive and serving the user community with answers to user inquiries, online and print documentation, and customized data services.
Data management and digital delivery of analog data
Miller, W.A.; Longhenry, Ryan; Smith, T.
2008-01-01
The U.S. Geological Survey's (USGS) data archive at the Earth Resources Observation and Science (EROS) Center is a comprehensive and impartial record of the Earth's changing land surface. USGS/EROS has been archiving and preserving land remote sensing data for over 35 years. This remote sensing archive continues to grow as aircraft and satellites acquire more imagery. As a world leader in preserving data, USGS/EROS has a reputation as a technological innovator in solving challenges and ensuring that access to these collections is available. Other agencies also call on the USGS to consider their collections for long-term archive support. To improve access to the USGS film archive, each frame on every roll of film is being digitized by automated high performance digital camera systems. The system robotically captures a digital image from each film frame for the creation of browse and medium resolution image files. Single frame metadata records are also created to improve access that otherwise involves interpreting flight indexes. USGS/EROS is responsible for over 8.6 million frames of aerial photographs and 27.7 million satellite images.
NASA Astrophysics Data System (ADS)
Novellino, Antonio; Gorringe, Patrick; Schaap, Dick; Pouliquen, Sylvie; Rickards, Lesley; Manzella, Giuseppe
2014-05-01
The Physics preparatory action (MARE/2010/02 - Lot [SI2.579120]) had the overall objectives to provide access to archived and near real-time data on physical conditions as monitored by fixed stations and Ferrybox lines in all the European sea basins and oceans and to determine how well the data meet the needs of users. The existing EMODnet-Physics portal, www.emodnet-physics.eu, includes systems for physical data from the whole Europe (wave height and period, temperature of the water column, wind speed and direction, salinity of the water column, horizontal velocity of the water column, light attenuation, and sea level) provided mainly by fixed stations and ferry-box platforms, discovering related data sets (both near real time and historical data sets), viewing and downloading of the data from about 470 platforms across the European Sea basins. It makes layers of physical data and their metadata available for use and contributes towards the definition of an operational European Marine Observation and Data Network (EMODnet). It is based on a strong collaboration between EuroGOOS member institutes and its regional operational oceanographic systems (ROOSs), and it brings together two marine, but different, communities : the "real time" ocean observing institutes and centers and the National Oceanographic Data Centres (NODCs) that are in charge for archived ocean data validation, quality check and continuous update of data archives for marine environmental monitoring. EMODnet Physics is a Marine Observation and Data Information System that provides a single point of access to near real time and historical achieved data, it is built on existing infrastructure by adding value and avoiding any unnecessary complexity, it provides data access to any relevant user, and is aimed at attracting new data holders and providing better and more data. With a long term-vision for a sustained pan European Ocean Observation System EMODnet Physics is supporting the coordination of the EuroGOOS ROOSs and the empowerment and improvement of their observing and data management infrastructure. The on-going EMODnet Physics preparatory action has recently been extended (MARE/2012/06 - Lot 6) with the aim to enlarge the coverage with additional monitoring systems (e.g. Argos, Gliders, HF Radars etc) and products and strengthening the underlying infrastructure. The presentation will show how to exploit the EMODnet portal and access to the metadata and data of connected platforms.
Warfighter Visualizations Compilations
2013-05-01
list of the user’s favorite websites or other textual content, sub-categorized into types, such as blogs, social networking sites, comics , videos...available: The example in the prototype shows a random archived comic from the website. Other options include thumbnail strips of imagery or dynamic...varied, and range from serving as statistical benchmarks, for increasing social consciousness and interaction, for improving educational interactions
Situated Knowledge and Visual Education: Patrick Geddes and Reclus's Geography (1886-1932)
ERIC Educational Resources Information Center
Ferretti, Federico
2017-01-01
This article addresses Patrick Geddes's relationship with geography and visual education by focusing on his collaboration with the network of the anarchist geographers Élie, Élisée, and Paul Reclus. Drawing on empirical archival research, it contributes to the current debates on geographies of anarchist education and on geographic teaching. The…
The Telecommunications and Data Acquisition Report
NASA Technical Reports Server (NTRS)
Posner, E. C. (Editor)
1989-01-01
Archival reports on developments in programs managed by JPL's Office of Telecommunications and Data Acquisition (TDA) are presented. Activities of the Deep Space Network (DSN) and its associated Ground Communications Facility (GCF) related to DSN advanced systems, systems implementation, and DSN operations are addressed. In addition, recent developments in the NASA SETI (Search for Extraterrestrial Intelligence) sky survey are summarized.
Faculty Recommendations for Web Tools: Implications for Course Management Systems
ERIC Educational Resources Information Center
Oliver, Kevin; Moore, John
2008-01-01
A gap analysis of web tools in Engineering was undertaken as one part of the Digital Library Network for Engineering and Technology (DLNET) grant funded by NSF (DUE-0085849). DLNET represents a Web portal and an online review process to archive quality knowledge objects in Engineering and Technology disciplines. The gap analysis coincided with the…
ERIC Educational Resources Information Center
Hsieh, Betina
2017-01-01
Research on social media use in education indicates that network-based connections can enable powerful teacher learning opportunities. Using a connectivist theoretical framework (Siemens, 2005), this study focuses on secondary teacher candidates (TCs) who completed, archived, and reflected upon 1-hour Twitter chats (N = 39) to explore the promise…
2010-06-01
corridor as the “Silk Road.”25 On this trade-road network, merchants, missionaries , and conquistadors carried silk, gems, pottery, tea, paper, medicines...2010). Atwell, Kyle. “Yanukovich: Ukraine will be a ridge between East and West” Atlantic Review, Feb 19, 2010, http://atlanticreview.org/archives
Observing Ocean Ecosystems with Sonar
DOE Office of Scientific and Technical Information (OSTI.GOV)
Matzner, Shari; Maxwell, Adam R.; Ham, Kenneth D.
2016-12-01
We present a real-time processing system for sonar to detect and track animals, and to extract water column biomass statistics in order to facilitate continuous monitoring of an underwater environment. The Nekton Interaction Monitoring System (NIMS) is built to connect to an instrumentation network, where it consumes a real-time stream of sonar data and archives tracking and biomass data.
Data archiving and network system of Bisei Spaceguard center
NASA Astrophysics Data System (ADS)
Terazono, J.-Y.; Asami, A.; Asher, D.; Hashimoto, N.; Nakano, S.; Nishiyama, K.; Oshima, Y.; Umehara, H.; Urata, T.; Yoshikawa, M.; Isobe, S.
2002-09-01
Bisei Spaceguard Center, Japan's first facility for observations of space debris and Near-Earth Objects (NEOs), will produce large amounts of data. In this paper, we describe details of the data transfer and processing system we are now developing. Also we present a software system devoted to the discovery of asteroids mainly by high school students.
Earthquake Monitoring: SeisComp3 at the Swiss National Seismic Network
NASA Astrophysics Data System (ADS)
Clinton, J. F.; Diehl, T.; Cauzzi, C.; Kaestli, P.
2011-12-01
The Swiss Seismological Service (SED) has an ongoing responsibility to improve the seismicity monitoring capability for Switzerland. This is a crucial issue for a country with low background seismicity but where a large M6+ earthquake is expected in the next decades. With over 30 stations with spacing of ~25km, the SED operates one of the densest broadband networks in the world, which is complimented by ~ 50 realtime strong motion stations. The strong motion network is expected to grow with an additional ~80 stations over the next few years. Furthermore, the backbone of the network is complemented by broadband data from surrounding countries and temporary sub-networks for local monitoring of microseismicity (e.g. at geothermal sites). The variety of seismic monitoring responsibilities as well as the anticipated densifications of our network demands highly flexible processing software. We are transitioning all software to the SeisComP3 (SC3) framework. SC3 is a fully featured automated real-time earthquake monitoring software developed by GeoForschungZentrum Potsdam in collaboration with commercial partner, gempa GmbH. It is in its core open source, and becoming a community standard software for earthquake detection and waveform processing for regional and global networks across the globe. SC3 was originally developed for regional and global rapid monitoring of potentially tsunamagenic earthquakes. In order to fulfill the requirements of a local network recording moderate seismicity, SED has tuned configurations and added several modules. In this contribution, we present our SC3 implementation strategy, focusing on the detection and identification of seismicity on different scales. We operate several parallel processing "pipelines" to detect and locate local, regional and global seismicity. Additional pipelines with lower detection thresholds can be defined to monitor seismicity within dense subnets of the network. To be consistent with existing processing procedures, the nonlinloc algorithm was implemented for manual and automatic locations using 1D and 3D velocity models; plugins for improved automatic phase picking and Ml computation were developed; and the graphical user interface for manual review was extended (including pick uncertainty definition; first motion focal mechanisms; interactive review of station magnitude waveforms; full inclusion of strong motion data). SC3 locations are fully compatible with those derived from the existing in-house processing tools and are stored in a database derived from the QuakeML data model. The database is shared with the SED alerting software, which merges origins from both SC3 and external sources in realtime and handles the alerting procedure. With the monitoring software being transitioned to SeisComp3, acquisition, archival and dissemination of SED waveform data now conforms to the seedlink and ArcLink protocols and continuous archives can be accessed via SED and all EIDA (European Integrated Data Archives) web-sites. Further, a SC3 module for waveform parameterisation has been developed, allowing rapid computation of peak values of ground motion and other engineering parameters within minutes of a new event. An output of this module is USGS ShakeMap XML. n minutes of a new event. An output of this module is USGS ShakeMap XML.