Advances in Spatial Data Infrastructure, Acquisition, Analysis, Archiving and Dissemination
NASA Technical Reports Server (NTRS)
Ramapriyan, Hampapuran K.; Rochon, Gilbert L.; Duerr, Ruth; Rank, Robert; Nativi, Stefano; Stocker, Erich Franz
2010-01-01
The authors review recent contributions to the state-of-thescience and benign proliferation of satellite remote sensing, spatial data infrastructure, near-real-time data acquisition, analysis on high performance computing platforms, sapient archiving, multi-modal dissemination and utilization for a wide array of scientific applications. The authors also address advances in Geoinformatics and its growing ubiquity, as evidenced by its inclusion as a focus area within the American Geophysical Union (AGU), European Geosciences Union (EGU), as well as by the evolution of the IEEE Geoscience and Remote Sensing Society's (GRSS) Data Archiving and Distribution Technical Committee (DAD TC).
ERIC Educational Resources Information Center
Miller, Warren; Tanter, Raymond
The International Relations Archive undertakes as its primary goals the acquisition, management and dissemination of international affairs data. The first document enclosed is a copy of the final machine readable codebook prepared for the data from the Political Events Project, 1948-1965. Also included is a copy of the final machine-readable…
NASA Astrophysics Data System (ADS)
Conway, Esther; Waterfall, Alison; Pepler, Sam; Newey, Charles
2015-04-01
In this paper we decribe a business process modelling approach to the integration of exisiting archival activities. We provide a high level overview of existing practice and discuss how procedures can be extended and supported through the description of preservation state. The aim of which is to faciliate the dynamic controlled management of scientific data through its lifecycle. The main types of archival processes considered are: • Management processes that govern the operation of an archive. These management processes include archival governance (preservation state management, selection of archival candidates and strategic management) . • Operational processes that constitute the core activities of the archive which maintain the value of research assets. These operational processes are the acquisition, ingestion, deletion, generation of metadata and preservation actvities, • Supporting processes, which include planning, risk analysis and monitoring of the community/preservation environment. We then proceed by describing the feasability testing of extended risk management and planning procedures which integrate current practices. This was done through the CEDA Archival Format Audit which inspected British Atmospherics Data Centre and National Earth Observation Data Centre Archival holdings. These holdings are extensive, comprising of around 2PB of data and 137 million individual files which were analysed and characterised in terms of format based risk. We are then able to present an overview of the risk burden faced by a large scale archive attempting to maintain the usability of heterogeneous environmental data sets. We conclude by presenting a dynamic data management information model that is capable of describing the preservation state of archival holdings throughout the data lifecycle. We provide discussion of the following core model entities and their relationships: • Aspirational entities, which include Data Entity definitions and their associated Preservation Objectives. • Risk entities, which act as drivers for change within the data lifecycle. These include Acquisitional Risks, Technical Risks, Strategic Risks and External Risks • Plan entities, which detail the actions to bring about change within an archive. These include Acquisition Plans, Preservation Plans and Monitoring plans • The Result entities describe the successful outcomes of the executed plans. These include Acquisitions, Mitigations and Accepted Risks.
Patton, John M.; Ketchum, David C.; Guy, Michelle R.
2015-11-02
This document provides an overview of the capabilities, design, and use cases of the data acquisition and archiving subsystem at the U.S. Geological Survey National Earthquake Information Center. The Edge and Continuous Waveform Buffer software supports the National Earthquake Information Center’s worldwide earthquake monitoring mission in direct station data acquisition, data import, short- and long-term data archiving, data distribution, query services, and playback, among other capabilities. The software design and architecture can be configured to support acquisition and (or) archiving use cases. The software continues to be developed in order to expand the acquisition, storage, and distribution capabilities.
The Telecommunications and Data Acquisition Report
NASA Technical Reports Server (NTRS)
Posner, E. C. (Editor)
1990-01-01
Archival reports on developments in programs managed by the JPL Office of Telecommunications and Data Acquisition (TDA) are provided. Topics covered include: DSN advanced systems (tracking and ground-based navigation; communications, spacecraft-ground; and station control and system technology) and DSN systems implementation (capabilities for existing projects; capabilities for new projects; TDA program management and analysis; and Goldstone solar system radar).
NASA Astrophysics Data System (ADS)
Smith, Edward M.; Wright, Jeffrey; Fontaine, Marc T.; Robinson, Arvin E.
1998-07-01
The Medical Information, Communication and Archive System (MICAS) is a multi-vendor incremental approach to PACS. MICAS is a multi-modality integrated image management system that incorporates the radiology information system (RIS) and radiology image database (RID) with future 'hooks' to other hospital databases. Even though this approach to PACS is more risky than a single-vendor turn-key approach, it offers significant advantages. The vendors involved in the initial phase of MICAS are IDX Corp., ImageLabs, Inc. and Digital Equipment Corp (DEC). The network architecture operates at 100 MBits per sec except between the modalities and the stackable intelligent switch which is used to segment MICAS by modality. Each modality segment contains the acquisition engine for the modality, a temporary archive and one or more diagnostic workstations. All archived studies are available at all workstations, but there is no permanent archive at this time. At present, the RIS vendor is responsible for study acquisition and workflow as well as maintenance of the temporary archive. Management of study acquisition, workflow and the permanent archive will become the responsibility of the archive vendor when the archive is installed in the second quarter of 1998. The modalities currently interfaced to MICAS are MRI, CT and a Howtek film digitizer with Nuclear Medicine and computed radiography (CR) to be added when the permanent archive is installed. There are six dual-monitor diagnostic workstations which use ImageLabs Shared Vision viewer software located in MRI, CT, Nuclear Medicine, musculoskeletal reading areas and two in Radiology's main reading area. One of the major lessons learned to date is that the permanent archive should have been part of the initial MICAS installation and the archive vendor should have been responsible for image acquisition rather than the RIS vendor. Currently an archive vendor is being selected who will be responsible for the management of the archive plus the HIS/RIS interface, image acquisition, modality work list manager and interfacing to the current DICOM viewer software. The next phase of MICAS will include interfacing ultrasound, locating servers outside of the Radiology LAN to support the distribution of images and reports to the clinical floors and physician offices both within and outside of the University of Rochester Medical Center (URMC) campus and the teaching archive.
ERIC Educational Resources Information Center
Dixon, Jennifer J.
2012-01-01
This study explores No Child Left Behind's required timetable for English language learners (ELLs) to reach English language proficiency within five years, as outlined in the Annual Measurable Achievement Outcomes (AMAOs), despite the lack of research evidence to support this as a reasonable expectation. Analysis was conducted on the archived data…
Research and Development in Very Long Baseline Interferometry (VLBI)
NASA Technical Reports Server (NTRS)
Himwich, William E.
2004-01-01
Contents include the following: 1.Observation coordination. 2. Data acquisition system control software. 3. Station support. 4. Correlation, data processing, and analysis. 5. Data distribution and archiving. 6. Technique improvement and research. 7. Computer support.
NASA Technical Reports Server (NTRS)
1993-01-01
The Second International Symposium featured 135 oral presentations in these 12 categories: Future Missions and Operations; System-Level Architectures; Mission-Specific Systems; Mission and Science Planning and Sequencing; Mission Control; Operations Automation and Emerging Technologies; Data Acquisition; Navigation; Operations Support Services; Engineering Data Analysis of Space Vehicle and Ground Systems; Telemetry Processing, Mission Data Management, and Data Archiving; and Operations Management. Topics focused on improvements in the productivity, effectiveness, efficiency, and quality of mission operations, ground systems, and data acquisition. Also emphasized were accomplishments in management of human factors; use of information systems to improve data retrieval, reporting, and archiving; design and implementation of logistics support for mission operations; and the use of telescience and teleoperations.
A Waveform Archiving System for the GE Solar 8000i Bedside Monitor.
Fanelli, Andrea; Jaishankar, Rohan; Filippidis, Aristotelis; Holsapple, James; Heldt, Thomas
2018-01-01
Our objective was to develop, deploy, and test a data-acquisition system for the reliable and robust archiving of high-resolution physiological waveform data from a variety of bedside monitoring devices, including the GE Solar 8000i patient monitor, and for the logging of ancillary clinical and demographic information. The data-acquisition system consists of a computer-based archiving unit and a GE Tram Rac 4A that connects to the GE Solar 8000i monitor. Standard physiological front-end sensors connect directly to the Tram Rac, which serves as a port replicator for the GE monitor and provides access to these waveform signals through an analog data interface. Together with the GE monitoring data streams, we simultaneously collect the cerebral blood flow velocity envelope from a transcranial Doppler ultrasound system and a non-invasive arterial blood pressure waveform along a common time axis. All waveform signals are digitized and archived through a LabView-controlled interface that also allows for the logging of relevant meta-data such as clinical and patient demographic information. The acquisition system was certified for hospital use by the clinical engineering team at Boston Medical Center, Boston, MA, USA. Over a 12-month period, we collected 57 datasets from 11 neuro-ICU patients. The system provided reliable and failure-free waveform archiving. We measured an average temporal drift between waveforms from different monitoring devices of 1 ms every 66 min of recorded data. The waveform acquisition system allows for robust real-time data acquisition, processing, and archiving of waveforms. The temporal drift between waveforms archived from different devices is entirely negligible, even for long-term recording.
Second Annual Conference on Astronomical Data Analysis Software and Systems. Abstracts
NASA Technical Reports Server (NTRS)
1992-01-01
Abstracts from the conference are presented. The topics covered include the following: next generation software systems and languages; databases, catalogs, and archives; user interfaces/visualization; real-time data acquisition/scheduling; and IRAF/STSDAS/PROS status reports.
Image acquisition unit for the Mayo/IBM PACS project
NASA Astrophysics Data System (ADS)
Reardon, Frank J.; Salutz, James R.
1991-07-01
The Mayo Clinic and IBM Rochester, Minnesota, have jointly developed a picture archiving, distribution and viewing system for use with Mayo's CT and MRI imaging modalities. Images are retrieved from the modalities and sent over the Mayo city-wide token ring network to optical storage subsystems for archiving, and to server subsystems for viewing on image review stations. Images may also be retrieved from archive and transmitted back to the modalities. The subsystems that interface to the modalities and communicate to the other components of the system are termed Image Acquisition Units (LAUs). The IAUs are IBM Personal System/2 (PS/2) computers with specially developed software. They operate independently in a network of cooperative subsystems and communicate with the modalities, archive subsystems, image review server subsystems, and a central subsystem that maintains information about the content and location of images. This paper provides a detailed description of the function and design of the Image Acquisition Units.
Remote sensing data acquisition, analysis and archival. Volume 1. Final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stringer, W.J.; Dean, K.G.; Groves, J.E.
1993-03-25
The project specialized in the acquisition and dissemination of satellite imagery and its utilization for case-specific and statistical analyses of offshore environmental conditions, particularly those involving sea ice. During the duration of this contract, 854 Landsat Multispectral Scanner and 2 Landsat Thematic Mapper scenes, 8,576 Advanced Very High Resolution Radiometer images, and 31,000 European, Earth Resources Satellite, Synthetic Aperture Radar images were archived. Direct assistance was provided to eight Minerals Management Service (MMS)-sponsored studies, including analyses of Port Moller circulation, Bowhead whale migration, distribution, population and behavioral studies, Beaufort Sea fisheries, oil spill trajectory model development, and Kasegaluk Lagoon environmentalmore » assessments. In addition, under this Cooperative Agreement several complete studies were undertaken based on analysis of satellite imagery. The topics included: Kasegaluk Lagoon transport, the effect of winter storms on arctic ice, the relationship between ice surface temperatures as measured by buoys and passive microwave imagery, unusual cloud forms following lead-openings, and analyses of Chukchi and Bering sea polynyas.« less
NASA Technical Reports Server (NTRS)
Bilitza, D.; King, J. H.
1988-01-01
The activities and services of the National Space Science data Center (NSSDC) and the World Data Center A for Rockets and Satellites (WDC-A-R and S) are described with special emphasis on ionospheric physics. The present catalog/archive system is explained and future developments are indicated. In addition to the basic data acquisition, archiving, and dissemination functions, ongoing activities include the Central Online Data Directory (CODD), the Coordinated Data Analysis Workshopps (CDAW), the Space Physics Analysis Network (SPAN), advanced data management systems (CD/DIS, NCDS, PLDS), and publication of the NSSDC News, the SPACEWARN Bulletin, and several NSSD reports.
WRATS Integrated Data Acquisition System
NASA Technical Reports Server (NTRS)
Piatak, David J.
2008-01-01
The Wing and Rotor Aeroelastic Test System (WRATS) data acquisition system (DAS) is a 64-channel data acquisition display and analysis system specifically designed for use with the WRATS 1/5-scale V-22 tiltrotor model of the Bell Osprey. It is the primary data acquisition system for experimental aeroelastic testing of the WRATS model for the purpose of characterizing the aeromechanical and aeroelastic stability of prototype tiltrotor configurations. The WRATS DAS was also used during aeroelastic testing of Bell Helicopter Textron s Quad-Tiltrotor (QTR) design concept, a test which received international attention. The LabVIEW-based design is portable and capable of powering and conditioning over 64 channels of dynamic data at sampling rates up to 1,000 Hz. The system includes a 60-second circular data archive, an integrated model swashplate excitation system, a moving block damping application for calculation of whirl flutter mode subcritical damping, a loads and safety monitor, a pilot-control console display, data analysis capabilities, and instrumentation calibration functions. Three networked computers running custom-designed LabVIEW software acquire data through National Instruments data acquisition hardware. The aeroelastic model (see figure) was tested with the DAS at two facilities at NASA Langley, the Transonic Dynamics Tunnel (TDT) and the Rotorcraft Hover Test Facility (RHTF). Because of the need for seamless transition between testing at these facilities, DAS is portable. The software is capable of harmonic analysis of periodic time history data, Fast Fourier Transform calculations, power spectral density calculations, and on-line calibration of test instrumentation. DAS has a circular buffer archive to ensure critical data is not lost in event of model failure/incident, as well as a sample-and-hold capability for phase-correct time history data.
Data archiving and serving system implementation in CLEP's GRAS Core System
NASA Astrophysics Data System (ADS)
Zuo, Wei; Zeng, Xingguo; Zhang, Zhoubin; Geng, Liang; Li, Chunlai
2017-04-01
The Ground Research & Applications System(GRAS) is one of the five systems of China's Lunar Exploration Project(CLEP), it is responsible for data acquisition, processing, management and application, and it is also the operation control center during satellite in-orbit and payload operation management. Chang'E-1, Chang'E-2 and Chang'E-3 have collected abundant lunar exploration data. The aim of this work is to present the implementation of data archiving and Serving in CLEP's GRAS Core System software. This first approach provides a client side API and server side software allowing the creation of a simplified version of CLEPDB data archiving software, and implements all required elements to complete data archiving flow from data acquisition until its persistent storage technology. The client side includes all necessary components that run on devices that acquire or produce data, distributing and streaming to configure remote archiving servers. The server side comprises an archiving service that stores into PDS files all received data. The archiving solution aims at storing data coming for the Data Acquisition Subsystem, the Operation Management Subsystem, the Data Preprocessing Subsystem and the Scientific Application & Research Subsystem. The serving solution aims at serving data for the various business systems, scientific researchers and public users. The data-driven and component clustering methods was adopted in this system, the former is used to solve real-time data archiving and data persistence services; the latter is used to keep the continuous supporting ability of archive and service to new data from Chang'E Mission. Meanwhile, it can save software development cost as well.
A MYSQL-BASED DATA ARCHIVER: PRELIMINARY RESULTS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Matthew Bickley; Christopher Slominski
2008-01-23
Following an evaluation of the archival requirements of the Jefferson Laboratory accelerator’s user community, a prototyping effort was executed to determine if an archiver based on MySQL had sufficient functionality to meet those requirements. This approach was chosen because an archiver based on a relational database enables the development effort to focus on data acquisition and management, letting the database take care of storage, indexing and data consistency. It was clear from the prototype effort that there were no performance impediments to successful implementation of a final system. With our performance concerns addressed, the lab undertook the design and developmentmore » of an operational system. The system is in its operational testing phase now. This paper discusses the archiver system requirements, some of the design choices and their rationale, and presents the acquisition, storage and retrieval performance.« less
Earth observation archive activities at DRA Farnborough
NASA Technical Reports Server (NTRS)
Palmer, M. D.; Williams, J. M.
1993-01-01
Space Sector, Defence Research Agency (DRA), Farnborough have been actively involved in the acquisition and processing of Earth Observation data for over 15 years. During that time an archive of over 20,000 items has been built up. This paper describes the major archive activities, including: operation and maintenance of the main DRA Archive, the development of a prototype Optical Disc Archive System (ODAS), the catalog systems in use at DRA, the UK Processing and Archive Facility for ERS-1 data, and future plans for archiving activities.
NASA Technical Reports Server (NTRS)
Danielsen, Edwin F.; Pfister, Leonhard; Hipskind, R. Stephen; Gaines, Steven E.
1990-01-01
The purpose of this task is the acquisition, distribution, archival, and analysis of data collected during and in support of the Upper Atmospheric Research Program (UARP) field experiments. Meteorological and U2 data from the 1984 Stratosphere-Troposphere Exchange Project (STEP) was analyzed to determine characteristics of internal atmospheric waves. CDROM's containing data from the 1987 STEP, 1987 Airborne Antarctic Ozone Expedition (AAOE), and the 1989 Airborne Arctic Stratospheric Expedition (AASE) were produced for archival and distribution of those data sets. The AASE CDROM contains preliminary data and a final release is planned for February 1990. Comparisons of data from the NASA ER-2 Meteorological Measurement System (MMS) with radar tracking and radiosonde data show good agreement. Planning for a Meteorological Support Facility continues. We are investigating existing and proposed hardware and software to receive, manipulate, and display satellite imagery and standard meteorological analyses, forecasts, and radiosonde data.
Implementation of the Boston University Space Physics Acquisition Center
NASA Technical Reports Server (NTRS)
Spence, Harlan E.
1998-01-01
The tasks carried out during this grant achieved the goals as set forth in the initial proposal. The Boston University Space Physics Acquisition CEnter (BUSPACE) now provides World Wide Web access to data from a large suite of both space-based and ground-based instruments, archived from different missions, experiments, or campaigns in which researchers associated with the Center for Space Physics (CSP) at Boston University have been involved. These archival data sets are in digital form and are valuable for retrospective data analysis studies of magnetospheric as well as ionospheric, thermospheric, and mesospheric physics. We have leveraged our grass-roots effort with the NASA seed money to establish dedicated hardware (computer and hard disk augmentation) and student support to grow and maintain the system. This leveraging of effort now permits easy access by the space physics community to many underutilized, yet important data sets, one example being that of the SCATHA satellite.
ERIC Educational Resources Information Center
Devarrewaere, Anthony; Roelly, Aude
2005-01-01
The Archives Departementales de la Cote-d'Or chose as a priority for its automation plan the acquisition of a search engine, to publish online archival descriptions and the library catalogue. The Archives deliberately opted for a practical approach, using for the encoding of the finding aids an automatic data export from an archival management…
The History of Archives and the History of Science: Comment.
James, Kathryn
2016-03-01
Drawing on Terry Cook's famous challenge to the relationship of historians to the archive, this comment responds to the four preceding Focus essays, offering an examination of the roles, in particular, of acquisition and appraisal, canon formation, and place or location in the relationship that historians of science have with the archive.
The Hydrologic Cycle Distributed Active Archive Center
NASA Technical Reports Server (NTRS)
Hardin, Danny M.; Goodman, H. Michael
1995-01-01
The Marshall Space Flight Center Distributed Active Archive Center in Huntsville, Alabama supports the acquisition, production, archival and dissemination of data relevant to the study of the global hydrologic cycle. This paper describes the Hydrologic Cycle DAAC, surveys its principle data holdings, addresses future growth, and gives information for accessing the data sets.
A comprehensive cost model for NASA data archiving
NASA Technical Reports Server (NTRS)
Green, J. L.; Klenk, K. F.; Treinish, L. A.
1990-01-01
A simple archive cost model has been developed to help predict NASA's archiving costs. The model covers data management activities from the beginning of the mission through launch, acquisition, and support of retrospective users by the long-term archive; it is capable of determining the life cycle costs for archived data depending on how the data need to be managed to meet user requirements. The model, which currently contains 48 equations with a menu-driven user interface, is available for use on an IBM PC or AT.
Planning applications in image analysis
NASA Technical Reports Server (NTRS)
Boddy, Mark; White, Jim; Goldman, Robert; Short, Nick, Jr.
1994-01-01
We describe two interim results from an ongoing effort to automate the acquisition, analysis, archiving, and distribution of satellite earth science data. Both results are applications of Artificial Intelligence planning research to the automatic generation of processing steps for image analysis tasks. First, we have constructed a linear conditional planner (CPed), used to generate conditional processing plans. Second, we have extended an existing hierarchical planning system to make use of durations, resources, and deadlines, thus supporting the automatic generation of processing steps in time and resource-constrained environments.
Monitoring the CMS strip tracker readout system
NASA Astrophysics Data System (ADS)
Mersi, S.; Bainbridge, R.; Baulieu, G.; Bel, S.; Cole, J.; Cripps, N.; Delaere, C.; Drouhin, F.; Fulcher, J.; Giassi, A.; Gross, L.; Hahn, K.; Mirabito, L.; Nikolic, M.; Tkaczyk, S.; Wingham, M.
2008-07-01
The CMS Silicon Strip Tracker at the LHC comprises a sensitive area of approximately 200 m2 and 10 million readout channels. Its data acquisition system is based around a custom analogue front-end chip. Both the control and the readout of the front-end electronics are performed by off-detector VME boards in the counting room, which digitise the raw event data and perform zero-suppression and formatting. The data acquisition system uses the CMS online software framework to configure, control and monitor the hardware components and steer the data acquisition. The first data analysis is performed online within the official CMS reconstruction framework, which provides many services, such as distributed analysis, access to geometry and conditions data, and a Data Quality Monitoring tool based on the online physics reconstruction. The data acquisition monitoring of the Strip Tracker uses both the data acquisition and the reconstruction software frameworks in order to provide real-time feedback to shifters on the operational state of the detector, archiving for later analysis and possibly trigger automatic recovery actions in case of errors. Here we review the proposed architecture of the monitoring system and we describe its software components, which are already in place, the various monitoring streams available, and our experiences of operating and monitoring a large-scale system.
Data acquisition and processing system for the HT-6M tokamak fusion experiment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shu, Y.T.; Liu, G.C.; Pang, J.Q.
1987-08-01
This paper describes a high-speed data acquisition and processing system which has been successfully operated on the HT-6M tokamak fusion experimental device. The system collects, archives and analyzes up to 512 kilobytes of data from each shot of the experiment. A shot lasts 50-150 milliseconds and occurs every 5-10 minutes. The system consists of two PDP-11/24 computer systems. One PDP-11/24 is used for real-time data taking and on-line data analysis. It is based upon five CAMAC crates organized into a parallel branch. Another PDP-11/24 is used for off-line data processing. Both data acquisition software RSX-DAS and data processing software RSX-DAPmore » have modular, multi-tasking and concurrent processing features.« less
Development and implementation of ultrasound picture archiving and communication system
NASA Astrophysics Data System (ADS)
Weinberg, Wolfram S.; Tessler, Franklin N.; Grant, Edward G.; Kangarloo, Hooshang; Huang, H. K.
1990-08-01
The Department of Radiological Sciences at the UCLA School of Medicine is developing an archiving and communication system (PACS) for digitized ultrasound images. In its final stage the system will involve the acquisition and archiving of ultrasound studies from four different locations including the Center for Health Sciences, the Department for Mental Health and the Outpatient Radiology and Endoscopy Departments with a total of 200-250 patient studies per week. The concept comprises two stages of image manipulation for each ultrasound work area. The first station is located close to the examination site and accomodates the acquisition of digital images from up to five ultrasound devices and provides for instantaneous display and primary viewing and image selection. Completed patient studies are transferred to a main workstation for secondary review, further analysis and comparison studies. The review station has an on-line storage capacity of 10,000 images with a resolution of 512x512 8 bit data to allow for immediate retrieval of active patient studies of up to two weeks. The main work stations are connected through the general network and use one central archive for long term storage and a film printer for hardcopy output. First phase development efforts concentrate on the implementation and testing of a system at one location consisting of a number of ultrasound units with video digitizer and network interfaces and a microcomputer workstation as host for the display station with two color monitors, each allowing simultaneous display of four 512x512 images. The discussion emphasizes functionality, performance and acceptance of the system in the clinical environment.
Acquisition, use, and archiving of real-time data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Leach, M.J.; Bernstein, H.J.; Tichler, J.L.
Meteorological information is needed by scientific personnel at Brookhaven National Laboratory (BNL) for various purposes. An automated system, used to acquire, archive, and provide users with weather data, is described. Hardware, software, and some of the examples of the uses of the system are detailed.
The global Landsat archive: Status, consolidation, and direction
Wulder, Michael A.; White, Joanne C.; Loveland, Thomas; Woodcock, Curtis; Belward, Alan; Cohen, Warren B.; Fosnight, Eugene A.; Shaw, Jerad; Masek, Jeffery G.; Roy, David P.
2016-01-01
New and previously unimaginable Landsat applications have been fostered by a policy change in 2008 that made analysis-ready Landsat data free and open access. Since 1972, Landsat has been collecting images of the Earth, with the early years of the program constrained by onboard satellite and ground systems, as well as limitations across the range of required computing, networking, and storage capabilities. Rather than robust on-satellite storage for transmission via high bandwidth downlink to a centralized storage and distribution facility as with Landsat-8, a network of receiving stations, one operated by the U.S. government, the other operated by a community of International Cooperators (ICs), were utilized. ICs paid a fee for the right to receive and distribute Landsat data and over time, more Landsat data was held outside the archive of the United State Geological Survey (USGS) than was held inside, much of it unique. Recognizing the critical value of these data, the USGS began a Landsat Global Archive Consolidation (LGAC) initiative in 2010 to bring these data into a single, universally accessible, centralized global archive, housed at the Earth Resources Observation and Science (EROS) Center in Sioux Falls, South Dakota. The primary LGAC goals are to inventory the data held by ICs, acquire the data, and ingest and apply standard ground station processing to generate an L1T analysis-ready product. As of January 1, 2015 there were 5,532,454 images in the USGS archive. LGAC has contributed approximately 3.2 million of those images, more than doubling the original USGS archive holdings. Moreover, an additional 2.3 million images have been identified to date through the LGAC initiative and are in the process of being added to the archive. The impact of LGAC is significant and, in terms of images in the collection, analogous to that of having had twoadditional Landsat-5 missions. As a result of LGAC, there are regions of the globe that now have markedly improved Landsat data coverage, resulting in an enhanced capacity for mapping, monitoring change, and capturing historic conditions. Although future missions can be planned and implemented, the past cannot be revisited, underscoring the value and enhanced significance of historical Landsat data and the LGAC initiative. The aim of this paper is to report the current status of the global USGS Landsat archive, document the existing and anticipated contributions of LGAC to the archive, and characterize the current acquisitions of Landsat-7 and Landsat-8. Landsat-8 is adding data to the archive at an unprecedented rate as nearly all terrestrial images are now collected. We also offer key lessons learned so far from the LGAC initiative, plus insights regarding other critical elements of the Landsat program looking forward, such as acquisition, continuity, temporal revisit, and the importance of continuing to operationalize the Landsat program.
The Telecommunications and Data Acquisition Report
NASA Technical Reports Server (NTRS)
Posner, E. C. (Editor)
1989-01-01
Archival reports on developments in programs managed by the Jet Propulsion Laboratory's Office of Telecommunications and Data Acquisition are provided. Space communications, radio navigation, radio science, and ground based radio and radio astronomy are discussed. Deep Space Network projects are also discussed.
ERIC Educational Resources Information Center
Griffiths, Jose-Marie; And Others
This document contains validated activities and competencies needed by information professionals working in an archive or museum. The activities and competencies are organized according to the functions which information professionals in archives or museums perform: acquisitions; cataloging/indexing; reference; exhibit management; and…
A Client/Server Architecture for Supporting Science Data Using EPICS Version 4
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dalesio, Leo
2015-04-21
The Phase 1 grant that serves as a precursor to this proposal, prototyped complex storage techniques for high speed structured data that is being produced in accelerator diagnostics and beam line experiments. It demonstrates the technologies that can be used to archive and retrieve complex data structures and provide the performance required by our new accelerators, instrumentations, and detectors. Phase 2 is proposed to develop a high-performance platform for data acquisition and analysis to provide physicists and operators a better understanding of the beam dynamics. This proposal includes developing a platform for reading 109 MHz data at 10 KHz ratesmore » through a multicore front end processor, archiving the data to an archive repository that is then indexed for fast retrieval. The data is then retrieved from this data archive, integrated with the scalar data, to provide data sets to client applications for analysis, use in feedback, and to aid in identifying problem with the instrumentation, plant, beam steering, or model. This development is built on EPICS version 4 , which is being successfully deployed to implement physics applications. Through prior SBIR grants, EPICS version 4 has a solid communication protocol for middle layer services (PVAccess), structured data representation and methods for efficient transportation and access (PVData), an operational hierarchical record environment (JAVA IOC), and prototypes for standard structured data (Normative Types). This work was further developed through project funding to successfully deploy the first service based physics application environment with demonstrated services that provide arbitrary object views, save sets, model, lattice, and unit conversion. Thin client physics applications have been developed in Python that implement quad centering, orbit display, bump control, and slow orbit feedback. This service based architecture has provided a very modular and robust environment that enables commissioning teams to rapidly develop and deploy small scripts that build on powerful services. These services are all built on relational database data stores and scalar data. The work proposed herein, builds on these previous successes to provide data acquisition of high speed data for online analysis clients.« less
Proteomic analysis of formalin-fixed paraffin embedded tissue by MALDI imaging mass spectrometry
Casadonte, Rita; Caprioli, Richard M
2012-01-01
Archived formalin-fixed paraffin-embedded (FFPE) tissue collections represent a valuable informational resource for proteomic studies. Multiple FFPE core biopsies can be assembled in a single block to form tissue microarrays (TMAs). We describe a protocol for analyzing protein in FFPE -TMAs using matrix-assisted laser desorption/ionization (MAL DI) imaging mass spectrometry (IMS). The workflow incorporates an antigen retrieval step following deparaffinization, in situ trypsin digestion, matrix application and then mass spectrometry signal acquisition. The direct analysis of FFPE -TMA tissue using IMS allows direct analysis of multiple tissue samples in a single experiment without extraction and purification of proteins. The advantages of high speed and throughput, easy sample handling and excellent reproducibility make this technology a favorable approach for the proteomic analysis of clinical research cohorts with large sample numbers. For example, TMA analysis of 300 FFPE cores would typically require 6 h of total time through data acquisition, not including data analysis. PMID:22011652
NASA Astrophysics Data System (ADS)
Avolio, G.; D'Ascanio, M.; Lehmann-Miotto, G.; Soloviev, I.
2017-10-01
The Trigger and Data Acquisition (TDAQ) system of the ATLAS detector at the Large Hadron Collider at CERN is composed of a large number of distributed hardware and software components (about 3000 computers and more than 25000 applications) which, in a coordinated manner, provide the data-taking functionality of the overall system. During data taking runs, a huge flow of operational data is produced in order to constantly monitor the system and allow proper detection of anomalies or misbehaviours. In the ATLAS trigger and data acquisition system, operational data are archived and made available to applications by the P-BEAST (Persistent Back-End for the Atlas Information System of TDAQ) service, implementing a custom time-series database. The possibility to efficiently visualize both realtime and historical operational data is a great asset facilitating both online identification of problems and post-mortem analysis. This paper will present a web-based solution developed to achieve such a goal: the solution leverages the flexibility of the P-BEAST archiver to retrieve data, and exploits the versatility of the Grafana dashboard builder to offer a very rich user experience. Additionally, particular attention will be given to the way some technical challenges (like the efficient visualization of a huge amount of data and the integration of the P-BEAST data source in Grafana) have been faced and solved.
ERIC Educational Resources Information Center
Organization of American States, Washington, DC.
The resolutions of the 15th Seminar on the Acquisition of Latin American Library Materials (SALALM) cover the following topics: (1) Acquisitions Matters; (2) Reproduction of Library Materials and Computer Technology; (3) Archives and Manuscripts; (4) Bibliographic Matters; (5) Library Organization, Personnel and Research; (6) SALALM Organizational…
Between Oais and Agile a Dynamic Data Management Approach
NASA Astrophysics Data System (ADS)
Bennett, V. L.; Conway, E. A.; Waterfall, A. M.; Pepler, S.
2015-12-01
In this paper we decribe an approach to the integration of existing archival activities which lies between compliance with the more rigid OAIS/TRAC standards and a more flexible "Agile" approach to the curation and preservation of Earth Observation data. We provide a high level overview of existing practice and discuss how these procedures can be extended and supported through the description of preservation state. The aim of which is to facilitate the dynamic controlled management of scientific data through its lifecycle. While processes are considered they are not statically defined but rather driven by human interactions in the form of risk management/review procedure that produce actionable plans, which are responsive to change. We then proceed by describing the feasibility testing of extended risk management and planning procedures which integrate current practices. This was done through the CEDA Archival Format Audit which inspected British Atmospheric Data Centre and NERC Earth Observation Data Centre Archival holdings. These holdings are extensive, comprising of around 2 Petabytes of data and 137 million individual files, which were analysed and characterised in terms of format, based risk. We are then able to present an overview of the format based risk burden faced by a large scale archive attempting to maintain the usability of heterogeneous environmental data sets We continue by presenting a dynamic data management information model and provide discussion of the following core model entities and their relationships: Aspirational entities, which include Data Entity definitions and their associated Preservation Objectives. Risk entities, which act as drivers for change within the data lifecycle. These include Acquisitional Risks, Technical Risks, Strategic Risks and External Risks Plan entities, which detail the actions to bring about change within an archive. These include Acquisition Plans, Preservation Plans and Monitoring plans which support responsive interactions with the community. The Result entities describe the outcomes of the plans. This includes Acquisitions. Mitigations and Accepted Risks. With risk acceptance permitting imperfect but functional solutions that can be realistically supported within an archives resource levels
USGS Releases Landsat Orthorectified State Mosaics
,
2005-01-01
The U.S. Geological Survey (USGS) National Remote Sensing Data Archive, located at the USGS Center for Earth Resources Observation and Science (EROS) in Sioux Falls, South Dakota, maintains the Landsat orthorectified data archive. Within the archive are Landsat Enhanced Thematic Mapper Plus (ETM+) data that have been pansharpened and orthorectified by the Earth Satellite Corporation. This imagery has acquisition dates ranging from 1999 to 2001 and was created to provide users with access to quality-screened, high-resolution satellite images with global coverage over the Earth's landmasses.
Algorithms and software for U-Pb geochronology by LA-ICPMS
NASA Astrophysics Data System (ADS)
McLean, Noah M.; Bowring, James F.; Gehrels, George
2016-07-01
The past 15 years have produced numerous innovations in geochronology, including experimental methods, instrumentation, and software that are revolutionizing the acquisition and application of geochronological data. For example, exciting advances are being driven by Laser-Ablation ICP Mass Spectrometry (LA-ICPMS), which allows for rapid determination of U-Th-Pb ages with 10s of micrometer-scale spatial resolution. This method has become the most commonly applied tool for dating zircons, constraining a host of geological problems. The LA-ICPMS community is now faced with archiving these data with associated analytical results and, more importantly, ensuring that data meet the highest standards for precision and accuracy and that interlaboratory biases are minimized. However, there is little consensus with regard to analytical strategies and data reduction protocols for LA-ICPMS geochronology. The result is systematic interlaboratory bias and both underestimation and overestimation of uncertainties on calculated dates that, in turn, decrease the value of data in repositories such as EarthChem, which archives data and analytical results from participating laboratories. We present free open-source software that implements new algorithms for evaluating and resolving many of these discrepancies. This solution is the result of a collaborative effort to extend the U-Pb_Redux software for the ID-TIMS community to the LA-ICPMS community. Now named ET_Redux, our new software automates the analytical and scientific workflows of data acquisition, statistical filtering, data analysis and interpretation, publication, community-based archiving, and the compilation and comparison of data from different laboratories to support collaborative science.
The Acquisition and Management of Electronic Resources: Can Use Justify Cost?
ERIC Educational Resources Information Center
Koehn, Shona L.; Hawamdeh, Suliman
2010-01-01
As library collections increasingly become digital, libraries are faced with many challenges regarding the acquisition and management of electronic resources. Some of these challenges include copyright and fair use, the first-sale doctrine, licensing versus ownership, digital preservation, long-term archiving, and, most important, the issue of…
The Telecommunications and Data Acquisition Report
NASA Technical Reports Server (NTRS)
Posner, E. C. (Editor)
1992-01-01
Archival reports on developments in programs managed by JPL's Office of Telecommunications and Data Acquisition (TDA) are provided. In space communications, radio navigation, radio science, and ground-based radio and radar astronomy, it reports on activities of the Deep Space Network (DSN) in planning, in supporting research and technology, in implementation, and in operations. Also included is standards activity at JPL for space data and information. In the search for extraterrestrial intelligence (SETI), the TDA Progress Report reports on implementation and operations for searching the microwave spectrum. Topics covered include tracking and ground-based navigation; communications, spacecraft-ground; station control and system technology; capabilities for new projects; network upgrade and sustaining; network operations and operations support; and TDA program management and analysis.
VETA x ray data acquisition and control system
NASA Technical Reports Server (NTRS)
Brissenden, Roger J. V.; Jones, Mark T.; Ljungberg, Malin; Nguyen, Dan T.; Roll, John B., Jr.
1992-01-01
We describe the X-ray Data Acquisition and Control System (XDACS) used together with the X-ray Detection System (XDS) to characterize the X-ray image during testing of the AXAF P1/H1 mirror pair at the MSFC X-ray Calibration Facility. A variety of X-ray data were acquired, analyzed and archived during the testing including: mirror alignment, encircled energy, effective area, point spread function, system housekeeping and proportional counter window uniformity data. The system architecture is presented with emphasis placed on key features that include a layered UNIX tool approach, dedicated subsystem controllers, real-time X-window displays, flexibility in combining tools, network connectivity and system extensibility. The VETA test data archive is also described.
LDCM Ground System. Network Lesson Learned
NASA Technical Reports Server (NTRS)
Gal-Edd, Jonathan
2010-01-01
This slide presentation reviews the Landsat Data Continuity Mission (LDCM) and the lessons learned in implementing the network that was assembled to allow for the acquisition, archiving and distribution of the data from the Landsat mission. The objective of the LDCM is to continue the acquisition, archiving, and distribution of moderate-resolution multispectral imagery affording global, synoptic, and repetitive coverage of the earth's land surface at a scale where natural and human-induced changes can be detected, differentiated, characterized, and monitored over time. It includes a review of the ground network, including a block diagram of the ground network elements (GNE) and a review of the RF design and testing. Also included is a listing of the lessons learned.
MECDAS: A distributed data acquisition system for experiments at MAMI
NASA Astrophysics Data System (ADS)
Krygier, K. W.; Merle, K.
1994-02-01
For the coincidence experiments with the three spectrometer setup at MAMI an experiment control and data acquisition system has been built and was put successfully into final operation in 1992. MECDAS is designed as a distributed system using communication via Ethernet and optical links. As the front end, VME bus systems are used for real time purposes and direct hardware access via CAMAC, Fastbus or VMEbus. RISC workstations running UNIX are used for monitoring, data archiving and online and offline analysis of the experiment. MECDAS consists of several fixed programs and libraries, but large parts of readout and analysis can be configured by the user. Experiment specific configuration files are used to generate efficient and powerful code well adapted to special problems without additional programming. The experiment description is added to the raw collection of partially analyzed data to get self-descriptive data files.
A new archival infrastructure for highly-structured astronomical data
NASA Astrophysics Data System (ADS)
Dovgan, Erik; Knapic, Cristina; Sponza, Massimo; Smareglia, Riccardo
2018-03-01
With the advent of the 2020 Radio Astronomy Telescopes era, the amount and format of the radioastronomical data is becoming a massive and performance-critical challenge. Such an evolution of data models and data formats require new data archiving techniques that allow massive and fast storage of data that are at the same time also efficiently processed. A useful expertise for efficient archiviation has been obtained through data archiving of Medicina and Noto Radio Telescopes. The presented archival infrastructure named the Radio Archive stores and handles various formats, such as FITS, MBFITS, and VLBI's XML, which includes description and ancillary files. The modeling and architecture of the archive fulfill all the requirements of both data persistence and easy data discovery and exploitation. The presented archive already complies with the Virtual Observatory directives, therefore future service implementations will also be VO compliant. This article presents the Radio Archive services and tools, from the data acquisition to the end-user data utilization.
NASA Technical Reports Server (NTRS)
2008-01-01
The Aquarius Radiometer, a subsystem of the Aquarius Instrument required a data acquisition ground system to support calibration and radiometer performance assessment. To support calibration and compose performance assessments, we developed an automated system which uploaded raw data to a ftp server and saved raw and processed data to a database. This paper details the overall functionalities of the Aquarius Instrument Science Data System (ISDS) and the individual electrical ground support equipment (EGSE) which produced data files that were infused into the ISDS. Real time EGSEs include an ICDS Simulator, Calibration GSE, Labview controlled power supply, and a chamber data acquisition system. ICDS Simulator serves as a test conductor primary workstation, collecting radiometer housekeeping (HK) and science data and passing commands and HK telemetry collection request to the radiometer. Calibration GSE (Radiometer Active Test Source) provides source choice from multiple targets for the radiometer external calibration. Power Supply GSE, controlled by labview, provides real time voltage and current monitoring of the radiometer. And finally the chamber data acquisition system produces data reflecting chamber vacuum pressure, thermistor temperatures, AVG and watts. Each GSE system produce text based data files every two to six minutes and automatically copies the data files to the Central Archiver PC. The Archiver PC stores the data files, schedules automated uploads of these files to an external FTP server, and accepts request to copy all data files to the ISDS for offline data processing and analysis. Aquarius Radiometer ISDS contains PHP and MATLab programs to parse, process and save all data to a MySQL database. Analysis tools (MATLab programs) in the ISDS system are capable of displaying radiometer science, telemetry and auxiliary data in near real time as well as performing data analysis and producing automated performance assessment reports of the Aquarius Radiometer.
Larsen, Dana M.
1993-01-01
The EROS Data Center has managed to National Satellite Land Remote Sensing Data Archive's (NSLRSDA) Landsat data since 1972. The NSLRSDA includes Landsat MSS data from 1972 through 1991 and T M data from 1982 through 1993. In response to many requests from multi-disciplined users for an enhanced insight into the availability and volume of Landsat data over specific worldwide land areas, numerous world plots and corresponding statical overviews have been prepared. These presentations include information related to image quality, cloud cover, various types of data overage (i.e. regions, countries, path, rows), acquisition station coverage areas, various archive media formats (i.e. wide band video tapes, computer compatible tapes, high density tapes, etc.) and acquisition time periods (i.e. years, seasons). Plans are to publish this information in a paper sample booklet at the Pecora 12 Symposium, in a USGS circular and on a Landsat CD-ROM; the data will be also be incorporated into GLIS.
2013-05-01
contract or a PhD di sse rtation typically are a " proo f- of-concept" code base that can onl y read a single set of inputs and are not designed ...AFRL-RX-WP-TR-2013-0210 COLLABORATIVE RESEARCH AND DEVELOPMENT (CR&D) III Task Order 0090: Image Processing Framework: From...public release; distribution unlimited. See additional restrictions described on inside pages. STINFO COPY AIR FORCE RESEARCH LABORATORY
Weld analysis and control system
NASA Technical Reports Server (NTRS)
Kennedy, Larry Z. (Inventor); Rodgers, Michael H. (Inventor); Powell, Bradley W. (Inventor); Burroughs, Ivan A. (Inventor); Goode, K. Wayne (Inventor)
1994-01-01
The invention is a Weld Analysis and Control System developed for active weld system control through real time weld data acquisition. Closed-loop control is based on analysis of weld system parameters and weld geometry. The system is adapted for use with automated welding apparatus having a weld controller which is capable of active electronic control of all aspects of a welding operation. Enhanced graphics and data displays are provided for post-weld analysis. The system provides parameter acquisition, including seam location which is acquired for active torch cross-seam positioning. Torch stand-off is also monitored for control. Weld bead and parent surface geometrical parameters are acquired as an indication of weld quality. These parameters include mismatch, peaking, undercut, underfill, crown height, weld width, puddle diameter, and other measurable information about the weld puddle regions, such as puddle symmetry, etc. These parameters provide a basis for active control as well as post-weld quality analysis and verification. Weld system parameters, such as voltage, current and wire feed rate, are also monitored and archived for correlation with quality parameters.
AppEEARS: A Simple Tool that Eases Complex Data Integration and Visualization Challenges for Users
NASA Astrophysics Data System (ADS)
Maiersperger, T.
2017-12-01
The Application for Extracting and Exploring Analysis-Ready Samples (AppEEARS) offers a simple and efficient way to perform discovery, processing, visualization, and acquisition across large quantities and varieties of Earth science data. AppEEARS brings significant value to a very broad array of user communities by 1) significantly reducing data volumes, at-archive, based on user-defined space-time-variable subsets, 2) promoting interoperability across a wide variety of datasets via format and coordinate reference system harmonization, 3) increasing the velocity of both data analysis and insight by providing analysis-ready data packages and by allowing interactive visual exploration of those packages, and 4) ensuring veracity by making data quality measures more apparent and usable and by providing standards-based metadata and processing provenance. Development and operation of AppEEARS is led by the National Aeronautics and Space Administration (NASA) Land Processes Distributed Active Archive Center (LP DAAC). The LP DAAC also partners with several other archives to extend the capability across a larger federation of geospatial data providers. Over one hundred datasets are currently available, covering a diversity of variables including land cover, population, elevation, vegetation indices, and land surface temperature. Many hundreds of users have already used this new web-based capability to make the complex tasks of data integration and visualization much simpler and more efficient.
The Telecommunications and Data Acquisition Report
NASA Technical Reports Server (NTRS)
Posner, E. C. (Editor)
1989-01-01
Archival reports on developments in programs managed by JPL's Office of Telecommunications and Data Acquisition (TDA) are presented. Activities of the Deep Space Network (DSN) and its associated Ground Communications Facility (GCF) related to DSN advanced systems, systems implementation, and DSN operations are addressed. In addition, recent developments in the NASA SETI (Search for Extraterrestrial Intelligence) sky survey are summarized.
The Telecommunications and Data Acquisition Report
NASA Technical Reports Server (NTRS)
Posner, E. C. (Editor)
1983-01-01
Archival reports on developments in programs managed by JPL's office of Telecommunications and Data Acquisition (TDA) are presented. In space communications, radio navigation, radio science, and ground-based radio astronomy, it reports on activities of the Deep Space Network (DSN) and its associated Ground Communications Facility (GCF) in planning, in supporting research and technology, in implementation, and in operations.
Using and Distributing Spaceflight Data: The Johnson Space Center Life Sciences Data Archive
NASA Technical Reports Server (NTRS)
Cardenas, J. A.; Buckey, J. C.; Turner, J. N.; White, T. S.; Havelka,J. A.
1995-01-01
Life sciences data collected before, during and after spaceflight are valuable and often irreplaceable. The Johnson Space Center Life is hard to find, and much of the data (e.g. Sciences Data Archive has been designed to provide researchers, engineers, managers and educators interactive access to information about and data from human spaceflight experiments. The archive system consists of a Data Acquisition System, Database Management System, CD-ROM Mastering System and Catalog Information System (CIS). The catalog information system is the heart of the archive. The CIS provides detailed experiment descriptions (both written and as QuickTime movies), hardware descriptions, hardware images, documents, and data. An initial evaluation of the archive at a scientific meeting showed that 88% of those who evaluated the catalog want to use the system when completed. The majority of the evaluators found the archive flexible, satisfying and easy to use. We conclude that the data archive effectively provides key life sciences data to interested users.
Internet based ECG medical information system.
James, D A; Rowlands, D; Mahnovetski, R; Channells, J; Cutmore, T
2003-03-01
Physiological monitoring of humans for medical applications is well established and ready to be adapted to the Internet. This paper describes the implementation of a Medical Information System (MIS-ECG system) incorporating an Internet based ECG acquisition device. Traditionally clinical monitoring of ECG is largely a labour intensive process with data being typically stored on paper. Until recently, ECG monitoring applications have also been constrained somewhat by the size of the equipment required. Today's technology enables large and fixed hospital monitoring systems to be replaced by small portable devices. With an increasing emphasis on health management a truly integrated information system for the acquisition, analysis, patient particulars and archiving is now a realistic possibility. This paper describes recent Internet and technological advances and presents the design and testing of the MIS-ECG system that utilises those advances.
Visual analytics for semantic queries of TerraSAR-X image content
NASA Astrophysics Data System (ADS)
Espinoza-Molina, Daniela; Alonso, Kevin; Datcu, Mihai
2015-10-01
With the continuous image product acquisition of satellite missions, the size of the image archives is considerably increasing every day as well as the variety and complexity of their content, surpassing the end-user capacity to analyse and exploit them. Advances in the image retrieval field have contributed to the development of tools for interactive exploration and extraction of the images from huge archives using different parameters like metadata, key-words, and basic image descriptors. Even though we count on more powerful tools for automated image retrieval and data analysis, we still face the problem of understanding and analyzing the results. Thus, a systematic computational analysis of these results is required in order to provide to the end-user a summary of the archive content in comprehensible terms. In this context, visual analytics combines automated analysis with interactive visualizations analysis techniques for an effective understanding, reasoning and decision making on the basis of very large and complex datasets. Moreover, currently several researches are focused on associating the content of the images with semantic definitions for describing the data in a format to be easily understood by the end-user. In this paper, we present our approach for computing visual analytics and semantically querying the TerraSAR-X archive. Our approach is mainly composed of four steps: 1) the generation of a data model that explains the information contained in a TerraSAR-X product. The model is formed by primitive descriptors and metadata entries, 2) the storage of this model in a database system, 3) the semantic definition of the image content based on machine learning algorithms and relevance feedback, and 4) querying the image archive using semantic descriptors as query parameters and computing the statistical analysis of the query results. The experimental results shows that with the help of visual analytics and semantic definitions we are able to explain the image content using semantic terms and the relations between them answering questions such as what is the percentage of urban area in a region? or what is the distribution of water bodies in a city?
Getting from then to now: Sustaining the Lesbian Herstory Archives as a lesbian organization.
Smith-Cruz, Shawn ta; Rando, Flavia; Corbman, Rachel; Edel, Deborah; Gwenwald, Morgan; Nestle, Joan; Thistlethwaite, Polly
2016-01-01
This article is a compilation of six narratives written by collective members of the volunteer-run Lesbian Herstory Archives, the oldest and largest collection of lesbian material in the world. Narratives draw on a yearlong series of conversations, which culminated in a panel discussion at the 40th Anniversary celebration. Authors' narratives detail the significance of the Lesbian Herstory Archives as a successful and sustainable lesbian organization. Topics covered span four decades and include: the organization's history and practice, founding and activism, the acquisition of the current space, community engagement, and processing of special collections.
Cardio-PACs: a new opportunity
NASA Astrophysics Data System (ADS)
Heupler, Frederick A., Jr.; Thomas, James D.; Blume, Hartwig R.; Cecil, Robert A.; Heisler, Mary
2000-05-01
It is now possible to replace film-based image management in the cardiac catheterization laboratory with a Cardiology Picture Archiving and Communication System (Cardio-PACS) based on digital imaging technology. The first step in the conversion process is installation of a digital image acquisition system that is capable of generating high-quality DICOM-compatible images. The next three steps, which are the subject of this presentation, involve image display, distribution, and storage. Clinical requirements and associated cost considerations for these three steps are listed below: Image display: (1) Image quality equal to film, with DICOM format, lossless compression, image processing, desktop PC-based with color monitor, and physician-friendly imaging software; (2) Performance specifications include: acquire 30 frames/sec; replay 15 frames/sec; access to file server 5 seconds, and to archive 5 minutes; (3) Compatibility of image file, transmission, and processing formats; (4) Image manipulation: brightness, contrast, gray scale, zoom, biplane display, and quantification; (5) User-friendly control of image review. Image distribution: (1) Standard IP-based network between cardiac catheterization laboratories, file server, long-term archive, review stations, and remote sites; (2) Non-proprietary formats; (3) Bidirectional distribution. Image storage: (1) CD-ROM vs disk vs tape; (2) Verification of data integrity; (3) User-designated storage capacity for catheterization laboratory, file server, long-term archive. Costs: (1) Image acquisition equipment, file server, long-term archive; (2) Network infrastructure; (3) Review stations and software; (4) Maintenance and administration; (5) Future upgrades and expansion; (6) Personnel.
Middleware for Plug and Play Integration of Heterogeneous Sensor Resources into the Sensor Web
Toma, Daniel M.; Jirka, Simon; Del Río, Joaquín
2017-01-01
The study of global phenomena requires the combination of a considerable amount of data coming from different sources, acquired by different observation platforms and managed by institutions working in different scientific fields. Merging this data to provide extensive and complete data sets to monitor the long-term, global changes of our oceans is a major challenge. The data acquisition and data archival procedures usually vary significantly depending on the acquisition platform. This lack of standardization ultimately leads to information silos, preventing the data to be effectively shared across different scientific communities. In the past years, important steps have been taken in order to improve both standardization and interoperability, such as the Open Geospatial Consortium’s Sensor Web Enablement (SWE) framework. Within this framework, standardized models and interfaces to archive, access and visualize the data from heterogeneous sensor resources have been proposed. However, due to the wide variety of software and hardware architectures presented by marine sensors and marine observation platforms, there is still a lack of uniform procedures to integrate sensors into existing SWE-based data infrastructures. In this work, a framework aimed to enable sensor plug and play integration into existing SWE-based data infrastructures is presented. First, an analysis of the operations required to automatically identify, configure and operate a sensor are analysed. Then, the metadata required for these operations is structured in a standard way. Afterwards, a modular, plug and play, SWE-based acquisition chain is proposed. Finally different use cases for this framework are presented. PMID:29244732
NASA Technical Reports Server (NTRS)
2003-01-01
In order to rapidly and efficiently grow crystals, tools were needed to automatically identify and analyze the growing process of protein crystals. To meet this need, Diversified Scientific, Inc. (DSI), with the support of a Small Business Innovation Research (SBIR) contract from NASA s Marshall Space Flight Center, developed CrystalScore(trademark), the first automated image acquisition, analysis, and archiving system designed specifically for the macromolecular crystal growing community. It offers automated hardware control, image and data archiving, image processing, a searchable database, and surface plotting of experimental data. CrystalScore is currently being used by numerous pharmaceutical companies and academic and nonprofit research centers. DSI, located in Birmingham, Alabama, was awarded the patent Method for acquiring, storing, and analyzing crystal images on March 4, 2003. Another DSI product made possible by Marshall SBIR funding is VaporPro(trademark), a unique, comprehensive system that allows for the automated control of vapor diffusion for crystallization experiments.
BOREAS Level 3-b AVHRR-LAC Imagery: Scaled At-sensor Radiance in LGSOWG Format
NASA Technical Reports Server (NTRS)
Hall, Forrest G. (Editor); Nickeson, Jaime; Newcomer, Jeffrey A.; Cihlar, Josef
2000-01-01
The BOREAS Staff Science Satellite Data Acquisition Program focused on providing the research teams with the remotely sensed satellite data products they needed to compare and spatially extend point results. Data acquired from the AVHRR instrument on the NOAA-9, -11, -12, and -14 satellites were processed and archived for the BOREAS region by the MRSC and BORIS. The data were acquired by CCRS and were provided for use by BOREAS researchers. A few winter acquisitions are available, but the archive contains primarily growing season imagery. These gridded, at-sensor radiance image data cover the period of 30-Jan-1994 to 18-Sep-1996. Geographically, the data cover the entire 1,000-km x 1,000-km BOREAS region. The data are stored in binary image format files.
NASA Astrophysics Data System (ADS)
Mattioli, G. S.; Linde, A. T.; Sacks, I. S.; Malin, P. E.; Shalev, E.; Elsworth, D.; Hidayat, D.; Voight, B.; Young, S. R.; Dunkley, P. N.; Herd, R.; Norton, G.
2003-12-01
The CALIPSO Project (Caribbean Andesite Lava Island-volcano Precision Seismo-geodetic Observatory) has greatly enhanced the monitoring and scientific infrastructure at the Soufriere Hills Volcano, Montserrat with the recent installation of an integrated array of borehole and surface geophysical instrumentation at four sites. Each site was designed to be sufficiently hardened to withstand extreme meteorological events (e.g. hurricanes) and only require minimum routine maintenance over an expected observatory lifespan of >30 y. The sensor package at each site includes: a single-component, very broad band, Sacks-Evertson strainmeter, a three-component seismometer ( ˜Hz to 1 kHz), a Pinnacle Technologies series 5000 tiltmeter, and a surface Ashtech u-Z CGPS station with choke ring antenna, SCIGN mount and radome. This instrument package is similar to that envisioned by the Plate Boundary Observatory for deployment on EarthScope target volcanoes in western North America and thus the CALIPSO Project may be considered a prototype PBO installation with real field testing on a very active and dangerous volcano. Borehole sites were installed in series and data acquisition began immediately after the sensors were grouted into position at 200 m depth, with the first completed at Trants (5.8 km from dome) in 12-02, then Air Studios (5.2 km), Geralds (9.4 km), and Olveston (7.0 km) in 3-03. Analog data from the strainmeter (50 Hz sync) and seismometer (200 Hz) were initially digitized and locally archived using RefTek 72A-07 data acquisition systems (DAS) on loan from the PASSCAL instrument pool. Data were downloaded manually to a laptop approximately every month from initial installation until August 2003, when new systems were installed. Approximately 0.2 Tb of raw data in SEGY format have already been acquired and are currently archived at UARK for analysis by the CALIPSO science team. The July 12th dome collapse and vulcanian explosion events were recorded at 3 of the 4 sites. Steel reinforced, poured concrete crypts were constructed to house the surface instruments, data acquisition, telemetry components, and backup battery array with sufficient power to last 10 d without recharging. The central, cross-braced column of the crypt also functions as the monument for the CGPS antenna, which is coupled to a bedrock-grouted 1.5" steel pipe using a precision SCIGN level. In August 2003, the original temporary DAS were replaced with Quanterra Q330 six channel 24 bit systems equipped with PB14 digital packet balers, which can locally buffer up to 20 Gb of strain and seismic data in MSEED packets. All instruments are linked together via a cat 5 IP LAN and data are telemetered from each remote using a single FreeWave FGR-115RE ethernet radio bridge, and where necessary repeater, to the Montserrat Volcano Observatory. Here they are cached prior to transmission via a VPN to UARK for final archival. Detailed schematics and a photo archive are available online.
NASA Astrophysics Data System (ADS)
Gopu, Arvind; Hayashi, Soichi; Young, Michael D.; Harbeck, Daniel R.; Boroson, Todd; Liu, Wilson; Kotulla, Ralf; Shaw, Richard; Henschel, Robert; Rajagopal, Jayadev; Stobie, Elizabeth; Knezek, Patricia; Martin, R. Pierre; Archbold, Kevin
2014-07-01
The One Degree Imager-Portal, Pipeline, and Archive (ODI-PPA) is a web science gateway that provides astronomers a modern web interface that acts as a single point of access to their data, and rich computational and visualization capabilities. Its goal is to support scientists in handling complex data sets, and to enhance WIYN Observatory's scientific productivity beyond data acquisition on its 3.5m telescope. ODI-PPA is designed, with periodic user feedback, to be a compute archive that has built-in frameworks including: (1) Collections that allow an astronomer to create logical collations of data products intended for publication, further research, instructional purposes, or to execute data processing tasks (2) Image Explorer and Source Explorer, which together enable real-time interactive visual analysis of massive astronomical data products within an HTML5 capable web browser, and overlaid standard catalog and Source Extractor-generated source markers (3) Workflow framework which enables rapid integration of data processing pipelines on an associated compute cluster and users to request such pipelines to be executed on their data via custom user interfaces. ODI-PPA is made up of several light-weight services connected by a message bus; the web portal built using Twitter/Bootstrap, AngularJS and jQuery JavaScript libraries, and backend services written in PHP (using the Zend framework) and Python; it leverages supercomputing and storage resources at Indiana University. ODI-PPA is designed to be reconfigurable for use in other science domains with large and complex datasets, including an ongoing offshoot project for electron microscopy data.
Acoustic inspection of concrete bridge decks
NASA Astrophysics Data System (ADS)
Henderson, Mark E.; Dion, Gary N.; Costley, R. Daniel
1999-02-01
The determination of concrete integrity, especially in concrete bridge decks, is of extreme importance. Current systems for testing concrete structures are expensive, slow, or tedious. State of the art systems use ground penetrating radar, but they have inherent problems especially with ghosting and signal signature overlap. The older method of locating delaminations in bridge decks involves either tapping on the surface with a hammer or metal rod, or dragging a chain-bar across the bridge deck. Both methods require a `calibrated' ear to determine the difference between good sections and bad sections of concrete. As a consequence, the method is highly subjective, different from person to person and even day to day for a given person. In addition, archival of such data is impractical, or at least improbable, in most situations. The Diagnostic Instrumentation and Analysis Laboratory has constructed an instrument that implements the chain-drag method of concrete inspection. The system is capable of real-time analysis of recorded signals, archival of processed data, and high-speed data acquisition so that post-processing of the data is possible for either research purposes or for listening to the recorded signals.
Picture archiving and communication systems (PACS).
Gamsu, Gordon; Perez, Enrico
2003-07-01
Over the past 2 decades, groups of computer scientists, electronic design engineers, and physicians, in universities and industry, have worked to achieve an electronic environment for the practice of medicine and radiology. The radiology component of this revolution is often called PACS (picture archiving and communication systems). More recently it has become evident that the efficiencies and cost savings of PACS are realized when they are part of an enterprise-wide electronic medical record. The installation of PACS requires careful planning by all the various stakeholds over many months prior to installation. All of the users must be aware of the initial disruption that will occur as they become familiar with the systems. Modern fourth generation PACS is linked to radiology and hospital information systems. The PACS consist of electronic acquisition sites-a robust network intelligently managed by a server, multiple viewing sites, and an archive. The details of how these are linked and their workflow analysis determines the success of PACS. PACS evolves over time, components are frequently replaced, and so the users must expect continuous learning about new updates and improved functionality. The digital medical revolution is rapidly being adopted in many medical centers, improving patient care and the success of the institution.
Missile Defense Acquisition: Failure Is Not An Option
2016-01-26
Missile Defense Acquisition: Failure is Not an Option 8 capabilities. Retired Marine General James Mattis ’ renowned quote rings true, “The enemy...american-missile-defense-why-failure-is- an-option. 18 Vago Muradian, “Interview: Gen. James Mattis , Commander, U.S. Joint Forces Command,” 23 May...2010, http://archive.defensenews.com/article/20100523/DEFFEAT03/5230301/Gen- James - Mattis . 19 Institute for Defense Analyses, p. II-3. 20 Missile
Status of worldwide Landsat archive
Warriner, Howard W.
1987-01-01
In cooperation with the International Landsat community, and through the Landsat Technical Working Group (LTWG), NOAA is assembling information about the status of the Worldwide Landsat Archive. During LTWG 9, member nations agreed to participate in a survey of International Landsat data holding and of their archive experiences with Landsat data. The goal of the effort was two-fold; one, to document the Landsat archive to date, and, two, to ensure that specific nations' experience with long-term Landsat archival problems were available to others. The survey requested details such as amount of data held, the format of the archive holdings by Spacecraft/Sensor, and acquisition years; the estimated costs to accumulated process, and replace the data (if necessary); the storage space required, and any member nation's plans that would establish the insurance of continuing quality. As a group, the LTWG nations are concerned about the characteristics and reliability of long-term magnetic media storage. Each nation's experience with older data retrieval is solicited in the survey. This information will allow nations to anticipate and plan for required changes to their archival holdings. Also solicited were reports of any upgrades to a nation's archival system that are currently planned and all results of attempts to reduce archive holdings including methodology, current status, and the planned access rates and product support that are anticipated for responding to future archival usage.
Carneggie, David M.; Metz, Gary G.; Draeger, William C.; Thompson, Ralph J.
1991-01-01
The U.S. Geological Survey's Earth Resources Observation Systems (EROS) Data Center, the national archive for Landsat data, has 20 years of experience in acquiring, archiving, processing, and distributing Landsat and earth science data. The Center is expanding its satellite and earth science data management activities to support the U.S. Global Change Research Program and the National Aeronautics and Space Administration (NASA) Earth Observing System Program. The Center's current and future data management activities focus on land data and include: satellite and earth science data set acquisition, development and archiving; data set preservation, maintenance and conversion to more durable and accessible archive medium; development of an advanced Land Data Information System; development of enhanced data packaging and distribution mechanisms; and data processing, reprocessing, and product generation systems.
The Telecommunications and Data Acquisition Report
NASA Technical Reports Server (NTRS)
Posner, E. C. (Editor)
1993-01-01
This quarterly publication provides archival reports on developments in programs managed by JPL's Office of Telecommunications and Data Acquisition (TDA). In space communications, radio navigation, radio science, and ground-based radio and radar astronomy, it reports on activities of the Deep Space Network (DSN) in planning, supporting research and technology, implementation, and operations. Also included are standards activity at JPL for space data and information systems and reimbursable DSN work performed for other space agencies through NASA.
Up the Beanstalk: An Evolutionary Organizational Structure for Libraries.
ERIC Educational Resources Information Center
Hoadley, Irene B.; Corbin, John
1990-01-01
Presents a functional organizational model for research libraries consisting of six major divisions and subunits: acquisition (buying, borrowing, leasing); organization (records creation, records maintenance); collections (collections management, selection, preservation, special collections and archives); interpretation (reference, instructional…
Fault tolerance techniques to assure data integrity in high-volume PACS image archives
NASA Astrophysics Data System (ADS)
He, Yutao; Huang, Lu J.; Valentino, Daniel J.; Wingate, W. Keith; Avizienis, Algirdas
1995-05-01
Picture archiving and communication systems (PACS) perform the systematic acquisition, archiving, and presentation of large quantities of radiological image and text data. In the UCLA Radiology PACS, for example, the volume of image data archived currently exceeds 2500 gigabytes. Furthermore, the distributed heterogeneous PACS is expected to have near real-time response, be continuously available, and assure the integrity and privacy of patient data. The off-the-shelf subsystems that compose the current PACS cannot meet these expectations; therefore fault tolerance techniques had to be incorporated into the system. This paper is to report our first-step efforts towards the goal and is organized as follows: First we discuss data integrity and identify fault classes under the PACS operational environment, then we describe auditing and accounting schemes developed for error-detection and analyze operational data collected. Finally, we outline plans for future research.
THE REMOTE SENSING DATA GATEWAY
The EPA Remote Sensing Data Gateway (RSDG) is a pilot project in the National Exposure Research Laboratory (NERL) to develop a comprehensive data search, acquisition, delivery and archive mechanism for internal, national and international sources of remote sensing data for the co...
Real-Time Data Streaming and Storing Structure for the LHD's Fusion Plasma Experiments
NASA Astrophysics Data System (ADS)
Nakanishi, Hideya; Ohsuna, Masaki; Kojima, Mamoru; Imazu, Setsuo; Nonomura, Miki; Emoto, Masahiko; Yoshida, Masanobu; Iwata, Chie; Ida, Katsumi
2016-02-01
The LHD data acquisition and archiving system, i.e., LABCOM system, has been fully equipped with high-speed real-time acquisition, streaming, and storage capabilities. To deal with more than 100 MB/s continuously generated data at each data acquisition (DAQ) node, DAQ tasks have been implemented as multitasking and multithreaded ones in which the shared memory plays the most important role for inter-process fast and massive data handling. By introducing a 10-second time chunk named “subshot,” endless data streams can be stored into a consecutive series of fixed length data blocks so that they will soon become readable by other processes even while the write process is continuing. Real-time device and environmental monitoring are also implemented in the same way with further sparse resampling. The central data storage has been separated into two layers to be capable of receiving multiple 100 MB/s inflows in parallel. For the frontend layer, high-speed SSD arrays are used as the GlusterFS distributed filesystem which can provide max. 2 GB/s throughput. Those design optimizations would be informative for implementing the next-generation data archiving system in big physics, such as ITER.
FBIS: A regional DNA barcode archival & analysis system for Indian fishes.
Nagpure, Naresh Sahebrao; Rashid, Iliyas; Pathak, Ajey Kumar; Singh, Mahender; Singh, Shri Prakash; Sarkar, Uttam Kumar
2012-01-01
DNA barcode is a new tool for taxon recognition and classification of biological organisms based on sequence of a fragment of mitochondrial gene, cytochrome c oxidase I (COI). In view of the growing importance of the fish DNA barcoding for species identification, molecular taxonomy and fish diversity conservation, we developed a Fish Barcode Information System (FBIS) for Indian fishes, which will serve as a regional DNA barcode archival and analysis system. The database presently contains 2334 sequence records of COI gene for 472 aquatic species belonging to 39 orders and 136 families, collected from available published data sources. Additionally, it contains information on phenotype, distribution and IUCN Red List status of fishes. The web version of FBIS was designed using MySQL, Perl and PHP under Linux operating platform to (a) store and manage the acquisition (b) analyze and explore DNA barcode records (c) identify species and estimate genetic divergence. FBIS has also been integrated with appropriate tools for retrieving and viewing information about the database statistics and taxonomy. It is expected that FBIS would be useful as a potent information system in fish molecular taxonomy, phylogeny and genomics. The database is available for free at http://mail.nbfgr.res.in/fbis/
IDC Re-Engineering Phase 2 System Requirements Document Version 1.4
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harris, James M.; Burns, John F.; Satpathi, Meara Allena
This System Requirements Document (SRD) defines waveform data processing requirements for the International Data Centre (IDC) of the Comprehensive Nuclear Test Ban Treaty Organization (CTBTO). The IDC applies, on a routine basis, automatic processing methods and interactive analysis to raw International Monitoring System (IMS) data in order to produce, archive, and distribute standard IDC products on behalf of all States Parties. The routine processing includes characterization of events with the objective of screening out events considered to be consistent with natural phenomena or non-nuclear, man-made phenomena. This document does not address requirements concerning acquisition, processing and analysis of radionuclide data,more » but includes requirements for the dissemination of radionuclide data and products.« less
IDC Re-Engineering Phase 2 System Requirements Document V1.3.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harris, James M.; Burns, John F.; Satpathi, Meara Allena
2015-12-01
This System Requirements Document (SRD) defines waveform data processing requirements for the International Data Centre (IDC) of the Comprehensive Nuclear Test Ban Treaty Organization (CTBTO). The IDC applies, on a routine basis, automatic processing methods and interactive analysis to raw International Monitoring System (IMS) data in order to produce, archive, and distribute standard IDC products on behalf of all States Parties. The routine processing includes characterization of events with the objective of screening out events considered to be consistent with natural phenomena or non-nuclear, man-made phenomena. This document does not address requirements concerning acquisition, processing and analysis of radionuclide datamore » but includes requirements for the dissemination of radionuclide data and products.« less
Fifteen Years of ASTER Data on NASA's Terra Platform
NASA Astrophysics Data System (ADS)
Abrams, M.; Tsu, H.
2014-12-01
The Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) is one of five instruments operating on NASA's Terra platform. Launched in 1999, ASTER has been acquiring data for 15 years. ASTER is a joint project between Japan's Ministry of Economy, Trade and Industry; and US NASA. Data processing and distribution are done by both organizations; a joint science team helps to define mission priorities. ASTER acquires ~550 images per day, with a 60 km swath width. A daytime acquisition is three visible bands and a backward-looking stereo band with 15 m resolution, six SWIR bands with 30 m resolution, and 5 TIR bands with 90 m resolution. Nighttime TIR-only data are routinely collected. The stereo capability has allowed the ASTER project to produce a global Digital Elevation Model (GDEM) data set, covering the earth's land surfaces from 83 degrees north to 83 degrees south, with 30 m data postings. This is the only (near-) global DEM available to all users at no charge; to date, over 28 million 1-by-1 degree DEM tiles have been distributed. As a general-purpose imaging instrument, ASTER-acquired data are used in numerous scientific disciplines, including: land use/land cover, urban monitoring, urban heat island studies, wetlands studies, agriculture monitoring, forestry, etc. Of particular emphasis has been the acquisition and analysis of data for natural hazard and disaster applications. We have been systematically acquiring images for 15,000 valley glaciers through the USGS Global Land Ice Monitoring from Space Project. The recently published Randolph Glacier Inventory, and the GLIMS book, both relied heavily on ASTER data as the basis for glaciological and climatological studies. The ASTER Volcano Archive is a unique on-line archive of thousands of daytime and nighttime ASTER images of ~1500 active glaciers, along with a growing archive of Landsat images. ASTER was scheduled to target active volcanoes at least 4 times per year, and more frequently for select volcanoes (like Mt. Etna and Hawaii). A separate processing and distribution system is operational in the US to allow rapid scheduling, acquisition, and distribution of ASTER data for natural hazards and disasters, such as forest fires, tornadoes, tsunamis, earthquakes, and floods. We work closely with other government agencies to provide this service.
The Telecommunications and Data Acquisition Report
NASA Technical Reports Server (NTRS)
Posner, Edward C. (Editor)
1991-01-01
This quarterly publication provides archival reports on developments in programs managed by the Jet Propulsion Laboratory's (JPL's) Office of Telecommunications and Data Acquisition (TDA). In space communications, radio navigation, radio science, and ground-based radio and radar astronomy, it reports on the activities of the Deep Space Network (DSN) in planning, in supporting research and technology, in implementation, and in operations. Also included is standards activity at JPL for space data, information systems, and reimbursable DSN work performed for other space agencies through NASA.
The Telecommunications and Data Acquisition Report
NASA Technical Reports Server (NTRS)
Posner, E. C. (Editor)
1990-01-01
Archival reports on developments in programs managed by the Jet Propulsion Laboratory's (JPL) Office of Telecommunications and Data Acquisition (TDA) are given. Space communications, radio navigation, radio science, and ground-based radio and radar astronomy, activities of the Deep Space Network (DSN) and its associated Ground Communications Facility (GCF) in planning, supporting research and technology, implementation, and operations are reported. Also included is TDA-funded activity at JPL on data and information systems and reimbursable Deep Space Network (DSN) work performed for other space agencies through NASA.
The Telecommunications and Data Acquisition
NASA Technical Reports Server (NTRS)
Posner, Edward C. (Editor)
1992-01-01
This quarterly publication provides archival reports on developments in programs managed by JPL's Office of Telecommunications and Data Acquisition (TDA). In space communications, radio navigation, radio science, and ground-based radio and radar astronomy, it reports on activities of the Deep Space Network (DSN) in planning, supporting research and technology, implementation, and operations. Also included are standards activity at JPL for space data and information systems and reimbursable DSN work performed for other space agencies through NASA. The preceding work is all performed for NASA's Office of Space Communications (OSC).
Life Sciences Data Archive (LSDA)
NASA Technical Reports Server (NTRS)
Fitts, M.; Johnson-Throop, Kathy; Thomas, D.; Shackelford, K.
2008-01-01
In the early days of spaceflight, space life sciences data were been collected and stored in numerous databases, formats, media-types and geographical locations. While serving the needs of individual research teams, these data were largely unknown/unavailable to the scientific community at large. As a result, the Space Act of 1958 and the Science Data Management Policy mandated that research data collected by the National Aeronautics and Space Administration be made available to the science community at large. The Biomedical Informatics and Health Care Systems Branch of the Space Life Sciences Directorate at JSC and the Data Archive Project at ARC, with funding from the Human Research Program through the Exploration Medical Capability Element, are fulfilling these requirements through the systematic population of the Life Sciences Data Archive. This program constitutes a formal system for the acquisition, archival and distribution of data for Life Sciences-sponsored experiments and investigations. The general goal of the archive is to acquire, preserve, and distribute these data using a variety of media which are accessible and responsive to inquiries from the science communities.
A sensor fusion field experiment in forest ecosystem dynamics
NASA Technical Reports Server (NTRS)
Smith, James A.; Ranson, K. Jon; Williams, Darrel L.; Levine, Elissa R.; Goltz, Stewart M.
1990-01-01
The background of the Forest Ecosystem Dynamics field campaign is presented, a progress report on the analysis of the collected data and related modeling activities is provided, and plans for future experiments at different points in the phenological cycle are outlined. The ecological overview of the study site is presented, and attention is focused on forest stands, needles, and atmospheric measurements. Sensor deployment and thermal and microwave observations are discussed, along with two examples of the optical radiation measurements obtained during the experiment in support of radiative transfer modeling. Future activities pertaining to an archival system, synthetic aperture radar, carbon acquisition modeling, and upcoming field experiments are considered.
The Italian National Seismic Network
NASA Astrophysics Data System (ADS)
Michelini, Alberto
2016-04-01
The Italian National Seismic Network is composed by about 400 stations, mainly broadband, installed in the Country and in the surrounding regions. About 110 stations feature also collocated strong motion instruments. The Centro Nazionale Terremoti, (National Earthquake Center), CNT, has installed and operates most of these stations, although a considerable number of stations contributing to the INGV surveillance has been installed and is maintained by other INGV sections (Napoli, Catania, Bologna, Milano) or even other Italian or European Institutions. The important technological upgrades carried out in the last years has allowed for significant improvements of the seismic monitoring of Italy and of the Euro-Mediterranean Countries. The adopted data transmission systems include satellite, wireless connections and wired lines. The Seedlink protocol has been adopted for data transmission. INGV is a primary node of EIDA (European Integrated Data Archive) for archiving and distributing, continuous, quality checked data. The data acquisition system was designed to accomplish, in near-real-time, automatic earthquake detection and hypocenter and magnitude determination (moment tensors, shake maps, etc.). Database archiving of all parametric results are closely linked to the existing procedures of the INGV seismic monitoring environment. Overall, the Italian earthquake surveillance service provides, in quasi real-time, hypocenter parameters which are then revised routinely by the analysts of the Bollettino Sismico Nazionale. The results are published on the web page http://cnt.rm.ingv.it/ and are publicly available to both the scientific community and the the general public. This presentation will describe the various activities and resulting products of the Centro Nazionale Terremoti. spanning from data acquisition to archiving, distribution and specialised products.
Diagnostic report acquisition unit for the Mayo/IBM PACS project
NASA Astrophysics Data System (ADS)
Brooks, Everett G.; Rothman, Melvyn L.
1991-07-01
The Mayo Clinic and IBM Rochester have jointly developed a picture archive and control system (PACS) for use with Mayo's MRI and Neuro-CT imaging modalities. One of the challenges of developing a useful PACS involves integrating the diagnostic reports with the electronic images so they can be displayed simultaneously. By the time a diagnostic report is generated for a particular case, its images have already been captured and archived by the PACS. To integrate the report with the images, the authors have developed an IBM Personal System/2 computer (PS/2) based diagnostic report acquisition unit (RAU). A typed copy of the report is transmitted via facsimile to the RAU where it is stacked electronically with other reports that have been sent previously but not yet processed. By processing these reports at the RAU, the information they contain is integrated with the image database and a copy of the report is archived electronically on an IBM Application System/400 computer (AS/400). When a user requests a set of images for viewing, the report is automatically integrated with the image data. By using a hot key, the user can toggle on/off the report on the display screen. This report describes process, hardware, and software employed to integrate the diagnostic report information into the PACS, including how the report images are captured, transmitted, and entered into the AS/400 database. Also described is how the archived reports and their associated medical images are located and merged for retrieval and display. The methods used to detect and process error conditions are also discussed.
A pathologist-designed imaging system for anatomic pathology signout, teaching, and research.
Schubert, E; Gross, W; Siderits, R H; Deckenbaugh, L; He, F; Becich, M J
1994-11-01
Pathology images are derived from gross surgical specimens, light microscopy, immunofluorescence, electron microscopy, molecular diagnostic gels, flow cytometry, image analysis data, and clinical laboratory data in graphic form. We have implemented a network of desktop personal computers (PCs) that allow us to easily capture, store, and retrieve gross and microscopic, anatomic, and research pathology images. System architecture involves multiple image acquisition and retrieval sites and a central file server for storage. The digitized images are conveyed via a local area network to and from image capture or display stations. Acquisition sites consist of a high-resolution camera connected to a frame grabber card in a 486-type personal computer, equipped with 16 MB (Table 1) RAM, a 1.05-gigabyte hard drive, and a 32-bit ethernet card for access to our anatomic pathology reporting system. We have designed a push-button workstation for acquiring and indexing images that does not significantly interfere with surgical pathology sign-out. Advantages of the system include the following: (1) Improving patient care: the availability of gross images at time of microscopic sign-out, verification of recurrence of malignancy from archived images, monitoring of bone marrow engraftment and immunosuppressive intervention after bone marrow/solid organ transplantation on repeat biopsies, and ability to seek instantaneous consultation with any pathologist on the network; (2) enhancing the teaching environment: building a digital surgical pathology atlas, improving the availability of images for conference support, and sharing cases across the network; (3) enhancing research: case study compilation, metastudy analysis, and availability of digitized images for quantitative analysis and permanent/reusable image records for archival study; and (4) other practical and economic considerations: storing case requisition images and hand-drawn diagrams deters the spread of gross room contaminants and results in considerable cost savings in photographic media for conferences, improved quality assurance by porting control stains across the network, and a multiplicity of other advantages that enhance image and information management in pathology.
Eclipse Megamovie 2017: How did we do?
NASA Astrophysics Data System (ADS)
Hudson, H. S.; Bender, M.; Collier, B. L.; Johnson, C.; Koh, J.; Konerding, D.; Martinez Oliveros, J. C.; Peticolas, L. M.; White, V.; Zevin, D.
2017-12-01
The Eclipse Megamovie program, as set up for the Great American Eclipse of 21 August 2017, will have completed its first phase, data acquisition, on that day or shortly thereafter. Our objective was to create (with Google's help) a vast public archive of amateur and other photography, down to the smartphone level, of the corona itself and of Baily's Beads at the 2nd and 3rd contacts. The archive and the consumer electronics enabling it open a large new domain of parameter space for eclipse science. At whatever level we have succeeded, the archive is an historical first, and we hope that the it already has been a springboard for citizen-science projects. We will discuss the execution of the program and some of its science plans/results.
MIRAGE: The data acquisition, analysis, and display system
NASA Technical Reports Server (NTRS)
Rosser, Robert S.; Rahman, Hasan H.
1993-01-01
Developed for the NASA Johnson Space Center and Life Sciences Directorate by GE Government Services, the Microcomputer Integrated Real-time Acquisition Ground Equipment (MIRAGE) system is a portable ground support system for Spacelab life sciences experiments. The MIRAGE system can acquire digital or analog data. Digital data may be NRZ-formatted telemetry packets of packets from a network interface. Analog signal are digitized and stored in experimental packet format. Data packets from any acquisition source are archived to a disk as they are received. Meta-parameters are generated from the data packet parameters by applying mathematical and logical operators. Parameters are displayed in text and graphical form or output to analog devices. Experiment data packets may be retransmitted through the network interface. Data stream definition, experiment parameter format, parameter displays, and other variables are configured using spreadsheet database. A database can be developed to support virtually any data packet format. The user interface provides menu- and icon-driven program control. The MIRAGE system can be integrated with other workstations to perform a variety of functions. The generic capabilities, adaptability and ease of use make the MIRAGE a cost-effective solution to many experimental data processing requirements.
Event-synchronized data acquisition system for the SPring-8 linac beam position monitors
NASA Astrophysics Data System (ADS)
Masuda, T.; Fukui, T.; Tanaka, R.; Taniuchi, T.; Yamashita, A.; Yanagida, K.
2005-05-01
By the summer of 2003, we had completed the installation of a new non-destructive beam position monitor (BPM) system to facilitate beam trajectory and energy correction for the SPring-8 linac. In all, 47 BPM sets were installed on the 1-GeV linac and three beam-transport lines. All of the BPM data acquisition system was required to operate synchronously with the electron beam acceleration cycle. We have developed an event-synchronized data acquisition system for the BPM data readout. We have succeeded in continuously taking all the BPMs data from six VME computers synchronized with the 10 pps operation of the linac to continuously acquire data. For each beam shot, the data points are indexed by event number and stored in a database. Using the real-time features of the Solaris operating system and distributed database technology, we currently have achieved about 99.9% efficiency in capturing and archiving all of the 10 Hz data. The linac BPM data is available for off-line analysis of the beam trajectory, but also for real-time control and automatic correction of the beam trajectory and energy.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 705.4 Parks, Forests, and Public Property LIBRARY OF CONGRESS REPRODUCTION, COMPILATION, AND....4 Reproduction. (a) Library of Congress staff acting under the general authority of the Librarian of... American Television and Radio Archives Act at 2 U.S.C. 170(a), and on Library of Congress acquisition...
An Outbreak of Respiratory Tularemia Caused by Diverse Clones of Francisella tularensis
Johansson, Anders; Lärkeryd, Adrian; Widerström, Micael; Mörtberg, Sara; Myrtännäs, Kerstin; Öhrman, Caroline; Birdsell, Dawn; Keim, Paul; Wagner, David M.; Forsman, Mats; Larsson, Pär
2014-01-01
Background. The bacterium Francisella tularensis is recognized for its virulence, infectivity, genetic homogeneity, and potential as a bioterrorism agent. Outbreaks of respiratory tularemia, caused by inhalation of this bacterium, are poorly understood. Such outbreaks are exceedingly rare, and F. tularensis is seldom recovered from clinical specimens. Methods. A localized outbreak of tularemia in Sweden was investigated. Sixty-seven humans contracted laboratory-verified respiratory tularemia. F. tularensis subspecies holarctica was isolated from the blood or pleural fluid of 10 individuals from July to September 2010. Using whole-genome sequencing and analysis of single-nucleotide polymorphisms (SNPs), outbreak isolates were compared with 110 archived global isolates. Results. There were 757 SNPs among the genomes of the 10 outbreak isolates and the 25 most closely related archival isolates (all from Sweden/Finland). Whole genomes of outbreak isolates were >99.9% similar at the nucleotide level and clustered into 3 distinct genetic clades. Unexpectedly, high-sequence similarity grouped some outbreak and archival isolates that originated from patients from different geographic regions and up to 10 years apart. Outbreak and archival genomes frequently differed by only 1–3 of 1 585 229 examined nucleotides. Conclusions. The outbreak was caused by diverse clones of F. tularensis that occurred concomitantly, were widespread, and apparently persisted in the environment. Multiple independent acquisitions of F. tularensis from the environment over a short time period suggest that natural outbreaks of respiratory tularemia are triggered by environmental cues. The findings additionally caution against interpreting genome sequence identity for this pathogen as proof of a direct epidemiological link. PMID:25097081
Alygizakis, Nikiforos A; Samanipour, Saer; Hollender, Juliane; Ibáñez, María; Kaserzon, Sarit; Kokkali, Varvara; van Leerdam, Jan A; Mueller, Jochen F; Pijnappels, Martijn; Reid, Malcolm J; Schymanski, Emma L; Slobodnik, Jaroslav; Thomaidis, Nikolaos S; Thomas, Kevin V
2018-05-01
A key challenge in the environmental and exposure sciences is to establish experimental evidence of the role of chemical exposure in human and environmental systems. High resolution and accurate tandem mass spectrometry (HRMS) is increasingly being used for the analysis of environmental samples. One lauded benefit of HRMS is the possibility to retrospectively process data for (previously omitted) compounds that has led to the archiving of HRMS data. Archived HRMS data affords the possibility of exploiting historical data to rapidly and effectively establish the temporal and spatial occurrence of newly identified contaminants through retrospective suspect screening. We propose to establish a global emerging contaminant early warning network to rapidly assess the spatial and temporal distribution of contaminants of emerging concern in environmental samples through performing retrospective analysis on HRMS data. The effectiveness of such a network is demonstrated through a pilot study, where eight reference laboratories with available archived HRMS data retrospectively screened data acquired from aqueous environmental samples collected in 14 countries on 3 different continents. The widespread spatial occurrence of several surfactants (e.g., polyethylene glycols ( PEGs ) and C12AEO-PEGs ), transformation products of selected drugs (e.g., gabapentin-lactam, metoprolol-acid, carbamazepine-10-hydroxy, omeprazole-4-hydroxy-sulfide, and 2-benzothiazole-sulfonic-acid), and industrial chemicals (3-nitrobenzenesulfonate and bisphenol-S) was revealed. Obtaining identifications of increased reliability through retrospective suspect screening is challenging, and recommendations for dealing with issues such as broad chromatographic peaks, data acquisition, and sensitivity are provided.
Molecular Hydrogen Fluorescence in IC 63
NASA Technical Reports Server (NTRS)
Andersson, B-G
2005-01-01
This grant has supported the acquisition, reduction and analysis of data targeting the structure and excitation of molecular hydrogen in the reflection nebula IC 63 and in particular the fluorescent emission seen in the UV. In addition to manpower for analyzing the FUSE data, the grant supported the (attempted) acquisition of supporting ground-based data. We proposed for and received observing time for two sets of ground based, data; narrow band imaging ([S II], [O III) at KPNO (July 2002; Observer: Burgh) and imaging spectro-photometry of several of the near-infrared rotation-vibration lines of H2 at the IRTF (October 2003; Observer: Andersson). Unfortunately, both of these runs were failures, primarily because of bad weather, and did not result in any useful data. We combined the FUSE observations with rocket borne observations of the star responsible for exciting the H2 fluorescence in IC 63: gamma Cas, and with archival HUT observations of IC 63, covering the long-wavelength part of the molecular hydrogen fluorescence.
Video and LAN solutions for a digital OR: the Varese experience
NASA Astrophysics Data System (ADS)
Nocco, Umberto; Cocozza, Eugenio; Sivo, Monica; Peta, Giancarlo
2007-03-01
Purpose: build 20 ORs equipped with independent video acquisition and broadcasting systems and a powerful LAN connectivity. Methods: a digital PC controlled video matrix has been installed in each OR. The LAN connectivity has been developed to grant data entering the OR and high speed connectivity to a server and to broadcasting devices. Video signals are broadcasted within the OR. Fixed inputs and five additional video inputs have been placed in the OR. Images can be stored locally on a high capacity HDD and a DVD recorder. Images can be also stored in a central archive for future acquisition and reference. Ethernet plugs have been placed within the OR to acquire images and data from the Hospital LAN; the OR is connected to the server/archive using a dedicated optical fiber. Results: 20 independent digital ORs have been built. Each OR is "self contained" and images can be digitally managed and broadcasted. Security issues concerning both image visualization and electrical safety have been fulfilled and each OR is fully integrated in the Hospital LAN. Conclusions: Digital ORs were fully implemented, they fulfill surgeons needs in terms of video acquisition and distribution and grant high quality video for each kind of surgery in a major hospital.
Smith, E M; Wandtke, J; Robinson, A
1999-05-01
The Medical Information, Communication and Archive System (MICAS) is a multivendor incremental approach to picture archiving and communications system (PACS). It is a multimodality integrated image management system that is seamlessly integrated with the radiology information system (RIS). Phase II enhancements of MICAS include a permanent archive, automated workflow, study caches, Microsoft (Redmond, WA) Windows NT diagnostic workstations with all components adhering to Digital Information Communications in Medicine (DICOM) standards. MICAS is designed as an enterprise-wide PACS to provide images and reports throughout the Strong Health healthcare network. Phase II includes the addition of a Cemax-Icon (Fremont, CA) archive, PACS broker (Mitra, Waterloo, Canada), an interface (IDX PACSlink, Burlington, VT) to the RIS (IDXrad) plus the conversion of the UNIX-based redundant array of inexpensive disks (RAID) 5 temporary archives in phase I to NT-based RAID 0 DICOM modality-specific study caches (ImageLabs, Bedford, MA). The phase I acquisition engines and workflow management software was uninstalled and the Cemax archive manager (AM) assumed these functions. The existing ImageLabs UNIX-based viewing software was enhanced and converted to an NT-based DICOM viewer. Installation of phase II hardware and software and integration with existing components began in July 1998. Phase II of MICAS demonstrates that a multivendor open-system incremental approach to PACS is feasible, cost-effective, and has significant advantages over a single-vendor implementation.
42 CFR 37.44 - Approval of radiographic facilities that use digital radiography systems.
Code of Federal Regulations, 2013 CFR
2013-10-01
... image acquisition, digitization, processing, compression, transmission, display, archiving, and... quality digital chest radiographs by submitting to NIOSH digital radiographic image files of a test object... digital radiographic image files from six or more sample chest radiographs that are of acceptable quality...
42 CFR 37.44 - Approval of radiographic facilities that use digital radiography systems.
Code of Federal Regulations, 2012 CFR
2012-10-01
... image acquisition, digitization, processing, compression, transmission, display, archiving, and... quality digital chest radiographs by submitting to NIOSH digital radiographic image files of a test object... digital radiographic image files from six or more sample chest radiographs that are of acceptable quality...
Henri, C J; Cox, R D; Bret, P M
1997-08-01
This article details our experience in developing and operating an ultrasound mini-picture archiving and communication system (PACS). Using software developed in-house, low-end Macintosh computers (Apple Computer Co. Cupertino, CA) equipped with framegrabbers coordinate the entry of patient demographic information, image acquisition, and viewing on each ultrasound scanner. After each exam, the data are transmitted to a central archive server where they can be accessed from anywhere on the network. The archive server also provides web-based access to the data and manages pre-fetch and other requests for data that may no longer be on-line. Archival is fully automatic and is performed on recordable compact disk (CD) without compression. The system has been filmless now for over 18 months. In the meantime, one film processor has been eliminated and the position of one film clerk has been reallocated. Previously, nine ultrasound machines produced approximately 150 sheets of laser film per day (at 14 images per sheet). The same quantity of data are now archived without compression onto a single CD. Start-up costs were recovered within six months, and the project has been extended to include computed tomography (CT) and magnetic resonance imaging (MRI).
The Telecommunications and Data Acquisition Report
NASA Technical Reports Server (NTRS)
Posner, E. C. (Editor)
1987-01-01
Archival reports on developments in programs managed by JPL's Office of Telecommunications and Data Acquisition (TDA) are provided. Activities of the Deep Space Network (DSN) in space communications, radio navigation, radio science, and ground-based radio astronomy are reported. Also included are the plans, supporting research and technology, implementation and operations for the Ground Communications Facility (GCF). In geodynamics, the publication reports on the application of radio interferometry at microwave frequencies for geodynamic measurements. In the search for extraterrestrial intelligence (SETI), it reports on implementation and operations for searching the microwave spectrum.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1985-01-01
Data were collected at the Shelly Ridge Girl Scout Center using an Aeolian Kinetics PDL-24 data acquisition system. Instantaneous data readings were recorded each 15 seconds by the microprocessor. These channel readings were then averaged to produce hourly values which were then stored on an audio cassette. These hourly data were then transcribed to the AIAF archive. The Girl Scout Center features an 861 square foot unvented Trombe wall, a direct gain sunspace, and two rooftop collectors for heating domestic water.
PDF-ECG in clinical practice: A model for long-term preservation of digital 12-lead ECG data.
Sassi, Roberto; Bond, Raymond R; Cairns, Andrew; Finlay, Dewar D; Guldenring, Daniel; Libretti, Guido; Isola, Lamberto; Vaglio, Martino; Poeta, Roberto; Campana, Marco; Cuccia, Claudio; Badilini, Fabio
In clinical practice, data archiving of resting 12-lead electrocardiograms (ECGs) is mainly achieved by storing a PDF report in the hospital electronic health record (EHR). When available, digital ECG source data (raw samples) are only retained within the ECG management system. The widespread availability of the ECG source data would undoubtedly permit successive analysis and facilitate longitudinal studies, with both scientific and diagnostic benefits. PDF-ECG is a hybrid archival format which allows to store in the same file both the standard graphical report of an ECG together with its source ECG data (waveforms). Using PDF-ECG as a model to address the challenge of ECG data portability, long-term archiving and documentation, a real-world proof-of-concept test was conducted in a northern Italy hospital. A set of volunteers undertook a basic ECG using routine hospital equipment and the source data captured. Using dedicated web services, PDF-ECG documents were then generated and seamlessly uploaded in the hospital EHR, replacing the standard PDF reports automatically generated at the time of acquisition. Finally, the PDF-ECG files could be successfully retrieved and re-analyzed. Adding PDF-ECG to an existing EHR had a minimal impact on the hospital's workflow, while preserving the ECG digital data. Copyright © 2017 Elsevier Inc. All rights reserved.
Kanungo, Jyotshnabala; Lantz, Susan; Paule, Merle G
2011-01-01
We describe an imaging procedure to measure axon length in zebrafish embryos in vivo. Automated fluorescent image acquisition was performed with the ImageXpress Micro high content screening reader and further analysis of axon lengths was performed on archived images using AcuityXpress software. We utilized the Neurite Outgrowth Application module with a customized protocol (journal) to measure the axons. Since higher doses of ethanol (2-2.5%, v/v) have been shown to deform motor neurons and axons during development, here we used ethanol to treat transgenic [hb9:GFP (green fluorescent protein)] zebrafish embryos at 28 hpf (hours post-fertilization). These embryos express GFP in the motor neurons and their axons. Embryos after ethanol treatment were arrayed in 384-well plates for automated fluorescent image acquisition in vivo. Average axon lengths of high dose ethanol-treated embryos were significantly lower than the control. Another experiment showed that there was no significant difference in the axon lengths between the embryos grown for 24h at 22°C and 28.5°C. These test experiments demonstrate that using axon development as an end-point, compound screening can be performed in a time-efficient manner. Published by Elsevier Inc.
NASA Astrophysics Data System (ADS)
Detrick, R. S.; Clark, D.; Gaylord, A.; Goldsmith, R.; Helly, J.; Lemmond, P.; Lerner, S.; Maffei, A.; Miller, S. P.; Norton, C.; Walden, B.
2005-12-01
The Scripps Institution of Oceanography (SIO) and the Woods Hole Oceanographic Institution (WHOI) have joined forces with the San Diego Supercomputer Center to build a testbed for multi-institutional archiving of shipboard and deep submergence vehicle data. Support has been provided by the Digital Archiving and Preservation program funded by NSF/CISE and the Library of Congress. In addition to the more than 92,000 objects stored in the SIOExplorer Digital Library, the testbed will provide access to data, photographs, video images and documents from WHOI ships, Alvin submersible and Jason ROV dives, and deep-towed vehicle surveys. An interactive digital library interface will allow combinations of distributed collections to be browsed, metadata inspected, and objects displayed or selected for download. The digital library architecture, and the search and display tools of the SIOExplorer project, are being combined with WHOI tools, such as the Alvin Framegrabber and the Jason Virtual Control Van, that have been designed using WHOI's GeoBrowser to handle the vast volumes of digital video and camera data generated by Alvin, Jason and other deep submergence vehicles. Notions of scalability will be tested, as data volumes range from 3 CDs per cruise to 200 DVDs per cruise. Much of the scalability of this proposal comes from an ability to attach digital library data and metadata acquisition processes to diverse sensor systems. We are able to run an entire digital library from a laptop computer as well as from supercomputer-center-size resources. It can be used, in the field, laboratory or classroom, covering data from acquisition-to-archive using a single coherent methodology. The design is an open architecture, supporting applications through well-defined external interfaces maintained as an open-source effort for community inclusion and enhancement.
Life Sciences Data Archive (LSDA) in the Post-Shuttle Era
NASA Technical Reports Server (NTRS)
Fitts, Mary A.; Johnson-Throop, Kathy; Havelka, Jacque; Thomas, Diedre
2009-01-01
Now, more than ever before, NASA is realizing the value and importance of their intellectual assets. Principles of knowledge management, the systematic use and reuse of information/experience/expertise to achieve a specific goal, are being applied throughout the agency. LSDA is also applying these solutions, which rely on a combination of content and collaboration technologies, to enable research teams to create, capture, share, and harness knowledge to do the things they do well, even better. In the early days of spaceflight, space life sciences data were been collected and stored in numerous databases, formats, media-types and geographical locations. These data were largely unknown/unavailable to the research community. The Biomedical Informatics and Health Care Systems Branch of the Space Life Sciences Directorate at JSC and the Data Archive Project at ARC, with funding from the Human Research Program through the Exploration Medical Capability Element, are fulfilling these requirements through the systematic population of the Life Sciences Data Archive. This project constitutes a formal system for the acquisition, archival and distribution of data for HRP-related experiments and investigations. The general goal of the archive is to acquire, preserve, and distribute these data and be responsive to inquiries from the science communities.
Research Resources for the Study of African-American and Jewish Relations.
ERIC Educational Resources Information Center
Gubert, Betty Kaplan
1994-01-01
Discusses New York City library resources for the study of African American and Jewish American relations. Highlights include library collections, access to materials, audio and visual materials, international newspapers, clippings, archives, children's books, and acquisitions. A list of the major libraries for the study of African American and…
Milne "en Masse": A Case Study in Digitizing Large Image Collections
ERIC Educational Resources Information Center
Harkema, Craig; Avery, Cheryl
2015-01-01
In December 2012, the University of Saskatchewan Library's University Archives and Special Collections acquired the complete image collection of Courtney Milne, a professional photographer whose worked encompassed documentary, abstract and fine art photographs. From acquisition to digital curation, the authors identify, outline, and discuss the…
Lagone, Elizabeth; Mathur, Sanyukta; Nakyanjo, Neema; Nalugoda, Fred; Santelli, John
2014-01-01
Uganda is recognised as an early success story in the HIV epidemic at least in part due to an open and vigorous national dialogue about HIV prevention. This study examined the national discourse about HIV, AIDS, and young people in New Vision , Uganda's leading national newspaper between 1996 and 2011, building from a previous archival analysis of New Vision reporting by Kirby (1986-1995). We examined the continuing evolution in the public discourse in Uganda, focusing on reporting about young people. An increase in reporting on HIV and AIDS occurred after 2003, as antiretroviral treatment was becoming available. While the emphasis in newspaper reporting about adults and the population at large evolved to reflect the development of new HIV treatment and prevention methods, the majority of the articles focused on young people did not change. Articles about young people continued to emphasise HIV acquisition due to early and premarital sexual activity and the need for social support services for children affected by HIV and AIDS. Articles often did not report on the complex social conditions that shape HIV-related risk among young people, or address young people who are sexually active, married, and/or HIV infected. With HIV prevalence now increasing among young people and adults in Uganda, greater attention to HIV prevention is needed.
FBIS: A regional DNA barcode archival & analysis system for Indian fishes
Nagpure, Naresh Sahebrao; Rashid, Iliyas; Pathak, Ajey Kumar; Singh, Mahender; Singh, Shri Prakash; Sarkar, Uttam Kumar
2012-01-01
DNA barcode is a new tool for taxon recognition and classification of biological organisms based on sequence of a fragment of mitochondrial gene, cytochrome c oxidase I (COI). In view of the growing importance of the fish DNA barcoding for species identification, molecular taxonomy and fish diversity conservation, we developed a Fish Barcode Information System (FBIS) for Indian fishes, which will serve as a regional DNA barcode archival and analysis system. The database presently contains 2334 sequence records of COI gene for 472 aquatic species belonging to 39 orders and 136 families, collected from available published data sources. Additionally, it contains information on phenotype, distribution and IUCN Red List status of fishes. The web version of FBIS was designed using MySQL, Perl and PHP under Linux operating platform to (a) store and manage the acquisition (b) analyze and explore DNA barcode records (c) identify species and estimate genetic divergence. FBIS has also been integrated with appropriate tools for retrieving and viewing information about the database statistics and taxonomy. It is expected that FBIS would be useful as a potent information system in fish molecular taxonomy, phylogeny and genomics. Availability The database is available for free at http://mail.nbfgr.res.in/fbis/ PMID:22715304
EOS Data Products Latency and Reprocessing Evaluation
NASA Astrophysics Data System (ADS)
Ramapriyan, H. K.; Wanchoo, L.
2012-12-01
NASA's Earth Observing System (EOS) Data and Information System (EOSDIS) program has been processing, archiving, and distributing EOS data since the launch of Terra platform in 1999. The EOSDIS Distributed Active Archive Centers (DAACs) and Science-Investigator-led Processing Systems (SIPSs) are generating over 5000 unique products with a daily average volume of 1.7 Petabytes. Initially EOSDIS had requirements to make process data products within 24 hours of receiving all inputs needed for generating them. Thus, generally, the latency would be slightly over 24 and 48 hours after satellite data acquisition, respectively, for Level 1 and Level 2 products. Due to budgetary constraints these requirements were relaxed, with the requirement being to avoid a growing backlog of unprocessed data. However, the data providers have been generating these products in as timely a manner as possible. The reduction in costs of computing hardware has helped considerably. It is of interest to analyze the actual latencies achieved over the past several years in processing and inserting the data products into the EOSDIS archives for the users to support various scientific studies such as land processes, oceanography, hydrology, atmospheric science, cryospheric science, etc. The instrument science teams have continuously evaluated the data products since the launches of EOS satellites and improved the science algorithms to provide high quality products. Data providers have periodically reprocessed the previously acquired data with these improved algorithms. The reprocessing campaigns run for an extended time period in parallel with forward processing, since all data starting from the beginning of the mission need to be reprocessed. Each reprocessing activity involves more data than the previous reprocessing. The historical record of the reprocessing times would be of interest to future missions, especially those involving large volumes of data and/or computational loads due to complexity of algorithms. Evaluation of latency and reprocessing times requires some of the product metadata information, such as the beginning and ending time of data acquisition, processing date, and version number. This information for each product is made available by data providers to the ESDIS Metrics System (EMS). The EMS replaced the earlier ESDIS Data Gathering and Reporting System (EDGRS) in FY2005. Since then it has collected information about data products' ingest, archive, and distribution. The analysis of latencies and reprocessing times will provide an insight to the data provider process and identify potential areas of weakness in providing timely data to the user community. Delays may be caused by events such as system unavailability, disk failures, delay in level 0 data delivery, availability of input data, network problems, and power failures. Analysis of metrics will highlight areas for focused examination of root causes for delays. The purposes of this study are to: 1) perform a detailed analysis of latency of selected instrument products for last 6 years; 2) analyze the reprocessed data from various data providers to determine the times taken for reprocessing campaigns; 3) identify potential reasons for any anomalies in these metrics.
NASA Astrophysics Data System (ADS)
Barillere, R.; Cabel, H.; Chan, B.; Goulas, I.; Le Goff, J. M.; Vinot, L.; Willmott, C.; Milcent, H.; Huuskonen, P.
1994-12-01
The Cortex control information system framework is being developed at CERN. It offers basic functions to allow the sharing of information, control and analysis functions; it presents a uniform human interface for such information and functions; it permits upgrades and additions without code modification and it is sufficiently generic to allow its use by most of the existing or future control systems at CERN. Services will include standard interfaces to user-supplied functions, analysis, archive and event management. Cortex does not attempt to carry out the direct data acquisition or control of the devices; these are activities which are highly specific to the application and are best done by commercial systems or user-written programs. Instead, Cortex integrates these application-specific pieces and supports them by supplying other commonly needed facilities such as collaboration, analysis, diagnosis and user assistance.
Advancing data management and analysis in different scientific disciplines
NASA Astrophysics Data System (ADS)
Fischer, M.; Gasthuber, M.; Giesler, A.; Hardt, M.; Meyer, J.; Prabhune, A.; Rigoll, F.; Schwarz, K.; Streit, A.
2017-10-01
Over the past several years, rapid growth of data has affected many fields of science. This has often resulted in the need for overhauling or exchanging the tools and approaches in the disciplines’ data life cycles. However, this allows the application of new data analysis methods and facilitates improved data sharing. The project Large-Scale Data Management and Analysis (LSDMA) of the German Helmholtz Association has been addressing both specific and generic requirements in its data life cycle successfully since 2012. Its data scientists work together with researchers from the fields such as climatology, energy and neuroscience to improve the community-specific data life cycles, in several cases even all stages of the data life cycle, i.e. from data acquisition to data archival. LSDMA scientists also study methods and tools that are of importance to many communities, e.g. data repositories and authentication and authorization infrastructure.
An outbreak of respiratory tularemia caused by diverse clones of Francisella tularensis.
Johansson, Anders; Lärkeryd, Adrian; Widerström, Micael; Mörtberg, Sara; Myrtännäs, Kerstin; Ohrman, Caroline; Birdsell, Dawn; Keim, Paul; Wagner, David M; Forsman, Mats; Larsson, Pär
2014-12-01
The bacterium Francisella tularensis is recognized for its virulence, infectivity, genetic homogeneity, and potential as a bioterrorism agent. Outbreaks of respiratory tularemia, caused by inhalation of this bacterium, are poorly understood. Such outbreaks are exceedingly rare, and F. tularensis is seldom recovered from clinical specimens. A localized outbreak of tularemia in Sweden was investigated. Sixty-seven humans contracted laboratory-verified respiratory tularemia. F. tularensis subspecies holarctica was isolated from the blood or pleural fluid of 10 individuals from July to September 2010. Using whole-genome sequencing and analysis of single-nucleotide polymorphisms (SNPs), outbreak isolates were compared with 110 archived global isolates. There were 757 SNPs among the genomes of the 10 outbreak isolates and the 25 most closely related archival isolates (all from Sweden/Finland). Whole genomes of outbreak isolates were >99.9% similar at the nucleotide level and clustered into 3 distinct genetic clades. Unexpectedly, high-sequence similarity grouped some outbreak and archival isolates that originated from patients from different geographic regions and up to 10 years apart. Outbreak and archival genomes frequently differed by only 1-3 of 1 585 229 examined nucleotides. The outbreak was caused by diverse clones of F. tularensis that occurred concomitantly, were widespread, and apparently persisted in the environment. Multiple independent acquisitions of F. tularensis from the environment over a short time period suggest that natural outbreaks of respiratory tularemia are triggered by environmental cues. The findings additionally caution against interpreting genome sequence identity for this pathogen as proof of a direct epidemiological link. © The Author 2014. Published by Oxford University Press on behalf of the Infectious Diseases Society of America. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
The Cambridge Structural Database: a quarter of a million crystal structures and rising.
Allen, Frank H
2002-06-01
The Cambridge Structural Database (CSD) now contains data for more than a quarter of a million small-molecule crystal structures. The information content of the CSD, together with methods for data acquisition, processing and validation, are summarized, with particular emphasis on the chemical information added by CSD editors. Nearly 80% of new structural data arrives electronically, mostly in CIF format, and the CCDC acts as the official crystal structure data depository for 51 major journals. The CCDC now maintains both a CIF archive (more than 73,000 CIFs dating from 1996), as well as the distributed binary CSD archive; the availability of data in both archives is discussed. A statistical survey of the CSD is also presented and projections concerning future accession rates indicate that the CSD will contain at least 500,000 crystal structures by the year 2010.
Radvany, M G; Chacko, A K; Richardson, R R; Grazdan, G W
1999-05-01
In a time of decreasing resources, managers need a tool to manage their resources effectively, support clinical requirements, and replace aging equipment in order to ensure adequate clinical care. To do this successfully, one must be able to perform technology assessment and capital equipment asset management. The lack of a commercial system that adequately performed technology needs assessment and addressed the unique needs of the military led to the development of an in-house Technology Assessment and Requirements Analysis (TARA) program. The TARA is a tool that provides an unbiased review of clinical operations and the resulting capital equipment requirements for military hospitals. The TARA report allows for the development of acquisition strategies for new equipment, enhances personnel management, and improves and streamlines clinical operations and processes.
The Telecommunications and Data Acquisition Report
NASA Technical Reports Server (NTRS)
Yuen, Joseph H. (Editor)
1994-01-01
This quarterly publication provides archival reports on developments in programs managed by JPL's Telecommunications and Mission Operations Directorate (TMOD), which now includes the former Telecommunications and Data Acquisition (TDA) Office. In space communications, radio navigation, radio science, and ground-based radio and radar astronomy, it reports on activities of the Deep Space Network (DS) in planning, supporting research and technology, implementation, and operations. Also included are standards activity at JPL for space data and information systems and reimbursable DSN work performed for other space agencies through NASA. The preceding work is all performed for NASA's Office of Space Communications (OSC).
The Telecommunications and Data Acquisition Report
NASA Technical Reports Server (NTRS)
Yuen, Joseph H. (Editor)
1993-01-01
This quarterly publication provides archival reports on developments in programs managed by JPL's Office of Telecommunications and Data Acquisition (TDA). In space communications, radio navigation, radio science, and ground-based radio and radar astronomy, it reports on activities of the Deep Space Network (DSN) in planning, supporting research and technology, implementation, and operations. Also included are standards activity at JPL for space data and information systems and reimbursable DSN work performed for other space agencies through NASA. The papers included in this document cover satellite tracking and ground-based navigation, spacecraft-ground communications, and optical communication systems for the Deep Space Network.
The Telecommunications and Data Acquisition Report
NASA Technical Reports Server (NTRS)
Yuen, Joseph H. (Editor)
1995-01-01
This quarterly publication provides archival reports on developments in programs managed by JPL's Telecommunications and Mission Operations Directorate (TMOD), which now includes the former Telecommunications and Data Acquisition (TDA) Office. In space communications, radio navigation, radio science, and ground-based radio and radar astronomy, it reports on activities of the Deep Space Network (DSN) in planning, supporting research and technology, implementation, and operations. Also included are standards activity at JPL for space data and information systems and reimbursable DSN work performed for other space agencies through NASA. The preceding work is all performed for NASA's Office of Space Communications (OSC).
ACE: A distributed system to manage large data archives
NASA Technical Reports Server (NTRS)
Daily, Mike I.; Allen, Frank W.
1993-01-01
Competitive pressures in the oil and gas industry are requiring a much tighter integration of technical data into E and P business processes. The development of new systems to accommodate this business need must comprehend the significant numbers of large, complex data objects which the industry generates. The life cycle of the data objects is a four phase progression from data acquisition, to data processing, through data interpretation, and ending finally with data archival. In order to implement a cost effect system which provides an efficient conversion from data to information and allows effective use of this information, an organization must consider the technical data management requirements in all four phases. A set of technical issues which may differ in each phase must be addressed to insure an overall successful development strategy. The technical issues include standardized data formats and media for data acquisition, data management during processing, plus networks, applications software, and GUI's for interpretation of the processed data. Mass storage hardware and software is required to provide cost effective storage and retrieval during the latter three stages as well as long term archival. Mobil Oil Corporation's Exploration and Producing Technical Center (MEPTEC) has addressed the technical and cost issues of designing, building, and implementing an Advanced Computing Environment (ACE) to support the petroleum E and P function, which is critical to the corporation's continued success. Mobile views ACE as a cost effective solution which can give Mobile a competitive edge as well as a viable technical solution.
Towne, Tyler J; Boot, Walter R; Ericsson, K Anders
2016-09-01
In this paper we describe a novel approach to the study of individual differences in acquired skilled performance in complex laboratory tasks based on an extension of the methodology of the expert-performance approach (Ericsson & Smith, 1991) to shorter periods of training and practice. In contrast to more traditional approaches that study the average performance of groups of participants, we explored detailed behavioral changes for individual participants across their development on the Space Fortress game. We focused on dramatic individual differences in learning and skill acquisition at the individual level by analyzing the archival game data of several interesting players to uncover the specific structure of their acquired skill. Our analysis revealed that even after maximal values for game-generated subscores were reached, the most skilled participant's behaviors such as his flight path, missile firing, and mine handling continued to be refined and improved (Participant 17 from Boot et al., 2010). We contrasted this participant's behavior with the behavior of several other participants and found striking differences in the structure of their performance, which calls into question the appropriateness of averaging their data. For example, some participants engaged in different control strategies such as "world wrapping" or maintaining a finely-tuned circular flight path around the fortress (in contrast to Participant 17's angular flight path). In light of these differences, we raise fundamental questions about how skill acquisition for individual participants should be studied and described. Our data suggest that a detailed analysis of individuals' data is an essential step for generating a general theory of skill acquisition that explains improvement at the group and individual levels. Copyright © 2016 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Carsey, F. D.; Weeks, W.
1988-01-01
The Alaska SAR Facility (ASF) program for the acquisition and processing of data from the ESA ERS-1, the NASDA ERS-1, and Radarsat and to carry out a program of science investigations using the data is introduced. Agreements for data acquisition and analysis are in place except for the agreement between NASA and Radarsat which is in negotiation. The ASF baseline system, consisting of the Receiving Ground System, the SAR Processor System and the Archive and Operations System, passed critical design review and is fully in implementation phase. Augments to the baseline system for systems to perform geophysical processing and for processing of J-ERS-1 optical data are in the design and implementation phase. The ASF provides a very effective vehicle with which to prepare for the Earth Observing System (EOS) in that it will aid the development of systems and technologies for handling the data volumes produced by the systems of the next decades, and it will also supply some of the data types that will be produced by EOS.
Multimedia content analysis and indexing: evaluation of a distributed and scalable architecture
NASA Astrophysics Data System (ADS)
Mandviwala, Hasnain; Blackwell, Scott; Weikart, Chris; Van Thong, Jean-Manuel
2003-11-01
Multimedia search engines facilitate the retrieval of documents from large media content archives now available via intranets and the Internet. Over the past several years, many research projects have focused on algorithms for analyzing and indexing media content efficiently. However, special system architectures are required to process large amounts of content from real-time feeds or existing archives. Possible solutions include dedicated distributed architectures for analyzing content rapidly and for making it searchable. The system architecture we propose implements such an approach: a highly distributed and reconfigurable batch media content analyzer that can process media streams and static media repositories. Our distributed media analysis application handles media acquisition, content processing, and document indexing. This collection of modules is orchestrated by a task flow management component, exploiting data and pipeline parallelism in the application. A scheduler manages load balancing and prioritizes the different tasks. Workers implement application-specific modules that can be deployed on an arbitrary number of nodes running different operating systems. Each application module is exposed as a web service, implemented with industry-standard interoperable middleware components such as Microsoft ASP.NET and Sun J2EE. Our system architecture is the next generation system for the multimedia indexing application demonstrated by www.speechbot.com. It can process large volumes of audio recordings with minimal support and maintenance, while running on low-cost commodity hardware. The system has been evaluated on a server farm running concurrent content analysis processes.
Geodetic VLBI observations at Simeiz station
NASA Astrophysics Data System (ADS)
Volvach, A.; Petrov, L.; Nesterov, N.
Very long baseline interferometry (VLBI) observations under international geodetic programs are carried out at Simeiz station since June 1994. 22-m radiotelescope is equipped by dual-band S/X receivers, hydrogen maser CH-70 and data acquisition terminal Mark-IIIA. Observations are conducted by 24 hours sessions scheduled 6-15 times per year. Observational programs are a part of common efforts for maintenance of terrestrial reference frame, celestial reference frame and monitoring Earth orientation parameters carried out by international community under the auspices of International VLBI Service (IVS). Data are recorded on magnetic tapes which are shipped to correlator centers for further correlation and fringing. Fringed data are archived and are freely available via Internet for scientific analysis after 1-2 months after observations.
NASA Technical Reports Server (NTRS)
Noll, Carey E.; Torrence, Mark H.; Pollack, Nathan H.; Tyahla, Lori J.
2013-01-01
The ILRS website, http://ilrs.gsfc.nasa.gov, is the central source of information for all aspects of the service. The website provides information on the organization and operation of the ILRS and descriptions of ILRS components data, and products. Furthermore, the website provides an entry point to the archive of these data products available through the data centers. Links are provided to extensive information on the ILRS network stations including performance assesments and data quality evaluations. Descriptions of suported satellite missions (current, future, and past) are provided to aid in station acquisition and data analysis. The website was reently redesigned. Content was reviewed during the update process, ensuring information is current and useful. This poster will provide specific examples of key sections, applicaitons, and webpages.
The Telecommunications and Data Acquisition Report
NASA Technical Reports Server (NTRS)
Yuen, Joseph H. (Editor)
1994-01-01
This quarterly publication provides archival reports on developments in programs in space communications, radio navigation, radio science, and ground-based radio and radar astronomy. It reports on activities of the Deep Space Network (DSN) in planning, supporting research and technology, implementation, and operations. Also included are standardization activities at the Jet Propulsion Laboratory for space data and information systems.
ERIC Educational Resources Information Center
Hendriks, Klaus B.
Intended for use by archivists, curators, and others responsible for the acquisition and preservation of documentary materials in photographic form, this publication describes the nature of photographic media and recommended conservation measures. It is noted that the major emphasis is on black-and-white photographic materials, with some…
Web-based data delivery services in support of disaster-relief applications
Jones, Brenda K.; Risty, Ron R.; Buswell, M.
2003-01-01
The U.S. Geological Survey Earth Resources Observation Systems Data Center responds to emergencies in support of various government agencies for human-induced and natural disasters. This response consists of satellite tasking and acquisitions, satellite image registrations, disaster-extent maps analysis and creation, base image provision and support, Web-based mapping services for product delivery, and predisaster and postdisaster data archiving. The emergency response staff are on call 24 hours a day, 7 days a week, and have access to many commercial and government satellite and aerial photography tasking authorities. They have access to value-added data processing and photographic laboratory services for off-hour emergency requests. They work with various Federal agencies for preparedness planning, which includes providing base imagery. These data may include digital elevation models, hydrographic models, base satellite images, vector data layers such as roads, aerial photographs, and other predisaster data. These layers are incorporated into a Web-based browser and data delivery service that is accessible either to the general public or to select customers. As usage declines, the data are moved to a postdisaster nearline archive that is still accessible, but not in real time.
NASA Astrophysics Data System (ADS)
Georgiou, Mike F.; Sfakianakis, George N.; Johnson, Gary; Douligeris, Christos; Scandar, Silvia; Eisler, E.; Binkley, B.
1994-05-01
In an effort to improve patient care while considering cost-effectiveness, we developed a Picture Archiving and Communication System (PACS), which combines imaging cameras, computers and other peripheral equipment from multiple nuclear medicine vectors. The PACS provides fully-digital clinical operation which includes acquisition and automatic organization of patient data, distribution of the data to all networked units inside the department and other remote locations, digital analysis and quantitation of images, digital diagnostic reading of image studies and permanent data archival with the ability for fast retrieval. The PACS enabled us to significantly reduce the amount of film used, and we are currently proceeding with implementing a film-less laboratory. Hard copies are produced on paper or transparent sheets for non-digitally connected parts of the hospital. The PACS provides full-digital operation which is faster, more reliable, better organized and managed, and overall more efficient than a conventional film-based operation. In this paper, the integration of the various PACS components from multiple vendors is reviewed, and the impact of PACS, with its advantages and limitations on our clinical operation is analyzed.
Analog-to-digital clinical data collection on networked workstations with graphic user interface.
Lunt, D
1991-02-01
An innovative respiratory examination system has been developed that combines physiological response measurement, real-time graphic displays, user-driven operating sequences, and networked file archiving and review into a scientific research and clinical diagnosis tool. This newly constructed computer network is being used to enhance the research center's ability to perform patient pulmonary function examinations. Respiratory data are simultaneously acquired and graphically presented during patient breathing maneuvers and rapidly transformed into graphic and numeric reports, suitable for statistical analysis or database access. The environment consists of the hardware (Macintosh computer, MacADIOS converters, analog amplifiers), the software (HyperCard v2.0, HyperTalk, XCMDs), and the network (AppleTalk, fileservers, printers) as building blocks for data acquisition, analysis, editing, and storage. System operation modules include: Calibration, Examination, Reports, On-line Help Library, Graphic/Data Editing, and Network Storage.
The Telecommunications and Data Acquisition Report
NASA Technical Reports Server (NTRS)
Posner, Edward C. (Editor)
1992-01-01
This quarterly publication provides archival reports on developments in programs managed by JPL's Telecommunications and Data Acquisition (TDA) Office. In the Search for Extraterrestrial Intelligence (SETI), the TDA Progress Report reports on implementation and operations for searching the microwave spectrum. In solar system radar, it reports on the uses of the Goldstone Solar System Radar for scientific exploration of the planets, their rings and satellites, asteroids, and comets. In radio astronomy, the areas of support include spectroscopy, very long baseline interferometry, and astrometry. These three programs are performed for NASA's Office of Space Science and Applications (OSSA) with the Office of Space Operations for funding DSN operational support.
NASA Astrophysics Data System (ADS)
Cusma, Jack T.; Spero, Laurence A.; Groshong, Bennett R.; Cho, Teddy; Bashore, Thomas M.
1993-09-01
An economical and practical digital solution for the replacement of 35 mm cine film as the archive media in the cardiac x-ray imaging environment has remained lacking to date due to the demanding requirements of high capacity, high acquisition rate, high transfer rate, and a need for application in a distributed environment. A clinical digital image library and network based on the D2 digital video format has been installed in the Duke University Cardiac Catheterization Laboratory. The system architecture includes a central image library with digital video recorders and robotic tape retrieval, three acquisition stations, and remote review stations connected via a serial image network. The library has a capacity for over 20,000 Gigabytes of uncompressed image data, equivalent to records for approximately 20,000 patients. Image acquisition in the clinical laboratories is via a real-time digital interface between the digital angiography system and a local digital recorder. Images are transferred to the library over the serial network at a rate of 14.3 Mbytes/sec and permanently stored for later review. The image library and network are currently undergoing a clinical comparison with cine film for visual and quantitative assessment of coronary artery disease. At the conclusion of the evaluation, the configuration will be expanded to include four additional catheterization laboratories and remote review stations throughout the hospital.
The challenges of archiving networked-based multimedia performances (Performance cryogenics)
NASA Astrophysics Data System (ADS)
Cohen, Elizabeth; Cooperstock, Jeremy; Kyriakakis, Chris
2002-11-01
Music archives and libraries have cultural preservation at the core of their charters. New forms of art often race ahead of the preservation infrastructure. The ability to stream multiple synchronized ultra-low latency streams of audio and video across a continent for a distributed interactive performance such as music and dance with high-definition video and multichannel audio raises a series of challenges for the architects of digital libraries and those responsible for cultural preservation. The archiving of such performances presents numerous challenges that go beyond simply recording each stream. Case studies of storage and subsequent retrieval issues for Internet2 collaborative performances are discussed. The development of shared reality and immersive environments generate issues about, What constitutes an archived performance that occurs across a network (in multiple spaces over time)? What are the families of necessary metadata to reconstruct this virtual world in another venue or era? For example, if the network exhibited changes in latency the performers most likely adapted. In a future recreation, the latency will most likely be completely different. We discuss the parameters of immersive environment acquisition and rendering, network architectures, software architecture, musical/choreographic scores, and environmental acoustics that must be considered to address this problem.
Acquisition plan for Digital Document Storage (DDS) prototype system
NASA Technical Reports Server (NTRS)
1990-01-01
NASA Headquarters maintains a continuing interest in and commitment to exploring the use of new technology to support productivity improvements in meeting service requirements tasked to the NASA Scientific and Technical Information (STI) Facility, and to support cost effective approaches to the development and delivery of enhanced levels of service provided by the STI Facility. The DDS project has been pursued with this interest and commitment in mind. It is believed that DDS will provide improved archival blowback quality and service for ad hoc requests for paper copies of documents archived and serviced centrally at the STI Facility. It will also develop an operating capability to scan, digitize, store, and reproduce paper copies of 5000 NASA technical reports archived annually at the STI Facility and serviced to the user community. Additionally, it will provide NASA Headquarters and field installations with on-demand, remote, electronic retrieval of digitized, bilevel, bit mapped report images along with branched, nonsequential retrieval of report subparts.
Distributed Active Archive Center
NASA Technical Reports Server (NTRS)
Bodden, Lee; Pease, Phil; Bedet, Jean-Jacques; Rosen, Wayne
1993-01-01
The Goddard Space Flight Center Version 0 Distributed Active Archive Center (GSFC V0 DAAC) is being developed to enhance and improve scientific research and productivity by consolidating access to remote sensor earth science data in the pre-EOS time frame. In cooperation with scientists from the science labs at GSFC, other NASA facilities, universities, and other government agencies, the DAAC will support data acquisition, validation, archive and distribution. The DAAC is being developed in response to EOSDIS Project Functional Requirements as well as from requirements originating from individual science projects such as SeaWiFS, Meteor3/TOMS2, AVHRR Pathfinder, TOVS Pathfinder, and UARS. The GSFC V0 DAAC has begun operational support for the AVHRR Pathfinder (as of April, 1993), TOVS Pathfinder (as of July, 1993) and the UARS (September, 1993) Projects, and is preparing to provide operational support for SeaWiFS (August, 1994) data. The GSFC V0 DAAC has also incorporated the existing data, services, and functionality of the DAAC/Climate, DAAC/Land, and the Coastal Zone Color Scanner (CZCS) Systems.
Immunity: Insect Immune Memory Goes Viral.
Ligoxygakis, Petros
2017-11-20
Adaptive memory in insect immunity has been controversial. In this issue, Andino and co-workers propose that acquisition of viral sequences in the host genome gives rise to anti-sense, anti-viral piRNAs. Such sequences can be regarded as both a genomic archive of past infections and as an armour of potential heritable memory. Copyright © 2017 Elsevier Ltd. All rights reserved.
Landsat Data Continuity Mission
,
2012-01-01
The Landsat Data Continuity Mission (LDCM) is a partnership formed between the National Aeronautics and Space Administration (NASA) and the U.S. Geological Survey (USGS) to place the next Landsat satellite in orbit in January 2013. The Landsat era that began in 1972 will become a nearly 41-year global land record with the successful launch and operation of the LDCM. The LDCM will continue the acquisition, archiving, and distribution of multispectral imagery affording global, synoptic, and repetitive coverage of the Earth's land surfaces at a scale where natural and human-induced changes can be detected, differentiated, characterized, and monitored over time. The mission objectives of the LDCM are to (1) collect and archive medium resolution (30-meter spatial resolution) multispectral image data affording seasonal coverage of the global landmasses for a period of no less than 5 years; (2) ensure that LDCM data are sufficiently consistent with data from the earlier Landsat missions in terms of acquisition geometry, calibration, coverage characteristics, spectral characteristics, output product quality, and data availability to permit studies of landcover and land-use change over time; and (3) distribute LDCM data products to the general public on a nondiscriminatory basis at no cost to the user.
Scalable Data Mining and Archiving for the Square Kilometre Array
NASA Astrophysics Data System (ADS)
Jones, D. L.; Mattmann, C. A.; Hart, A. F.; Lazio, J.; Bennett, T.; Wagstaff, K. L.; Thompson, D. R.; Preston, R.
2011-12-01
As the technologies for remote observation improve, the rapid increase in the frequency and fidelity of those observations translates into an avalanche of data that is already beginning to eclipse the resources, both human and technical, of the institutions and facilities charged with managing the information. Common data management tasks like cataloging both data itself and contextual meta-data, creating and maintaining scalable permanent archive, and making data available on-demand for research present significant software engineering challenges when considered at the scales of modern multi-national scientific enterprises such as the upcoming Square Kilometre Array project. The NASA Jet Propulsion Laboratory (JPL), leveraging internal research and technology development funding, has begun to explore ways to address the data archiving and distribution challenges with a number of parallel activities involving collaborations with the EVLA and ALMA teams at the National Radio Astronomy Observatory (NRAO), and members of the Square Kilometre Array South Africa team. To date, we have leveraged the Apache OODT Process Control System framework and its catalog and archive service components that provide file management, workflow management, resource management as core web services. A client crawler framework ingests upstream data (e.g., EVLA raw directory output), identifies its MIME type and automatically extracts relevant metadata including temporal bounds, and job-relevant/processing information. A remote content acquisition (pushpull) service is responsible for staging remote content and handing it off to the crawler framework. A science algorithm wrapper (called CAS-PGE) wraps underlying code including CASApy programs for the EVLA, such as Continuum Imaging and Spectral Line Cube generation, executes the algorithm, and ingests its output (along with relevant extracted metadata). In addition to processing, the Process Control System has been leveraged to provide data curation and automatic ingestion for the MeerKAT/KAT-7 precursor instrument in South Africa, helping to catalog and archive correlator and sensor output from KAT-7, and to make the information available for downstream science analysis. These efforts, supported by the increasing availability of high-quality open source software, represent a concerted effort to seek a cost-conscious methodology for maintaining the integrity of observational data from the upstream instrument to the archive, and at the same time ensuring that the data, with its richly annotated catalog of meta-data, remains a viable resource for research into the future.
PCB Analysis Plan for Tank Archive Samples
DOE Office of Scientific and Technical Information (OSTI.GOV)
NGUYEN, D.M.
2001-03-22
This analysis plan specifies laboratory analysis, quality assurance/quality control (QA/QC), and data reporting requirements for analyzing polychlorinated biphenyls (PCB) concentrations in archive samples. Tank waste archive samples that are planned for PCB analysis are identified in Nguyen 2001. The tanks and samples are summarized in Table 1-1. The analytical data will be used to establish a PCB baseline inventory in Hanford tanks.
Waltermire, Robert G.; Emmerich, Christopher U.; Mendenhall, Laura C.; Bohrer, Gil; Weinzierl, Rolf P.; McGann, Andrew J.; Lineback, Pat K.; Kern, Tim J.; Douglas, David C.
2016-05-03
U.S. Fish and Wildlife Service (USFWS) staff in the Pacific Southwest Region and at the Hopper Mountain National Wildlife Refuge Complex requested technical assistance to improve their global positioning system (GPS) data acquisition, management, and archive in support of the California Condor Recovery Program. The USFWS deployed and maintained GPS units on individual Gymnogyps californianus (California condor) in support of long-term research and daily operational monitoring and management of California condors. The U.S. Geological Survey (USGS) obtained funding through the Science Support Program to provide coordination among project participants, provide GPS Global System for Mobile Communication (GSM) transmitters for testing, and compare GSM/GPS with existing Argos satellite GPS technology. The USFWS staff worked with private companies to design, develop, and fit condors with GSM/GPS transmitters. The Movebank organization, an online database of animal tracking data, coordinated with each of these companies to automatically stream their GPS data into Movebank servers and coordinated with USFWS to improve Movebank software for managing transmitter data, including proofing/error checking of incoming GPS data. The USGS arranged to pull raw GPS data from Movebank into the USGS California Condor Management and Analysis Portal (CCMAP) (https://my.usgs.gov/ccmap) for production and dissemination of a daily map of condor movements including various automated alerts. Further, the USGS developed an automatic archiving system for pulling raw and proofed Movebank data into USGS ScienceBase to comply with the Federal Information Security Management Act of 2002. This improved data management system requires minimal manual intervention resulting in more efficient data flow from GPS data capture to archive status. As a result of the project’s success, Pinnacles National Park and the Ventana Wildlife Society California condor programs became partners and adopted the same workflow, tracking, and data archive system. This GPS tracking data management model and workflow should be applicable and beneficial to other wildlife tracking programs.
Microbially assisted recording of the Earth's magnetic field in sediment.
Zhao, Xiangyu; Egli, Ramon; Gilder, Stuart A; Müller, Sebastian
2016-02-11
Sediments continuously record variations of the Earth's magnetic field and thus provide an important archive for studying the geodynamo. The recording process occurs as magnetic grains partially align with the geomagnetic field during and after sediment deposition, generating a depositional remanent magnetization (DRM) or post-DRM (PDRM). (P)DRM acquisition mechanisms have been investigated for over 50 years, yet many aspects remain unclear. A key issue concerns the controversial role of bioturbation, that is, the mechanical disturbance of sediment by benthic organisms, during PDRM acquisition. A recent theory on bioturbation-driven PDRM appears to solve many inconsistencies between laboratory experiments and palaeomagnetic records, yet it lacks experimental proof. Here we fill this gap by documenting the important role of bioturbation-induced rotational diffusion for (P)DRM acquisition, including the control exerted on the recorded inclination and intensity, as determined by the equilibrium between aligning and perturbing torques acting on magnetic particles.
Incorporating Trust into Department of Defense Acquisition Risk Management
2014-09-01
Budget, Paperwork Reduction Project (0704–0188) Washington DC 20503. 1. AGENCY USE ONLY (Leave blank) 2. REPORT DATE September 2014 3. REPORT TYPE...Sydney, Australia. Paper published and archived, SETE Conference Proceedings. xviii Langford, Gary O. 2012. Engineering Systems Integration Theory...and knowledge on the subject were invaluable in keeping me focused and nudging me in the proper direction. This project would have been impossible
Ram K. Deo; Matthew B. Russell; Grant M. Domke; Christopher W. Woodall; Michael J. Falkowski; Warren B. Cohen
2017-01-01
The publicly accessible archive of Landsat imagery and increasing regional-scale LiDAR acquisitions offer an opportunity to periodically estimate aboveground forest biomass (AGB) from 1990 to the present to alignwith the reporting needs ofNationalGreenhouseGas Inventories (NGHGIs). This study integrated Landsat time-series data, a state-wide LiDAR dataset, and a recent...
The Telecommunications and Data Acquisition Report
NASA Technical Reports Server (NTRS)
Posner, E. C. (Editor)
1990-01-01
Archival reports are given on developments in programs managed by JPL's Office of Telecommunications and Data Acquisition (TDA), including space communications, radio navigation, radio science, ground-based radio and radar astronomy, and the Deep Space Network (DSN) and its associated Ground Communications Facility (GCF) in planning, supporting research and technology, implementation, and operations. Also included is TDA-funded activity at JPL on data and information systems and reimbursable DSN work performed for other space agencies through NASA. In the search for extraterrestrial intelligence (SETI), implementation and operations for searching the microwave spectrum are reported. Use of the Goldstone Solar System Radar for scientific exploration of the planets, their rings and satellites, asteroids, and comets are discussed.
The Telecommunications and Data Acquisition Report
NASA Technical Reports Server (NTRS)
Posner, Edward C. (Editor)
1992-01-01
Archival reports on developments in programs managed by the Jet Propulsion Laboratory's (JPL's) Office of Telecommunications and Data Acquisition (TDA) are published in the TDA Progress Report. In the search for extraterrestrial intelligence (SETI), the TDA Progress Report reports on implementation and operations for searching the microwave spectrum. In solar system radar, it reports on the uses of the Goldstone Solar System Radar for scientific exploration of the planets, their rings and satellites, asteroids, and comets. In radio astronomy, the areas of support include spectroscopy, very long baseline interferometry, and astrometry. These three programs are performed for NASA's Office of Space Science and Applications (OSSA), with the Office of Space Operations funding DSN operational support.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Christopher Slominski
2009-10-01
Archiving a large fraction of the EPICS signals within the Jefferson Lab (JLAB) Accelerator control system is vital for postmortem and real-time analysis of the accelerator performance. This analysis is performed on a daily basis by scientists, operators, engineers, technicians, and software developers. Archiving poses unique challenges due to the magnitude of the control system. A MySQL Archiving system (Mya) was developed to scale to the needs of the control system; currently archiving 58,000 EPICS variables, updating at a rate of 11,000 events per second. In addition to the large collection rate, retrieval of the archived data must also bemore » fast and robust. Archived data retrieval clients obtain data at a rate over 100,000 data points per second. Managing the data in a relational database provides a number of benefits. This paper describes an archiving solution that uses an open source database and standard off the shelf hardware to reach high performance archiving needs. Mya has been in production at Jefferson Lab since February of 2007.« less
Mubemba, B; Thompson, P N; Odendaal, L; Coetzee, P; Venter, E H
2017-05-01
Rift Valley fever (RVF), caused by an arthropod borne Phlebovirus in the family Bunyaviridae, is a haemorrhagic disease that affects ruminants and humans. Due to the zoonotic nature of the virus, a biosafety level 3 laboratory is required for isolation of the virus. Fresh and frozen samples are the preferred sample type for isolation and acquisition of sequence data. However, these samples are scarce in addition to posing a health risk to laboratory personnel. Archived formalin-fixed, paraffin-embedded (FFPE) tissue samples are safe and readily available, however FFPE derived RNA is in most cases degraded and cross-linked in peptide bonds and it is unknown whether the sample type would be suitable as reference material for retrospective phylogenetic studies. A RT-PCR assay targeting a 490 nt portion of the structural G N glycoprotein encoding gene of the RVFV M-segment was applied to total RNA extracted from archived RVFV positive FFPE samples. Several attempts to obtain target amplicons were unsuccessful. FFPE samples were then analysed using next generation sequencing (NGS), i.e. Truseq ® (Illumina) and sequenced on the Miseq ® genome analyser (Illumina). Using reference mapping, gapped virus sequence data of varying degrees of shallow depth was aligned to a reference sequence. However, the NGS did not yield long enough contigs that consistently covered the same genome regions in all samples to allow phylogenetic analysis. Copyright © 2017 Elsevier B.V. All rights reserved.
Image standards in tissue-based diagnosis (diagnostic surgical pathology).
Kayser, Klaus; Görtler, Jürgen; Goldmann, Torsten; Vollmer, Ekkehard; Hufnagl, Peter; Kayser, Gian
2008-04-18
Progress in automated image analysis, virtual microscopy, hospital information systems, and interdisciplinary data exchange require image standards to be applied in tissue-based diagnosis. To describe the theoretical background, practical experiences and comparable solutions in other medical fields to promote image standards applicable for diagnostic pathology. THEORY AND EXPERIENCES: Images used in tissue-based diagnosis present with pathology-specific characteristics. It seems appropriate to discuss their characteristics and potential standardization in relation to the levels of hierarchy in which they appear. All levels can be divided into legal, medical, and technological properties. Standards applied to the first level include regulations or aims to be fulfilled. In legal properties, they have to regulate features of privacy, image documentation, transmission, and presentation; in medical properties, features of disease-image combination, human-diagnostics, automated information extraction, archive retrieval and access; and in technological properties features of image acquisition, display, formats, transfer speed, safety, and system dynamics. The next lower second level has to implement the prescriptions of the upper one, i.e. describe how they are implemented. Legal aspects should demand secure encryption for privacy of all patient related data, image archives that include all images used for diagnostics for a period of 10 years at minimum, accurate annotations of dates and viewing, and precise hardware and software information. Medical aspects should demand standardized patients' files such as DICOM 3 or HL 7 including history and previous examinations, information of image display hardware and software, of image resolution and fields of view, of relation between sizes of biological objects and image sizes, and of access to archives and retrieval. Technological aspects should deal with image acquisition systems (resolution, colour temperature, focus, brightness, and quality evaluation procedures), display resolution data, implemented image formats, storage, cycle frequency, backup procedures, operation system, and external system accessibility. The lowest third level describes the permitted limits and threshold in detail. At present, an applicable standard including all mentioned features does not exist to our knowledge; some aspects can be taken from radiological standards (PACS, DICOM 3); others require specific solutions or are not covered yet. The progress in virtual microscopy and application of artificial intelligence (AI) in tissue-based diagnosis demands fast preparation and implementation of an internationally acceptable standard. The described hierarchic order as well as analytic investigation in all potentially necessary aspects and details offers an appropriate tool to specifically determine standardized requirements.
ESA Science Archives, VO tools and remote Scientific Data reduction in Grid Architectures
NASA Astrophysics Data System (ADS)
Arviset, C.; Barbarisi, I.; de La Calle, I.; Fajersztejn, N.; Freschi, M.; Gabriel, C.; Gomez, P.; Guainazzi, M.; Ibarra, A.; Laruelo, A.; Leon, I.; Micol, A.; Parrilla, E.; Ortiz, I.; Osuna, P.; Salgado, J.; Stebe, A.; Tapiador, D.
2008-08-01
This paper presents the latest functionalities of the ESA Science Archives located at ESAC, Spain, in particular, the following archives : the ISO Data Archive (IDA {http://iso.esac.esa.int/ida}), the XMM-Newton Science Archive (XSA {http://xmm.esac.esa.int/xsa}), the Integral SOC Science Data Archive (ISDA {http://integral.esac.esa.int/isda}) and the Planetary Science Archive (PSA {http://www.rssd.esa.int/psa}), both the classical and the map-based Mars Express interfaces. Furthermore, the ESA VOSpec {http://esavo.esac.esa.int/vospecapp} spectra analysis tool is described, which allows to access and display spectral information from VO resources (both real observational and theoretical spectra), including access to Lines database and recent analysis functionalities. In addition, we detail the first implementation of RISA (Remote Interface for Science Analysis), a web service providing remote users the ability to create fully configurable XMM-Newton data analysis workflows, and to deploy and run them on the ESAC Grid. RISA makes fully use of the inter-operability provided by the SIAP (Simple Image Access Protocol) services as data input, and at the same time its VO-compatible output can directly be used by general VO-tools.
Development of autonomous gamma dose logger for environmental monitoring
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jisha, N. V.; Krishnakumar, D. N.; Surya Prakash, G.
2012-03-15
Continuous monitoring and archiving of background radiation levels in and around the nuclear installation is essential and the data would be of immense use during analysis of any untoward incidents. A portable Geiger Muller detector based autonomous gamma dose logger (AGDL) for environmental monitoring is indigenously designed and developed. The system operations are controlled by microcontroller (AT89S52) and the main features of the system are software data acquisition, real time LCD display of radiation level, data archiving at removable compact flash card. The complete system operates on 12 V battery backed up by solar panel and hence the system ismore » totally portable and ideal for field use. The system has been calibrated with Co-60 source (8.1 MBq) at various source-detector distances. The system is field tested and performance evaluation is carried out. This paper covers the design considerations of the hardware, software architecture of the system along with details of the front-end operation of the autonomous gamma dose logger and the data file formats. The data gathered during field testing and inter comparison with GammaTRACER are also presented in the paper. AGDL has shown excellent correlation with energy fluence monitor tuned to identify {sup 41}Ar, proving its utility for real-time plume tracking and source term estimation.« less
Development of autonomous gamma dose logger for environmental monitoring
NASA Astrophysics Data System (ADS)
Jisha, N. V.; Krishnakumar, D. N.; Surya Prakash, G.; Kumari, Anju; Baskaran, R.; Venkatraman, B.
2012-03-01
Continuous monitoring and archiving of background radiation levels in and around the nuclear installation is essential and the data would be of immense use during analysis of any untoward incidents. A portable Geiger Muller detector based autonomous gamma dose logger (AGDL) for environmental monitoring is indigenously designed and developed. The system operations are controlled by microcontroller (AT89S52) and the main features of the system are software data acquisition, real time LCD display of radiation level, data archiving at removable compact flash card. The complete system operates on 12 V battery backed up by solar panel and hence the system is totally portable and ideal for field use. The system has been calibrated with Co-60 source (8.1 MBq) at various source-detector distances. The system is field tested and performance evaluation is carried out. This paper covers the design considerations of the hardware, software architecture of the system along with details of the front-end operation of the autonomous gamma dose logger and the data file formats. The data gathered during field testing and inter comparison with GammaTRACER are also presented in the paper. AGDL has shown excellent correlation with energy fluence monitor tuned to identify 41Ar, proving its utility for real-time plume tracking and source term estimation.
NASA Astrophysics Data System (ADS)
Saleh, T.; Rico, H.; Solanki, K.; Hauksson, E.; Friberg, P.
2005-12-01
The Southern California Seismic Network (SCSN) handles more than 2500 high-data rate channels from more than 380 seismic stations distributed across southern California. These data are imported real-time from dataloggers, earthworm hubs, and partner networks. The SCSN also exports data to eight different partner networks. Both the imported and exported data are critical for emergency response and scientific research. Previous data acquisition systems were complex and difficult to operate, because they grew in an ad hoc fashion to meet the increasing needs for distributing real-time waveform data. To maximize reliability and redundancy, we apply best practices methods from computer science for implementing the software and hardware configurations for import, export, and acquisition of real-time seismic data. Our approach makes use of failover software designs, methods for dividing labor diligently amongst the network nodes, and state of the art networking redundancy technologies. To facilitate maintenance and daily operations we seek to provide some separation between major functions such as data import, export, acquisition, archiving, real-time processing, and alarming. As an example, we make waveform import and export functions independent by operating them on separate servers. Similarly, two independent servers provide waveform export, allowing data recipients to implement their own redundancy. The data import is handled differently by using one primary server and a live backup server. These data import servers, run fail-over software that allows automatic role switching in case of failure from primary to shadow. Similar to the classic earthworm design, all the acquired waveform data are broadcast onto a private network, which allows multiple machines to acquire and process the data. As we separate data import and export away from acquisition, we are also working on new approaches to separate real-time processing and rapid reliable archiving of real-time data. Further, improved network security is an integral part of the new design. Redundant firewalls will provide secure data imports, exports, and acquisition as well as DMZ zones for web servers and other publicly available servers. We will present the detailed design of this new configuration that is currently being implemented by the SCSN at Caltech. The design principals are general enough to be of use to most regional seismic networks.
Teaching Electronic Records Management in the Archival Curriculum
ERIC Educational Resources Information Center
Zhang, Jane
2016-01-01
Electronic records management has been incorporated into the archival curriculum in North America since the 1990s. This study reported in this paper provides a systematic analysis of the content of electronic records management (ERM) courses currently taught in archival education programs. Through the analysis of course combinations and their…
DeWitt, Nancy T.; Flocks, James G.; Pendleton, Elizabeth A.; Hansen, Mark E.; Reynolds, B.J.; Kelso, Kyle W.; Wiese, Dana S.; Worley, Charles R.
2012-01-01
See the digital FACS equipment log for details about the acquisition equipment used. Raw datasets are stored digitally at the USGS St. Petersburg Coastal and Marine Science Center and processed systematically using Novatel's GrafNav version 7.6, SANDS version 3.7, SEA SWATHplus version 3.06.04.03, CARIS HIPS AND SIPS version 3.6, and ESRI ArcGIS version 9.3.1. For more information on processing refer to the Equipment and Processing page. Chirp seismic data were also collected during these surveys and are archived separately.
Clegg, G; Roebuck, S; Steedman, D
2001-01-01
Objectives—To develop a computer based storage system for clinical images—radiographs, photographs, ECGs, text—for use in teaching, training, reference and research within an accident and emergency (A&E) department. Exploration of methods to access and utilise the data stored in the archive. Methods—Implementation of a digital image archive using flatbed scanner and digital camera as capture devices. A sophisticated coding system based on ICD 10. Storage via an "intelligent" custom interface. Results—A practical solution to the problems of clinical image storage for teaching purposes. Conclusions—We have successfully developed a digital image capture and storage system, which provides an excellent teaching facility for a busy A&E department. We have revolutionised the practice of the "hand-over meeting". PMID:11435357
PACS quality control and automatic problem notifier
NASA Astrophysics Data System (ADS)
Honeyman-Buck, Janice C.; Jones, Douglas; Frost, Meryll M.; Staab, Edward V.
1997-05-01
One side effect of installing a clinical PACS Is that users become dependent upon the technology and in some cases it can be very difficult to revert back to a film based system if components fail. The nature of system failures range from slow deterioration of function as seen in the loss of monitor luminance through sudden catastrophic loss of the entire PACS networks. This paper describes the quality control procedures in place at the University of Florida and the automatic notification system that alerts PACS personnel when a failure has happened or is anticipated. The goal is to recover from a failure with a minimum of downtime and no data loss. Routine quality control is practiced on all aspects of PACS, from acquisition, through network routing, through display, and including archiving. Whenever possible, the system components perform self and between platform checks for active processes, file system status, errors in log files, and system uptime. When an error is detected or a exception occurs, an automatic page is sent to a pager with a diagnostic code. Documentation on each code, trouble shooting procedures, and repairs are kept on an intranet server accessible only to people involved in maintaining the PACS. In addition to the automatic paging system for error conditions, acquisition is assured by an automatic fax report sent on a daily basis to all technologists acquiring PACS images to be used as a cross check that all studies are archived prior to being removed from the acquisition systems. Daily quality control is preformed to assure that studies can be moved from each acquisition and contrast adjustment. The results of selected quality control reports will be presented. The intranet documentation server will be described with the automatic pager system. Monitor quality control reports will be described and the cost of quality control will be quantified. As PACS is accepted as a clinical tool, the same standards of quality control must be established as are expected on other equipment used in the diagnostic process.
NASA Astrophysics Data System (ADS)
Kiebuzinski, A. B.; Bories, C. M.; Kalluri, S.
2002-12-01
As part of its Earth Observing System (EOS), NASA supports operations for several satellites including Landsat 7, Terra, and Aqua. ECS (EOSDIS Core System) is a vast archival and distribution system and includes several Distributed Active Archive Centers (DAACs) located around the United States. EOSDIS reached a milestone in February when its data holdings exceeded one petabyte (1,000 terabytes) in size. It has been operational since 1999 and originally was intended to serve a large community of Earth Science researchers studying global climate change. The Synergy Program was initiated in 2000 with the purpose of exploring and expanding the use of remote sensing data beyond the traditional research community to the applications community including natural resource managers, disaster/emergency managers, urban planners and others. This included facilitating data access at the DAACs to enable non-researchers to exploit the data for their specific applications. The combined volume of data archived daily across the DAACs is of the order of three terabytes. These archived data are made available to the research community and to general users of ECS data. Currently, the average data volume distributed daily is two terabytes, which combined with an ever-increasing need for timely access to these data, taxes the ECS processing and archival resources for more real-time use than was previously intended for research purposes. As a result, the delivery of data sets to users was being delayed in many cases, to unacceptable limits. Raytheon, under the auspices of the Synergy Program, investigated methods at making data more accessible at a lower cost of resources (processing and archival) at the DAACs. Large on-line caches (as big as 70 Terabytes) of data were determined to be a solution that would allow users who require contemporary data to access them without having to pull it from the archive. These on-line caches are referred to as "Data Pools." In the Data Pool concept, data is inserted via subscriptions based on ECS events, for example, arrival of data matching a specific spatial context. Upon acquisition, these data are written to the Data Pools as well as to the permanent archive. The data is then accessed via a public Web interface, which provides a drilldown search, using data group, spatial, temporal and other flags. The result set is displayed as a list of ftp links to the data, which the user can click and directly download. Data Pool holdings are continuously renewed as the data is allowed to expire and is replaced by more current insertions. In addition, the Data Pool may also house data sets that though not contemporary, receive significant user attention, i.e. a Chernobyl-type of incident, a flood, or a forest fire. The benefits are that users who require contemporary data can access the data immediately (within 24 hours of acquisition) under a much improved access technique. Users not requiring contemporary data, benefit from the Data Pools by having greater archival and processing resources (and a shorter processing queue) made available to them. All users benefit now from the capability to have standing data orders for data matching a geographic context (spatial subscription), a capability also developed under the Synergy program. The Data Pools are currently being installed and checked at each of the DAACs. Additionally, several improvements to the search capabilities, data manipulation tools and overall storage capacity are being developed and will be installed in the First Quarter of 2003.
NASA Astrophysics Data System (ADS)
Zhang, F.; Barriot, J. P.; Maamaatuaiahutapu, K.; Sichoix, L.; Xu, G., Sr.
2017-12-01
In order to better understand and predict the complex meteorological context of French Polynesia, we focus on the time evolution of Integrated Precipitable Water (PW) using Radiosoundings (RS) data from 1974 to 2017. In a first step, we make a comparison over selected months between the PW estimate reconstructed from raw two seconds acquisition and the PW estimate reconstructed from the highly compressed and undersampled Integrated Global Radiosonde Archive (IGRA). In a second step, we make a comparison with other techniques of PW acquisition (radio delays, temperature of sky, infrared bands absorption) in order to assess the intrinsic biases of RS acquisition. In a last step, we analyze the PW time series in our area validated at the light of the first and second step, w.r.t seasonality (dry season and wet season) and spatial location. During the wet season (November to April), the PW values are higher than the corresponding values observed during the dry season (May to October). The PW values are smaller with higher latitudes, but there are higher PW values in Tahiti than in other islands because of the presence of the South Pacific Convergence Zone (SPCZ) around Tahiti. All the PW time series show the same uptrend in French Polynesia in recent years. This study provides further evidence that the PW time series derived from RS can be assimilated in weather forecasting and climate warming models.
Staff Radiation Doses in a Real-Time Display Inside the Angiography Room
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sanchez, Roberto, E-mail: rmsanchez.hcsc@salud.madrid.org; Vano, E.; Fernandez, J. M.
MethodsThe evaluation of a new occupational Dose Aware System (DAS) showing staff radiation doses in real time has been carried out in several angiography rooms in our hospital. The system uses electronic solid-state detectors with high-capacity memory storage. Every second, it archives the dose and dose rate measured and is wirelessly linked to a base-station screen mounted close to the diagnostic monitors. An easy transfer of the values to a data sheet permits further analysis of the scatter dose profile measured during the procedure, compares it with patient doses, and seeks to find the most effective actions to reduce operatormore » exposure to radiation.ResultsThe cumulative occupational doses measured per procedure (shoulder-over lead apron) ranged from 0.6 to 350 {mu}Sv when the ceiling-suspended screen was used, and DSA (Digital Subtraction Acquisition) runs were acquired while the personnel left the angiography room. When the suspended screen was not used and radiologists remained inside the angiography room during DSA acquisitions, the dose rates registered at the operator's position reached up to 1-5 mSv/h during fluoroscopy and 12-235 mSv/h during DSA acquisitions. In such case, the cumulative scatter dose could be more than 3 mSv per procedure.ConclusionReal-time display of doses to staff members warns interventionists whenever the scatter dose rates are too high or the radiation protection tools are not being properly used, providing an opportunity to improve personal protection accordingly.« less
Landsat Data Continuity Mission
,
2007-01-01
The Landsat Data Continuity Mission (LDCM) is a partnership between the National Aeronautics and Space Administration (NASA) and the U.S. Geological Survey (USGS) to place the next Landsat satellite in orbit by late 2012. The Landsat era that began in 1972 will become a nearly 45-year global land record with the successful launch and operation of the LDCM. The LDCM will continue the acquisition, archival, and distribution of multispectral imagery affording global, synoptic, and repetitive coverage of the Earth's land surfaces at a scale where natural and human-induced changes can be detected, differentiated, characterized, and monitored over time. The mission objectives of the LDCM are to (1) collect and archive medium resolution (circa 30-m spatial resolution) multispectral image data affording seasonal coverage of the global landmasses for a period of no less than 5 years; (2) ensure that LDCM data are sufficiently consistent with data from the earlier Landsat missions, in terms of acquisition geometry, calibration, coverage characteristics, spectral characteristics, output product quality, and data availability to permit studies of land-cover and land-use change over time; and (3) distribute LDCM data products to the general public on a nondiscriminatory basis and at a price no greater than the incremental cost of fulfilling a user request. Distribution of LDCM data over the Internet at no cost to the user is currently planned.
PACS 2000: quality control using the task allocation chart
NASA Astrophysics Data System (ADS)
Norton, Gary S.; Romlein, John R.; Lyche, David K.; Richardson, Ronald R., Jr.
2000-05-01
Medical imaging's technological evolution in the next century will continue to include Picture Archive and Communication Systems (PACS) and teleradiology. It is difficult to predict radiology's future in the new millennium with both computed radiography and direct digital capture competing as the primary image acquisition methods for routine radiography. Changes in Computed Axial Tomography (CT) and Magnetic Resonance Imaging (MRI) continue to amaze the healthcare community. No matter how the acquisition, display, and archive functions change, Quality Control (QC) of the radiographic imaging chain will remain an important step in the imaging process. The Task Allocation Chart (TAC) is a tool that can be used in a medical facility's QC process to indicate the testing responsibilities of the image stakeholders and the medical informatics department. The TAC shows a grid of equipment to be serviced, tasks to be performed, and the organization assigned to perform each task. Additionally, skills, tasks, time, and references for each task can be provided. QC of the PACS must be stressed as a primary element of a PACS' implementation. The TAC can be used to clarify responsibilities during warranty and paid maintenance periods. Establishing a TAC a part of a PACS implementation has a positive affect on patient care and clinical acceptance.
Distributed digital music archives and libraries
NASA Astrophysics Data System (ADS)
Fujinaga, Ichiro
2005-09-01
The main goal of this research program is to develop and evaluate practices, frameworks, and tools for the design and construction of worldwide distributed digital music archives and libraries. Over the last few millennia, humans have amassed an enormous amount of musical information that is scattered around the world. It is becoming abundantly clear that the optimal path for acquisition is to distribute the task of digitizing the wealth of historical and cultural heritage material that exists in analogue formats, which may include books and manuscripts related to music, music scores, photographs, videos, audio tapes, and phonograph records. In order to achieve this goal, libraries, museums, and archives throughout the world, large or small, need well-researched policies, proper guidance, and efficient tools to digitize their collections and to make them available economically. The research conducted within the program addresses unique and imminent challenges posed by the digitization and dissemination of music media. The are four major research projects in progress: development and evaluation of digitization methods for preservation of analogue recordings; optical music recognition using microfilms; design of workflow management system with automatic metadata extraction; and formulation of interlibrary communication strategies.
Lessons learned from planetary science archiving
NASA Astrophysics Data System (ADS)
Zender, J.; Grayzeck, E.
2006-01-01
The need for scientific archiving of past, current, and future planetary scientific missions, laboratory data, and modeling efforts is indisputable. To quote from a message by G. Santayama carved over the entrance of the US Archive in Washington DC “Those who can not remember the past are doomed to repeat it.” The design, implementation, maintenance, and validation of planetary science archives are however disputed by the involved parties. The inclusion of the archives into the scientific heritage is problematic. For example, there is the imbalance between space agency requirements and institutional and national interests. The disparity of long-term archive requirements and immediate data analysis requests are significant. The discrepancy between the space missions archive budget and the effort required to design and build the data archive is large. An imbalance exists between new instrument development and existing, well-proven archive standards. The authors present their view on the problems and risk areas in the archiving concepts based on their experience acquired within NASA’s Planetary Data System (PDS) and ESA’s Planetary Science Archive (PSA). Individual risks and potential problem areas are discussed based on a model derived from a system analysis done upfront. The major risk for a planetary mission science archive is seen in the combination of minimal involvement by Mission Scientists and inadequate funding. The authors outline how the risks can be reduced. The paper ends with the authors view on future planetary archive implementations including the archive interoperability aspect.
Jet stream winds - Enhanced aircraft data acquisition and analysis over Southwest Asia
NASA Technical Reports Server (NTRS)
Tenenbaum, J.
1989-01-01
A project is described for providing the accurate initial and verification analyses for the jet stream in regions where general circulation models are known to have large systematic errors, either due to the extreme sparsity of data or to incorrect physical parameterizations. For this purpose, finely spaced aircraft-based meteorological data for the Southwest Asian regions collected for three 10-day periods in the winter of 1988-1989 will be used, together with corresponding data for the North American regions used as a control, to rerun the assimilation cycles and forecast models of the NMC and the NASA Goddard Laboratory for Atmospheres. Data for Southeast Asia will be collected by three carriers with extensive wide-body routes crossing the total region, while data for the North American region will be obtained from the archives of ACARS and GTS.
NASA Technical Reports Server (NTRS)
Iverson, L. R.; Olson, J. S.; Risser, P. G.; Treworgy, C.; Frank, T.; Cook, E.; Ke, Y.
1986-01-01
Data acquisition, initial site characterization, image and geographic information methods available, and brief evaluations of first-year for NASA's Thematic Mapper (TM) working group are presented. The TM and other spectral data are examined in order to relate local, intensive ecosystem research findings to estimates of carbon cycling rates over wide geographic regions. The effort is to span environments ranging from dry to moist climates and from good to poor site quality using the TM capability, with and without the inclusion of geographic information system (GIS) data, and thus to interpret the local spatial pattern of factors conditioning biomass or productivity. Twenty-eight TM data sets were acquired, archived, and evaluated. The ERDAS image processing and GIS system were installed on the microcomputer (PC-AT) and its capabilities are being investigated. The TM coverage of seven study areas were exported via ELAS software on the Prime to the ERDAS system. Statistical analysis procedures to be used on the spectral data are being identified.
Astro-WISE: Chaining to the Universe
NASA Astrophysics Data System (ADS)
Valentijn, E. A.; McFarland, J. P.; Snigula, J.; Begeman, K. G.; Boxhoorn, D. R.; Rengelink, R.; Helmich, E.; Heraudeau, P.; Verdoes Kleijn, G.; Vermeij, R.; Vriend, W.-J.; Tempelaar, M. J.; Deul, E.; Kuijken, K.; Capaccioli, M.; Silvotti, R.; Bender, R.; Neeser, M.; Saglia, R.; Bertin, E.; Mellier, Y.
2007-10-01
The recent explosion of recorded digital data and its processed derivatives threatens to overwhelm researchers when analysing their experimental data or looking up data items in archives and file systems. While current hardware developments allow the acquisition, processing and storage of hundreds of terabytes of data at the cost of a modern sports car, the software systems to handle these data are lagging behind. This problem is very general and is well recognized by various scientific communities; several large projects have been initiated, e.g., DATAGRID/EGEE {http://www.eu-egee.org/} federates compute and storage power over the high-energy physical community, while the international astronomical community is building an Internet geared Virtual Observatory {http://www.euro-vo.org/pub/} (Padovani 2006) connecting archival data. These large projects either focus on a specific distribution aspect or aim to connect many sub-communities and have a relatively long trajectory for setting standards and a common layer. Here, we report first light of a very different solution (Valentijn & Kuijken 2004) to the problem initiated by a smaller astronomical IT community. It provides an abstract scientific information layer which integrates distributed scientific analysis with distributed processing and federated archiving and publishing. By designing new abstractions and mixing in old ones, a Science Information System with fully scalable cornerstones has been achieved, transforming data systems into knowledge systems. This break-through is facilitated by the full end-to-end linking of all dependent data items, which allows full backward chaining from the observer/researcher to the experiment. Key is the notion that information is intrinsic in nature and thus is the data acquired by a scientific experiment. The new abstraction is that software systems guide the user to that intrinsic information by forcing full backward and forward chaining in the data modelling.
The Telecommunications and Data Acquisition Report
NASA Technical Reports Server (NTRS)
Posner, E. C. (Editor)
1987-01-01
This quarterly publication (July-September 1987) provides archival reports on developments in programs managed by JPL's Office of Telecommunications and Data Acquisition (TDA). In space communications, radio navigation, radio science, and ground-based radio astronomy, it reports on activities of the Deep Space Network (DSN) and its associated Ground Communications Facility (GCF) in planning, in supporting research and technology, in implementation, and in operations. This work is performed for NASA's Office of Space Tracking and Data Systems (OSTDS). In geodynamics, the publication reports on the application of radio interferometry at microwave frequencies for geodynamic measurements. In the Search for Extraterrestrial Intelligence (SETI), it reports on implementation and operations for searching the microwave spectrum. The latter two programs are performed for NASA's Office of Space Science and Applications (OSSA).
The Telecommunications and Data Acquisition Report
NASA Technical Reports Server (NTRS)
Posner, E. C. (Editor)
1986-01-01
This quarterly publication (July-Sept. 1986) provides archival reports on developments in programs managed by JPL's Office of Telecommunications and Data Acquisition (TDA). In space communications, radio navigation, radio science, and ground-based radio astronomy, it reports on activities of the Deep Space Network (DSN) and its associated Ground Communications Facility (GCF) in planning, in supporting research and technology, in implementation, and in operations. This work is performed for NASA's Office of Space Tracking and Data Systems (OSTDS). In geodynamics, the publication reports on the application of radio interferometry at microwave frequencies for geodynamic measurements. In the search for extraterrestrial intelligence (SETI), it reports on implementation and operations for searching the microwave spectrum. The latter two programs are performed for NASA's Office of Space Science and Applications (OSSA).
The Telecommunications and Data Acquisition Report
NASA Technical Reports Server (NTRS)
Posner, Edward C. (Editor)
1992-01-01
This quarterly publication provides archival reports on developments in programs managed by JPL's Office Telecommunications and Data Acquisition (TDA). In space communications, radio navigation, radio science, and ground-based radio and radar astronomy, it reports on activities of the Deep Space Network (DSN) in planning, in supporting research and technology, in implementation, and in operations. Also included is standards activity at JPL for space data and information systems and reimbursable DSN work performed for other space agencies through NASA. The preceding work is all performed for NASA's Office of Space Operations (OSO). The TDA Office also performs work funded by two other NASA program offices through and with the cooperation of the OSO. These are the Orbital Debris Radar Program and 21st Century Communication Studies.
The Telecommunications and Data Acquisition Report
NASA Technical Reports Server (NTRS)
Posner, Edward C. (Editor)
1993-01-01
This quarterly publication provides archival reports on developments in programs managed by JPL's Office of Telecommunications and Data Acquisition (TDA) in the following areas: space communications, radio navigation, radio science, and ground-based radio and radar astronomy. This document also reports on the activities of the Deep Space Network (DSN) in planning, supporting research and technology, implementation, and operations. Also included are standards activity at JPL for space data and information systems and reimbursable DSN work performed for other space agencies through NASA. The preceding work is all performed for NASA's Office of Space Communications (OSC). The TDA Office also performs work funded by another NASA program office through and with the cooperation of OSC. This is the Orbital Debris Radar Program with the Office of Space Systems Development.
The Telecommunications and Data Acquisition Report
NASA Technical Reports Server (NTRS)
Yuen, Joseph H. (Editor)
1995-01-01
This quarterly publiction provides archival reports on developments in programs managed by JPL Telecommunications and Mission Operations Directorate (TMOD), which now includes the former communications and Data Acquisition (TDA) Office. In space communications, radio navigation, radio science, and ground-based radio and radar astronomy, it reports on activities of the Deep Space Network (DSN) in planning, supporting research and technology, implementation, and operations. Also included are standards activity at JPL for space data and information systems and reimbursable DSN work performed for other space agencies through NASA. The Orbital Debris Radar Program, funded by the Office of Space Systems Development, makes use of the planetary radar capability when the antennas are configured at science instruments making direct observations of planets, their satellites, and asteroids of our solar system.
Tsaparina, Diana; Bonin, Patrick; Méot, Alain
2011-12-01
The aim of the present study was to provide Russian normative data for the Snodgrass and Vanderwart (Behavior Research Methods, Instruments, & Computers, 28, 516-536, 1980) colorized pictures (Rossion & Pourtois, Perception, 33, 217-236, 2004). The pictures were standardized on name agreement, image agreement, conceptual familiarity, imageability, and age of acquisition. Objective word frequency and objective visual complexity measures are also provided for the most common names associated with the pictures. Comparative analyses between our results and the norms obtained in other, similar studies are reported. The Russian norms may be downloaded from the Psychonomic Society supplemental archive.
The GONG Data Reduction and Analysis System. [solar oscillations
NASA Technical Reports Server (NTRS)
Pintar, James A.; Andersen, Bo Nyborg; Andersen, Edwin R.; Armet, David B.; Brown, Timothy M.; Hathaway, David H.; Hill, Frank; Jones, Harrison P.
1988-01-01
Each of the six GONG observing stations will produce three, 16-bit, 256X256 images of the Sun every 60 sec of sunlight. These data will be transferred from the observing sites to the GONG Data Management and Analysis Center (DMAC), in Tucson, on high-density tapes at a combined rate of over 1 gibabyte per day. The contemporaneous processing of these data will produce several standard data products and will require a sustained throughput in excess of 7 megaflops. Peak rates may exceed 50 megaflops. Archives will accumulate at the rate of approximately 1 terabyte per year, reaching nearly 3 terabytes in 3 yr of observing. Researchers will access the data products with a machine-independent GONG Reduction and Analysis Software Package (GRASP). Based on the Image Reduction and Analysis Facility, this package will include database facilities and helioseismic analysis tools. Users may access the data as visitors in Tucson, or may access DMAC remotely through networks, or may process subsets of the data at their local institutions using GRASP or other systems of their choice. Elements of the system will reach the prototype stage by the end of 1988. Full operation is expected in 1992 when data acquisition begins.
C-130 Automated Digital Data System (CADDS)
NASA Technical Reports Server (NTRS)
Scofield, C. P.; Nguyen, Chien
1991-01-01
Real time airborne data acquisition, archiving and distribution on the NASA/Ames Research Center (ARC) C-130 has been improved over the past three years due to the implementation of the C-130 Automated Digital Data System (CADDS). CADDS is a real time, multitasking, multiprocessing ROM-based system. CADDS acquires data from both avionics and environmental sensors inflight for all C-130 data lines. The system also displays the data on video monitors throughout the aircraft.
Karsten, Stanislav L.; Van Deerlin, Vivianna M. D.; Sabatti, Chiara; Gill, Lisa H.; Geschwind, Daniel H.
2002-01-01
Archival formalin-fixed, paraffin-embedded and ethanol-fixed tissues represent a potentially invaluable resource for gene expression analysis, as they are the most widely available material for studies of human disease. Little data are available evaluating whether RNA obtained from fixed (archival) tissues could produce reliable and reproducible microarray expression data. Here we compare the use of RNA isolated from human archival tissues fixed in ethanol and formalin to frozen tissue in cDNA microarray experiments. Since an additional factor that can limit the utility of archival tissue is the often small quantities available, we also evaluate the use of the tyramide signal amplification method (TSA), which allows the use of small amounts of RNA. Detailed analysis indicates that TSA provides a consistent and reproducible signal amplification method for cDNA microarray analysis, across both arrays and the genes tested. Analysis of this method also highlights the importance of performing non-linear channel normalization and dye switching. Furthermore, archived, fixed specimens can perform well, but not surprisingly, produce more variable results than frozen tissues. Consistent results are more easily obtainable using ethanol-fixed tissues, whereas formalin-fixed tissue does not typically provide a useful substrate for cDNA synthesis and labeling. PMID:11788730
Prototyping Control and Data Acquisition for the ITER Neutral Beam Test Facility
NASA Astrophysics Data System (ADS)
Luchetta, Adriano; Manduchi, Gabriele; Taliercio, Cesare; Soppelsa, Anton; Paolucci, Francesco; Sartori, Filippo; Barbato, Paolo; Breda, Mauro; Capobianco, Roberto; Molon, Federico; Moressa, Modesto; Polato, Sandro; Simionato, Paola; Zampiva, Enrico
2013-10-01
The ITER Neutral Beam Test Facility will be the project's R&D facility for heating neutral beam injectors (HNB) for fusion research operating with H/D negative ions. Its mission is to develop technology to build the HNB prototype injector meeting the stringent HNB requirements (16.5 MW injection power, -1 MeV acceleration energy, 40 A ion current and one hour continuous operation). Two test-beds will be built in sequence in the facility: first SPIDER, the ion source test-bed, to optimize the negative ion source performance, second MITICA, the actual prototype injector, to optimize ion beam acceleration and neutralization. The SPIDER control and data acquisition system is under design. To validate the main architectural choices, a system prototype has been assembled and performance tests have been executed to assess the prototype's capability to meet the control and data acquisition system requirements. The prototype is based on open-source software frameworks running under Linux. EPICS is the slow control engine, MDSplus is the data handler and MARTe is the fast control manager. The prototype addresses low and high-frequency data acquisition, 10 kS/s and 10 MS/s respectively, camera image acquisition, data archiving, data streaming, data retrieval and visualization, real time fast control with 100 μs control cycle and supervisory control.
SPASE: The Connection Among Solar and Space Physics Data Centers
NASA Technical Reports Server (NTRS)
Thieman, James R.; King, Todd A.; Roberts, D. Aaron
2011-01-01
The Space Physics Archive Search and Extract (SPASE) project is an international collaboration among Heliophysics (solar and space physics) groups concerned with data acquisition and archiving. Within this community there are a variety of old and new data centers, resident archives, "virtual observatories", etc. acquiring, holding, and distributing data. A researcher interested in finding data of value for his or her study faces a complex data environment. The SPASE group has simplified the search for data through the development of the SPASE Data Model as a common method to describe data sets in the various archives. The data model is an XML-based schema and is now in operational use. There are both positives and negatives to this approach. The advantage is the common metadata language enabling wide-ranging searches across the archives, but it is difficult to inspire the data holders to spend the time necessary to describe their data using the Model. Software tools have helped, but the main motivational factor is wide-ranging use of the standard by the community. The use is expanding, but there are still other groups who could benefit from adopting SPASE. The SPASE Data Model is also being expanded in the sense of providing the means for more detailed description of data sets with the aim of enabling more automated ingestion and use of the data through detailed format descriptions. We will discuss the present state of SPASE usage and how we foresee development in the future. The evolution is based on a number of lessons learned - some unique to Heliophysics, but many common to the various data disciplines.
Wilmot, Michael P; Kostal, Jack W; Stillwell, David; Kosinski, Michal
2017-07-01
For the past 40 years, the conventional univariate model of self-monitoring has reigned as the dominant interpretative paradigm in the literature. However, recent findings associated with an alternative bivariate model challenge the conventional paradigm. In this study, item response theory is used to develop measures of the bivariate model of acquisitive and protective self-monitoring using original Self-Monitoring Scale (SMS) items, and data from two large, nonstudent samples ( Ns = 13,563 and 709). Results indicate that the new acquisitive (six-item) and protective (seven-item) self-monitoring scales are reliable, unbiased in terms of gender and age, and demonstrate theoretically consistent relations to measures of personality traits and cognitive ability. Additionally, by virtue of using original SMS items, previously collected responses can be reanalyzed in accordance with the alternative bivariate model. Recommendations for the reanalysis of archival SMS data, as well as directions for future research, are provided.
The challenge of archiving and preserving remotely sensed data
Faundeen, John L.
2003-01-01
Few would question the need to archive the scientific and technical (S&T) data generated by researchers. At a minimum, the data are needed for change analysis. Likewise, most people would value efforts to ensure the preservation of the archived S&T data. Future generations will use analysis techniques not even considered today. Until recently, archiving and preserving these data were usually accomplished within existing infrastructures and budgets. As the volume of archived data increases, however, organizations charged with archiving S&T data will be increasingly challenged (U.S. General Accounting Office, 2002). The U.S. Geological Survey has had experience in this area and has developed strategies to deal with the mountain of land remote sensing data currently being managed and the tidal wave of expected new data. The Agency has dealt with archiving issues, such as selection criteria, purging, advisory panels, and data access, and has met with preservation challenges involving photographic and digital media. That experience has allowed the USGS to develop management approaches, which this paper outlines.
Cloud-based NEXRAD Data Processing and Analysis for Hydrologic Applications
NASA Astrophysics Data System (ADS)
Seo, B. C.; Demir, I.; Keem, M.; Goska, R.; Weber, J.; Krajewski, W. F.
2016-12-01
The real-time and full historical archive of NEXRAD Level II data, covering the entire United States from 1991 to present, recently became available on Amazon cloud S3. This provides a new opportunity to rebuild the Hydro-NEXRAD software system that enabled users to access vast amounts of NEXRAD radar data in support of a wide range of research. The system processes basic radar data (Level II) and delivers radar-rainfall products based on the user's custom selection of features such as space and time domain, river basin, rainfall product space and time resolution, and rainfall estimation algorithms. The cloud-based new system can eliminate prior challenges faced by Hydro-NEXRAD data acquisition and processing: (1) temporal and spatial limitation arising from the limited data storage; (2) archive (past) data ingestion and format conversion; and (3) separate data processing flow for the past and real-time Level II data. To enhance massive data processing and computational efficiency, the new system is implemented and tested for the Iowa domain. This pilot study begins by ingesting rainfall metadata and implementing Hydro-NEXRAD capabilities on the cloud using the new polarimetric features, as well as the existing algorithm modules and scripts. The authors address the reliability and feasibility of cloud computation and processing, followed by an assessment of response times from an interactive web-based system.
The Telecommunications and Data Acquisition Report
NASA Technical Reports Server (NTRS)
Yuen, Joseph H. (Editor)
1994-01-01
This quarterly publication provides archival reports on developments in programs managed by JPL's Office of Telecommunications and Data Acquisition (TDA). In space communications, radio navigation, radio science, and ground-based radio and radar astronomy, it reports on activities of the Deep Space Network (DSN) in planning, supporting research and technology, implementation, and operations. Also included are standards activity at JPL for space data and information systems and reimbursable DSN work performed for other space agencies through NASA. The preceding work is all performed for NASA's Office of Space Communications (OSC). The TDA Office also performs work funded by other NASA program offices through and with the cooperation of OSC. Finally, tasks funded under the JPL Director's Discretionary Fund and the Caltech President's Fund that involve the TDA Office are included.
The Telecommunications and Data Acquisition Report
NASA Technical Reports Server (NTRS)
Posner, Edward C. (Editor)
1991-01-01
This quarterly publication provides archival reports on developments in programs managed by JPL's Office of Telecommunications and Data Acquisition (TDA). In space communications, radio navigation, radio science, and ground-based radio and radar astronomy, it reports on activities of the Deep Space Network (DSN). Also included is standards activity at JPL for space data and information systems and reimbursable DSN work performed for other space agencies through NASA. In the search for extraterrestrial intelligence (SETI), 'The TDA Progress Report' reports on implementation and operations for searching the microwave spectrum. In solar system radar, it reports on the uses of the Goldstone Solar System Radar for scientific exploration of the planets, their rings and satellites, asteroids, and comets. In radio astronomy, the areas of support include spectroscopy, very long baseline interferometry, and astrometry.
Satellite Analysis of Ocean Biogeochemistry and Mesoscale Variability in the Sargasso Sea
NASA Technical Reports Server (NTRS)
Siegel, D. A.; Micheals, A. F.; Nelson, N. B.
1997-01-01
The objective of this study was to analyze the impact of spatial variability on the time-series of biogeochemical measurements made at the U.S. JGOFS Bermuda Atlantic Time-series Study (BATS) site. Originally the study was planned to use SeaWiFS as well as AVHRR high-resolution data. Despite the SeaWiFS delays we were able to make progress on the following fronts: (1) Operational acquisition, processing, and archive of HRPT data from a ground station located in Bermuda; (2) Validation of AVHRR SST data using BATS time-series and spatial validation cruise CTD data; (3) Use of AVHRR sea surface temperature imagery and ancillary data to assess the impact of mesoscale spatial variability on P(CO2) and carbon flux in the Sargasso Sea; (4) Spatial and temporal extent of tropical cyclone induced surface modifications; and (5) Assessment of eddy variability using TOPEX/Poseidon data.
NASA Astrophysics Data System (ADS)
Saha, A.; Monet, D.
2005-12-01
Continued acquisition and analysis for short-exposure observations support the preliminary conclusion presented by Monet et al. (BAAS v36, p1531, 2004) that a 10-second exposure in 1.0-arcsecond seeing can provide a differential astrometric accuracy of about 10 milliarcseconds. A single solution for mapping coefficients appears to be valid over spatial scales of up to 10 arcminutes, and this suggests that numerical processing can proceed on a per-sensor basis without the need to further divide the individual fields of view into several astrometric patches. Data from the Subaru public archive as well as from the LSST Cerro Pachon 2005 observing campaign and various CTIO and NOAO 4-meter engineering runs have been considered. Should these results be confirmed, the expected astrometric accuracy after 10 years of LSST observations should be around 1.0 milliarcseconds for parallax and 0.2 milliarcseconds/year for proper motions.
A Data Quality Filter for PMU Measurements: Description, Experience, and Examples
DOE Office of Scientific and Technical Information (OSTI.GOV)
Follum, James D.; Amidan, Brett G.
Networks of phasor measurement units (PMUs) continue to grow, and along with them, the amount of data available for analysis. With so much data, it is impractical to identify and remove poor quality data manually. The data quality filter described in this paper was developed for use with the Data Integrity and Situation Awareness Tool (DISAT), which analyzes PMU data to identify anomalous system behavior. The filter operates based only on the information included in the data files, without supervisory control and data acquisition (SCADA) data, state estimator values, or system topology information. Measurements are compared to preselected thresholds tomore » determine if they are reliable. Along with the filter's description, examples of data quality issues from application of the filter to nine months of archived PMU data are provided. The paper is intended to aid the reader in recognizing and properly addressing data quality issues in PMU data.« less
Site characterization of the national seismic network of Italy
NASA Astrophysics Data System (ADS)
Bordoni, Paola; Pacor, Francesca; Cultrera, Giovanna; Casale, Paolo; Cara, Fabrizio; Di Giulio, Giuseppe; Famiani, Daniela; Ladina, Chiara; PIschiutta, Marta; Quintiliani, Matteo
2017-04-01
The national seismic network of Italy (Rete Sismica Nazionale, RSN) run by Istituto Nazionale di Geofisica e Vulcanologia (INGV) consists of more than 400 seismic stations connected in real time to the institute data center in order to locate earthquakes for civil defense purposes. A critical issue in the performance of a network is the characterization of site condition at the recording stations. Recently INGV has started addressing this subject through the revision of all available geological and geophysical data, the acquisition of new information by means of ad-hoc field measurements and the analysis of seismic waveforms. The main effort is towards building a database, integrated with the other INGV infrastructures, designed to archive homogeneous parameters through the seismic network useful for a complete site characterization, including housing, geological, seismological and geotechnical features as well as the site class according to the European and Italian building codes. Here we present the ongoing INGV activities.
NASA Johnson Space Center Life Sciences Data System
NASA Technical Reports Server (NTRS)
Rahman, Hasan; Cardenas, Jeffery
1994-01-01
The Life Sciences Project Division (LSPD) at JSC, which manages human life sciences flight experiments for the NASA Life Sciences Division, augmented its Life Sciences Data System (LSDS) in support of the Spacelab Life Sciences-2 (SLS-2) mission, October 1993. The LSDS is a portable ground system supporting Shuttle, Spacelab, and Mir based life sciences experiments. The LSDS supports acquisition, processing, display, and storage of real-time experiment telemetry in a workstation environment. The system may acquire digital or analog data, storing the data in experiment packet format. Data packets from any acquisition source are archived and meta-parameters are derived through the application of mathematical and logical operators. Parameters may be displayed in text and/or graphical form, or output to analog devices. Experiment data packets may be retransmitted through the network interface and database applications may be developed to support virtually any data packet format. The user interface provides menu- and icon-driven program control and the LSDS system can be integrated with other workstations to perform a variety of functions. The generic capabilities, adaptability, and ease of use make the LSDS a cost-effective solution to many experiment data processing requirements. The same system is used for experiment systems functional and integration tests, flight crew training sessions and mission simulations. In addition, the system has provided the infrastructure for the development of the JSC Life Sciences Data Archive System scheduled for completion in December 1994.
ERIC Educational Resources Information Center
Huvila, Isto
2016-01-01
Introduction: This paper analyses the work practices and perspectives of professionals working with archaeological archives and the social organization of archaeological archiving and information management in Sweden. Method: The paper is based on an interview study of Swedish actors in the field of archaeological archiving (N = 16). Analysis: The…
ERIC Educational Resources Information Center
Rauber, Andreas; Bruckner, Robert M.; Aschenbrenner, Andreas; Witvoet, Oliver; Kaiser, Max; Masanes, Julien; Marchionini, Gary; Geisler, Gary; King, Donald W.; Montgomery, Carol Hansen; Rudner, Lawrence M.; Gellmann, Jennifer S.; Miller-Whitehead, Marie; Iverson, Lee
2002-01-01
These six articles discuss Web archives and Web analysis building on data warehouses; international efforts at continuous Web archiving; the Open Video Digital Library; electronic journal collections in academic libraries; online education journals; and an electronic library symposium at the University of British Columbia. (LRW)
NASA Astrophysics Data System (ADS)
Pesaresi, D.; Busby, R.
2013-08-01
The number and quality of seismic stations and networks in Europe continually improves, nevertheless there is always scope to optimize their performance. In this session we welcomed contributions from all aspects of seismic network installation, operation and management. This includes site selection; equipment testing and installation; planning and implementing communication paths; policies for redundancy in data acquisition, processing and archiving; and integration of different datasets including GPS and OBS.
A PC-controlled microwave tomographic scanner for breast imaging
NASA Astrophysics Data System (ADS)
Padhi, Shantanu; Howard, John; Fhager, A.; Bengtsson, Sebastian
2011-01-01
This article presents the design and development of a personal computer based controller for a microwave tomographic system for breast cancer detection. The system uses motorized, dual-polarized antennas and a custom-made GUI interface to control stepper motors, a wideband vector network analyzer (VNA) and to coordinate data acquisition and archival in a local MDSPlus database. Both copolar and cross-polar scattered field components can be measured directly. Experimental results are presented to validate the various functionalities of the scanner.
Radiometer Calibration and Characterization (RCC) User's Manual: Windows Version 4.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
Andreas, Afshin M.; Wilcox, Stephen M.
2016-02-29
The Radiometer Calibration and Characterization (RCC) software is a data acquisition and data archival system for performing Broadband Outdoor Radiometer Calibrations (BORCAL). RCC provides a unique method of calibrating broadband atmospheric longwave and solar shortwave radiometers using techniques that reduce measurement uncertainty and better characterize a radiometer's response profile. The RCC software automatically monitors and controls many of the components that contribute to uncertainty in an instrument's responsivity. This is a user's manual and guide to the RCC software.
The Role of Data Archives in Synoptic Solar Physics
NASA Astrophysics Data System (ADS)
Reardon, Kevin
The detailed study of solar cycle variations requires analysis of recorded datasets spanning many years of observations, that is, a data archive. The use of digital data, combined with powerful database server software, gives such archives new capabilities to provide, quickly and flexibly, selected pieces of information to scientists. Use of standardized protocols will allow multiple databases, independently maintained, to be seamlessly joined, allowing complex searches spanning multiple archives. These data archives also benefit from being developed in parallel with the telescope itself, which helps to assure data integrity and to provide close integration between the telescope and archive. Development of archives that can guarantee long-term data availability and strong compatibility with other projects makes solar-cycle studies easier to plan and realize.
New Developments At The Science Archives Of The NASA Exoplanet Science Institute
NASA Astrophysics Data System (ADS)
Berriman, G. Bruce
2018-06-01
The NASA Exoplanet Science Institute (NExScI) at Caltech/IPAC is the science center for NASA's Exoplanet Exploration Program and as such, NExScI operates three scientific archives: the NASA Exoplanet Archive (NEA) and Exoplanet Follow-up Observation Program Website (ExoFOP), and the Keck Observatory Archive (KOA).The NASA Exoplanet Archive supports research and mission planning by the exoplanet community by operating a service that provides confirmed and candidate planets, numerous project and contributed data sets and integrated analysis tools. The ExoFOP provides an environment for exoplanet observers to share and exchange data, observing notes, and information regarding the Kepler, K2, and TESS candidates. KOA serves all raw science and calibration observations acquired by all active and decommissioned instruments at the W. M. Keck Observatory, as well as reduced data sets contributed by Keck observers.In the coming years, the NExScI archives will support a series of major endeavours allowing flexible, interactive analysis of the data available at the archives. These endeavours exploit a common infrastructure based upon modern interfaces such as JuypterLab and Python. The first service will enable reduction and analysis of precision radial velocity data from the HIRES Keck instrument. The Exoplanet Archive is developing a JuypterLab environment based on the HIRES PRV interactive environment. Additionally, KOA is supporting an Observatory initiative to develop modern, Python based pipelines, and as part of this work, it has delivered a NIRSPEC reduction pipeline. The ensemble of pipelines will be accessible through the same environments.
Headway Deviation Effects on Bus Passenger Loads : Analysis of Tri-Met's Archived AVL-APC Data
DOT National Transportation Integrated Search
2003-01-01
In this paper we empirically analyze the relationship between transit service headway deviations and passenger loads, using archived data from Tri-Met's automatic vehicle location and automatic passenger counter systems. The analysis employs twostage...
Education Policy Analysis Archives, 2001: Numbers 46-51.
ERIC Educational Resources Information Center
Glass, Gene V., Ed.
2001-01-01
This document consists of articles 46 through 51 published in the electronic journal Education Policy Analysis Archives for the year 2001: (46) Second Year Analysis of a Hybrid Schedule High School (James B. Shreiber, William R. Veal, David J. Flinders, and Sherry Churchill; (47) Knowledge Management for Educational Information Systems: What Is…
Kernel-Phase Interferometry for Super-Resolution Detection of Faint Companions
NASA Astrophysics Data System (ADS)
Factor, Samuel M.; Kraus, Adam L.
2017-06-01
Direct detection of close in companions (exoplanets or binary systems) is notoriously difficult. While coronagraphs and point spread function (PSF) subtraction can be used to reduce contrast and dig out signals of companions under the PSF, there are still significant limitations in separation and contrast near λ/D. Non-redundant aperture masking (NRM) interferometry can be used to detect companions well inside the PSF of a diffraction limited image, though the mask discards ˜ 95% of the light gathered by the telescope and thus the technique is severely flux limited. Kernel-phase analysis applies interferometric techniques similar to NRM to a diffraction limited image utilizing the full aperture. Instead of non-redundant closure-phases, kernel-phases are constructed from a grid of points on the full aperture, simulating a redundant interferometer. I have developed a new, easy to use, faint companion detection pipeline which analyzes kernel-phases utilizing Bayesian model comparison. I demonstrate this pipeline on archival images from HST/NICMOS, searching for new companions in order to constrain binary formation models at separations inaccessible to previous techniques. Using this method, it is possible to detect a companion well within the classical λ/D Rayleigh diffraction limit using a fraction of the telescope time as NRM. Since the James Webb Space Telescope (JWST) will be able to perform NRM observations, further development and characterization of kernel-phase analysis will allow efficient use of highly competitive JWST telescope time. As no mask is needed, this technique can easily be applied to archival data and even target acquisition images (e.g. from JWST), making the detection of close in companions cheap and simple as no additional observations are needed.
Imaging and Data Acquisition in Clinical Trials for Radiation Therapy
DOE Office of Scientific and Technical Information (OSTI.GOV)
FitzGerald, Thomas J., E-mail: Thomas.Fitzgerald@umassmed.edu; Bishop-Jodoin, Maryann; Followill, David S.
2016-02-01
Cancer treatment evolves through oncology clinical trials. Cancer trials are multimodal and complex. Assuring high-quality data are available to answer not only study objectives but also questions not anticipated at study initiation is the role of quality assurance. The National Cancer Institute reorganized its cancer clinical trials program in 2014. The National Clinical Trials Network (NCTN) was formed and within it was established a Diagnostic Imaging and Radiation Therapy Quality Assurance Organization. This organization is Imaging and Radiation Oncology Core, the Imaging and Radiation Oncology Core Group, consisting of 6 quality assurance centers that provide imaging and radiation therapy qualitymore » assurance for the NCTN. Sophisticated imaging is used for cancer diagnosis, treatment, and management as well as for image-driven technologies to plan and execute radiation treatment. Integration of imaging and radiation oncology data acquisition, review, management, and archive strategies are essential for trial compliance and future research. Lessons learned from previous trials are and provide evidence to support diagnostic imaging and radiation therapy data acquisition in NCTN trials.« less
Linking Science Analysis with Observation Planning: A Full Circle Data Lifecycle
NASA Technical Reports Server (NTRS)
Grosvenor, Sandy; Jones, Jeremy; Koratkar, Anuradha; Li, Connie; Mackey, Jennifer; Neher, Ken; Wolf, Karl; Obenschain, Arthur F. (Technical Monitor)
2001-01-01
A clear goal of the Virtual Observatory (VO) is to enable new science through analysis of integrated astronomical archives. An additional and powerful possibility of the VO is to link and integrate these new analyses with planning of new observations. By providing tools that can be used for observation planning in the VO, the VO will allow the data lifecycle to come full circle: from theory to observations to data and back around to new theories and new observations. The Scientist's Expert Assistant (SEA) Simulation Facility (SSF) is working to combine the ability to access existing archives with the ability to model and visualize new observations. Integrating the two will allow astronomers to better use the integrated archives of the VO to plan and predict the success of potential new observations more efficiently, The full circle lifecycle enabled by SEA can allow astronomers to make substantial leaps in the quality of data and science returns on new observations. Our paper examines the exciting potential of integrating archival analysis with new observation planning, such as performing data calibration analysis on archival images and using that analysis to predict the success of new observations, or performing dynamic signal-to-noise analysis combining historical results with modeling of new instruments or targets. We will also describe how the development of the SSF is progressing and what have been its successes and challenges.
Linking Science Analysis with Observation Planning: A Full Circle Data Lifecycle
NASA Technical Reports Server (NTRS)
Jones, Jeremy; Grosvenor, Sandy; Wolf, Karl; Li, Connie; Koratkar, Anuradha; Powers, Edward I. (Technical Monitor)
2001-01-01
A clear goal of the Virtual Observatory (VO) is to enable new science through analysis of integrated astronomical archives. An additional and powerful possibility of the VO is to link and integrate these new analyses with planning of new observations. By providing tools that can be used for observation planning in the VO, the VO will allow the data lifecycle to come full circle: from theory to observations to data and back around to new theories and new observations. The Scientist's Expert Assistant (SEA) Simulation Facility (SSF) is working to combine the ability to access existing archives with the ability to model and visualize new observations. Integrating the two will allow astronomers to better use the integrated archives of the VO to plan and predict the success of potential new observations. The full circle lifecycle enabled by SEA can allow astronomers to make substantial leaps in the quality of data and science returns on new observations. Our paper will examine the exciting potential of integrating archival analysis with new observation planning, such as performing data calibration analysis on archival images and using that analysis to predict the success of new observations, or performing dynamic signal-to-noise analysis combining historical results with modeling of new instruments or targets. We will also describe how the development of the SSF is progressing and what has been its successes and challenges.
Rejected Manuscripts in Publishers' Archives: Legal Rights and Access
ERIC Educational Resources Information Center
Hamburger, Susan
2011-01-01
This article focuses on an analysis of how various archival repositories deal with rejected manuscripts in publishers' archives as part of existing collections and as potential donations, and includes suggestions for ways to provide access while maintaining the author's legal rights. Viewpoints from the journal editor, author, archivist, and…
Use of archive aerial photography for monitoring black mangrove populations
USDA-ARS?s Scientific Manuscript database
A study was conducted on the south Texas Gulf Coast to evaluate archive aerial color-infrared (CIR) photography combined with supervised image analysis techniques to quantify changes in black mangrove [Avicennia germinans (L.) L.] populations over a 26-year period. Archive CIR film from two study si...
State activism and the hidden incentives behind bank acquisitions.
Marquis, Christopher; Guthrie, Doug; Almandoz, Juan
2012-01-01
A number of studies have shown that, as a result of the ambiguity of US legal mandates, organizations have considerable latitude in how they comply with regulations. In this paper, we address how the different agendas of the federal and state governments increase ambiguities in state-firm relations and how states are interested actors in creating opportunities for firms to navigate the federal legislation. We analyze the institutional forces behind bank acquisitions within and across state lines in order to illuminate the ways that US states take advantage of federal ambiguity and are able to shape corporate practices to their benefit. We specifically examine how patterns of bank acquisitions are shaped by the crucial relationship between the federal Community Reinvestment Act (CRA) and a little-understood provision in the federal tax code that is implemented at the state level, the Low-Income Housing Tax Credit (LIHTC). The relationship is complex because, while the federal government uses the CRA to control bank acquisition activity, states promote use of the LIHTC, through which banks can address federal CRA concerns, and thereby promote bank acquisitions in their jurisdictions. Thus, our findings suggest that the implementation of social legislation at one level in a federal regulatory system undermines the mechanisms of social legislation at another level. We use archival research and in-depth interviews to examine the interaction between these institutional processes and formulate hypotheses that predict the ways in which bank acquisitions are constrained by banks' CRA ratings and the way states in turn help banks overcome their CRA constraints. Quantitative analyses of all bank acquisitions in the United States from 1990-2000 largely support these hypotheses. Copyright © 2011 Elsevier Inc. All rights reserved.
A National Solar Digital Observatory
NASA Astrophysics Data System (ADS)
Hill, F.
2000-05-01
The continuing development of the Internet as a research tool, combined with an improving funding climate, has sparked new interest in the development of Internet-linked astronomical data bases and analysis tools. Here I outline a concept for a National Solar Digital Observatory (NSDO), a set of data archives and analysis tools distributed in physical location at sites which already host such systems. A central web site would be implemented from which a user could search all of the component archives, select and download data, and perform analyses. Example components include NSO's Digital Library containing its synoptic and GONG data, and the forthcoming SOLIS archive. Several other archives, in various stages of development, also exist. Potential analysis tools include content-based searches, visualized programming tools, and graphics routines. The existence of an NSDO would greatly facilitate solar physics research, as a user would no longer need to have detailed knowledge of all solar archive sites. It would also improve public outreach efforts. The National Solar Observatory is operated by AURA, Inc. under a cooperative agreement with the National Science Foundation.
Acquisition and Post-Processing of Immunohistochemical Images.
Sedgewick, Jerry
2017-01-01
Augmentation of digital images is almost always a necessity in order to obtain a reproduction that matches the appearance of the original. However, that augmentation can mislead if it is done incorrectly and not within reasonable limits. When procedures are in place for insuring that originals are archived, and image manipulation steps reported, scientists not only follow good laboratory practices, but avoid ethical issues associated with post processing, and protect their labs from any future allegations of scientific misconduct. Also, when procedures are in place for correct acquisition of images, the extent of post processing is minimized or eliminated. These procedures include white balancing (for brightfield images), keeping tonal values within the dynamic range of the detector, frame averaging to eliminate noise (typically in fluorescence imaging), use of the highest bit depth when a choice is available, flatfield correction, and archiving of the image in a non-lossy format (not JPEG).When post-processing is necessary, the commonly used applications for correction include Photoshop, and ImageJ, but a free program (GIMP) can also be used. Corrections to images include scaling the bit depth to higher and lower ranges, removing color casts from brightfield images, setting brightness and contrast, reducing color noise, reducing "grainy" noise, conversion of pure colors to grayscale, conversion of grayscale to colors typically used in fluorescence imaging, correction of uneven illumination (flatfield correction), merging color images (fluorescence), and extending the depth of focus. These corrections are explained in step-by-step procedures in the chapter that follows.
Education Policy Analysis Archives, 1996.
ERIC Educational Resources Information Center
Glass, Gene V., Ed.
1997-01-01
This document consists of the 19 articles published in the Electronic Journal "Education Policy Analysis Archives" for the year 1996: (1) "The Achievement Crisis Is Real: A Review of 'The Manufactured Crisis'" (Lawrence C. Stedman); (2) "Staff Development Policy: Fuzzy Choices in an Imperfect Market" (Robert T.…
Education Policy Analysis Archives, 2000.
ERIC Educational Resources Information Center
Glass, Gene V., Ed.
2000-01-01
This document consists of the 2000 edition of the "Education Policy Analysis Archives." The papers include: (1) "Teacher Quality and Student Achievement: A Review of State Policy Evidence" (Linda Darling-Hammond); (2) "America Y2K: The Obsolescence of Educational Reforms" (Sherman Dorn); (3) "Forces for Change in…
Education Policy Analysis Archives, 1998.
ERIC Educational Resources Information Center
Glass, Gene V., Ed.
1998-01-01
This document consists of the 21 articles published in the electronic journal "Education Policy Analysis Archives" for the year 1996. The articles are: (1) "The Political Legacy of School Accountability Systems" (Sherman Dorn); (2) "Review of Stephen Arons's 'Short Route to Chaos'" (Charles L. Glenn); (3)…
Loudig, Olivier; Wang, Tao; Ye, Kenny; Lin, Juan; Wang, Yihong; Ramnauth, Andrew; Liu, Christina; Stark, Azadeh; Chitale, Dhananjay; Greenlee, Robert; Multerer, Deborah; Honda, Stacey; Daida, Yihe; Spencer Feigelson, Heather; Glass, Andrew; Couch, Fergus J.; Rohan, Thomas; Ben-Dov, Iddo Z.
2017-01-01
Formalin-fixed paraffin-embedded (FFPE) specimens, when used in conjunction with patient clinical data history, represent an invaluable resource for molecular studies of cancer. Even though nucleic acids extracted from archived FFPE tissues are degraded, their molecular analysis has become possible. In this study, we optimized a laboratory-based next-generation sequencing barcoded cDNA library preparation protocol for analysis of small RNAs recovered from archived FFPE tissues. Using matched fresh and FFPE specimens, we evaluated the robustness and reproducibility of our optimized approach, as well as its applicability to archived clinical specimens stored for up to 35 years. We then evaluated this cDNA library preparation protocol by performing a miRNA expression analysis of archived breast ductal carcinoma in situ (DCIS) specimens, selected for their relation to the risk of subsequent breast cancer development and obtained from six different institutions. Our analyses identified six miRNAs (miR-29a, miR-221, miR-375, miR-184, miR-363, miR-455-5p) differentially expressed between DCIS lesions from women who subsequently developed an invasive breast cancer (cases) and women who did not develop invasive breast cancer within the same time interval (control). Our thorough evaluation and application of this laboratory-based miRNA sequencing analysis indicates that the preparation of small RNA cDNA libraries can reliably be performed on older, archived, clinically-classified specimens. PMID:28335433
Analysis of the request patterns to the NSSDC on-line archive
NASA Technical Reports Server (NTRS)
Johnson, Theodore
1994-01-01
NASA missions, both for earth science and for space science, collect huge amounts of data, and the rate at which data is being gathered is increasing. For example, the EOSDIS project is expected to collect petabytes per year. In addition, these archives are being made available to remote users over the Internet. The ability to manage the growth of the size and request activity of scientific archives depends on an understanding of the access patterns of scientific users. The National Space Science Data Center (NSSDC) of NASA Goddard Space Flight Center has run their on-line mass storage archive of space data, the National Data Archive and Distribution Service (NDADS), since November 1991. A large world-wide space research community makes use of NSSDC, requesting more than 20,000 files per month. Since the initiation of their service, they have maintained log files which record all accesses the archive. In this report, we present an analysis of the NDADS log files. We analyze the log files, and discuss several issues, including caching, reference patterns, clustering, and system loading.
Archival Services and Technologies for Scientific Data
NASA Astrophysics Data System (ADS)
Meyer, Jörg; Hardt, Marcus; Streit, Achim; van Wezel, Jos
2014-06-01
After analysis and publication, there is no need to keep experimental data online on spinning disks. For reliability and costs inactive data is moved to tape and put into a data archive. The data archive must provide reliable access for at least ten years following a recommendation of the German Science Foundation (DFG), but many scientific communities wish to keep data available much longer. Data archival is on the one hand purely a bit preservation activity in order to ensure the bits read are the same as those written years before. On the other hand enough information must be archived to be able to use and interpret the content of the data. The latter is depending on many also community specific factors and remains an areas of much debate among archival specialists. The paper describes the current practice of archival and bit preservation in use for different science communities at KIT for which a combination of organizational services and technical tools are required. The special monitoring to detect tape related errors, the software infrastructure in use as well as the service certification are discussed. Plans and developments at KIT also in the context of the Large Scale Data Management and Analysis (LSDMA) project are presented. The technical advantages of the T10 SCSI Stream Commands (SSC-4) and the Linear Tape File System (LTFS) will have a profound impact on future long term archival of large data sets.
Visual Systems for Interactive Exploration and Mining of Large-Scale Neuroimaging Data Archives
Bowman, Ian; Joshi, Shantanu H.; Van Horn, John D.
2012-01-01
While technological advancements in neuroimaging scanner engineering have improved the efficiency of data acquisition, electronic data capture methods will likewise significantly expedite the populating of large-scale neuroimaging databases. As they do and these archives grow in size, a particular challenge lies in examining and interacting with the information that these resources contain through the development of compelling, user-driven approaches for data exploration and mining. In this article, we introduce the informatics visualization for neuroimaging (INVIZIAN) framework for the graphical rendering of, and dynamic interaction with the contents of large-scale neuroimaging data sets. We describe the rationale behind INVIZIAN, detail its development, and demonstrate its usage in examining a collection of over 900 T1-anatomical magnetic resonance imaging (MRI) image volumes from across a diverse set of clinical neuroimaging studies drawn from a leading neuroimaging database. Using a collection of cortical surface metrics and means for examining brain similarity, INVIZIAN graphically displays brain surfaces as points in a coordinate space and enables classification of clusters of neuroanatomically similar MRI images and data mining. As an initial step toward addressing the need for such user-friendly tools, INVIZIAN provides a highly unique means to interact with large quantities of electronic brain imaging archives in ways suitable for hypothesis generation and data mining. PMID:22536181
On detecting variables using ROTSE-IIId archival data
NASA Astrophysics Data System (ADS)
Yesilyaprak, C.; Yerli, S. K.; Aksaker, N.; Gucsav, B. B.; Kiziloglu, U.; Dikicioglu, E.; Coker, D.; Aydin, E.; Ozeren, F. F.
ROTSE (Robotic Optical Transient Search Experiment) telescopes can also be used for variable star detection. As explained in the system description tep{2003PASP..115..132A}, they have a good sky coverage and they allow a fast data acquisition. The optical magnitude range varies between 7^m to 19^m. Thirty percent of the telescope time of north-eastern leg of the network, namely ROTSE-IIId (located at TUBITAK National Observatory, Bakirlitepe, Turkey http://www.tug.tubitak.gov.tr/) is owned by Turkish researchers. Since its first light (May 2004) considerably a large amount of data has been collected (around 2 TB) from the Turkish time and roughly one million objects have been identified from the reduced data. A robust pipeline has been constructed to discover new variables, transients and planetary nebulae from this archival data. In the detection process, different statistical methods were applied to the archive. We have detected thousands of variable stars by applying roughly four different tests to light curve of each star. In this work a summary of the pipeline is presented. It uses a high performance computing (HPC) algorithm which performs inhomogeneous ensemble photometry of the data on a 36 core cluster. This study is supported by TUBITAK (Scientific and Technological Research Council of Turkey) with the grant number TBAG-108T475.
NASA Astrophysics Data System (ADS)
Saurel, Jean-Marie; Randriamora, Frédéric; Bosson, Alexis; Kitou, Thierry; Vidal, Cyril; Bouin, Marie-Paule; de Chabalier, Jean-Bernard; Clouard, Valérie
2010-05-01
Lesser Antilles observatories are in charge of monitoring the volcanoes and earthquakes in the Eastern Caribbean region. During the past two years, our seismic networks have evolved toward a full digital technology. These changes, which include modern three components sensors, high dynamic range digitizers, high speed terrestrial and satellite telemetry, improve data quality but also increase the data flows to process and to store. Moreover, the generalization of data exchange to build a wide virtual seismic network around the Caribbean domain requires a great flexibility to provide and receive data flows in various formats. As many observatories, we have decided to use the most popular and robust open source data acquisition systems in use in today observatories community : EarthWorm and SeisComP. The first is renowned for its ability to process real time seismic data flows, with a high number of tunable modules (filters, triggers, automatic pickers, locators). The later is renowned for its ability to exchange seismic data using the international SEED standard (Standard for Exchange of Earthquake Data), either by producing archive files, or by managing output and input SEEDLink flows. French Antilles Seismological and Volcanological Observatories have chosen to take advantage of the best features of each software to design a new data flow scheme and to integrate it in our global observatory data management system, WebObs [Beauducel et al., 2004]1, see the companion paper (Part 2). We assigned the tasks to the different softwares, regarding their main abilities : - EarthWorm first performs the integration of data from different heterogeneous sources; - SeisComP takes all this homogeneous EarthWorm data flow, adds other sources and produces SEED archives and SEED data flow; - EarthWorm is then used again to process this clean and complete SEEDLink data flow, mainly producing triggers, automatic locations and alarms; - WebObs provides a friendly human interface, both to the administrator for station management, and to the regular user for real time everyday analysis of the seismic data (event classification database, location scripts, automatic shakemaps and regional catalog with associated hypocenter maps).
Education Policy Analysis Archives, 2001: Numbers 12-22.
ERIC Educational Resources Information Center
Glass, Gene V., Ed.
2001-01-01
This document consists of articles 12-22 published in the electronic journal "Educational Policy Analysis Archives" for the year 2001: (12) "Affirmative Action at Work: Performance Audit of Two Minority Graduate Fellowship Programs, Illinois IMGIP and ICEOP" (Jack McKillip); (13) "School Reform Initiatives as Balancing…
Education Policy Analysis Archives, 1999: Numbers 21-31.
ERIC Educational Resources Information Center
Glass, Gene V., Ed.
1999-01-01
This document consists of articles 21-31 published in the electronic journal "Education Policy Analysis Archives" for the year 1999: (21) "Facing the Consequences: Identifying Limitations of How We Categorize People in Research and Policy" (Cynthia Wallat and Carolyn Steele); (22) "Teachers in Charter Schools and…
The Potential of "Function" as an Archival Descriptor
ERIC Educational Resources Information Center
Chaudron, Gerald
2008-01-01
Functional analysis has been incorporated widely into appraisal methods for decades. These methods, from documentation strategy to macroappraisal, are discussed, and the usefulness and limitations of functional analysis in appraisal are examined. Yet, while archival thinkers have focused on function in appraisal, little has been written on…
NASA Astrophysics Data System (ADS)
Agarwal, D.; Varadharajan, C.; Cholia, S.; Snavely, C.; Hendrix, V.; Gunter, D.; Riley, W. J.; Jones, M.; Budden, A. E.; Vieglais, D.
2017-12-01
The ESS-DIVE archive is a new U.S. Department of Energy (DOE) data archive designed to provide long-term stewardship and use of data from observational, experimental, and modeling activities in the earth and environmental sciences. The ESS-DIVE infrastructure is constructed with the long-term vision of enabling broad access to and usage of the DOE sponsored data stored in the archive. It is designed as a scalable framework that incentivizes data providers to contribute well-structured, high-quality data to the archive and that enables the user community to easily build data processing, synthesis, and analysis capabilities using those data. The key innovations in our design include: (1) application of user-experience research methods to understand the needs of users and data contributors; (2) support for early data archiving during project data QA/QC and before public release; (3) focus on implementation of data standards in collaboration with the community; (4) support for community built tools for data search, interpretation, analysis, and visualization tools; (5) data fusion database to support search of the data extracted from packages submitted and data available in partner data systems such as the Earth System Grid Federation (ESGF) and DataONE; and (6) support for archiving of data packages that are not to be released to the public. ESS-DIVE data contributors will be able to archive and version their data and metadata, obtain data DOIs, search for and access ESS data and metadata via web and programmatic portals, and provide data and metadata in standardized forms. The ESS-DIVE archive and catalog will be federated with other existing catalogs, allowing cross-catalog metadata search and data exchange with existing systems, including DataONE's Metacat search. ESS-DIVE is operated by a multidisciplinary team from Berkeley Lab, the National Center for Ecological Analysis and Synthesis (NCEAS), and DataONE. The primarily data copies are hosted at DOE's NERSC supercomputing facility with replicas at DataONE nodes.
Recommendations for a service framework to access astronomical archives
NASA Technical Reports Server (NTRS)
Travisano, J. J.; Pollizzi, J.
1992-01-01
There are a large number of astronomical archives and catalogs on-line for network access, with many different user interfaces and features. Some systems are moving towards distributed access, supplying users with client software for their home sites which connects to servers at the archive site. Many of the issues involved in defining a standard framework of services that archive/catalog suppliers can use to achieve a basic level of interoperability are described. Such a framework would simplify the development of client and server programs to access the wide variety of astronomical archive systems. The primary services that are supplied by current systems include: catalog browsing, dataset retrieval, name resolution, and data analysis. The following issues (and probably more) need to be considered in establishing a standard set of client/server interfaces and protocols: Archive Access - dataset retrieval, delivery, file formats, data browsing, analysis, etc.; Catalog Access - database management systems, query languages, data formats, synchronous/asynchronous mode of operation, etc.; Interoperability - transaction/message protocols, distributed processing mechanisms (DCE, ONC/SunRPC, etc), networking protocols, etc.; Security - user registration, authorization/authentication mechanisms, etc.; Service Directory - service registration, lookup, port/task mapping, parameters, etc.; Software - public vs proprietary, client/server software, standard interfaces to client/server functions, software distribution, operating system portability, data portability, etc. Several archive/catalog groups, notably the Astrophysics Data System (ADS), are already working in many of these areas. In the process of developing StarView, which is the user interface to the Space Telescope Data Archive and Distribution Service (ST-DADS), these issues and the work of others were analyzed. A framework of standard interfaces for accessing services on any archive system which would benefit archive user and supplier alike is proposed.
NASA Astrophysics Data System (ADS)
Astrand, Par-Johan; Wirnhardt, Csaba; Biagini, Bruno; Weber, Michaela; Hellerman, Rani
2004-11-01
Since 1993, the EC DG Agriculture has promoted the use of "Controls with Remote Sensing" (CwRS) as appropriate control system within the Common Agricultural Policy (CAP). CwRS is considered suitable to check if agricultural area-based subsidies (yearly > 25 billion euro EC expenditure) are correctly granted. On the basis of the Council Regulation (EC) 165/94 and of the Commission Regulation (EC) 601/94, the Commission Services are required to centralize the satellite images acquisition. This task has been managed by the MARS Project at the JRC since 1999, where the whole controls activity is coordinated. The activity also includes the setting up of specifications, recommendations, performing Quality Controls (QC) and auditing of the selected contractors, and evaluation of new methods. Satellite image acquisition involves the control site definition within each Member State, and the subsequent chain of image acquisition over the defined sites including feasibility with image providers, acquisition, validation, ordering, delivery and final archiving of the imagery. In summary the 2004 years campaign involved a budget of approximately 3.2 M euro to cover some 150 High Resolution (HR) sites and 71 Very High Resolution (VHR) sites. The objective of this paper is to describe the CwRS image acquisition with emphasis on the Ikonos, Quickbird, and EROS A satellites for the 2004 years CwRS Campaign, to give preliminary results, recommendations and future trends.
Life Testing and Diagnostics of a Planar Out-of-Core Thermionic Converter
NASA Astrophysics Data System (ADS)
Thayer, Kevin L.; Ramalingam, Mysore L.; Young, Timothy J.; Lamp, Thomas R.
1994-07-01
This paper details the design and performance of an automated computer data acquisition system for a planar, out-of-core thermionic converter with CVD rhenium electrodes. The output characteristics of this converter have been mapped for emitter temperatures ranging from approximately 1700K to 2000K, and life testing of the converter is presently being performed at the design point of operation. An automated data acquisition system has been constructed to facilitate the collection of current density versus output voltage (J-V) and temperature data from the converter throughout the life test. This system minimizes the amount of human interaction necessary during the lifetest to measure and archive the data and present it in a usable form. The task was accomplished using a Macintosh Ilcx computer, two multiple-purpose interface boards, a digital oscilloscope, a sweep generator, and National Instrument's LabVIEW application software package.
Education Policy Analysis Archives, 1999: Numbers 1-20.
ERIC Educational Resources Information Center
Glass, Gene V., Ed.
1999-01-01
This document consists of articles 1-20 published in the electronic journal "Educational Policy Analysis Archives" for the year 1999: (1) "Ethnic Segregation in Arizona Charter Schools" (Casey D. Cobb and Gene V. Glass); (2) "Educational Research in Latin America: A Response to Akkari and Perez" (Mariano Narodowski);…
Education Policy Analysis Archives, 2002: Numbers 26-50.
ERIC Educational Resources Information Center
Glass, Gene V., Ed.
This document consists of articles 26 through 50 published in the electronic journal "Education Policy Analysis Archives" for the year 2002: (26) "Home Schooling in the United States: Trends and Characteristics" (Kurt J. Bauman); (27) "Mentoring Narratives ON-LINE: Teaching the Principalship" (Alison I. Griffith and Svitlana Taraban); (28) "Elm…
Education Policy Analysis Archives, 2002: Numbers 1-25.
ERIC Educational Resources Information Center
Glass, Gene V., Ed.
2002-01-01
This document consists of articles 1 through 25 published in the electronic journal Education Policy Analysis Archives for the year 2002: (1) Testing and Diversity in Postsecondary Education: The Case of California (Daniel Koretz, Michael Russell, Chingwei David Shin, Cathy Horn, and Kelly Shasby); (2) State-Mandated Testing and Teachers Beliefs…
Education Policy Analysis Archives, 2001: Numbers 23-45.
ERIC Educational Resources Information Center
Glass, Gene V., Ed.
2001-01-01
This document consists of articles 23-45 published in the electronic journal "Education Policy Analysis Archives" for the year 2001: (23) "La Participacion de las Minorias Nacionales dentro de Sistemas Educativas Pre-Modernos: El Caso de los Garifunas de Guatemala" (Carlos R. Ruano); (24) "'Alexander v. Sandoval': A…
The Semantic Mapping of Archival Metadata to the CIDOC CRM Ontology
ERIC Educational Resources Information Center
Bountouri, Lina; Gergatsoulis, Manolis
2011-01-01
In this article we analyze the main semantics of archival description, expressed through Encoded Archival Description (EAD). Our main target is to map the semantics of EAD to the CIDOC Conceptual Reference Model (CIDOC CRM) ontology as part of a wider integration architecture of cultural heritage metadata. Through this analysis, it is concluded…
NASA Astrophysics Data System (ADS)
Petitjean, Gilles; de Hauteclocque, Bertrand
2004-06-01
EADS Defence and Security Systems (EADS DS SA) have developed an expertise as integrator of archive management systems for both their commercial and defence customers (ESA, CNES, EC, EUMETSAT, French MOD, US DOD, etc.), especially in Earth Observation and in Meteorology fields.The concern of valuable data owners is both their long-term preservation but also the integration of the archive in their information system with in particular an efficient access to archived data for their user community. The system integrator answers to this requirement by a methodology combining understanding of user needs, exhaustive knowledge of the existing solutions both for hardware and software elements and development and integration ability. The system integrator completes the facility development by support activities.The long-term preservation of archived data obviously involves a pertinent selection of storage media and archive library. This selection relies on storage technology survey but the selection criteria depend on the analysis of the user needs. The system integrator will recommend the best compromise for implementing an archive management facility, thanks to its knowledge and its independence of storage market and through the analysis of the user requirements. He will provide a solution, which is able to evolve to take advantage of the storage technology progress.But preserving the data for long-term is not only a question of storage technology. Some functions are required to secure the archive management system against contingency situation: multiple data set copies using operational procedures, active quality control of the archived data, migration policy optimising the cost of ownership.
Subino, Janice A.; Morgan, Karen L.M.; Krohn, M. Dennis; Miller, Gregory K.; Dadisman, Shawn V.; Forde, Arnell S.
2012-01-01
To view the survey maps and navigation files, and for more information about these items, see the Navigation page. Figure 1 displays the acquisition geometry. The tables provide detailed information about the assigned location, name, data, and time the photograph was taken along with links to the photo and corresponding 5-min contact sheet. Refer to table 1 and table 2 for details of the northern and southern county photographs, respectively.
BOREAS Level-0 ER-2 Navigation Data
NASA Technical Reports Server (NTRS)
Strub, Richard; Dominguez, Roseanne; Newcomer, Jeffrey A.; Hall, Forrest G. (Editor)
2000-01-01
The BOREAS Staff Science effort covered those activities that were BOREAS community-level activities or required uniform data collection procedures across sites and time. These activities included the acquisition, processing, and archiving of aircraft navigation/attitude data to complement the digital image data. The level-0 ER-2 navigation data files contain aircraft attitude and position information acquired during the digital image and photographic data collection missions. Temporally, the data were acquired from April to September 1994. Data were recorded at intervals of 5 seconds. The data are stored in tabular ASCII files.
NASA Technical Reports Server (NTRS)
Noll, C.; Lee, L.; Torrence, M.
2011-01-01
The International Laser Ranging Service (ILRS) website, http://ilrs.gsfc.nasa.gov, is the central source of information for all aspects of the service. The website provides information on the organization and operation of ILRS and descriptions of ILRS components, data, and products. Furthermore, the website and provides an entry point to the archive of these data and products available through the data centers. Links are provided to extensive information on the ILRS network stations including performance assessments and data quality evaluations. Descriptions of supported satellite missions (current, future, and past) are provided to aid in station acquisition and data analysis. The current format for the ILRS website has been in use since the early years of the service. Starting in 2010, the ILRS Central Bureau began efforts to redesign the look and feel for the website. The update will allow for a review of the contents, ensuring information is current and useful. This poster will detail the proposed design including specific examples of key sections and webpages.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ermi, A.M.
1997-05-01
Description of the Proposed Activity/REPORTABLE OCCURRENCE or PIAB: This ECN changes the computer systems design description support document describing the computers system used to control, monitor and archive the processes and outputs associated with the Hydrogen Mitigation Test Pump installed in SY-101. There is no new activity or procedure associated with the updating of this reference document. The updating of this computer system design description maintains an agreed upon documentation program initiated within the test program and carried into operations at time of turnover to maintain configuration control as outlined by design authority practicing guidelines. There are no new crediblemore » failure modes associated with the updating of information in a support description document. The failure analysis of each change was reviewed at the time of implementation of the Systems Change Request for all the processes changed. This document simply provides a history of implementation and current system status.« less
The LBT real-time based control software to mitigate and compensate vibrations
NASA Astrophysics Data System (ADS)
Borelli, J.; Trowitzsch, J.; Brix, M.; Kürster, M.; Gässler, W.; Bertram, T.; Briegel, F.
2010-07-01
The Large Binocular Telescope (LBT) uses two 8.4 meters active primary mirrors and two adaptive secondary mirrors on the same mounting to take advantage of its interferometric capabilities. Both applications, interferometry and AO, are sensitive to vibrations. Several measurement campaigns have been carried out at the LBT and their results strongly indicate that a vibration monitoring system is required to improve the performance of LINC-NIRVANA, LBTI, and ARGOS, the laser guided ground layer adaptive optic system. Currently, a control software for mitigation and compensation of the vibrations is being designed. A complex set of algorithms collects real-time vibration data, archiving it for further analysis, and in parallel, generating the tip-tilt and optical path difference (OPD) data for the control loop of the instruments. A real-time data acquisition device equipped with embedded real-time Linux is used in our systems. A set of quick-look tools is currently under development in order to verify if the conditions at the telescope are suitable for interferometric/adaptive observations.
USGS remote sensing coordination for the 2010 Haiti earthquake
Duda, Kenneth A.; Jones, Brenda
2011-01-01
In response to the devastating 12 January 2010, earthquake in Haiti, the US Geological Survey (USGS) provided essential coordinating services for remote sensing activities. Communication was rapidly established between the widely distributed response teams and data providers to define imaging requirements and sensor tasking opportunities. Data acquired from a variety of sources were received and archived by the USGS, and these products were subsequently distributed using the Hazards Data Distribution System (HDDS) and other mechanisms. Within six weeks after the earthquake, over 600,000 files representing 54 terabytes of data were provided to the response community. The USGS directly supported a wide variety of groups in their use of these data to characterize post-earthquake conditions and to make comparisons with pre-event imagery. The rapid and continuing response achieved was enabled by existing imaging and ground systems, and skilled personnel adept in all aspects of satellite data acquisition, processing, distribution and analysis. The information derived from image interpretation assisted senior planners and on-site teams to direct assistance where it was most needed.
The Queued Service Observing Project at CFHT
NASA Astrophysics Data System (ADS)
Martin, Pierre; Savalle, Renaud; Vermeulen, Tom; Shapiro, Joshua N.
2002-12-01
In order to maximize the scientific productivity of the CFH12K mosaic wide-field imager (and soon MegaCam), the Queued Service Observing (QSO) mode was implemented at the Canada-France-Hawaii Telescope at the beginning of 2001. The QSO system consists of an ensemble of software components allowing for the submission of programs, the preparation of queues, and finally the execution and evaluation of observations. The QSO project is part of a broader system known as the New Observing Process (NOP). This system includes data acquisition, data reduction and analysis through a pipeline named Elixir, and a data archiving and distribution component (DADS). In this paper, we review several technical and operational aspects of the QSO project. In particular, we present our strategy, technical architecture, program submission system, and the tools developed for the preparation and execution of the queues. Our successful experience of over 150 nights of QSO operations is also discussed along with the future plans for queue observing with MegaCam and other instruments at CFHT.
Software used with the flux mapper at the solar parabolic dish test site
NASA Technical Reports Server (NTRS)
Miyazono, C.
1984-01-01
Software for data archiving and data display was developed for use on a Digital Equipment Corporation (DEC) PDP-11/34A minicomputer for use with the JPL-designed flux mapper. The flux mapper is a two-dimensional, high radiant energy scanning device designed to measure radiant flux energies expected at the focal point of solar parabolic dish concentrators. Interfacing to the DEC equipment was accomplished by standard RS-232C serial lines. The design of the software was dicated by design constraints of the flux-mapper controller. Early attemps at data acquisition from the flux-mapper controller were not without difficulty. Time and personnel limitations result in an alternative method of data recording at the test site with subsequent analysis accomplished at a data evaluation location at some later time. Software for plotting was also written to better visualize the flux patterns. Recommendations for future alternative development are discussed. A listing of the programs used in the anaysis is included in an appendix.
Forde, Arnell S.; Dadisman, Shawn V.; Flocks, James G.; Wiese, Dana S.
2010-01-01
In June of 2007, the U.S. Geological Survey (USGS) conducted a geophysical survey offshore of the Chandeleur Islands, Louisiana, in cooperation with the Louisiana Department of Natural Resources (LDNR) as part of the USGS Barrier Island Comprehensive Monitoring (BICM) project. This project is part of a broader study focused on Subsidence and Coastal Change (SCC). The purpose of the study was to investigate the shallow geologic framework and monitor the enviromental impacts of Hurricane Katrina (Louisiana landfall was on August 29, 2005) on the Gulf Coast's barrier island chains. This report serves as an archive of unprocessed digital 512i and 424 Chirp sub-bottom profile data, trackline maps, navigation files, Geographic Information System (GIS) files, Field Activity Collection System (FACS) logs, observer's logbook, and formal Federal Geographic Data Committee (FGDC) metadata. Gained (a relative increase in signal amplitude) digital images of the seismic profiles are also provided. Refer to the Acronyms page for expansion of acronyms and abbreviations used in this report. The USGS St. Petersburg Coastal and Marine Science Center (SPCMSC) assigns a unique identifier to each cruise or field activity. For example, 07SCC01 tells us the data were collected in 2007 for the Subsidence and Coastal Change (SCC) study and the data were collected during the first field activity for that study in that calendar year. Refer to http://walrus.wr.usgs.gov/infobank/programs/html/definition/activity.html for a detailed description of the method used to assign the field activity identification (ID). All Chirp systems use a signal of continuously varying frequency; the Chirp systems used during this survey produce high resolution, shallow penetration profile images beneath the seafloor. The towfish is a sound source and receiver, which is typically towed 1 - 2 m below the sea surface. The acoustic energy is reflected at density boundaries (such as the seafloor or sediment layers beneath the seafloor), detected by a receiver, and recorded by a PC-based seismic acquisition system. This process is repeated at timed intervals (for example, 0.125 s) and recorded for specific intervals of time (for example, 50 ms). In this way, a two-dimensional vertical image of the shallow geologic structure beneath the ship track is produced. Figure 1 displays the acquisition geometry. Refer to table 1 for a summary of acquisition parameters. See the digital FACS equipment log (11-KB PDF) for details about the acquisition equipment used. Table 2 lists trackline statistics. Scanned images of the handwritten FACS logs and handwritten science logbook (449-KB PDF) are also provided. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG-Y rev 1 format (Norris and Faichney, 2002); ASCII character encoding is used for the first 3,200 bytes of the card image header instead of the SEG-Y rev 0 (Barry and others, 1975) EBCDIC format. The SEG-Y files may be downloaded and processed with commercial or public domain software such as Seismic Unix (SU) (Cohen and Stockwell, 2010). See the How To Download SEG-Y Data page for download instructions. The web version of this archive does not contain the SEG-Y trace files. These files are very large and would require extremely long download times. To obtain the complete DVD archive, contact USGS Information at 1-888-ASK-USGS or infoservices@usgs.gov. The printable profiles provided here are GIF images that were processed and gained using SU software; refer to the Software page for links to example SU processing scripts and USGS software for viewing the SEG-Y files (Zihlman, 1992). The processed SEG-Y data were also exported to Chesapeake Technology, Inc. (CTI) SonarWeb software to produce an interactive version of the profile that allows the user to obtain a geographic location and depth from the profile for a given cursor position. This information is displayed in the status bar of the browser.
The Alaska Arctic Vegetation Archive (AVA-AK)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Walker, Donald; Breen, Amy; Druckenmiller, Lisa
The Alaska Arctic Vegetation Archive (AVA-AK, GIVD-ID: NA-US-014) is a free, publically available database archive of vegetation-plot data from the Arctic tundra region of northern Alaska. The archive currently contains 24 datasets with 3,026 non-overlapping plots. Of these, 74% have geolocation data with 25-m or better precision. Species cover data and header data are stored in a Turboveg database. A standardized Pan Arctic Species List provides a consistent nomenclature for vascular plants, bryophytes, and lichens in the archive. A web-based online Alaska Arctic Geoecological Atlas (AGA-AK) allows viewing and downloading the species data in a variety of formats, and providesmore » access to a wide variety of ancillary data. We conducted a preliminary cluster analysis of the first 16 datasets (1,613 plots) to examine how the spectrum of derived clusters is related to the suite of datasets, habitat types, and environmental gradients. Here, we present the contents of the archive, assess its strengths and weaknesses, and provide three supplementary files that include the data dictionary, a list of habitat types, an overview of the datasets, and details of the cluster analysis.« less
The Alaska Arctic Vegetation Archive (AVA-AK)
Walker, Donald; Breen, Amy; Druckenmiller, Lisa; ...
2016-05-17
The Alaska Arctic Vegetation Archive (AVA-AK, GIVD-ID: NA-US-014) is a free, publically available database archive of vegetation-plot data from the Arctic tundra region of northern Alaska. The archive currently contains 24 datasets with 3,026 non-overlapping plots. Of these, 74% have geolocation data with 25-m or better precision. Species cover data and header data are stored in a Turboveg database. A standardized Pan Arctic Species List provides a consistent nomenclature for vascular plants, bryophytes, and lichens in the archive. A web-based online Alaska Arctic Geoecological Atlas (AGA-AK) allows viewing and downloading the species data in a variety of formats, and providesmore » access to a wide variety of ancillary data. We conducted a preliminary cluster analysis of the first 16 datasets (1,613 plots) to examine how the spectrum of derived clusters is related to the suite of datasets, habitat types, and environmental gradients. Here, we present the contents of the archive, assess its strengths and weaknesses, and provide three supplementary files that include the data dictionary, a list of habitat types, an overview of the datasets, and details of the cluster analysis.« less
On-the-fly Data Reprocessing and Analysis Capabilities from the XMM-Newton Archive
NASA Astrophysics Data System (ADS)
Ibarra, A.; Sarmiento, M.; Colomo, E.; Loiseau, N.; Salgado, J.; Gabriel, C.
2017-10-01
The XMM-Newton Science Archive (XSA) includes since last release the possibility to perform on-the-fly data processing with SAS through the Remote Interface for Science Analysis (RISA) server. It enables scientists to analyse data without any download and installation of data and software. The analysis options presently available include extraction of spectra and light curves of user-defined EPIC source regions and full reprocessing of data for which currently archived pipeline products were processed with older SAS versions or calibration files. The current pipeline is fully aligned with the most recent SAS and calibration, while the last full reprocessing of the archive was performed in 2013. The on-the-fly data processing functionality in this release is an experimental version and we invite the community to test and let us know their results. Known issues and workarounds are described in the 'Watchouts' section of the XSA web page. Feedback on how this functionality should evolve will be highly appreciated.
NASA Technical Reports Server (NTRS)
Noll, Carey
2006-01-01
The IGS analysis centers and user community in general need to be assured that the data centers archive a consistent set of files. Changes to the archives can occur because of the re-publishing of data, the transmission of historic data, and the resulting re-distribution (or lack thereof) of these data from data center to data center. To ensure the quality of the archives, a defined data flow and method of archive population needs to be established. This poster will diagram and review the current IGS data flow, discuss problems that have occurred, and provide recommendations for improvement.
NASA Astrophysics Data System (ADS)
Haley, M.
The purpose of this study was to investigate whether or not there have been successful applications of lean manufacturing principles in highly variable defense IT environments. Specifically, the study assessed if implementation of the lean philosophies by a defense organization yielded repeatable, predictable results in software release schedules reductions. Additionally, the study set out to determine what potential critical success factors (CSF's) were documented in the secondary data captured for each release, and extracted the variables used in the decision making for acceptability of fielding. In evaluating lean applicability to the high variability environment of USAF IT acquisitions, the research was conducted using non-experimental quantitative methods of archival secondary data. The sample for this case study was compiled from a USAF office that had implemented these techniques in pre-development, development and testing, and fielding phases. Based on the research data, acquisitionists and lean practitioners are inherently interconnected. Therefore, an understanding that critical success factors (CSFs) are integral to successful lean application in DoD IT acquisitions is crucial. Through a combination of synergistic alignments, plyometric CSFs were discovered to maximize the effects of each single CSF to produce rapid results in defense IT acquisitions. These include: (1) Enterprise Incorporation, (2) Team Trust, (3) Transformational Leadership, (4) Recursive Improvement, (5) Integrated Synergy, (6) Customer-Centric Culture and (7) Heuristic Communication.
Data Acquisition Backbone Core DABC release v1.0
NASA Astrophysics Data System (ADS)
Adamczewski-Musch, J.; Essel, H. G.; Kurz, N.; Linev, S.
2010-04-01
The Data Acquisition Backbone Core (DABC) is a general purpose software framework designed for the implementation of a wide-range of data acquisition systems - from various small detector test beds to high performance systems. DABC consists of a compact data-flow kernel and a number of plug-ins for various functional components like data inputs, device drivers, user functional modules and applications. DABC provides configurable components for implementing event building over fast networks like InfiniBand or Gigabit Ethernet. A generic Java GUI provides the dynamic control and visualization of control parameters and commands, provided by DIM servers. A first set of application plug-ins has been implemented to use DABC as event builder for the front-end components of the GSI standard DAQ system MBS (Multi Branch System). Another application covers the connection to DAQ readout chains from detector front-end boards (N-XYTER) linked to read-out controller boards (ROC) over UDP into DABC for event building, archiving and data serving. This was applied for data taking in the September 2008 test beamtime for the CBM experiment at GSI. DABC version 1.0 is released and available from the website.
The Transformation of Federal Education Policy: The Kennedy and Johnson Years.
ERIC Educational Resources Information Center
Graham, Hugh Davis
Archive-based historical analysis brings a perspective to policy studies that is lacking in individual case studies. The recently opened Kennedy and Johnson archives facilitate an internal analysis of the evolution of education policy formulation in the 1960s from the perspective of the executive branch. The central thread of continuity for such…
Education Policy Analysis Archives, 2001: Numbers 1-11.
ERIC Educational Resources Information Center
Glass, Gene V., Ed.
2001-01-01
This document consists of articles 1-11 published in the electronic journal "Education Policy Analysis Archives" for the year 2001: (1) "School Segregation of Children Who Migrate to the United States from Puerto Rico" (Luis M. Laosa); (2) "Testing Times: A School Case Study" (Ivor Goodson and Martha Foote); (3) "Impact of U.S. Overseas Schools in…
The MATISSE analysis of large spectral datasets from the ESO Archive
NASA Astrophysics Data System (ADS)
Worley, C.; de Laverny, P.; Recio-Blanco, A.; Hill, V.; Vernisse, Y.; Ordenovic, C.; Bijaoui, A.
2010-12-01
The automated stellar classification algorithm, MATISSE, has been developed at the Observatoire de la Côte d'Azur (OCA) in order to determine stellar temperatures, gravities and chemical abundances for large datasets of stellar spectra. The Gaia Data Processing and Analysis Consortium (DPAC) has selected MATISSE as one of the key programmes to be used in the analysis of the Gaia Radial Velocity Spectrometer (RVS) spectra. MATISSE is currently being used to analyse large datasets of spectra from the ESO archive with the primary goal of producing advanced data products to be made available in the ESO database via the Virtual Observatory. This is also an invaluable opportunity to identify and address issues that can be encountered with the analysis large samples of real spectra prior to the launch of Gaia in 2012. The analysis of the archived spectra of the FEROS spectrograph is currently underway and preliminary results are presented.
High-performance mass storage system for workstations
NASA Technical Reports Server (NTRS)
Chiang, T.; Tang, Y.; Gupta, L.; Cooperman, S.
1993-01-01
Reduced Instruction Set Computer (RISC) workstations and Personnel Computers (PC) are very popular tools for office automation, command and control, scientific analysis, database management, and many other applications. However, when using Input/Output (I/O) intensive applications, the RISC workstations and PC's are often overburdened with the tasks of collecting, staging, storing, and distributing data. Also, by using standard high-performance peripherals and storage devices, the I/O function can still be a common bottleneck process. Therefore, the high-performance mass storage system, developed by Loral AeroSys' Independent Research and Development (IR&D) engineers, can offload a RISC workstation of I/O related functions and provide high-performance I/O functions and external interfaces. The high-performance mass storage system has the capabilities to ingest high-speed real-time data, perform signal or image processing, and stage, archive, and distribute the data. This mass storage system uses a hierarchical storage structure, thus reducing the total data storage cost, while maintaining high-I/O performance. The high-performance mass storage system is a network of low-cost parallel processors and storage devices. The nodes in the network have special I/O functions such as: SCSI controller, Ethernet controller, gateway controller, RS232 controller, IEEE488 controller, and digital/analog converter. The nodes are interconnected through high-speed direct memory access links to form a network. The topology of the network is easily reconfigurable to maximize system throughput for various applications. This high-performance mass storage system takes advantage of a 'busless' architecture for maximum expandability. The mass storage system consists of magnetic disks, a WORM optical disk jukebox, and an 8mm helical scan tape to form a hierarchical storage structure. Commonly used files are kept in the magnetic disk for fast retrieval. The optical disks are used as archive media, and the tapes are used as backup media. The storage system is managed by the IEEE mass storage reference model-based UniTree software package. UniTree software will keep track of all files in the system, will automatically migrate the lesser used files to archive media, and will stage the files when needed by the system. The user can access the files without knowledge of their physical location. The high-performance mass storage system developed by Loral AeroSys will significantly boost the system I/O performance and reduce the overall data storage cost. This storage system provides a highly flexible and cost-effective architecture for a variety of applications (e.g., realtime data acquisition with a signal and image processing requirement, long-term data archiving and distribution, and image analysis and enhancement).
Online discussion boards in dental education: potential and challenges.
Linjawi, A I; Walmsley, A D; Hill, K B
2012-02-01
Online discussion boards may enhance critical analysis and reflection, and promote the acquisition of knowledge. To assess the effectiveness of online discussion board as a pedagogical tool in augmenting face-to-face teaching in dental education. Data were collected from a discussion archive offered through the E-course website of the School of Dentistry, University of Birmingham, UK in 2008. A multi-component metric included; participation, social learning, cognitive processing, role of instructors, and quality of discussion. Messages were coded for 14 variables to evaluate these dimensions. Data were analyzed using content analysis methodology and a complete message was uses as the unit of analysis. There were no significant difference in participation between students and instructors (P<0.05). Social interaction with peers appeared only through students posting messages with open questions (27/135 messages). The discussion board was mainly used by students to understand concepts (27/102 messages) and apply procedural knowledge (17/102 messages). Instructors were mainly replying to students' messages with (49/120 messages) or without (54/120 messages) proposing another action. Online discussion boards were found to be successful pedagogical tools in dental education. Further development of instructor-led discussion approach is needed to promote higher level learning and collaborative thinking. © 2011 John Wiley & Sons A/S.
Detecting Suspended Sediments from Remote Sensed Data in the Northern Gulf of Mexico
NASA Astrophysics Data System (ADS)
Hardin, D. M.; Graves, S. J.; Hawkins, L.; He, M.; Smith, T.; Drewry, M.; Ebersole, S.; Travis, A.; Thorn, J.; Brown, B.
2012-12-01
The Sediment Analysis Network for Decision Support (SANDS) project utilized remotely sensed data from Landsat and MODIS, both prior and following landfall, to investigate suspended sediment and sediment redistribution. The satellite imagery was enhanced by applying a combination of cluster busting and classification techniques to color and infrared bands. Results from the process show patterns associated with sediment transport and deposition related to coastal processes, storm-related sediment transport, post-storm pollutant transport, and sediment-current interactions. Imagery prior to landfall and following landfall are shown to the left for Landsat and to the right for MODIS. Scientific analysis and production of enhanced imagery was conducted by the Geological Survey of Alabama. The Information Technology and Systems Center at the University of Alabama in Huntsville was responsible for data acquisition, development of the SANDS data portal and the archive and distribution through the Global Hydrology Resource Center, one of NASA's Earth Science Data Centers . SANDs data may be obtained from the GHRC at ghrc.nsstc.nasa.gov and from the SANDS data portal at sands.itsc.uah.edu. This project was funded by the NASA Applied Sciences Division
Initial Experience With A Prototype Storage System At The University Of North Carolina
NASA Astrophysics Data System (ADS)
Creasy, J. L.; Loendorf, D. D.; Hemminger, B. M.
1986-06-01
A prototype archiving system manufactured by the 3M Corporation has been in place at the University of North Carolina for approximately 12 months. The system was installed as a result of a collaboration between 3M and UNC, with 3M seeking testing of their system, and UNC realizing the need for an archiving system as an essential part of their PACS test-bed facilities. System hardware includes appropriate network and disk interface devices as well as media for both short and long term storage of images and their associated information. The system software includes those procedures necessary to communicate with the network interface elements(NIEs) as well as those procedures necessary to interpret the ACR-NEMA header blocks and to store the images. A subset of the total ACR-NEMA header is parsed and stored in a relational database system. The entire header is stored on disk with the completed study. Interactive programs have been developed that allow radiologists to easily retrieve information about the archived images and to send the full images to a viewing console. Initial experience with the system has consisted primarily of hardware and software debugging. Although the system is ACR-NEMA compatable, further objective and subjective assessments of system performance is awaiting the connection of compatable consoles and acquisition devices to the network.
NASA Technical Reports Server (NTRS)
Sills, Joel W., Jr.; Griffin, Thomas J. (Technical Monitor)
2001-01-01
The Hubble Space Telescope (HST) Disturbance Verification Test (DVT) was conducted to characterize responses of the Observatory's new set of rigid solar array's (SA3) to thermally induced 'creak' or stiction releases. The data acquired in the DVT were used in verification of the HST Pointing Control System on-orbit performance, post-Servicing Mission 3B (SM3B). The test simulated the on-orbit environment on a deployed SA3 flight wing. Instrumentation for this test required pretest simulations in order to select the correct sensitivities. Vacuum compatible, highly accurate accelerometers and force gages were used for this test. The complexity of the test, as well as a short planning schedule, required a data acquisition system that was easy to configure, highly flexible, and extremely robust. A PC Windows oriented data acquisition system meets these requirements, allowing the test engineers to minimize the time required to plan and perform complex environmental test. The SA3 DVT provided a direct practical and complex demonstration of the versatility that PC based data acquisition systems provide. Two PC based data acquisition systems were assembled to acquire, process, distribute, and provide real time processing for several types of transducers used in the SA3 DVT. A high sample rate digital tape recorder was used to archive the sensor signals. The two systems provided multi-channel hardware and software architecture and were selected based on the test requirements. How these systems acquire and processes multiple data rates from different transducer types is discussed, along with the system hardware and software architecture.
Maloney, Stephen; Haas, Romi; Keating, Jenny L; Molloy, Elizabeth; Jolly, Brian; Sims, Jane; Morgan, Prue; Haines, Terry
2012-04-02
The introduction of Web-based education and open universities has seen an increase in access to professional development within the health professional education marketplace. Economic efficiencies of Web-based education and traditional face-to-face educational approaches have not been compared under randomized controlled trial conditions. To compare costs and effects of Web-based and face-to-face short courses in falls prevention education for health professionals. We designed two short courses to improve the clinical performance of health professionals in exercise prescription for falls prevention. One was developed for delivery in face-to-face mode and the other for online learning. Data were collected on learning outcomes including participation, satisfaction, knowledge acquisition, and change in practice, and combined with costs, savings, and benefits, to enable a break-even analysis from the perspective of the provider, cost-effectiveness analysis from the perspective of the health service, and cost-benefit analysis from the perspective of the participant. Face-to-face and Web-based delivery modalities produced comparable outcomes for participation, satisfaction, knowledge acquisition, and change in practice. Break-even analysis identified the Web-based educational approach to be robustly superior to face-to-face education, requiring a lower number of enrollments for the program to reach its break-even point. Cost-effectiveness analyses from the perspective of the health service and cost-benefit analysis from the perspective of the participant favored face-to-face education, although the outcomes were contingent on the sensitivity analysis applied (eg, the fee structure used). The Web-based educational approach was clearly more efficient from the perspective of the education provider. In the presence of relatively equivocal results for comparisons from other stakeholder perspectives, it is likely that providers would prefer to deliver education via a Web-based medium. Australian New Zealand Clinical Trials Registry (ACTRN): 12610000135011; http://www.anzctr.org.au/trial_view.aspx?id=335135 (Archived by WebCite at http://www.webcitation.org/668POww4L).
Haas, Romi; Keating, Jenny L; Molloy, Elizabeth; Jolly, Brian; Sims, Jane; Morgan, Prue; Haines, Terry
2012-01-01
Background The introduction of Web-based education and open universities has seen an increase in access to professional development within the health professional education marketplace. Economic efficiencies of Web-based education and traditional face-to-face educational approaches have not been compared under randomized controlled trial conditions. Objective To compare costs and effects of Web-based and face-to-face short courses in falls prevention education for health professionals. Methods We designed two short courses to improve the clinical performance of health professionals in exercise prescription for falls prevention. One was developed for delivery in face-to-face mode and the other for online learning. Data were collected on learning outcomes including participation, satisfaction, knowledge acquisition, and change in practice, and combined with costs, savings, and benefits, to enable a break-even analysis from the perspective of the provider, cost-effectiveness analysis from the perspective of the health service, and cost-benefit analysis from the perspective of the participant. Results Face-to-face and Web-based delivery modalities produced comparable outcomes for participation, satisfaction, knowledge acquisition, and change in practice. Break-even analysis identified the Web-based educational approach to be robustly superior to face-to-face education, requiring a lower number of enrollments for the program to reach its break-even point. Cost-effectiveness analyses from the perspective of the health service and cost-benefit analysis from the perspective of the participant favored face-to-face education, although the outcomes were contingent on the sensitivity analysis applied (eg, the fee structure used). Conclusions The Web-based educational approach was clearly more efficient from the perspective of the education provider. In the presence of relatively equivocal results for comparisons from other stakeholder perspectives, it is likely that providers would prefer to deliver education via a Web-based medium. Trial Registration Australian New Zealand Clinical Trials Registry (ACTRN): 12610000135011; http://www.anzctr.org.au/trial_view.aspx?id=335135 (Archived by WebCite at http://www.webcitation.org/668POww4L) PMID:22469659
The European Radiobiology Archives (ERA)--content, structure and use illustrated by an example.
Gerber, G B; Wick, R R; Kellerer, A M; Hopewell, J W; Di Majo, V; Dudoignon, N; Gössner, W; Stather, J
2006-01-01
The European Radiobiology Archives (ERA), supported by the European Commission and the European Late Effect Project Group (EULEP), together with the US National Radiobiology Archives (NRA) and the Japanese Radiobiology Archives (JRA) have collected all information still available on long-term animal experiments, including some selected human studies. The archives consist of a database in Microsoft Access, a website, databases of references and information on the use of the database. At present, the archives contain a description of the exposure conditions, animal strains, etc. from approximately 350,000 individuals; data on survival and pathology are available from approximately 200,000 individuals. Care has been taken to render pathological diagnoses compatible among different studies and to allow the lumping of pathological diagnoses into more general classes. 'Forms' in Access with an underlying computer code facilitate the use of the database. This paper describes the structure and content of the archives and illustrates an example for a possible analysis of such data.
Solomon, April D; Hytinen, Madison E; McClain, Aryn M; Miller, Marilyn T; Dawson Cruz, Tracey
2018-01-01
DNA profiles have been obtained from fingerprints, but there is limited knowledge regarding DNA analysis from archived latent fingerprints-touch DNA "sandwiched" between adhesive and paper. Thus, this study sought to comparatively analyze a variety of collection and analytical methods in an effort to seek an optimized workflow for this specific sample type. Untreated and treated archived latent fingerprints were utilized to compare different biological sampling techniques, swab diluents, DNA extraction systems, DNA concentration practices, and post-amplification purification methods. Archived latent fingerprints disassembled and sampled via direct cutting, followed by DNA extracted using the QIAamp® DNA Investigator Kit, and concentration with Centri-Sep™ columns increased the odds of obtaining an STR profile. Using the recommended DNA workflow, 9 of the 10 samples provided STR profiles, which included 7-100% of the expected STR alleles and two full profiles. Thus, with carefully selected procedures, archived latent fingerprints can be a viable DNA source for criminal investigations including cold/postconviction cases. © 2017 American Academy of Forensic Sciences.
ASF archive issues: Current status, past history, and questions for the future
NASA Technical Reports Server (NTRS)
Goula, Crystal A.; Wales, Carl
1994-01-01
The Alaska SAR Facility (ASF) collects, processes, archives, and distributes data from synthetic aperture radar (SAR) satellites in support of scientific research. ASF has been in operation since 1991 and presently has an archive of over 100 terabytes of data. ASF is performing an analysis of its magnetic tape storage system to ensure long-term preservation of this archive. Future satellite missions have the possibility of doubling to tripling the amounts of data that ASF acquires. ASF is examining the current data systems and the high volume storage, and exploring future concerns and solutions.
Harrison, Arnell S.; Dadisman, Shawn V.; McBride, W. Scott; Flocks, James G.; Wiese, Dana S.
2009-01-01
In May of 2008, the U.S. Geological Survey (USGS) conducted geophysical surveys in Lake Panasoffkee, located in central Florida, as part of the USGS Lakes and Coastal Aquifers (LCA) study. This report serves as an archive of unprocessed digital boomer and Compressed High Intensity Radar Pulse (CHIRP)* seismic reflection data, trackline maps, navigation files, Field Activity Collection System (FACS) logs, Geographic Information System (GIS) files, and formal Federal Geographic Data Committee (FGDC) metadata. Filtered and gained (a relative increase in signal amplitude) digital images of the seismic profiles and geospatially corrected interactive profiles are also provided. Refer to the Acronyms page for expansions of acronyms and abbreviations used in this report. *Due to poor data acquisition conditions associated with the lake bottom sediments, only two CHIRP tracklines were collected during this field activity. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG-Y format (Barry and others, 1975) and may be downloaded and processed with commercial or public domain software such as Seismic Unix (SU). Example SU processing scripts and USGS software for viewing the SEG-Y files (Zihlman, 1992) are provided. The USGS Florida Integrated Science Center (FISC) - St. Petersburg assigns a unique identifier to each cruise or field activity. For example, 08LCA03 tells us the data were collected in 2008 for the Lakes and Coastal Aquifers (LCA) study and the data were collected during the third field activity for that study in that calendar year. Refer to http://walrus.wr.usgs.gov/infobank/programs/html/definition/activity.html for a detailed description of the method used to assign the field activity ID. The naming convention used for each seismic line is as follows: yye##a, where 'yy' are the last two digits of the year in which the data were collected, 'e' is a 1-letter abbreviation for the equipment type (for example, b for boomer and c for CHIRP), '##' is a 2-digit number representing a specific track, and 'a' is a letter representing the section of a line if recording was prematurely terminated or rerun for quality or acquisition problems. The boomer plate is an acoustic energy source that consists of capacitors charged to a high voltage and discharged through a transducer in the water. The transducer is towed on a sled floating on the water surface and, when discharged, emits a short acoustic pulse, or shot, which propagates through the water, sediment column, or rock beneath. The acoustic energy is reflected at density boundaries (such as the seafloor, sediment, or rock layers beneath the seafloor), detected by the receiver, and recorded by a PC-based seismic acquisition system. This process is repeated at timed intervals (for example, 0.5 s) and recorded for specific intervals of time (for example, 100 ms). In this way, a two-dimensional (2-D) vertical profile of the shallow geologic structure beneath the ship track is produced. Figure 1 displays the boomer acquisition geometry. The EdgeTech SB-424 CHIRP system used for this survey has a vertical resolution of 4 - 8 cm, a penetration depth that is usually less than 2 m beneath the seafloor, and uses a signal of continuously varying frequency. The towfish is a sound source and receiver, which is typically towed 2 - 5 m above the seafloor. The acoustic energy is reflected at density boundaries (such as the seafloor or sediment layers beneath the seafloor), detected by a receiver, and recorded by a PC-based seismic acquisition system. This process is repeated at timed intervals (for example, 0.125 s) and recorded for specific intervals of time (for example, 50 ms); the resulting profile is a two-dimensional vertical image of the shallow geologic structure beneath the ship track. Figure 2 displays the acquisition geometry for the CHIRP system. Refer to table 1 for a summary of acquisition parameters and table 2 for trackline statistics.
Mergers, Acquisitions, and Access: STM Publishing Today
NASA Astrophysics Data System (ADS)
Robertson, Kathleen
Electronic publishing is changing the fundamentals of the entire printing/delivery/archive system that has served as the distribution mechanism for scientific research over the last century and a half. The merger-mania of the last 20 years, preprint pools, and publishers' licensing and journals-bundling plans are among the phenomena impacting the scientific information field. Science-Technology-Medical (STM) publishing is experiencing a period of intense consolidation and reorganization. This paper gives an overview of the economic factors fueling these trends, the major STM publishers, and the government regulatory bodies that referee this industry in Europe, Canada, and the USA.
NASA Astrophysics Data System (ADS)
Elias-Rosa, N.; Pastorello, A.; Benetti, S.; Cappellaro, E.; Pignata, G.; Takats, K.; Valenti, S.; Harutyunyan, A.
2013-06-01
We report the identification of potential candidates for the progenitor star of the Type Ic PSN J12015272-1852183 discovered by the CHASE survey (CBAT TOCP) from archival Hubble Space Telescope (HST) WFPC2 images in F555W (~V), and F814W (~I). These images were obtained from 2008-12-02 to 2009-02-09 UT for the HST proposal 11962, PI: A. Riess. We used a good-quality, acquisition image of the SN obtained at TNG Telescope (+Dolores) on June 23.88 UT to geometrically match the HST mosaics.
Storm Prediction Center - Sounding Analysis Archive
Archive NOAA Weather Radio Research Non-op. Products Forecast Tools Svr. Tstm. Events SPC Publications SPC hour, and will also re-run old hours to fill in late data. NOTE: This data is experimental and may not
ERIC Educational Resources Information Center
Lagone, Elizabeth; Mathur, Sanyukta; Nakyanjo, Neema; Nalugoda, Fred; Santelli, John
2014-01-01
Uganda is recognised as an early success story in the HIV epidemic at least in part due to an open and vigorous national dialogue about HIV prevention. This study examined the national discourse about HIV, AIDS, and young people in New Vision, Uganda's leading national newspaper between 1996 and 2011, building from a previous archival analysis of…
Perinatal acquisition of drug-resistant HIV-1 infection: mechanisms and long-term outcome
Delaugerre, Constance; Chaix, Marie-Laure; Blanche, Stephane; Warszawski, Josiane; Cornet, Dorine; Dollfus, Catherine; Schneider, Veronique; Burgard, Marianne; Faye, Albert; Mandelbrot, Laurent; Tubiana, Roland; Rouzioux, Christine
2009-01-01
Background Primary-HIV-1-infection in newborns that occurs under antiretroviral prophylaxis that is a high risk of drug-resistance acquisition. We examine the frequency and the mechanisms of resistance acquisition at the time of infection in newborns. Patients and Methods We studied HIV-1-infected infants born between 01 January 1997 and 31 December 2004 and enrolled in the ANRS-EPF cohort. HIV-1-RNA and HIV-1-DNA samples obtained perinatally from the newborn and mother were subjected to population-based and clonal analyses of drug resistance. If positive, serial samples were obtained from the child for resistance testing. Results Ninety-two HIV-1-infected infants were born during the study period. Samples were obtained from 32 mother-child pairs and from another 28 newborns. Drug resistance was detected in 12 newborns (20%): drug resistance to nucleoside reverse transcriptase inhibitors was seen in 10 cases, non-nucleoside reverse transcriptase inhibitors in two cases, and protease inhibitors in one case. For 9 children, the detection of the same resistance mutations in mothers' samples (6 among 10 available) and in newborn lymphocytes (6/8) suggests that the newborn was initially infected by a drug-resistant strain. Resistance variants were either transmitted from mother-to-child or selected during subsequent temporal exposure under suboptimal perinatal prophylaxis. Follow-up studies of the infants showed that the resistance pattern remained stable over time, regardless of antiretroviral therapy, suggesting the early cellular archiving of resistant viruses. The absence of resistance in the mother of the other three children (3/10) and neonatal lymphocytes (2/8) suggests that the newborns were infected by a wild-type strain without long-term persistence of resistance when suboptimal prophylaxis was stopped. Conclusion This study confirms the importance of early resistance genotyping of HIV-1-infected newborns. In most cases (75%), drug resistance was archived in the cellular reservoir and persisted during infancy, with or without antiretroviral treatment. This finding stresses the need for effective antiretroviral treatment of pregnant women. PMID:19765313
48 CFR 215.404 - Proposal analysis.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 48 Federal Acquisition Regulations System 3 2011-10-01 2011-10-01 false Proposal analysis. 215.404 Section 215.404 Federal Acquisition Regulations System DEFENSE ACQUISITION REGULATIONS SYSTEM, DEPARTMENT... Proposal analysis. ...
48 CFR 215.404 - Proposal analysis.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 48 Federal Acquisition Regulations System 3 2010-10-01 2010-10-01 false Proposal analysis. 215.404 Section 215.404 Federal Acquisition Regulations System DEFENSE ACQUISITION REGULATIONS SYSTEM, DEPARTMENT... Proposal analysis. ...
NASA's Long-Term Archive (LTA) of ICESat Data at the National Snow and Ice Data Center (NSIDC)
NASA Astrophysics Data System (ADS)
Fowler, D. K.; Moses, J. F.; Dimarzio, J. P.; Webster, D.
2011-12-01
Data Stewardship, preservation, and reproducibility are becoming principal parts of a data manager's work. In an era of distributed data and information systems, where the host location ought to be transparent to the internet user, it is of vital importance that organizations make a commitment to both current and long-term goals of data management and the preservation of scientific data. NASA's EOS Data and Information System (EOSDIS) is a distributed system of discipline-specific archives and mission-specific science data processing facilities. Satellite missions and instruments go through a lifecycle that involves pre-launch calibration, on-orbit data acquisition and product generation, and final reprocessing. Data products and descriptions flow to the archives for distribution on a regular basis during the active part of the mission. However there is additional information from the product generation and science teams needed to ensure the observations will be useful for long term climate studies. Examples include ancillary input datasets, product generation software, and production history as developed by the team during the course of product generation. These data and information will need to be archived after product data processing is completed. Using inputs from the USGCRP Workshop on Long Term Archive Requirements (1998), discussions with EOS instrument teams, and input from the 2011 ESIPS Federation meeting, NASA is developing a set of Earth science data and information content requirements for long term preservation that will ultimately be used for all the EOS missions as they come to completion. Since the ICESat/GLAS mission is one of the first to come to an end, NASA and NSIDC are preparing for long-term support of the ICESat mission data now. For a long-term archive, it is imperative that there is sufficient information about how products were prepared in order to convince future researchers that the scientific results are accurate, understandable, useable, and reproducible. Our experience suggests data centers know what to preserve in most cases, i.e., the processing algorithms along with the Level 0 or Level 1a input and ancillary products used to create the higher-level products will be archived and made available to users. In other cases the data centers must seek guidance from the science team, e.g., for pre-launch, calibration/validation, and test data. All these data are an important part of product provenance, contributing to and helping establish the integrity of the scientific observations for long term climate studies. In this presentation we will describe application of information content requirements, guidance from the ICESat/GLAS Science Team and the flow of additional information from the ICESat Science team and Science Investigator-Led Processing System to the Distributed Active Archive Center.
NASA Astrophysics Data System (ADS)
Hay, D. Robert; Brassard, Michel; Matthews, James R.; Garneau, Stephane; Morchat, Richard
1995-06-01
The convergence of a number of contemporary technologies with increasing demands for improvements in inspection capabilities in maritime applications has created new opportunities for ultrasonic inspection. An automated ultrasonic inspection and data collection system APHIUS (automated pressure hull intelligent ultrasonic system), incorporates hardware and software developments to meet specific requirements for the maritime vessels, in particular, submarines in the Canadian Navy. Housed within a hardened portable computer chassis, instrumentation for digital ultrasonic data acquisition and transducer position measurement provide new capabilities that meet more demanding requirements for inspection of the aging submarine fleet. Digital data acquisition enables a number of new important capabilites including archiving of the complete inspection session, interpretation assistance through imaging, and automated interpretation using artificial intelligence methods. With this new reliable inspection system, in conjunction with a complementary study of the significance of real defect type and location, comprehensive new criteria can be generated which will eliminate unnecessary defect removal. As a consequence, cost savings will be realized through shortened submarine refit schedules.
GLAS Long-Term Archive: Preservation and Stewardship for a Vital Earth Observing Mission
NASA Astrophysics Data System (ADS)
Fowler, D. K.; Moses, J. F.; Zwally, J.; Schutz, B. E.; Hancock, D.; McAllister, M.; Webster, D.; Bond, C.
2012-12-01
Data Stewardship, preservation, and reproducibility are fast becoming principal parts of a data manager's work. In an era of distributed data and information systems, it is of vital importance that organizations make a commitment to both current and long-term goals of data management and the preservation of scientific data. Satellite missions and instruments go through a lifecycle that involves pre-launch calibration, on-orbit data acquisition and product generation, and final reprocessing. Data products and descriptions flow to the archives for distribution on a regular basis during the active part of the mission. However there is additional information from the product generation and science teams needed to ensure the observations will be useful for long term climate studies. Examples include ancillary input datasets, product generation software, and production history as developed by the team during the course of product generation. These data and information will need to be archived after product data processing is completed. NASA has developed a set of Earth science data and information content requirements for long term preservation that is being used for all the EOS missions as they come to completion. Since the ICESat/GLAS mission was one of the first to end, NASA and NSIDC, in collaboration with the science team, are collecting data, software, and documentation, preparing for long-term support of the ICESat mission. For a long-term archive, it is imperative to preserve sufficient information about how products were prepared in order to ensure future researchers that the scientific results are accurate, understandable, and useable. Our experience suggests data centers know what to preserve in most cases. That is, the processing algorithms along with the Level 0 or Level 1a input and ancillary products used to create the higher-level products will be archived and made available to users. In other cases, such as pre-launch, calibration/validation, and test data, the data centers must seek guidance from the science team. All these data are essential for product provenance, contributing to and helping establish the integrity of the scientific observations for long term climate studies. In this presentation we will describe application of information gathering with guidance from the ICESat/GLAS Science Team, and the flow of additional information from the ICESat Science team and Science Investigator-Led Processing System to the NSIDC Distributed Active Archive Center. This presentation will also cover how we envision user support through the years of the Long-Term Archive.
Gloyn, Liz; Crewe, Vicky; King, Laura; Woodham, Anna
2018-01-01
Using an interdisciplinary research methodology across three archaeological and historical case studies, this article explores “family archives.” Four themes illustrate how objects held in family archives, curation practices, and intergenerational narratives reinforce a family’s sense of itself: people–object interactions, gender, socialization and identity formation, and the “life course.” These themes provide a framework for professional archivists to assist communities and individuals working with their own family archives. We argue that the family archive, broadly defined, encourages a more egalitarian approach to history. We suggest a multiperiod analysis draws attention to historical forms of knowledge and meaning-making practices over time. PMID:29593371
The Ties That Bind: Materiality, Identity, and the Life Course in the "Things" Families Keep.
Gloyn, Liz; Crewe, Vicky; King, Laura; Woodham, Anna
2018-04-01
Using an interdisciplinary research methodology across three archaeological and historical case studies, this article explores "family archives." Four themes illustrate how objects held in family archives, curation practices, and intergenerational narratives reinforce a family's sense of itself: people-object interactions, gender, socialization and identity formation, and the "life course." These themes provide a framework for professional archivists to assist communities and individuals working with their own family archives. We argue that the family archive, broadly defined, encourages a more egalitarian approach to history. We suggest a multiperiod analysis draws attention to historical forms of knowledge and meaning-making practices over time.
DeWitt, Nancy T.; Flocks, James G.; Pfeiffer, William R.; Gibson, James N.; Wiese, Dana S.
2012-01-01
Data were collected aboard the U.S. Army Corps of Engineers (USACE) SV Irvington, a 56-foot (ft) Kvichak Marine Industries, Inc., catamaran (fig. 2). Side scan sonar and multibeam bathymetry data were collected simultaneously along the tracklines. The side scan sonar towfish was towed off the starboard side just slightly behind the vessel, close to the seafloor. The multibeam transducer was attached to a retractable strut-arm lowered between the catamaran hulls. Navigation was acquired with an Applanix POS MV and differentially corrected using the broadcast signal from a local National Geodetic Survey (NGS) Continuously Operating Reference Station (CORS) beacon. See the digital FACS equipment log for details about the acquisition equipment used. Raw datasets were stored digitally and processed using HYPACK Inc., HYSWEEP software at the USACE Mobile, Ala., District office. For more information on processing refer to the Equipment and Processing page. Chirp seismic data were also collected during this survey and are archived separately.
Cawthon, M A
1999-05-01
The Department of Defense (DoD) undertook a major systems specification, acquisition, and implementation project of multivendor picture archiving and communications system (PACS) and teleradiology systems during 1997 with deployment of the first systems in 1998. These systems differ from their DoD predecessor system in being multivendor in origin, specifying adherence to the developing Digital Imaging and Communications in Medicine (DICOM) 3.0 standard and all of its service classes, emphasizing open architecture, using personal computer (PC) and web-based image viewing access, having radiologic telepresence over large geographic areas as a primary focus of implementation, and requiring bidirectional interfacing with the DoD hospital information system (HIS). The benefits and advantages to the military health-care system accrue through the enabling of a seamless implementation of a virtual radiology operational environment throughout this vast healthcare organization providing efficient general and subspecialty radiologic interpretive and consultative services for our medical beneficiaries to any healthcare provider, anywhere and at any time of the night or day.
Architecture for a PACS primary diagnosis workstation
NASA Astrophysics Data System (ADS)
Shastri, Kaushal; Moran, Byron
1990-08-01
A major factor in determining the overall utility of a medical Picture Archiving and Communications (PACS) system is the functionality of the diagnostic workstation. Meyer-Ebrecht and Wendler [1] have proposed a modular picture computer architecture with high throughput and Perry et.al [2] have defined performance requirements for radiology workstations. In order to be clinically useful, a primary diagnosis workstation must not only provide functions of current viewing systems (e.g. mechanical alternators [3,4]) such as acceptable image quality, simultaneous viewing of multiple images, and rapid switching of image banks; but must also provide a diagnostic advantage over the current systems. This includes window-level functions on any image, simultaneous display of multi-modality images, rapid image manipulation, image processing, dynamic image display (cine), electronic image archival, hardcopy generation, image acquisition, network support, and an easy user interface. Implementation of such a workstation requires an underlying hardware architecture which provides high speed image transfer channels, local storage facilities, and image processing functions. This paper describes the hardware architecture of the Siemens Diagnostic Reporting Console (DRC) which meets these requirements.
Oceans 2.0: Interactive tools for the Visualization of Multi-dimensional Ocean Sensor Data
NASA Astrophysics Data System (ADS)
Biffard, B.; Valenzuela, M.; Conley, P.; MacArthur, M.; Tredger, S.; Guillemot, E.; Pirenne, B.
2016-12-01
Ocean Networks Canada (ONC) operates ocean observatories on all three of Canada's coasts. The instruments produce 280 gigabytes of data per day with 1/2 petabyte archived so far. In 2015, 13 terabytes were downloaded by over 500 users from across the world. ONC's data management system is referred to as "Oceans 2.0" owing to its interactive, participative features. A key element of Oceans 2.0 is real time data acquisition and processing: custom device drivers implement the input-output protocol of each instrument. Automatic parsing and calibration takes place on the fly, followed by event detection and quality control. All raw data are stored in a file archive, while the processed data are copied to fast databases. Interactive access to processed data is provided through data download and visualization/quick look features that are adapted to diverse data types (scalar, acoustic, video, multi-dimensional, etc). Data may be post or re-processed to add features, analysis or correct errors, update calibrations, etc. A robust storage structure has been developed consisting of an extensive file system and a no-SQL database (Cassandra). Cassandra is a node-based open source distributed database management system. It is scalable and offers improved performance for big data. A key feature is data summarization. The system has also been integrated with web services and an ERDDAP OPeNDAP server, capable of serving scalar and multidimensional data from Cassandra for fixed or mobile devices.A complex data viewer has been developed making use of the big data capability to interactively display live or historic echo sounder and acoustic Doppler current profiler data, where users can scroll, apply processing filters and zoom through gigabytes of data with simple interactions. This new technology brings scientists one step closer to a comprehensive, web-based data analysis environment in which visual assessment, filtering, event detection and annotation can be integrated.
NASA Astrophysics Data System (ADS)
Smale, Alan P.
2018-06-01
The High Energy Astrophysics Science Archive Research Center (HEASARC) is NASA's primary archive for high energy astrophysics and cosmic microwave background (CMB) data, supporting the broad science goals of NASA's Physics of the Cosmos theme. It provides vital scientific infrastructure to the community by standardizing science data formats and analysis programs, providing open access to NASA resources, and implementing powerful archive interfaces. These enable multimission studies of key astronomical targets, and deliver a major cost savings to NASA and proposing mission teams in terms of a reusable science infrastructure, as well as a time savings to the astronomical community through not having to learn a new analysis system for each new mission. The HEASARC archive holdings are currently in excess of 100 TB, supporting seven active missions (Chandra, Fermi, INTEGRAL, NICER, NuSTAR, Swift, and XMM-Newton), and providing continuing access to data from over 40 missions that are no longer in operation. HEASARC scientists are also engaged with the upcoming IXPE and XARM missions, and with many other Probe, Explorer, SmallSat, and CubeSat proposing teams. Within the HEASARC, the LAMBDA CMB thematic archive provides a permanent archive for NASA mission data from WMAP, COBE, IRAS, SWAS, and a wide selection of suborbital missions and experiments, and hosts many other CMB-related datasets, tools, and resources. In this talk I will summarize the current activities of the HEASARC and our plans for the coming decade. In addition to mission support, we will expand our software and user interfaces to provide astronomers with new capabilities to access and analyze HEASARC data, and continue to work with our Virtual Observatory partners to develop and implement standards to enable improved interrogation and analysis of data regardless of wavelength regime, mission, or archive boundaries. The future looks bright for high energy astrophysics, and the HEASARC looks forward to continuing its central role in the community.
The Legacy Archive for Microwave Background Data Analysis (LAMBDA)
NASA Astrophysics Data System (ADS)
Miller, Nathan; LAMBDA
2018-01-01
The Legacy Archive for Microwave Background Data Analysis (LAMBDA) provides CMB researchers with archival data for cosmology missions, software tools, and links to other sites of interest. LAMBDA is one-stop shopping for CMB researchers. It hosts data from WMAP along with many suborbital experiments. Over the past year, LAMBDA has acquired new data from SPTpol, SPIDER and ACTPol. In addition to the primary CMB, LAMBDA also provides foreground data.LAMBDA has several ongoing efforts to provide tools for CMB researchers. These tools include a web interface for CAMB and a web interface for a CMB survey footprint database and plotting tool. Additionally, we have recently developed a Docker container with standard CMB analysis tools and demonstrations in the form of Jupyter notebooks. These containers will be publically available through Docker's container repository and the source will be available on github.
The AMBRE Project: Stellar parameterisation of the ESO:UVES archived spectra
NASA Astrophysics Data System (ADS)
Worley, C. C.; de Laverny, P.; Recio-Blanco, A.; Hill, V.; Bijaoui, A.
2016-06-01
Context. The AMBRE Project is a collaboration between the European Southern Observatory (ESO) and the Observatoire de la Côte d'Azur (OCA) that has been established to determine the stellar atmospheric parameters for the archived spectra of four ESO spectrographs. Aims: The analysis of the UVES archived spectra for their stellar parameters was completed in the third phase of the AMBRE Project. From the complete ESO:UVES archive dataset that was received covering the period 2000 to 2010, 51 921 spectra for the six standard setups were analysed. These correspond to approximately 8014 distinct targets (that comprise stellar and non-stellar objects) by radial coordinate search. Methods: The AMBRE analysis pipeline integrates spectral normalisation, cleaning and radial velocity correction procedures in order that the UVES spectra can then be analysed automatically with the stellar parameterisation algorithm MATISSE to obtain the stellar atmospheric parameters. The synthetic grid against which the MATISSE analysis is carried out is currently constrained to parameters of FGKM stars only. Results: Stellar atmospheric parameters are reported for 12 403 of the 51 921 UVES archived spectra analysed in AMBRE:UVES. This equates to ~23.9% of the sample and ~3708 stars. Effective temperature, surface gravity, metallicity, and alpha element to iron ratio abundances are provided for 10 212 spectra (~19.7%), while effective temperature at least is provided for the remaining 2191 spectra. Radial velocities are reported for 36 881 (~71.0%) of the analysed archive spectra. While parameters were determined for 32 306 (62.2%) spectra these parameters were not considered reliable (and thus not reported to ESO) for reasons such as very low S/N, too poor radial velocity determination, spectral features too broad for analysis, and technical issues from the reduction. Similarly the parameters of a further 7212 spectra (13.9%) were also not reported to ESO based on quality criteria and error analysis which were determined within the automated parameterisation process. Those tests lead us to expect that multi-component stellar systems will return high errors in radial velocity and fitting to the synthetic spectra and therefore will not have parameters reported to ESO. Typical external errors of σTeff ~ 110 dex, σlog g ~ 0.18 dex, σ[ M/H ] ~ 0.13 dex, and σ[ α/ Fe ] ~ 0.05 dex with some variation between giants and dwarfs and between setups are reported. Conclusions: UVES is used to observe an extensive collection of stellar and non-stellar objects all of which have been included in the archived dataset provided to OCA by ESO. The AMBRE analysis extracts those objects that lie within the FGKM parameter space of the AMBRE slow-rotating synthetic spectra grid. Thus by homogeneous blind analysis AMBRE has successfully extracted and parameterised the targeted FGK stars (23.9% of the analysed sample) from within the ESO:UVES archive.
Anderson, Bradley W; Suh, Yun-Suhk; Choi, Boram; Lee, Hyuk-Joon; Yab, Tracy C; Taylor, William; Dukek, Brian A; Berger, Calise K; Cao, Xiaoming; Foote, Patrick H; Devens, Mary E; Boardman, Lisa A; Kisiel, John B; Mahoney, Douglas W; Slettedahl, Seth W; Allawi, Hatim T; Lidgard, Graham P; Smyrk, Thomas C; Yang, Han-Kwang; Ahlquist, David A
2018-05-29
Gastric adenocarcinoma (GAC) is the third most common cause of cancer mortality worldwide. Accurate and affordable non-invasive detection methods have potential value for screening and surveillance. Herein, we identify novel methylated DNA markers (MDMs) for GAC, validate their discrimination for GAC in tissues from geographically separate cohorts, explore marker acquisition through the oncogenic cascade, and describe distributions of candidate MDMs in plasma from GAC cases and normal controls. Following discovery by unbiased whole methylome sequencing, candidate MDMs were validated by blinded methylation-specific PCR in archival case-control tissues from U.S. and South Korean patients. Top MDMs were then assayed by an analytically sensitive method (quantitative real-time allele-specific target and signal amplification) in a blinded pilot study on archival plasma from GAC cases and normal controls. Whole methylome discovery yielded novel and highly discriminant candidate MDMs. In tissue, a panel of candidate MDMs detected GAC in 92-100% of U.S. and S. Korean cohorts at 100% specificity. Levels of most MDMs increased progressively from normal mucosa through metaplasia, adenoma, and GAC with variation in points of greatest marker acquisition. In plasma, a 3 marker panel ( ELMO1 , ZNF569 , C13orf18) detected 86% (95% CI 71-95%) of GACs at 95% specificity. Novel MDMs appear to accurately discriminate GAC from normal controls in both tissue and plasma. The point of aberrant methylation during oncogenesis varies by MDM, which may have relevance to marker selection in clinical applications. Further exploration of these MDMs for GAC screening and surveillance is warranted. Copyright ©2018, American Association for Cancer Research.
A prototype for automation of land-cover products from Landsat Surface Reflectance Data Records
NASA Astrophysics Data System (ADS)
Rover, J.; Goldhaber, M. B.; Steinwand, D.; Nelson, K.; Coan, M.; Wylie, B. K.; Dahal, D.; Wika, S.; Quenzer, R.
2014-12-01
Landsat data records of surface reflectance provide a three-decade history of land surface processes. Due to the vast number of these archived records, development of innovative approaches for automated data mining and information retrieval were necessary. Recently, we created a prototype utilizing open source software libraries for automatically generating annual Anderson Level 1 land cover maps and information products from data acquired by the Landsat Mission for the years 1984 to 2013. The automated prototype was applied to two target areas in northwestern and east-central North Dakota, USA. The approach required the National Land Cover Database (NLCD) and two user-input target acquisition year-days. The Landsat archive was mined for scenes acquired within a 100-day window surrounding these target dates, and then cloud-free pixels where chosen closest to the specified target acquisition dates. The selected pixels were then composited before completing an unsupervised classification using the NLCD. Pixels unchanged in pairs of the NLCD were used for training decision tree models in an iterative process refined with model confidence measures. The decision tree models were applied to the Landsat composites to generate a yearly land cover map and related information products. Results for the target areas captured changes associated with the recent expansion of oil shale production and agriculture driven by economics and policy, such as the increase in biofuel production and reduction in Conservation Reserve Program. Changes in agriculture, grasslands, and surface water reflect the local hydrological conditions that occurred during the 29-year span. Future enhancements considered for this prototype include a web-based client, ancillary spatial datasets, trends and clustering algorithms, and the forecasting of future land cover.
Data Processing, Visualization and Distribution for Support of Science Programs in the Arctic Ocean
NASA Astrophysics Data System (ADS)
Johnson, P. D.; Edwards, M. H.; Wright, D.
2006-12-01
For the past two years the Hawaii Mapping Research Group (HMRG) and Oregon State University researchers have been building an on-line archive of geophysical data for the Arctic Basin. This archive is known as AAGRUUK - the Arctic Archive for Geophysical Research: Unlocking Undersea Knowledge (http://www.soest.hawaii.edu/hmrg/Aagruuk). It contains a wide variety of data including bathymetry, sidescan and subbottom data collected by: 1) U.S. Navy nuclear-powered submarines during the Science Ice Exercises (SCICEX), 2) icebreakers such as the USCGC Healy, R/V Nathaniel B. Palmer, and CCGS Amundsen, and 3) historical depth soundings from the T3 ice camp and pre-1990 nuclear submarine missions. Instead of simply soliciting data, reformatting it, and serving it to the community, we have focused our efforts on producing and serving an integrated dataset. We pursued this path after experimenting with dataset integration and discovering a multitude of problems including navigational inconsistencies and systemic offsets produced by acquiring data in an ice-covered ocean. Our goal in addressing these problems, integrating the processed datasets and producing a data compilation was to prevent the myriad researchers interested in these datasets, many of whom have less experience processing geophysical data than HMRG personnel, from having to repeat the same data processing efforts. For investigators interested in pursuing their own data processing approaches, AAGRUUK also serves most of the raw data that was included in the data compilation, as well as processed versions of individual datasets. The archive also provides downloadable static chart sets for users who desire derived products for inclusion in reports, planning documents, etc. We are currently testing a prototype mapserver that allows maps of the cleaned datasets to be accessed interactively as well as providing access to the edited files that make up the datasets. Previously we have documented the types of the problems that were encountered in a general way. Over the past year we have integrated two terabytes of data, which allows us to comment on system performance from a much broader context. In this presentation we will show the types of error for each data acquisition system and also for operating conditions (e.g. ice cover, time of year, etc.). Our error analysis both illuminates our approach to data processing and serves as a guide for, when possible, choosing the type of instruments and the optimal time to conduct these types of surveys in ice-covered oceans.
A PACS archive architecture supported on cloud services.
Silva, Luís A Bastião; Costa, Carlos; Oliveira, José Luis
2012-05-01
Diagnostic imaging procedures have continuously increased over the last decade and this trend may continue in coming years, creating a great impact on storage and retrieval capabilities of current PACS. Moreover, many smaller centers do not have financial resources or requirements that justify the acquisition of a traditional infrastructure. Alternative solutions, such as cloud computing, may help address this emerging need. A tremendous amount of ubiquitous computational power, such as that provided by Google and Amazon, are used every day as a normal commodity. Taking advantage of this new paradigm, an architecture for a Cloud-based PACS archive that provides data privacy, integrity, and availability is proposed. The solution is independent from the cloud provider and the core modules were successfully instantiated in examples of two cloud computing providers. Operational metrics for several medical imaging modalities were tabulated and compared for Google Storage, Amazon S3, and LAN PACS. A PACS-as-a-Service archive that provides storage of medical studies using the Cloud was developed. The results show that the solution is robust and that it is possible to store, query, and retrieve all desired studies in a similar way as in a local PACS approach. Cloud computing is an emerging solution that promises high scalability of infrastructures, software, and applications, according to a "pay-as-you-go" business model. The presented architecture uses the cloud to setup medical data repositories and can have a significant impact on healthcare institutions by reducing IT infrastructures.
NASA Astrophysics Data System (ADS)
Choquet, Élodie; Pueyo, Laurent; Soummer, Rémi; Perrin, Marshall D.; Hagan, J. Brendan; Gofas-Salas, Elena; Rajan, Abhijith; Aguilar, Jonathan
2015-09-01
The ALICE program, for Archival Legacy Investigation of Circumstellar Environment, is currently conducting a virtual survey of about 400 stars, by re-analyzing the HST-NICMOS coronagraphic archive with advanced post-processing techniques. We present here the strategy that we adopted to identify detections and potential candidates for follow-up observations, and we give a preliminary overview of our detections. We present a statistical analysis conducted to evaluate the confidence level on these detection and the completeness of our candidate search.
Ocean Networks Canada's "Big Data" Initiative
NASA Astrophysics Data System (ADS)
Dewey, R. K.; Hoeberechts, M.; Moran, K.; Pirenne, B.; Owens, D.
2013-12-01
Ocean Networks Canada operates two large undersea observatories that collect, archive, and deliver data in real time over the Internet. These data contribute to our understanding of the complex changes taking place on our ocean planet. Ocean Networks Canada's VENUS was the world's first cabled seafloor observatory to enable researchers anywhere to connect in real time to undersea experiments and observations. Its NEPTUNE observatory is the largest cabled ocean observatory, spanning a wide range of ocean environments. Most recently, we installed a new small observatory in the Arctic. Together, these observatories deliver "Big Data" across many disciplines in a cohesive manner using the Oceans 2.0 data management and archiving system that provides national and international users with open access to real-time and archived data while also supporting a collaborative work environment. Ocean Networks Canada operates these observatories to support science, innovation, and learning in four priority areas: study of the impact of climate change on the ocean; the exploration and understanding the unique life forms in the extreme environments of the deep ocean and below the seafloor; the exchange of heat, fluids, and gases that move throughout the ocean and atmosphere; and the dynamics of earthquakes, tsunamis, and undersea landslides. To date, the Ocean Networks Canada archive contains over 130 TB (collected over 7 years) and the current rate of data acquisition is ~50 TB per year. This data set is complex and diverse. Making these "Big Data" accessible and attractive to users is our priority. In this presentation, we share our experience as a "Big Data" institution where we deliver simple and multi-dimensional calibrated data cubes to a diverse pool of users. Ocean Networks Canada also conducts extensive user testing. Test results guide future tool design and development of "Big Data" products. We strive to bridge the gap between the raw, archived data and the needs and experience of a diverse user community, each requiring tailored data visualization and integrated products. By doing this we aim to design tools that maximize exploitation of the data.
48 CFR 15.404 - Proposal analysis.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 48 Federal Acquisition Regulations System 1 2011-10-01 2011-10-01 false Proposal analysis. 15.404 Section 15.404 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION CONTRACTING METHODS AND CONTRACT TYPES CONTRACTING BY NEGOTIATION Contract Pricing 15.404 Proposal analysis. ...
48 CFR 15.404 - Proposal analysis.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Proposal analysis. 15.404 Section 15.404 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION CONTRACTING METHODS AND CONTRACT TYPES CONTRACTING BY NEGOTIATION Contract Pricing 15.404 Proposal analysis. ...
Using the Stereotype Content Model to examine group depictions in Fascism: An Archival Approach.
Durante, Federica; Volpato, Chiara; Fiske, Susan T
2010-04-01
The Stereotype Content Model (SCM) suggests potentially universal intergroup depictions. If universal, they should apply across history in archival data. Bridging this gap, we examined social groups descriptions during Italy's Fascist era. In Study 1, articles published in a Fascist magazine- La Difesa della Razza -were content analyzed, and results submitted to correspondence analysis. Admiration prejudice depicted ingroups; envious and contemptuous prejudices depicted specific outgroups, generally in line with SCM predictions. No paternalistic prejudice appeared; historical reasons might explain this finding. Results also fit the recently developed BIAS Map of behavioral consequences. In Study 2, ninety-six undergraduates rated the content-analysis traits on warmth and competence, without knowing their origin. They corroborated SCM's interpretations of the archival data.
A statistical analysis of IUE spectra of dwarf novae and nova-like stars
NASA Technical Reports Server (NTRS)
Ladous, Constanze
1990-01-01
First results of a statistical analysis of the IUE International Ultraviolet Explorer archive on dwarf novae and nova like stars are presented. The archive contains approximately 2000 low resolution spectra of somewhat over 100 dwarf novae and nova like stars. Many of these were looked at individually, but so far the collective information content of this set of data has not been explored. The first results of work are reported.
Archive Management of NASA Earth Observation Data to Support Cloud Analysis
NASA Technical Reports Server (NTRS)
Lynnes, Christopher; Baynes, Kathleen; McInerney, Mark A.
2017-01-01
NASA collects, processes and distributes petabytes of Earth Observation (EO) data from satellites, aircraft, in situ instruments and model output, with an order of magnitude increase expected by 2024. Cloud-based web object storage (WOS) of these data can simplify the execution of such an increase. More importantly, it can also facilitate user analysis of those volumes by making the data available to the massively parallel computing power in the cloud. However, storing EO data in cloud WOS has a ripple effect throughout the NASA archive system with unexpected challenges and opportunities. One challenge is modifying data servicing software (such as Web Coverage Service servers) to access and subset data that are no longer on a directly accessible file system, but rather in cloud WOS. Opportunities include refactoring of the archive software to a cloud-native architecture; virtualizing data products by computing on demand; and reorganizing data to be more analysis-friendly.
Next-generation sequencing provides unprecedented access to genomic information in archival FFPE tissue samples. However, costs and technical challenges related to RNA isolation and enrichment limit use of whole-genome RNA-sequencing for large-scale studies of FFPE specimens. Rec...
Using archived ITS data for sensitivity analysis in the estimation of mobile source emissions
DOT National Transportation Integrated Search
2000-12-01
The study described in this paper demonstrates the use of archived ITS data from San Antonio's TransGuide traffic management center (TMC) for sensitivity analyses in the estimation of on-road mobile source emissions. Because of the stark comparison b...
The Department of Defense Acquisition Workforce: Background, Analysis, and Questions for Congress
2016-07-29
The Department of Defense Acquisition Workforce: Background, Analysis, and Questions for Congress Moshe Schwartz Specialist in Defense... Acquisition Kathryn A. Francis Analyst in Government Organization and Management Charles V. O’Connor U.S. Department of Defense Fellow July 29, 2016...Congressional Research Service 7-5700 www.crs.gov R44578 The Department of Defense Acquisition Workforce: Background, Analysis, and Questions
An Analysis of Turkey’s Defense Systems Acquisition Policy
2009-03-01
i NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA MBA PROFESSIONAL REPORT An Analysis of Turkey’s Defense Systems Acquisition...2009 3. REPORT TYPE AND DATES COVERED MBA THESIS 4. TITLE AND SUBTITLE An Analysis of Turkey’s Defense Systems Acquisition Policy 6. AUTHOR(S...DISTRIBUTION CODE 13. ABSTRACT (maximum 200 words) The purpose of this MBA thesis is to analyze Turkey’s defense systems acquisition policy and
Applications and challenges of digital pathology and whole slide imaging.
Higgins, C
2015-07-01
Virtual microscopy is a method for digitizing images of tissue on glass slides and using a computer to view, navigate, change magnification, focus and mark areas of interest. Virtual microscope systems (also called digital pathology or whole slide imaging systems) offer several advantages for biological scientists who use slides as part of their general, pharmaceutical, biotechnology or clinical research. The systems usually are based on one of two methodologies: area scanning or line scanning. Virtual microscope systems enable automatic sample detection, virtual-Z acquisition and creation of focal maps. Virtual slides are layered with multiple resolutions at each location, including the highest resolution needed to allow more detailed review of specific regions of interest. Scans may be acquired at 2, 10, 20, 40, 60 and 100 × or a combination of magnifications to highlight important detail. Digital microscopy starts when a slide collection is put into an automated or manual scanning system. The original slides are archived, then a server allows users to review multilayer digital images of the captured slides either by a closed network or by the internet. One challenge for adopting the technology is the lack of a universally accepted file format for virtual slides. Additional challenges include maintaining focus in an uneven sample, detecting specimens accurately, maximizing color fidelity with optimal brightness and contrast, optimizing resolution and keeping the images artifact-free. There are several manufacturers in the field and each has not only its own approach to these issues, but also its own image analysis software, which provides many options for users to enhance the speed, quality and accuracy of their process through virtual microscopy. Virtual microscope systems are widely used and are trusted to provide high quality solutions for teleconsultation, education, quality control, archiving, veterinary medicine, research and other fields.
Fully automated processing of fMRI data in SPM: from MRI scanner to PACS.
Maldjian, Joseph A; Baer, Aaron H; Kraft, Robert A; Laurienti, Paul J; Burdette, Jonathan H
2009-01-01
Here we describe the Wake Forest University Pipeline, a fully automated method for the processing of fMRI data using SPM. The method includes fully automated data transfer and archiving from the point of acquisition, real-time batch script generation, distributed grid processing, interface to SPM in MATLAB, error recovery and data provenance, DICOM conversion and PACS insertion. It has been used for automated processing of fMRI experiments, as well as for the clinical implementation of fMRI and spin-tag perfusion imaging. The pipeline requires no manual intervention, and can be extended to any studies requiring offline processing.
A global planktic foraminifer census data set for the Pliocene ocean
Dowsett, Harry J.; Robinson, Marci M.; Foley, Kevin M.
2016-01-01
This article presents data derived by the USGS Pliocene Research, Interpretation and Synoptic Mapping (PRISM) Project. PRISM has generated planktic foraminifer census data from core sites and outcrops around the globe since 1988. These data form the basis of a number of paleoceanographic reconstructions focused on the mid-Piacenzian Warm Period (3.264 to 3.025 million years ago). Data are presented as counts of individuals within 64 taxonomic categories for each locality. We describe sample acquisition and processing, age dating, taxonomy and archival storage of material. These data provide a unique, stratigraphically focused opportunity to assess the effects of global warming on marine plankton.
76 FR 14568 - Federal Acquisition Regulation; Use of Commercial Services Item Authority
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-16
...-AL44 Federal Acquisition Regulation; Use of Commercial Services Item Authority AGENCIES: Department of... interim rule amending the Federal Acquisition Regulation (FAR) to implement section 868 of the Duncan... interim rule. II. Discussion/Analysis The analysis of public comments by the Defense Acquisition...
Calderon, Karynna; Forde, Arnell S.; Dadisman, Shawn V.; Wiese, Dana S.; Phelps, Daniel C.
2012-01-01
In September and October of 2003, the U.S. Geological Survey (USGS), in cooperation with the Florida Geological Survey, conducted geophysical surveys of the Atlantic Ocean offshore northeast Florida from St. Augustine, Florida, to the Florida-Georgia border. This report serves as an archive of unprocessed digital boomer subbottom profile data, trackline maps, navigation files, Geographic Information System (GIS) files, Field Activity Collection System (FACS) logs, and formal Federal Geographic Data Committee (FGDC) metadata. Filtered and gained (a relative increase in signal amplitude) digital images of the seismic profiles are also provided. Refer to the Acronyms page for expansions of all acronyms and abbreviations used in this report. The USGS St. Petersburg Coastal and Marine Science Center (SPCMSC) assigns a unique identifier to each cruise or field activity. For example, 03FGS01 tells us the data were collected in 2003 as part of cooperative work with the Florida Geological Survey (FGS) and that the data were collected during the first field activity for that project in that calendar year. Refer to http://walrus.wr.usgs.gov/infobank/programs/html/definition/activity.html for a detailed description of the method used to assign the field activity identification (ID). The naming convention used for each seismic line is as follows: yye##a, where 'yy' are the last two digits of the year in which the data were collected, 'e' is a 1-letter abbreviation for the equipment type (for example, b for boomer), '##' is a 2-digit number representing a specific track, and 'a' is a letter representing the section of a line if recording was prematurely terminated or rerun for quality or acquisition problems. The boomer plate is an acoustic energy source that consists of capacitors charged to a high voltage and discharged through a transducer in the water. The transducer is towed on a sled floating on the water surface and when discharged emits a short acoustic pulse, or shot, which propagates through the water, sediment column, or rock beneath. The acoustic energy is reflected at density boundaries (such as the seafloor, sediment, or rock layers beneath the seafloor), detected by hydrophone receivers, and recorded by a PC-based seismic acquisition system. This process is repeated at timed intervals (for example, 0.5 seconds) and recorded for specific intervals of time (for example, 100 milliseconds). In this way, a two-dimensional (2-D) vertical profile of the shallow geologic structure beneath the ship track is produced. Refer to the handwritten FACS operation log (PDF, 442 KB) for diagrams and descriptions of acquisition geometry, which varied throughout the cruises. Table 1 displays a summary of acquisition parameters. See the digital FACS equipment logs (PDF, 9-13 KB each) for details about the acquisition equipment used. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG Y (Barry and others, 1975) format (rev. 0), except for the first 3,200 bytes of the card image header, which are stored in ASCII format instead of the standard EBCDIC format. The SEG Y files may be downloaded and processed with commercial or public domain software such as Seismic Unix (SU) (Cohen and Stockwell, 2005). See the How To Download SEG Y Data page for download instructions. The printable profiles provided here are Graphics Interchange Format (GIF) images that were filtered and gained using SU software. Refer to the Software page for details about the processing and links to example SU processing scripts and USGS software for viewing the SEG Y files (Zihlman, 1992).
An enhanced archive facilitating climate impacts analysis
Maurer, E.P.; Brekke, L.; Pruitt, T.; Thrasher, B.; Long, J.; Duffy, P.; Dettinger, M.; Cayan, D.; Arnold, J.
2014-01-01
We describe the expansion of a publicly available archive of downscaled climate and hydrology projections for the United States. Those studying or planning to adapt to future climate impacts demand downscaled climate model output for local or regional use. The archive we describe attempts to fulfill this need by providing data in several formats, selectable to meet user needs. Our archive has served as a resource for climate impacts modelers, water managers, educators, and others. Over 1,400 individuals have transferred more than 50 TB of data from the archive. In response to user demands, the archive has expanded from monthly downscaled data to include daily data to facilitate investigations of phenomena sensitive to daily to monthly temperature and precipitation, including extremes in these quantities. New developments include downscaled output from the new Coupled Model Intercomparison Project phase 5 (CMIP5) climate model simulations at both the monthly and daily time scales, as well as simulations of surface hydrologi- cal variables. The web interface allows the extraction of individual projections or ensemble statistics for user-defined regions, promoting the rapid assessment of model consensus and uncertainty for future projections of precipitation, temperature, and hydrology. The archive is accessible online (http://gdo-dcp.ucllnl.org/downscaled_ cmip_projections).
High volume data storage architecture analysis
NASA Technical Reports Server (NTRS)
Malik, James M.
1990-01-01
A High Volume Data Storage Architecture Analysis was conducted. The results, presented in this report, will be applied to problems of high volume data requirements such as those anticipated for the Space Station Control Center. High volume data storage systems at several different sites were analyzed for archive capacity, storage hierarchy and migration philosophy, and retrieval capabilities. Proposed architectures were solicited from the sites selected for in-depth analysis. Model architectures for a hypothetical data archiving system, for a high speed file server, and for high volume data storage are attached.
Characterizing the LANDSAT Global Long-Term Data Record
NASA Technical Reports Server (NTRS)
Arvidson, T.; Goward, S. N.; Williams, D. L.
2006-01-01
The effects of global climate change are fast becoming politically, sociologically, and personally important: increasing storm frequency and intensity, lengthening cycles of drought and flood, expanding desertification and soil salinization. A vital asset in the analysis of climate change on a global basis is the 34-year record of Landsat imagery. In recognition of its increasing importance, a detailed analysis of the Landsat observation coverage within the US archive was commissioned. Results to date indicate some unexpected gaps in the US-held archive. Fortunately, throughout the Landsat program, data have been downlinked routinely to International Cooperator (IC) ground stations for archival, processing, and distribution. These IC data could be combined with the current US holdings to build a nearly global, annual observation record over this 34-year period. Today, we have inadequate information as to which scenes are available from which IC archives. Our best estimate is that there are over four million digital scenes in the IC archives, compared with the nearly two million scenes held in the US archive. This vast pool of Landsat observations needs to be accurately documented, via metadata, to determine the existence of complementary scenes and to characterize the potential scope of the global Landsat observation record. Of course, knowing the extent and completeness of the data record is but the first step. It will be necessary to assure that the data record is easy to use, internally consistent in terms of calibration and data format, and fully accessible in order to fully realize its potential.
Lee, Deanna; Das Gupta, Jaydip; Gaughan, Christina; Steffen, Imke; Tang, Ning; Luk, Ka-Cheung; Qiu, Xiaoxing; Urisman, Anatoly; Fischer, Nicole; Molinaro, Ross; Broz, Miranda; Schochetman, Gerald; Klein, Eric A; Ganem, Don; Derisi, Joseph L; Simmons, Graham; Hackett, John; Silverman, Robert H; Chiu, Charles Y
2012-01-01
XMRV, or xenotropic murine leukemia virus (MLV)-related virus, is a novel gammaretrovirus originally identified in studies that analyzed tissue from prostate cancer patients in 2006 and blood from patients with chronic fatigue syndrome (CFS) in 2009. However, a large number of subsequent studies failed to confirm a link between XMRV infection and CFS or prostate cancer. On the contrary, recent evidence indicates that XMRV is a contaminant originating from the recombination of two mouse endogenous retroviruses during passaging of a prostate tumor xenograft (CWR22) in mice, generating laboratory-derived cell lines that are XMRV-infected. To confirm or refute an association between XMRV and prostate cancer, we analyzed prostate cancer tissues and plasma from a prospectively collected cohort of 39 patients as well as archival RNA and prostate tissue from the original 2006 study. Despite comprehensive microarray, PCR, FISH, and serological testing, XMRV was not detected in any of the newly collected samples or in archival tissue, although archival RNA remained XMRV-positive. Notably, archival VP62 prostate tissue, from which the prototype XMRV strain was derived, tested negative for XMRV on re-analysis. Analysis of viral genomic and human mitochondrial sequences revealed that all previously characterized XMRV strains are identical and that the archival RNA had been contaminated by an XMRV-infected laboratory cell line. These findings reveal no association between XMRV and prostate cancer, and underscore the conclusion that XMRV is not a naturally acquired human infection.
ICI optical data storage tape: An archival mass storage media
NASA Technical Reports Server (NTRS)
Ruddick, Andrew J.
1993-01-01
At the 1991 Conference on Mass Storage Systems and Technologies, ICI Imagedata presented a paper which introduced ICI Optical Data Storage Tape. This paper placed specific emphasis on the media characteristics and initial data was presented which illustrated the archival stability of the media. More exhaustive analysis that was carried out on the chemical stability of the media is covered. Equally important, it also addresses archive management issues associated with, for example, the benefits of reduced rewind requirements to accommodate tape relaxation effects that result from careful tribology control in ICI Optical Tape media. ICI Optical Tape media was designed to meet the most demanding requirements of archival mass storage. It is envisaged that the volumetric data capacity, long term stability and low maintenance characteristics demonstrated will have major benefits in increasing reliability and reducing the costs associated with archival storage of large data volumes.
Archival samples represent a vast resource for identification of chemical and pharmaceutical targets. Previous use of formalin-fixed paraffin-embedded (FFPE) samples has been limited due to changes in RNA introduced by fixation and embedding procedures. Recent advances in RNA-seq...
Use of archival resources has been limited to date by inconsistent methods for genomic profiling of degraded RNA from formalin-fixed paraffin-embedded (FFPE) samples. RNA-sequencing offers a promising way to address this problem. Here we evaluated transcriptomic dose responses us...
Formalin-fixed paraffin-embedded (FFPE) tissue samples represent a potentially invaluable resource for genomic research into the molecular basis of disease. However, use of FFPE samples in gene expression studies has been limited by technical challenges resulting from degradation...
Natural Learning Case Study Archives
ERIC Educational Resources Information Center
Lawler, Robert W.
2015-01-01
Natural Learning Case Study Archives (NLCSA) is a research facility for those interested in using case study analysis to deepen their understanding of common sense knowledge and natural learning (how the mind interacts with everyday experiences to develop common sense knowledge). The database comprises three case study corpora based on experiences…
The semantic pathfinder: using an authoring metaphor for generic multimedia indexing.
Snoek, Cees G M; Worring, Marcel; Geusebroek, Jan-Mark; Koelma, Dennis C; Seinstra, Frank J; Smeulders, Arnold W M
2006-10-01
This paper presents the semantic pathfinder architecture for generic indexing of multimedia archives. The semantic pathfinder extracts semantic concepts from video by exploring different paths through three consecutive analysis steps, which we derive from the observation that produced video is the result of an authoring-driven process. We exploit this authoring metaphor for machine-driven understanding. The pathfinder starts with the content analysis step. In this analysis step, we follow a data-driven approach of indexing semantics. The style analysis step is the second analysis step. Here, we tackle the indexing problem by viewing a video from the perspective of production. Finally, in the context analysis step, we view semantics in context. The virtue of the semantic pathfinder is its ability to learn the best path of analysis steps on a per-concept basis. To show the generality of this novel indexing approach, we develop detectors for a lexicon of 32 concepts and we evaluate the semantic pathfinder against the 2004 NIST TRECVID video retrieval benchmark, using a news archive of 64 hours. Top ranking performance in the semantic concept detection task indicates the merit of the semantic pathfinder for generic indexing of multimedia archives.
The AMBRE Project: Stellar parameterisation of the ESO:FEROS archived spectra
NASA Astrophysics Data System (ADS)
Worley, C. C.; de Laverny, P.; Recio-Blanco, A.; Hill, V.; Bijaoui, A.; Ordenovic, C.
2012-06-01
Context. The AMBRE Project is a collaboration between the European Southern Observatory (ESO) and the Observatoire de la Côte d'Azur (OCA) that has been established in order to carry out the determination of stellar atmospheric parameters for the archived spectra of four ESO spectrographs. Aims: The analysis of the FEROS archived spectra for their stellar parameters (effective temperatures, surface gravities, global metallicities, alpha element to iron ratios and radial velocities) has been completed in the first phase of the AMBRE Project. From the complete ESO:FEROS archive dataset that was received, a total of 21 551 scientific spectra have been identified, covering the period 2005 to 2010. These spectra correspond to 6285 stars. Methods: The determination of the stellar parameters was carried out using the stellar parameterisation algorithm, MATISSE (MATrix Inversion for Spectral SynthEsis), which has been developed at OCA to be used in the analysis of large scale spectroscopic studies in galactic archaeology. An analysis pipeline has been constructed that integrates spectral normalisation, cleaning and radial velocity correction procedures in order that the FEROS spectra could be analysed automatically with MATISSE to obtain the stellar parameters. The synthetic grid against which the MATISSE analysis is carried out is currently constrained to parameters of FGKM stars only. Results: Stellar atmospheric parameters, effective temperature, surface gravity, metallicity and alpha element abundances, were determined for 6508 (30.2%) of the FEROS archived spectra (~3087 stars). Radial velocities were determined for 11 963 (56%) of the archived spectra. 2370 (11%) spectra could not be analysed within the pipeline due to very low signal-to-noise ratios or missing spectral orders. 12 673 spectra (58.8%) were analysed in the pipeline but their parameters were discarded based on quality criteria and error analysis determined within the automated process. The majority of these rejected spectra were found to have broad spectral features, as probed both by the direct measurement of the features and cross-correlation function breadths, indicating that they may be hot and/or fast rotating stars, which are not considered within the adopted reference synthetic spectra grid. The current configuration of the synthetic spectra grid is devoted to slow-rotating FGKM stars. Hence non-standard spectra (binaries, chemically peculiar stars etc.) that could not be identified may pollute the analysis.
Digital image archiving: challenges and choices.
Dumery, Barbara
2002-01-01
In the last five years, imaging exam volume has grown rapidly. In addition to increased image acquisition, there is more patient information per study. RIS-PACS integration and information-rich DICOM headers now provide us with more patient information relative to each study. The volume of archived digital images is increasing and will continue to rise at a steeper incline than film-based storage of the past. Many filmless facilities have been caught off guard by this increase, which has been stimulated by many factors. The most significant factor is investment in new digital and DICOM-compliant modalities. A huge volume driver is the increase in images per study from multi-slice technology. Storage requirements also are affected by disaster recovery initiatives and state retention mandates. This burgeoning rate of imaging data volume presents many challenges: cost of ownership, data accessibility, storage media obsolescence, database considerations, physical limitations, reliability and redundancy. There are two basic approaches to archiving--single tier and multi-tier. Each has benefits. With a single-tier approach, all the data is stored on a single media that can be accessed very quickly. A redundant copy of the data is then stored onto another less expensive media. This is usually a removable media. In this approach, the on-line storage is increased incrementally as volume grows. In a multi-tier approach, storage levels are set up based on access speed and cost. In other words, all images are stored at the deepest archiving level, which is also the least expensive. Images are stored on or moved back to the intermediate and on-line levels if they will need to be accessed more quickly. It can be difficult to decide what the best approach is for your organization. The options include RAIDs (redundant array of independent disks), direct attached RAID storage (DAS), network storage using RAIDs (NAS and SAN), removable media such as different types of tape, compact disks (CDs and DVDs) and magneto-optical disks (MODs). As you evaluate the various options for storage, it is important to consider both performance and cost. For most imaging enterprises, a single-tier archiving approach is the best solution. With the cost of hard drives declining, NAS is a very feasible solution today. It is highly reliable, offers immediate access to all exams, and easily scales as imaging volume grows. Best of all, media obsolescence challenges need not be of concern. For back-up storage, removable media can be implemented, with a smaller investment needed as it will only be used for a redundant copy of the data. There is no need to keep it online and available. If further system redundancy is desired, multiple servers should be considered. The multi-tier approach still has its merits for smaller enterprises, but with a detailed long-term cost of ownership analysis, NAS will probably still come out on top as the solution of choice for many imaging facilities.
Executable research compendia in geoscience research infrastructures
NASA Astrophysics Data System (ADS)
Nüst, Daniel
2017-04-01
From generation through analysis and collaboration to communication, scientific research requires the right tools. Scientists create their own software using third party libraries and platforms. Cloud computing, Open Science, public data infrastructures, and Open Source enable scientists with unprecedented opportunites, nowadays often in a field "Computational X" (e.g. computational seismology) or X-informatics (e.g. geoinformatics) [0]. This increases complexity and generates more innovation, e.g. Environmental Research Infrastructures (environmental RIs [1]). Researchers in Computational X write their software relying on both source code (e.g. from https://github.com) and binary libraries (e.g. from package managers such as APT, https://wiki.debian.org/Apt, or CRAN, https://cran.r-project.org/). They download data from domain specific (cf. https://re3data.org) or generic (e.g. https://zenodo.org) data repositories, and deploy computations remotely (e.g. European Open Science Cloud). The results themselves are archived, given persistent identifiers, connected to other works (e.g. using https://orcid.org/), and listed in metadata catalogues. A single researcher, intentionally or not, interacts with all sub-systems of RIs: data acquisition, data access, data processing, data curation, and community support [3]. To preserve computational research [3] proposes the Executable Research Compendium (ERC), a container format closing the gap of dependency preservation by encapsulating the runtime environment. ERCs and RIs can be integrated for different uses: (i) Coherence: ERC services validate completeness, integrity and results (ii) Metadata: ERCs connect the different parts of a piece of research and faciliate discovery (iii) Exchange and Preservation: ERC as usable building blocks are the shared and archived entity (iv) Self-consistency: ERCs remove dependence on ephemeral sources (v) Execution: ERC services create and execute a packaged analysis but integrate with existing platforms for display and control These integrations are vital for capturing workflows in RIs and connect key stakeholders (scientists, publishers, librarians). They are demonstrated using developments by the DFG-funded project Opening Reproducible Research (http://o2r.info). Semi-automatic creation of ERCs based on research workflows is a core goal of the project. References [0] Tony Hey, Stewart Tansley, Kristin Tolle (eds), 2009. The Fourth Paradigm: Data-Intensive Scientific Discovery. Microsoft Research. [1] P. Martin et al., Open Information Linking for Environmental Research Infrastructures, 2015 IEEE 11th International Conference on e-Science, Munich, 2015, pp. 513-520. doi: 10.1109/eScience.2015.66 [2] Y. Chen et al., Analysis of Common Requirements for Environmental Science Research Infrastructures, The International Symposium on Grids and Clouds (ISGC) 2013, Taipei, 2013, http://pos.sissa.it/archive/conferences/179/032/ISGC [3] Opening Reproducible Research, Geophysical Research Abstracts Vol. 18, EGU2016-7396, 2016, http://meetingorganizer.copernicus.org/EGU2016/EGU2016-7396.pdf
DOT National Transportation Integrated Search
2006-11-01
This report discusses data acquisition and analysis for grade crossing risk analysis at the proposed San Joaquin High-Speed Rail Corridor in San Joaquin, California, and documents the data acquisition and analysis methodologies used to collect and an...
48 CFR 1511.011-76 - Legal analysis.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 48 Federal Acquisition Regulations System 6 2010-10-01 2010-10-01 true Legal analysis. 1511.011-76 Section 1511.011-76 Federal Acquisition Regulations System ENVIRONMENTAL PROTECTION AGENCY ACQUISITION PLANNING DESCRIBING AGENCY NEEDS 1511.011-76 Legal analysis. Contracting Officers shall insert the clause...
A Photo Album of Earth Scheduling Landsat 7 Mission Daily Activities
NASA Technical Reports Server (NTRS)
Potter, William; Gasch, John; Bauer, Cynthia
1998-01-01
Landsat7 is a member of a new generation of Earth observation satellites. Landsat7 will carry on the mission of the aging Landsat 5 spacecraft by acquiring high resolution, multi-spectral images of the Earth surface for strategic, environmental, commercial, agricultural and civil analysis and research. One of the primary mission goals of Landsat7 is to accumulate and seasonally refresh an archive of global images with full coverage of Earth's landmass, less the central portion of Antarctica. This archive will enable further research into seasonal, annual and long-range trending analysis in such diverse research areas as crop yields, deforestation, population growth, and pollution control, to name just a few. A secondary goal of Landsat7 is to fulfill imaging requests from our international partners in the mission. Landsat7 will transmit raw image data from the spacecraft to 25 ground stations in 20 subscribing countries. Whereas earlier Landsat missions were scheduled manually (as are the majority of current low-orbit satellite missions), the task of manually planning and scheduling Landsat7 mission activities would be overwhelmingly complex when considering the large volume of image requests, the limited resources available, spacecraft instrument limitations, and the limited ground image processing capacity, not to mention avoidance of foul weather systems. The Landsat7 Mission Operation Center (MOC) includes an image scheduler subsystem that is designed to automate the majority of mission planning and scheduling, including selection of the images to be acquired, managing the recording and playback of the images by the spacecraft, scheduling ground station contacts for downlink of images, and generating the spacecraft commands for controlling the imager, recorder, transmitters and antennas. The image scheduler subsystem autonomously generates 90% of the spacecraft commanding with minimal manual intervention. The image scheduler produces a conflict-free schedule for acquiring images of the "best" 250 scenes daily for refreshing the global archive. It then equitably distributes the remaining resources for acquiring up to 430 scenes to satisfy requests by international subscribers. The image scheduler selects candidate scenes based on priority and age of the requests, and predicted cloud cover and sun angle at each scene. It also selects these scenes to avoid instrument constraint violations and maximizes efficiency of resource usage by encouraging acquisition of scenes in clusters. Of particular interest to the mission planners, it produces the resulting schedule in a reasonable time, typically within 15 minutes.
Downs, Jennifer A; Dupnik, Kathryn M; van Dam, Govert J; Urassa, Mark; Lutonja, Peter; Kornelis, Dieuwke; de Dood, Claudia J; Hoekstra, Pytsje; Kanjala, Chifundo; Isingo, Raphael; Peck, Robert N; Lee, Myung Hee; Corstjens, Paul L A M; Todd, Jim; Changalucha, John M; Johnson, Warren D; Fitzgerald, Daniel W
2017-09-01
Schistosomiasis affects 218 million people worldwide, with most infections in Africa. Prevalence studies suggest that people with chronic schistosomiasis may have higher risk of HIV-1 acquisition and impaired ability to control HIV-1 replication once infected. We hypothesized that: (1) pre-existing schistosome infection may increase the odds of HIV-1 acquisition and that the effects may differ between men and women, and (2) individuals with active schistosome infection at the time of HIV-1 acquisition may have impaired immune control of HIV-1, resulting in higher HIV-1 viral loads at HIV-1 seroconversion. We conducted a nested case-control study within a large population-based survey of HIV-1 transmission in Tanzania. A population of adults from seven villages was tested for HIV in 2007, 2010, and 2013 and dried blood spots were archived for future studies with participants' consent. Approximately 40% of this population has Schistosoma mansoni infection, and 2% has S. haematobium. We tested for schistosome antigens in the pre- and post-HIV-1-seroconversion blood spots of people who acquired HIV-1. We also tested blood spots of matched controls who did not acquire HIV-1 and calculated the odds that a person with schistosomiasis would become HIV-1-infected compared to these matched controls. Analysis was stratified by gender. We compared 73 HIV-1 seroconverters with 265 controls. Women with schistosome infections had a higher odds of HIV-1 acquisition than those without (adjusted OR = 2.8 [1.2-6.6], p = 0.019). Schistosome-infected men did not have an increased odds of HIV-1 acquisition (adjusted OR = 0.7 [0.3-1.8], p = 0.42). We additionally compared HIV-1 RNA levels in the post-seroconversion blood spots in HIV-1 seroconverters with schistosomiasis versus those without who became HIV-infected in 2010, before antiretroviral therapy was widely available in the region. The median whole blood HIV-1 RNA level in the 15 HIV-1 seroconverters with schistosome infection was significantly higher than in the 22 without schistosomiasis: 4.4 [3.9-4.6] log10 copies/mL versus 3.7 [3.2-4.3], p = 0.017. We confirm, in an area with endemic S. mansoni, that pre-existing schistosome infection increases odds of HIV-1 acquisition in women and raises HIV-1 viral load at the time of HIV-1 seroconversion. This is the first study to demonstrate the effect of schistosome infection on HIV-1 susceptibility and viral control, and to differentiate effects by gender. Validation studies will be needed at additional sites.
NASA Astrophysics Data System (ADS)
Greene, G.; Kyprianou, M.; Levay, K.; Sienkewicz, M.; Donaldson, T.; Dower, T.; Swam, M.; Bushouse, H.; Greenfield, P.; Kidwell, R.; Wolfe, D.; Gardner, L.; Nieto-Santisteban, M.; Swade, D.; McLean, B.; Abney, F.; Alexov, A.; Binegar, S.; Aloisi, A.; Slowinski, S.; Gousoulin, J.
2015-09-01
The next generation for the Space Telescope Science Institute data management system is gearing up to provide a suite of archive system services supporting the operation of the James Webb Space Telescope. We are now completing the initial stage of integration and testing for the preliminary ground system builds of the JWST Science Operations Center which includes multiple components of the Data Management Subsystem (DMS). The vision for astronomical science and research with the JWST archive introduces both solutions to formal mission requirements and innovation derived from our existing mission systems along with the collective shared experience of our global user community. We are building upon the success of the Hubble Space Telescope archive systems, standards developed by the International Virtual Observatory Alliance, and collaborations with our archive data center partners. In proceeding forward, the “one archive” architectural model presented here is designed to balance the objectives for this new and exciting mission. The STScI JWST archive will deliver high quality calibrated science data products, support multi-mission data discovery and analysis, and provide an infrastructure which supports bridges to highly valued community tools and services.
Application of the Maximum Entropy Method to Risk Analysis of Mergers and Acquisitions
NASA Astrophysics Data System (ADS)
Xie, Jigang; Song, Wenyun
The maximum entropy (ME) method can be used to analyze the risk of mergers and acquisitions when only pre-acquisition information is available. A practical example of the risk analysis of China listed firms’ mergers and acquisitions is provided to testify the feasibility and practicality of the method.
Using the Stereotype Content Model to examine group depictions in Fascism: An Archival Approach
Durante, Federica; Volpato, Chiara; Fiske, Susan T.
2013-01-01
The Stereotype Content Model (SCM) suggests potentially universal intergroup depictions. If universal, they should apply across history in archival data. Bridging this gap, we examined social groups descriptions during Italy’s Fascist era. In Study 1, articles published in a Fascist magazine— La Difesa della Razza —were content analyzed, and results submitted to correspondence analysis. Admiration prejudice depicted ingroups; envious and contemptuous prejudices depicted specific outgroups, generally in line with SCM predictions. No paternalistic prejudice appeared; historical reasons might explain this finding. Results also fit the recently developed BIAS Map of behavioral consequences. In Study 2, ninety-six undergraduates rated the content-analysis traits on warmth and competence, without knowing their origin. They corroborated SCM’s interpretations of the archival data. PMID:24403646
Catalog of earthquake hypocenters at Alaskan volcanoes: January 1 through December 31, 2002
Dixon, James P.; Stihler, Scott D.; Power, John A.; Tytgat, Guy; Moran, Seth C.; Sánchez, John; Estes, Steve; McNutt, Stephen R.; Paskievitch, John
2003-01-01
The Alaska Volcano Observatory (AVO), a cooperative program of the U.S. Geological Survey, the Geophysical Institute of the University of Alaska Fairbanks, and the Alaska Division of Geological and Geophysical Surveys, has maintained seismic monitoring networks at historically active volcanoes in Alaska since 1988 (Power and others, 1993; Jolly and others, 1996; Jolly and others, 2001; Dixon and others, 2002). The primary objectives of this program are the seismic monitoring of active, potentially hazardous, Alaskan volcanoes and the investigation of seismic processes associated with active volcanism. This catalog presents the basic seismic data and changes in the seismic monitoring program for the period January 1, 2002 through December 31, 2002. Appendix G contains a list of publications pertaining to seismicity of Alaskan volcanoes based on these and previously recorded data. The AVO seismic network was used to monitor twenty-four volcanoes in real time in 2002. These include Mount Wrangell, Mount Spurr, Redoubt Volcano, Iliamna Volcano, Augustine Volcano, Katmai Volcanic Group (Snowy Mountain, Mount Griggs, Mount Katmai, Novarupta, Trident Volcano, Mount Mageik, Mount Martin), Aniakchak Crater, Mount Veniaminof, Pavlof Volcano, Mount Dutton, Isanotski Peaks, Shishaldin Volcano, Fisher Caldera, Westdahl Peak, Akutan Peak, Makushin Volcano, Great Sitkin Volcano, and Kanaga Volcano (Figure 1). Monitoring highlights in 2002 include an earthquake swarm at Great Sitkin Volcano in May-June; an earthquake swarm near Snowy Mountain in July-September; low frequency (1-3 Hz) tremor and long-period events at Mount Veniaminof in September-October and in December; and continuing volcanogenic seismic swarms at Shishaldin Volcano throughout the year. Instrumentation and data acquisition highlights in 2002 were the installation of a subnetwork on Okmok Volcano, the establishment of telemetry for the Mount Veniaminof subnetwork, and the change in the data acquisition system to an EARTHWORM detection system. AVO located 7430 earthquakes during 2002 in the vicinity of the monitored volcanoes. This catalog includes: (1) a description of instruments deployed in the field and their locations; (2) a description of earthquake detection, recording, analysis, and data archival systems; (3) a description of velocity models used for earthquake locations; (4) a summary of earthquakes located in 2002; and (5) an accompanying UNIX tar-file with a summary of earthquake origin times, hypocenters, magnitudes, and location quality statistics; daily station usage statistics; and all HYPOELLIPSE files used to determine the earthquake locations in 2002.The AVO seismic network was used to monitor twenty-four volcanoes in real time in 2002. These include Mount Wrangell, Mount Spurr, Redoubt Volcano, Iliamna Volcano, Augustine Volcano, Katmai Volcanic Group (Snowy Mountain, Mount Griggs, Mount Katmai, Novarupta, Trident Volcano, Mount Mageik, Mount Martin), Aniakchak Crater, Mount Veniaminof, Pavlof Volcano, Mount Dutton, Isanotski Peaks, Shishaldin Volcano, Fisher Caldera, Westdahl Peak, Akutan Peak, Makushin Volcano, Great Sitkin Volcano, and Kanaga Volcano (Figure 1). Monitoring highlights in 2002 include an earthquake swarm at Great Sitkin Volcano in May-June; an earthquake swarm near Snowy Mountain in July-September; low frequency (1-3 Hz) tremor and long-period events at Mount Veniaminof in September-October and in December; and continuing volcanogenic seismic swarms at Shishaldin Volcano throughout the year. Instrumentation and data acquisition highlights in 2002 were the installation of a subnetwork on Okmok Volcano, the establishment of telemetry for the Mount Veniaminof subnetwork, and the change in the data acquisition system to an EARTHWORM detection system. AVO located 7430 earthquakes during 2002 in the vicinity of the monitored volcanoes.This catalog includes: (1) a description of instruments deployed in the field and their locations; (2) a description of earthquake detection, recording, analysis, and data archival systems; (3) a description of velocity models used for earthquake locations; (4) a summary of earthquakes located in 2002; and (5) an accompanying UNIX tar-file with a summary of earthquake origin times, hypocenters, magnitudes, and location quality statistics; daily station usage statistics; and all HYPOELLIPSE files used to determine the earthquake locations in 2002.
Bulgarian National Digital Seismological Network
NASA Astrophysics Data System (ADS)
Dimitrova, L.; Solakov, D.; Nikolova, S.; Stoyanov, S.; Simeonova, S.; Zimakov, L. G.; Khaikin, L.
2011-12-01
The Bulgarian National Digital Seismological Network (BNDSN) consists of a National Data Center (NDC), 13 stations equipped with RefTek High Resolution Broadband Seismic Recorders - model DAS 130-01/3, 1 station equipped with Quanterra 680 and broadband sensors and accelerometers. Real-time data transfer from seismic stations to NDC is realized via Virtual Private Network of the Bulgarian Telecommunication Company. The communication interruptions don't cause any data loss at the NDC. The data are backed up in the field station recorder's 4Mb RAM memory and are retransmitted to the NDC immediately after the communication link is re-established. The recorders are equipped with 2 compact flash disks able to save more than 1 month long data. The data from the flash disks can be downloaded remotely using FTP. The data acquisition and processing hardware redundancy at the NDC is achieved by two clustered SUN servers and two Blade Workstations. To secure the acquisition, processing and data storage processes a three layer local network is designed at the NDC. Real-time data acquisition is performed using REFTEK's full duplex error-correction protocol RTPD. Data from the Quanterra recorder and foreign stations are fed into RTPD in real-time via SeisComP/SeedLink protocol. Using SeisComP/SeedLink software the NDC transfers real-time data to INGV-Roma, NEIC-USA, ORFEUS Data Center. Regional real-time data exchange with Romania, Macedonia, Serbia and Greece is established at the NDC also. Data processing is performed by the Seismic Network Data Processor (SNDP) software package running on the both Servers. SNDP includes subsystems: Real-time subsystem (RTS_SNDP) - for signal detection; evaluation of the signal parameters; phase identification and association; source estimation; Seismic analysis subsystem (SAS_SNDP) - for interactive data processing; Early warning subsystem (EWS_SNDP) - based on the first arrived P-phases. The signal detection process is performed by traditional STA/LTA detection algorithm. The filter parameters of the detectors are defined on the base of previously evaluated ambient noise at the seismic stations. Some extra modules for network command/control, state-of-health network monitoring and data archiving are running as well in the National Data Center. Three types of archives are produced in the NDC - two continuous - miniSEED format and RefTek PASSCAL format; and one event oriented in CSS3.0 scheme format. Modern digital equipment and broad-band seismometers installed at Bulgarian seismic stations, careful selection of the software packages for automatic and interactive data processing in the data center proved to be suitable choice for the purposes of BNDSN and NDC: ? to ensure reliable automatic localization of the seismic events and rapid notification of the governmental authorities in case of felt earthquakes on the territory of Bulgaria; ? to provide a modern basis for seismological studies in Bulgaria.
48 CFR 215.404-2 - Information to support proposal analysis.
Code of Federal Regulations, 2010 CFR
2010-10-01
... Contract Pricing 215.404-2 Information to support proposal analysis. See PGI 215.404-2 for guidance on... 48 Federal Acquisition Regulations System 3 2010-10-01 2010-10-01 false Information to support proposal analysis. 215.404-2 Section 215.404-2 Federal Acquisition Regulations System DEFENSE ACQUISITION...
48 CFR 215.404-2 - Information to support proposal analysis.
Code of Federal Regulations, 2011 CFR
2011-10-01
... Contract Pricing 215.404-2 Information to support proposal analysis. See PGI 215.404-2 for guidance on... 48 Federal Acquisition Regulations System 3 2011-10-01 2011-10-01 false Information to support proposal analysis. 215.404-2 Section 215.404-2 Federal Acquisition Regulations System DEFENSE ACQUISITION...
48 CFR 215.404-2 - Information to support proposal analysis.
Code of Federal Regulations, 2012 CFR
2012-10-01
... Contract Pricing 215.404-2 Information to support proposal analysis. See PGI 215.404-2 for guidance on... 48 Federal Acquisition Regulations System 3 2012-10-01 2012-10-01 false Information to support proposal analysis. 215.404-2 Section 215.404-2 Federal Acquisition Regulations System DEFENSE ACQUISITION...
Contractor relationships and inter-organizational strategies in NASA's R and D acquisition process
NASA Technical Reports Server (NTRS)
Guiltinan, J.
1976-01-01
Interorganizational analysis of NASA's acquisition process for research and development systems is discussed. The importance of understanding the contractor environment, constraints, and motives in selecting an acquisition strategy is demonstrated. By articulating clear project goals, by utilizing information about the contractor and his needs at each stage in the acquisition process, and by thorough analysis of the inter-organizational relationship, improved selection of acquisition strategies and business practices is possible.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Daley, P F
The overall objective of this project is the continued development, installation, and testing of continuous water sampling and analysis technologies for application to on-site monitoring of groundwater treatment systems and remediation sites. In a previous project, an on-line analytical system (OLAS) for multistream water sampling was installed at the Fort Ord Operable Unit 2 Groundwater Treatment System, with the objective of developing a simplified analytical method for detection of Compounds of Concern at that plant, and continuous sampling of up to twelve locations in the treatment system, from raw influent waters to treated effluent. Earlier implementations of the water samplingmore » and processing system (Analytical Sampling and Analysis Platform, A A+RT, Milpitas, CA) depended on off-line integrators that produced paper plots of chromatograms, and sent summary tables to a host computer for archiving. We developed a basic LabVIEW (National Instruments, Inc., Austin, TX) based gas chromatography control and data acquisition system that was the foundation for further development and integration with the ASAP system. Advantages of this integration include electronic archiving of all raw chromatographic data, and a flexible programming environment to support development of improved ASAP operation and automated reporting. The initial goals of integrating the preexisting LabVIEW chromatography control system with the ASAP, and demonstration of a simplified, site-specific analytical method were successfully achieved. However, although the principal objective of this system was assembly of an analytical system that would allow plant operators an up-to-the-minute view of the plant's performance, several obstacles remained. Data reduction with the base LabVIEW system was limited to peak detection and simple tabular output, patterned after commercial chromatography integrators, with compound retention times and peak areas. Preparation of calibration curves, method detection limit estimates and trend plotting were performed with spreadsheets and statistics software. Moreover, the analytical method developed was very limited in compound coverage, and unable to closely mirror the standard analytical methods promulgated by the EPA. To address these deficiencies, during this award the original equipment was operated at the OU 2-GTS to further evaluate the use of columns, commercial standard blends and other components to broaden the compound coverage of the chromatography system. A second-generation ASAP was designed and built to replace the original system at the OU 2-GTS, and include provision for introduction of internal standard compounds and surrogates into each sample analyzed. An enhanced, LabVIEW based chromatogram analysis application was written, that manages and archives chemical standards information, and provides a basis for NIST traceability for all analyses. Within this same package, all compound calibration response curves are managed, and different report formats were incorporated, that simplify trend analysis. Test results focus on operation of the original system at the OU 1 Integrated Chemical and Flow Monitoring System, at the OU 1 Fire Drill Area remediation site.« less
ERIC Educational Resources Information Center
Boadle, Don
2003-01-01
This analysis of the transformation of the Charles Sturt University Regional Archives from a library special collection to a multi-function regional repository highlights the importance of stakeholder interests in determining institutional configurations and collection development priorities. It also demonstrates the critical importance of…
The purpose of this SOP is to outline the archive/custody guidelines used by the NHEXAS Arizona research project. This procedure was followed to maintain and locate samples, extracts, tracings and hard copy results after laboratory analysis during the Arizona NHEXAS project and ...
Research Capacity Building in Education: The Role of Digital Archives
ERIC Educational Resources Information Center
Carmichael, Patrick
2011-01-01
Accounts of how research capacity in education can be developed often make reference to electronic networks and online resources. This paper presents a theoretically driven analysis of the role of one such resource, an online archive of educational research studies that includes not only digitised collections of original documents but also videos…
Supporting Student Research with Semantic Technologies and Digital Archives
ERIC Educational Resources Information Center
Martinez-Garcia, Agustina; Corti, Louise
2012-01-01
This article discusses how the idea of higher education students as producers of knowledge rather than consumers can be operationalised by means of student research projects, in which processes of research archiving and analysis are enabled through the use of semantic technologies. It discusses how existing digital repository frameworks can be…
Commercial imagery archive product development
NASA Astrophysics Data System (ADS)
Sakkas, Alysa
1999-12-01
The Lockheed Martin (LM) team had garnered over a decade of operational experience in digital imagery management and analysis for the US Government at numerous worldwide sites. Recently, it set out to create a new commercial product to serve the needs of large-scale imagery archiving and analysis markets worldwide. LM decided to provide a turnkey commercial solution to receive, store, retrieve, process, analyze and disseminate in 'push' or 'pull' modes components and adapted and developed its own algorithms to provide added functionality not commercially available elsewhere. The resultant product, Intelligent Library System, satisfies requirements for (a) a potentially unbounded, data archive automated workflow management for increased user productivity; (c) automatic tracking and management of files stored on shelves; (d) ability to ingest, process and disseminate data involves with bandwidths ranging up to multi-gigabit per second; (e) access through a thin client- to-server network environment; (f) multiple interactive users needing retrieval of filters in seconds from both archived images or in real time, and (g) scalability that maintains information throughput performance as the size of the digital library grows.
Viking Seismometer PDS Archive Dataset
NASA Astrophysics Data System (ADS)
Lorenz, R. D.
2016-12-01
The Viking Lander 2 seismometer operated successfully for over 500 Sols on the Martian surface, recording at least one likely candidate Marsquake. The Viking mission, in an era when data handling hardware (both on board and on the ground) was limited in capability, predated modern planetary data archiving, and ad-hoc repositories of the data, and the very low-level record at NSSDC, were neither convenient to process nor well-known. In an effort supported by the NASA Mars Data Analysis Program, we have converted the bulk of the Viking dataset (namely the 49,000 and 270,000 records made in High- and Event- modes at 20 and 1 Hz respectively) into a simple ASCII table format. Additionally, since wind-generated lander motion is a major component of the signal, contemporaneous meteorological data are included in summary records to facilitate correlation. These datasets are being archived at the PDS Geosciences Node. In addition to brief instrument and dataset descriptions, the archive includes code snippets in the freely-available language 'R' to demonstrate plotting and analysis. Further, we present examples of lander-generated noise, associated with the sampler arm, instrument dumps and other mechanical operations.
Nielsen, E E; Morgan, J A T; Maher, S L; Edson, J; Gauthier, M; Pepperell, J; Holmes, B J; Bennett, M B; Ovenden, J R
2017-05-01
Archived specimens are highly valuable sources of DNA for retrospective genetic/genomic analysis. However, often limited effort has been made to evaluate and optimize extraction methods, which may be crucial for downstream applications. Here, we assessed and optimized the usefulness of abundant archived skeletal material from sharks as a source of DNA for temporal genomic studies. Six different methods for DNA extraction, encompassing two different commercial kits and three different protocols, were applied to material, so-called bio-swarf, from contemporary and archived jaws and vertebrae of tiger sharks (Galeocerdo cuvier). Protocols were compared for DNA yield and quality using a qPCR approach. For jaw swarf, all methods provided relatively high DNA yield and quality, while large differences in yield between protocols were observed for vertebrae. Similar results were obtained from samples of white shark (Carcharodon carcharias). Application of the optimized methods to 38 museum and private angler trophy specimens dating back to 1912 yielded sufficient DNA for downstream genomic analysis for 68% of the samples. No clear relationships between age of samples, DNA quality and quantity were observed, likely reflecting different preparation and storage methods for the trophies. Trial sequencing of DNA capture genomic libraries using 20 000 baits revealed that a significant proportion of captured sequences were derived from tiger sharks. This study demonstrates that archived shark jaws and vertebrae are potential high-yield sources of DNA for genomic-scale analysis. It also highlights that even for similar tissue types, a careful evaluation of extraction protocols can vastly improve DNA yield. © 2016 John Wiley & Sons Ltd.
Improving Image Drizzling in the HST Archive: Advanced Camera for Surveys
NASA Astrophysics Data System (ADS)
Hoffmann, Samantha L.; Avila, Roberto J.
2017-06-01
The Mikulski Archive for Space Telescopes (MAST) pipeline performs geometric distortion corrections, associated image combinations, and cosmic ray rejections with AstroDrizzle on Hubble Space Telescope (HST) data. The MDRIZTAB reference table contains a list of relevant parameters that controls this program. This document details our photometric analysis of Advanced Camera for Surveys Wide Field Channel (ACS/WFC) data processed by AstroDrizzle. Based on this analysis, we update the MDRIZTAB table to improve the quality of the drizzled products delivered by MAST.
Application service provider (ASP) financial models for off-site PACS archiving
NASA Astrophysics Data System (ADS)
Ratib, Osman M.; Liu, Brent J.; McCoy, J. Michael; Enzmann, Dieter R.
2003-05-01
For the replacement of its legacy Picture Archiving and Communication Systems (approx. annual workload of 300,000 procedures), UCLA Medical Center has evaluated and adopted an off-site data-warehousing solution based on an ASP financial with a one-time single payment per study archived. Different financial models for long-term data archive services were compared to the traditional capital/operational costs of on-site digital archives. Total cost of ownership (TCO), including direct and indirect expenses and savings, were compared for each model. Financial parameters were considered: logistic/operational advantages and disadvantages of ASP models versus traditional archiving systems. Our initial analysis demonstrated that the traditional linear ASP business model for data storage was unsuitable for large institutions. The overall cost markedly exceeds the TCO of an in-house archive infrastructure (when support and maintenance costs are included.) We demonstrated, however, that non-linear ASP pricing models can be cost-effective alternatives for large-scale data storage, particularly if they are based on a scalable off-site data-warehousing service and the prices are adapted to the specific size of a given institution. The added value of ASP is that it does not require iterative data migrations from legacy media to new storage media at regular intervals.
The Expansion of the Astronomical Photographic Data Archive at PARI
NASA Astrophysics Data System (ADS)
Cline, J. Donald; Barker, Thurburn; Castelaz, Michael
2017-01-01
A diverse set of photometric, astrometric, spectral and surface brightness data exist on decades of photographic glass plates. The Astronomical Photographic Data Archive (APDA) at the Pisgah Astronomical Research Institute (PARI) was established in November 2007 and is dedicated to the task of collecting, restoring, preserving and storing astronomical photographic data and PARI continues to accept collections. APDA is also tasked with scanning each image and establishing a database of images that can be accessed via the Internet by the global community of scientists, researchers and students. APDA is a new type of astronomical observatory - one that harnesses analog data of the night sky taken for more than a century and making that data available in a digital format.In 2016, APDA expanded from 50 collections with about 220,000 plates to more than 55 collections and more than 340,000 plates and films. These account for more than 30% of all astronomical photographic data in the United States. The largest of the new acquisitions are the astronomical photographic plates in the Yale University collection. We present details of the newly added collections and review of other collections in APDA.
X-ray computed tomography applied to investigate ancient manuscripts
NASA Astrophysics Data System (ADS)
Bettuzzi, Matteo; Albertin, Fauzia; Brancaccio, Rosa; Casali, Franco; Pia Morigi, Maria; Peccenini, Eva
2017-03-01
I will describe in this paper the first results of a series of X-ray tomography applications, with different system setups, running on some ancient manuscripts containing iron-gall ink. The purpose is to verify the optimum measurement conditions with a laboratory instrumentation -that is also in fact portable- in order to recognize the text from the inside of the documents, without opening them. This becomes possible by exploiting the X-rays absorption contrast of iron-based ink and the three-dimensional reconstruction potential provided by computed tomography that overcomes problems that appear in simple radiograph practice. This work is part of a larger project of EPFL (Ecole Polytechnique Fédérale de Lausanne, Switzerland), the "Venice Time Machine" project (EPEL, Digital Heritage Venice, http://dhvenice.eu/, 2015) aimed at digitizing, transcribing and sharing in an open database all the information of the State Archives of Venice, exploiting traditional digitization technologies and innovative methods of acquisition. In this first measurement campaign I investigated a manuscript of the seventeenth century made of a folded sheet; a couple of unopened ancient wills kept in the State Archives in Venice and a handwritten book of several hundred pages of notes of Physics of the nineteenth century.
Indexing and retrieving DICOM data in disperse and unstructured archives.
Costa, Carlos; Freitas, Filipe; Pereira, Marco; Silva, Augusto; Oliveira, José L
2009-01-01
This paper proposes an indexing and retrieval solution to gather information from distributed DICOM documents by allowing searches and access to the virtual data repository using a Google-like process. The medical imaging modalities are becoming more powerful and less expensive. The result is the proliferation of equipment acquisition by imaging centers, including the small ones. With this dispersion of data, it is not easy to take advantage of all the information that can be retrieved from these studies. Furthermore, many of these small centers do not have large enough requirements to justify the acquisition of a traditional PACS. A peer-to-peer PACS platform to index and query DICOM files over a set of distributed repositories that are logically viewed as a single federated unit. The solution is based on a public domain document-indexing engine and extends traditional PACS query and retrieval mechanisms. This proposal deals well with complex searching requirements, from a single desktop environment to distributed scenarios. The solution performance and robustness were demonstrated in trials. The characteristics of presented PACS platform make it particularly important for small institutions, including educational and research groups.
Ultrasound Picture Archiving And Communication Systems
NASA Astrophysics Data System (ADS)
Koestner, Ken; Hottinger, C. F.
1982-01-01
The ideal ultrasonic image communication and storage system must be flexible in order to optimize speed and minimize storage requirements. Various ultrasonic imaging modalities are quite different in data volume and speed requirements. Static imaging, for example B-Scanning, involves acquisition of a large amount of data that is averaged or accumulated in a desired manner. The image is then frozen in image memory before transfer and storage. Images are commonly a 512 x 512 point array, each point 6 bits deep. Transfer of such an image over a serial line at 9600 baud would require about three minutes. Faster transfer times are possible; for example, we have developed a parallel image transfer system using direct memory access (DMA) that reduces the time to 16 seconds. Data in this format requires 256K bytes for storage. Data compression can be utilized to reduce these requirements. Real-time imaging has much more stringent requirements for speed and storage. The amount of actual data per frame in real-time imaging is reduced due to physical limitations on ultrasound. For example, 100 scan lines (480 points long, 6 bits deep) can be acquired during a frame at a 30 per second rate. In order to transmit and save this data at a real-time rate requires a transfer rate of 8.6 Megabaud. A real-time archiving system would be complicated by the necessity of specialized hardware to interpolate between scan lines and perform desirable greyscale manipulation on recall. Image archiving for cardiology and radiology would require data transfer at this high rate to preserve temporal (cardiology) and spatial (radiology) information.
48 CFR 15.404-2 - Data to support proposal analysis.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Data to support proposal analysis. 15.404-2 Section 15.404-2 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION CONTRACTING METHODS AND CONTRACT TYPES CONTRACTING BY NEGOTIATION Contract Pricing 15.404-2 Data to support proposal analysis. (a) Field pricing...
Calderon, Karynna; Dadisman, Shawn V.; Kindinger, Jack G.; Davis, Jeffrey B.; Flocks, James G.; Wiese, Dana S.
2004-01-01
In August and September of 1993 and January of 1994, the U.S. Geological Survey, under a cooperative agreement with the St. Johns River Water Management District (SJRWMD), conducted geophysical surveys of Kingsley Lake, Orange Lake, and Lowry Lake in northeast Florida. This report serves as an archive of unprocessed digital boomer seismic reflection data, trackline maps, navigation files, GIS information, observer's logbook, Field Activity Collection System (FACS) logs, and formal FGDC metadata. A filtered and gained GIF image of each seismic profile is also provided. Refer to the Acronyms page for expansion of acronyms and abbreviations used in this report. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG-Y format (Barry and others, 1975) and may be downloaded and processed with commercial or public domain software such as Seismic Unix (SU). Examples of SU processing scripts and in-house (USGS) software for viewing SEG-Y files (Zihlman, 1992) are also provided. The data archived here were collected under a cooperative agreement with the St. Johns River Water Management District as part of the USGS Lakes and Coastal Aquifers (LCA) Project. For further information about this study, refer to http://coastal.er.usgs.gov/stjohns, Kindinger and others (1994), and Kindinger and others (2000). The USGS Florida Integrated Science Center (FISC) - Coastal and Watershed Studies in St. Petersburg, Florida, assigns a unique identifier to each cruise or field activity. For example, 93LCA01 tells us the data were collected in 1993 for the Lakes and Coastal Aquifers (LCA) Project and the data were collected during the first field activity for that project in that calendar year. For a detailed description of the method used to assign the field activity ID, see http://walrus.wr.usgs.gov/infobank/programs/html/definition/activity.html. The boomer is an acoustic energy source that consists of capacitors charged to a high voltage and discharged through a transducer in the water. The transducer is towed on a sled at the sea surface and when discharged emits a short acoustic pulse, or shot, that propagates through the water and sediment column. The acoustic energy is reflected at density boundaries (such as the seafloor or sediment layers beneath the seafloor), detected by the receiver, and recorded by a PC-based seismic acquisition system. This process is repeated at timed intervals (e.g., 0.5 s) and recorded for specific intervals of time (e.g., 100 ms). In this way, a two-dimensional vertical image of the shallow geologic structure beneath the ship track is produced. Acquisition geometery for 94LCA01 is recorded in the operations logbook. No logbook exists for 93LCA01. Table 1 displays acquisition parameters for both field activities. For more information about the acquisition equipment used, refer to the FACS equipment logs. The unprocessed seismic data are stored in SEG-Y format (Barry and others, 1975). For a detailed description of the data format, refer to the SEG-Y Format page. See the How To Download SEG-Y Data page for more information about these files. Processed profiles can be viewed as GIF images from the Profiles page. Refer to the Software page for details about the processing and examples of the processing scripts. Detailed information about the navigation systems used for each field activity can be found in Table 1 and the FACS equipment logs. To view the trackline maps and navigation files, and for more information about these items, see the Navigation page. The original trace files were recorded in nonstandard ELICS format and later converted to standard SEG-Y format. The original trace files for 94LCA01 lines ORJ127_1, ORJ127_3, and ORJ131_1 were divided into two or more trace files (e.g., ORJ127_1 became ORJ127_1a and ORJ127_1b) because the original total number of traces exceeded the maximum allowed by the processing system. Digital data were not recoverable for 93LCA
Cosmic Ray Energetics and Mass (CREAM)
NASA Technical Reports Server (NTRS)
Coutu, Stephane
2005-01-01
The CREAM instrument was flown on a Long Duration Balloon in Antarctica in December 2004 and January 2005, achieving a flight duration record of nearly 42 days. It detected and recorded cosmic ray primary particles ranging in type from hydrogen to iron nuclei and in energy from 1 TeV to several hundred TeV. With the data collected we will have the world's best measurement of the energy spectra and mass composition of nuclei in the primary cosmic ray flux at these energies, close to the astrophysical knee . The instrument utilized a thin calorimeter, a transition radiation detector and a timing charge detector, which also provided time-of-flight information. The responsibilities of our group have been with the timing charge detector (TCD), and with the data acquisition electronics and ground station support equipment. The TCD utilized fast scintillators to measure the charge of the primary cosmic ray before any interactions could take place within the calorimeter. The data acquisition electronics handled the output of the various detectors, in a fashion fully integrated with the payload bus. A space-qualified flight computer controlled the acquisition, and was used for preliminary trigger information processing and decision making. Ground support equipment was used to monitor the health of the payload, acquire and archive the data transmitted to the ground, and to provide real-time control of the instrument in flight.
CytometryML and other data formats
NASA Astrophysics Data System (ADS)
Leif, Robert C.
2006-02-01
Cytology automation and research will be enhanced by the creation of a common data format. This data format would provide the pathology and research communities with a uniform way for annotating and exchanging images, flow cytometry, and associated data. This specification and/or standard will include descriptions of the acquisition device, staining, the binary representations of the image and list-mode data, the measurements derived from the image and/or the list-mode data, and descriptors for clinical/pathology and research. An international, vendor-supported, non-proprietary specification will allow pathologists, researchers, and companies to develop and use image capture/analysis software, as well as list-mode analysis software, without worrying about incompatibilities between proprietary vendor formats. Presently, efforts to create specifications and/or descriptions of these formats include the Laboratory Digital Imaging Project (LDIP) Data Exchange Specification; extensions to the Digital Imaging and Communications in Medicine (DICOM); Open Microscopy Environment (OME); Flowcyt, an extension to the present Flow Cytometry Standard (FCS); and CytometryML. The feasibility of creating a common data specification for digital microscopy and flow cytometry in a manner consistent with its use for medical devices and interoperability with both hospital information and picture archiving systems has been demonstrated by the creation of the CytometryML schemas. The feasibility of creating a software system for digital microscopy has been demonstrated by the OME. CytometryML consists of schemas that describe instruments and their measurements. These instruments include digital microscopes and flow cytometers. Optical components including the instruments' excitation and emission parts are described. The description of the measurements made by these instruments includes the tagged molecule, data acquisition subsystem, and the format of the list-mode and/or image data. Many of the CytometryML data-types are based on the Digital Imaging and Communications in Medicine (DICOM). Binary files for images and list-mode data have been created and read.
Orbiter Flying Qualities (OFQ) Workstation user's guide
NASA Technical Reports Server (NTRS)
Myers, Thomas T.; Parseghian, Zareh; Hogue, Jeffrey R.
1988-01-01
This project was devoted to the development of a software package, called the Orbiter Flying Qualities (OFQ) Workstation, for working with the OFQ Archives which are specially selected sets of space shuttle entry flight data relevant to flight control and flying qualities. The basic approach to creation of the workstation software was to federate and extend commercial software products to create a low cost package that operates on personal computers. Provision was made to link the workstation to large computers, but the OFQ Archive files were also converted to personal computer diskettes and can be stored on workstation hard disk drives. The primary element of the workstation developed in the project is the Interactive Data Handler (IDH) which allows the user to select data subsets from the archives and pass them to specialized analysis programs. The IDH was developed as an application in a relational database management system product. The specialized analysis programs linked to the workstation include a spreadsheet program, FREDA for spectral analysis, MFP for frequency domain system identification, and NIPIP for pilot-vehicle system parameter identification. The workstation also includes capability for ensemble analysis over groups of missions.
Redundant array of independent disks: practical on-line archiving of nuclear medicine image data.
Lear, J L; Pratt, J P; Trujillo, N
1996-02-01
While various methods for long-term archiving of nuclear medicine image data exist, none support rapid on-line search and retrieval of information. We assembled a 90-Gbyte redundant array of independent disks (RAID) system using 10-, 9-Gbyte disk drives. The system was connected to a personal computer and software was used to partition the array into 4-Gbyte sections. All studies (50,000) acquired over a 7-year period were archived in the system. Based on patient name/number and study date, information could be located within 20 seconds and retrieved for display and analysis in less than 5 seconds. RAID offers a practical, redundant method for long-term archiving of nuclear medicine studies that supports rapid on-line retrieval.
Archive Management of NASA Earth Observation Data to Support Cloud Analysis
NASA Technical Reports Server (NTRS)
Lynnes, Christopher; Baynes, Kathleen; McInerney, Mark
2017-01-01
NASA collects, processes and distributes petabytes of Earth Observation (EO) data from satellites, aircraft, in situ instruments and model output, with an order of magnitude increase expected by 2024. Cloud-based web object storage (WOS) of these data can simplify the execution of such an increase. More importantly, it can also facilitate user analysis of those volumes by making the data available to the massively parallel computing power in the cloud. However, storing EO data in cloud WOS has a ripple effect throughout the NASA archive system with unexpected challenges and opportunities. One challenge is modifying data servicing software (such as Web Coverage Service servers) to access and subset data that are no longer on a directly accessible file system, but rather in cloud WOS. Opportunities include refactoring of the archive software to a cloud-native architecture; virtualizing data products by computing on demand; and reorganizing data to be more analysis-friendly. Reviewed by Mark McInerney ESDIS Deputy Project Manager.
The NASA/IPAC Teacher Archive Research Program (NITARP) at Pierce College
NASA Astrophysics Data System (ADS)
Mallory, Carolyn R.; Feig, M.; Mahmud, N.; Silic, T.; Rebull, L.; Hoette, V.; Johnson, C.; McCarron, K.
2011-01-01
Our team from Pierce Community College, Woodland Hills, CA, participated in the NASA/IPAC Teacher Archive Research Program (NITARP) this past year (2010). (NITARP is described in another poster, Rebull et al.) Our team worked with archival Spitzer, 2MASS, and optical data to look for young stars in CG4, part of the Gum Nebula; our scientific results are described in a companion poster, Johnson et al. In this poster, we describe more about what we learned and how we incorporated our NITARP experiences into the Pierce College environment. Students developed critical thinking skills and an ability to organize their data analysis and develop a mental "big picture" of what is going on in the CG4 region. The NITARP program is one of several "Active Learning" programs going on at Pierce, and the other programs are briefly summarized in this poster as well. This program was made possible through the NASA/IPAC Teacher Archive Research Project (NITARP) and was funded by NASA Astrophysics Data Program and Archive Outreach funds.
Optimisation of solar synoptic observations
NASA Astrophysics Data System (ADS)
Klvaña, Miroslav; Sobotka, Michal; Švanda, Michal
2012-09-01
The development of instrumental and computer technologies is connected with steadily increasing needs for archiving of large data volumes. The current trend to meet this requirement includes the data compression and growth of storage capacities. This approach, however, has technical and practical limits. A further reduction of the archived data volume can be achieved by means of an optimisation of the archiving that consists in data selection without losing the useful information. We describe a method of optimised archiving of solar images, based on the selection of images that contain a new information. The new information content is evaluated by means of the analysis of changes detected in the images. We present characteristics of different kinds of image changes and divide them into fictitious changes with a disturbing effect and real changes that provide a new information. In block diagrams describing the selection and archiving, we demonstrate the influence of clouds, the recording of images during an active event on the Sun, including a period before the event onset, and the archiving of long-term history of solar activity. The described optimisation technique is not suitable for helioseismology, because it does not conserve the uniform time step in the archived sequence and removes the information about solar oscillations. In case of long-term synoptic observations, the optimised archiving can save a large amount of storage capacities. The actual capacity saving will depend on the setting of the change-detection sensitivity and on the capability to exclude the fictitious changes.
USDA-ARS?s Scientific Manuscript database
An often cited advantage of MALDI-MS is the ability to archive and reuse sample plates after the initial analysis is complete. However, experience demonstrates that the peptide ion signals decay rapidly as the number of laser shots becomes large. Thus, the signal level obtainable from an archived sa...
The purpose of this SOP is to outline the archive/custody guidelines used by the Arizona Border Study. This procedure was followed to maintain and locate samples, extracts, tracings and hard copy results after laboratory analysis during the Arizona NHEXAS project and the Border ...
The Impact of Juvenile Diversion: An Assessment Using Multiple Archival Perspectives.
ERIC Educational Resources Information Center
Johnston, Judith E.
Delinquency reduction and reduction of the number of juveniles referred to the justice system were assessed for 14 diversion programs in Los Angeles County. A project versus nonproject comparison and a pre-post analysis with archival and other data gathered from 1972 through 1977 were used. Preliminary results indicated that the diversion projects…
Retrospective Analysis of Technological Literacy of K-12 Students in the USA
ERIC Educational Resources Information Center
Eisenkraft, Arthur
2010-01-01
Assessing technological literacy in the USA will require a large expenditure of resources. While these important initiatives are taking place, it is useful to analyze existing archival data to get a sense of students' understanding of technology. Such archival data exists from the entries submitted to the Toshiba/NSTA ExploraVisions competition…
An experimental investigation of the flow physics of high-lift systems
NASA Technical Reports Server (NTRS)
Thomas, Flint O.; Nelson, R. C.
1995-01-01
This progress report, a series of viewgraphs, outlines experiments on the flow physics of confluent boundary layers for high lift systems. The design objective is to design high lift systems with improved C(sub Lmax) for landing approach and improved take-off L/D and simultaneously reduce acquisition and maintenance costs. In effect, achieve improved performance with simpler designs. The research objectives include: establish the role of confluent boundary layer flow physics in high-lift production; contrast confluent boundary layer structure for optimum and non-optimum C(sub L) cases; formation of a high quality, detailed archival data base for CFD/modeling; and examination of the role of relaminarization and streamline curvature.
Third International Symposium on Space Mission Operations and Ground Data Systems, part 1
NASA Technical Reports Server (NTRS)
Rash, James L. (Editor)
1994-01-01
Under the theme of 'Opportunities in Ground Data Systems for High Efficiency Operations of Space Missions,' the SpaceOps '94 symposium included presentations of more than 150 technical papers spanning five topic areas: Mission Management, Operations, Data Management, System Development, and Systems Engineering. The papers focus on improvements in the efficiency, effectiveness, productivity, and quality of data acquisition, ground systems, and mission operations. New technology, techniques, methods, and human systems are discussed. Accomplishments are also reported in the application of information systems to improve data retrieval, reporting, and archiving; the management of human factors; the use of telescience and teleoperations; and the design and implementation of logistics support for mission operations.
NASA Astrophysics Data System (ADS)
Holmdahl, P. E.; Ellis, A. B. E.; Moeller-Olsen, P.; Ringgaard, J. P.
1981-12-01
The basic requirements of the SAR ground segment of ERS-1 are discussed. A system configuration for the real time data acquisition station and the processing and archive facility is depicted. The functions of a typical SAR processing unit (SPU) are specified, and inputs required for near real time and full precision, deferred time processing are described. Inputs and the processing required for provision of these inputs to the SPU are dealt with. Data flow through the systems, and normal and nonnormal operational sequence, are outlined. Prerequisites for maintaining overall performance are identified, emphasizing quality control. The most demanding tasks to be performed by the front end are defined in order to determine types of processors and peripherals which comply with throughput requirements.
Everything is Data - Overview of Modular System of Sensors for Museum Environment
NASA Astrophysics Data System (ADS)
Valach, J.; Juliš, K.; Štefcová, P.; Pech, M.; Wolf, B.; Kotyk, M.; Frankl, J.
2015-08-01
The main aim of project nearing completion was to develop a modular and scalable system of sensors for monitoring of internal environment of museum exhibitions and depositories. The sensors vary according to parameters being monitored and at the same time also according to required energy autonomy, processing capability and bandwidth requirements. Sensors developed can be divided into three groups: environmental sensors, biosensors and sensors of vibrations. Data acquired by the sensors are archived and stored in open format. Metadata stored alongside true numerical data from measurement, represent assurance of future computer readability in data mining application. Long continuous series of data can provide sufficient data for acquisition of dose-response function.
NASA Astrophysics Data System (ADS)
Wilkins, George A.; Stevens-Rayburn, Sarah
This report provides an overview of the presentations and summaries of discussions at IAU Colloquium 110, which was held in Washington, D.C., on 26-30 July 1988 and at the Goddard Space Flight Center on 1 August 1988. The topics included: the publication and acquisition of books and journals; searching for astronomical information; the handling and use of special-format materials; conservation; archiving of unpublished documents; uses of computers in libraries; astronomical databases and various aspects of the administration of astronomy libraries and services. Particular attention was paid to new developments, but the problems of astronomers and institutions in developing countries were also considered.
Forde, Arnell S.; Miselis, Jennifer L.; Wiese, Dana S.
2014-01-01
From July 23 - 31, 2012, the U.S. Geological Survey conducted geophysical surveys to investigate the geologic controls on barrier island framework and long-term sediment transport along the oil spill mitigation sand berm constructed at the north end and just offshore of the Chandeleur Islands, La. (figure 1). This effort is part of a broader USGS study, which seeks to better understand barrier island evolution over medium time scales (months to years). This report serves as an archive of unprocessed digital chirp subbottom data, trackline maps, navigation files, Geographic Information System (GIS) files, Field Activity Collection System (FACS) logs, and formal Federal Geographic Data Committee (FGDC) metadata. Gained (showing a relative increase in signal amplitude) digital images of the seismic profiles are also provided. Refer to the Abbreviations page for expansions of acronyms and abbreviations used in this report. The USGS St. Petersburg Coastal and Marine Science Center (SPCMSC) assigns a unique identifier to each cruise or field activity. For example, 12BIM03 tells us the data were collected in 2012 during the third field activity for that project in that calendar year and BIM is a generic code, which represents efforts related to Barrier Island Mapping. Refer to http://walrus.wr.usgs.gov/infobank/programs/html/definition/activity.html for a detailed description of the method used to assign the field activity ID. All chirp systems use a signal of continuously varying frequency; the EdgeTech SB-424 system used during this survey produces high-resolution, shallow-penetration (typically less than 50 milliseconds (ms)) profile images of sub-seafloor stratigraphy. The towfish contains a transducer that transmits and receives acoustic energy and is typically towed 1 - 2 m below the sea surface. As transmitted acoustic energy intersects density boundaries, such as the seafloor or sub-surface sediment layers, energy is reflected back toward the transducer, received, and recorded by a PC-based seismic acquisition system. This process is repeated at regular time intervals (for example, 0.125 seconds (s)) and returned energy is recorded for a specific duration (for example, 50 ms). In this way, a two-dimensional (2-D) vertical image of the shallow geologic structure beneath the ship track is produced. Figure 2 displays the acquisition geometry. Refer to table 1 for a summary of acquisition parameters and table 2 for trackline statistics. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG Y rev. 0 format (Barry and others, 1975); the first 3,200 bytes of the card image header are in ASCII format instead of EBCDIC format. The SEG Y files may be downloaded and processed with commercial or public domain software such as Seismic Unix (SU) (Cohen and Stockwell, 2010). See the How To Download SEG Y Data page for download instructions. The web version of this archive does not contain the SEG Y trace files. These files are very large and would require extremely long download times. To obtain the complete DVD archive, contact USGS Information Services at 1-888-ASK-USGS or infoservices@usgs.gov. The printable profiles provided here are GIF images that were processed and gained using SU software and can be viewed from the Profiles page or from links located on the trackline maps; refer to the Software page for links to example SU processing scripts. The SEG Y files are available on the DVD version of this report or on the Web, downloadable via the USGS Coastal and Marine Geoscience Data System (http://cmgds.marine.usgs.gov). The data are also available for viewing using GeoMapApp (http://www.geomapapp.org) and Virtual Ocean (http://www.virtualocean.org) multi-platform open source software. Detailed information about the navigation system used can be found in table 1 and the Field Activity Collection System (FACS) logs. To view the trackline maps and navigation files, and for more information about these items, see the Navigation page.
Back to the Future: Long-Term Seismic Archives Revisited
NASA Astrophysics Data System (ADS)
Waldhauser, F.; Schaff, D. P.
2007-12-01
Archives of digital seismic data recorded by seismometer networks around the world have grown tremendously over the last several decades helped by the deployment of seismic stations and their continued operation within the framework of monitoring seismic activity. These archives typically consist of waveforms of seismic events and associated parametric data such as phase arrival time picks and the location of hypocenters. Catalogs of earthquake locations are fundamental data in seismology, and even in the Earth sciences in general. Yet, these locations have notoriously low spatial resolution because of errors in both the picks and the models commonly used to locate events one at a time. This limits their potential to address fundamental questions concerning the physics of earthquakes, the structure and composition of the Earth's interior, and the seismic hazards associated with active faults. We report on the comprehensive use of modern waveform cross-correlation based methodologies for high- resolution earthquake location - as applied to regional and global long-term seismic databases. By simultaneous re-analysis of two decades of the digital seismic archive of Northern California, reducing pick errors via cross-correlation and model errors via double-differencing, we achieve up to three orders of magnitude resolution improvement over existing hypocenter locations. The relocated events image networks of discrete faults at seismogenic depths across various tectonic settings that until now have been hidden in location uncertainties. Similar location improvements are obtained for earthquakes recorded at global networks by re- processing 40 years of parametric data from the ISC and corresponding waveforms archived at IRIS. Since our methods are scaleable and run on inexpensive Beowulf clusters, periodic re-analysis of entire archives may thus become a routine procedure to continuously improve resolution in existing catalogs. We demonstrate the role of seismic archives in obtaining the precise location of new events in real-time. Such information has considerable social and economic impact in the evaluation and mitigation of seismic hazards, for example, and highlights the need for consistent long-term seismic monitoring and archiving of records.
Archival storage solutions for PACS
NASA Astrophysics Data System (ADS)
Chunn, Timothy
1997-05-01
While they are many, one of the inhibitors to the wide spread diffusion of PACS systems has been robust, cost effective digital archive storage solutions. Moreover, an automated Nearline solution is key to a central, sharable data repository, enabling many applications such as PACS, telemedicine and teleradiology, and information warehousing and data mining for research such as patient outcome analysis. Selecting the right solution depends on a number of factors: capacity requirements, write and retrieval performance requirements, scaleability in capacity and performance, configuration architecture and flexibility, subsystem availability and reliability, security requirements, system cost, achievable benefits and cost savings, investment protection, strategic fit and more.This paper addresses many of these issues. It compares and positions optical disk and magnetic tape technologies, which are the predominant archive mediums today. Price and performance comparisons will be made at different archive capacities, plus the effect of file size on storage system throughput will be analyzed. The concept of automated migration of images from high performance, high cost storage devices to high capacity, low cost storage devices will be introduced as a viable way to minimize overall storage costs for an archive. The concept of access density will also be introduced and applied to the selection of the most cost effective archive solution.
Object classification and outliers analysis in the forthcoming Gaia mission
NASA Astrophysics Data System (ADS)
Ordóñez-Blanco, D.; Arcay, B.; Dafonte, C.; Manteiga, M.; Ulla, A.
2010-12-01
Astrophysics is evolving towards the rational optimization of costly observational material by the intelligent exploitation of large astronomical databases from both terrestrial telescopes and spatial mission archives. However, there has been relatively little advance in the development of highly scalable data exploitation and analysis tools needed to generate the scientific returns from these large and expensively obtained datasets. Among the upcoming projects of astronomical instrumentation, Gaia is the next cornerstone ESA mission. The Gaia survey foresees the creation of a data archive and its future exploitation with automated or semi-automated analysis tools. This work reviews some of the work that is being developed by the Gaia Data Processing and Analysis Consortium for the object classification and analysis of outliers in the forthcoming mission.
NASA Astrophysics Data System (ADS)
Waldhauser, F.; Schaff, D. P.
2012-12-01
Archives of digital seismic data recorded by seismometer networks around the world have grown tremendously over the last several decades helped by the deployment of seismic stations and their continued operation within the framework of monitoring earthquake activity and verification of the Nuclear Test-Ban Treaty. We show results from our continuing effort in developing efficient waveform cross-correlation and double-difference analysis methods for the large-scale processing of regional and global seismic archives to improve existing earthquake parameter estimates, detect seismic events with magnitudes below current detection thresholds, and improve real-time monitoring procedures. We demonstrate the performance of these algorithms as applied to the 28-year long seismic archive of the Northern California Seismic Network. The tools enable the computation of periodic updates of a high-resolution earthquake catalog of currently over 500,000 earthquakes using simultaneous double-difference inversions, achieving up to three orders of magnitude resolution improvement over existing hypocenter locations. This catalog, together with associated metadata, form the underlying relational database for a real-time double-difference scheme, DDRT, which rapidly computes high-precision correlation times and hypocenter locations of new events with respect to the background archive (http://ddrt.ldeo.columbia.edu). The DDRT system facilitates near-real-time seismicity analysis, including the ability to search at an unprecedented resolution for spatio-temporal changes in seismogenic properties. In areas with continuously recording stations, we show that a detector built around a scaled cross-correlation function can lower the detection threshold by one magnitude unit compared to the STA/LTA based detector employed at the network. This leads to increased event density, which in turn pushes the resolution capability of our location algorithms. On a global scale, we are currently building the computational framework for double-difference processing the combined parametric and waveform archives of the ISC, NEIC, and IRIS with over three million recorded earthquakes worldwide. Since our methods are scalable and run on inexpensive Beowulf clusters, periodic re-analysis of such archives may thus become a routine procedure to continuously improve resolution in existing global earthquake catalogs. Results from subduction zones and aftershock sequences of recent great earthquakes demonstrate the considerable social and economic impact that high-resolution images of active faults, when available in real-time, will have in the prompt evaluation and mitigation of seismic hazards. These results also highlight the need for consistent long-term seismic monitoring and archiving of records.
48 CFR 3046.792 - Cost benefit analysis (USCG).
Code of Federal Regulations, 2010 CFR
2010-10-01
... 48 Federal Acquisition Regulations System 7 2010-10-01 2010-10-01 false Cost benefit analysis (USCG). 3046.792 Section 3046.792 Federal Acquisition Regulations System DEPARTMENT OF HOMELAND SECURITY, HOMELAND SECURITY ACQUISITION REGULATION (HSAR) CONTRACT MANAGEMENT QUALITY ASSURANCE Warranties 3046.792...
Predictors and Effects of Knowledge Management in U.S. Colleges and Schools of Pharmacy
NASA Astrophysics Data System (ADS)
Watcharadamrongkun, Suntaree
Public demands for accountability in higher education have placed increasing pressure on institutions to document their achievement of critical outcomes. These demands also have had wide-reaching implications for the development and enforcement of accreditation standards, including those governing pharmacy education. The knowledge management (KM) framework provides perspective for understanding how organizations evaluate themselves and guidance for how to improve their performance. In this study, we explore knowledge management processes, how these processes are affected by organizational structure and by information technology resources, and how these processes affect organizational performance. This is done in the context of Accreditation Standards and Guidelines for the Professional Program in Pharmacy Leading to the Doctor of Pharmacy Degree (Standards 2007). Data were collected using an online census survey of 121 U.S. Colleges and Schools of Pharmacy and supplemented with archival data. A key informant method was used with CEO Deans and Assessment leaders serving as respondents. The survey yielded a 76.0% (92/121) response rate. Exploratory factor analysis was used to construct scales (and scales) describing core KM processes: Knowledge Acquisition, Knowledge Integration, and Institutionalization; all scale reliabilities were found to be acceptable. Analysis showed that, as expected, greater Knowledge Acquisition predicts greater Knowledge Integration and greater Knowledge Integration predicts greater Institutionalization. Predictive models were constructed using hierarchical multiple regression and path analysis. Overall, information technology resources had stronger effects on KM processes than did characteristics of organizational structure. Greater Institutionalization predicted better outcomes related to direct measures of performance (i.e., NAPLEX pass rates, Accreditation actions) but Institutionalization was unrelated to an indirect measure of performance (i.e., USNWR ratings). Several organizational structure characteristics (i.e., size, age, and being part of an academic health center) were significant predictors of organizational performance; in contrast, IT resources had no direct effects on performance. Findings suggest that knowledge management processes, organizational structures and IT resources are related to better performance for Colleges and Schools of Pharmacy. Further research is needed to understand mechanisms through which specific knowledge management processes translate into better performance and, relatedly, to establish how enhancing KM processes can be used to improve institutional quality.
NASA Astrophysics Data System (ADS)
Boler, F. M.; Blewitt, G.; Kreemer, C. W.; Bock, Y.; Noll, C. E.; McWhirter, J.; Jamason, P.; Squibb, M. B.
2010-12-01
Space geodetic science and other disciplines using geodetic products have benefited immensely from open sharing of data and metadata from global and regional archives Ten years ago Scripps Orbit and Permanent Array Center (SOPAC), the NASA Crustal Dynamics Data Information System (CDDIS), UNAVCO and other archives collaborated to create the GPS Seamless Archive Centers (GSAC) in an effort to further enable research with the expanding collections of GPS data then becoming available. The GSAC partners share metadata to facilitate data discovery and mining across participating archives and distribution of data to users. This effort was pioneering, but was built on technology that has now been rendered obsolete. As the number of geodetic observing technologies has expanded, the variety of data and data products has grown dramatically, exposing limitations in data product sharing. Through a NASA ROSES project, the three archives (CDDIS, SOPAC and UNAVCO) have been funded to expand the original GSAC capability for multiple geodetic observation types and to simultaneously modernize the underlying technology by implementing web services. The University of Nevada, Reno (UNR) will test the web services implementation by incorporating them into their daily GNSS data processing scheme. The effort will include new methods for quality control of current and legacy data that will be a product of the analysis/testing phase performed by UNR. The quality analysis by UNR will include a report of the stability of the stations coordinates over time that will enable data users to select sites suitable for their application, for example identifying stations with large seasonal effects. This effort will contribute to enhanced ability for very large networks to obtain complete data sets for processing.
The X-Ray Spectra of Blazars: Analysis of the Complete EXOSAT Archive: Erratum
NASA Astrophysics Data System (ADS)
Sambruna, Rita M.; Barr, Paul; Giommi, Paolo; Maraschi, Laura; Tagliaferri, Gianpiero; Treves, Aldo
1995-07-01
In the paper "The X-Ray Spectra of Blazars: Analysis of the Complete EXOSAT Archive" by Rita M. Sambruna, Paul Barr, Paolo Giommi, Laura Maraschi, Gianpiero Tagliaferri, and Aldo Treves (ApJS, 95,371 [1994]), the section regarding the object PKS 1510-08 (Section 4.4.14) contains an erroneous quotation. K. P. Singh, A.R. Rao, and M.N. Vahia (ApJ, 365,455 [1990]) in fact detected: emission line only in the 1984 data, and not in the 1985 spectrum, as stated.
Monitoring of time and space evolution of glaciers' flow at the scale of the Karakoram and Himalayas
NASA Astrophysics Data System (ADS)
Dehecq, Amaury; Gourmelen, Noel; Trouvé, Emmanuel; Wegmuller, Urs; Cheng, Xiao
2014-05-01
Climate warming over the 20th century has caused drastic changes in mountain glaciers globally, and of the Himalayan glaciers in particular. The stakes are high; glaciers and ice caps are the largest contributor to the increase in the mass of the world's oceans, and the Himalayas play a key role in the hydrology of the region, impacting on the economy, food safety and flood risk to a large population. Partial monitoring of the Himalayan glaciers has revealed a mixed picture; while many of the Himalayan glaciers are retreating, in some cases locally stable or advancing glaciers in this region have also been observed. But recent controversies have highlighted the need to understand the glaciers dynamic and its relationship with climate change in this region. Earth Observation provides a mean for global and long-term monitoring of mountain glaciers' dynamics. In the frame of the Dragon program, a partnership between the European Space Agency (ESA) and the Chinese Center for Earth Observation (NRSCC), we begun a monitoring program aimed at quantifying multidecadal changes in glaciers' flow at the scale of the entire Himalayas and Karakoram from a 40 years' archive of Earth Observation. Ultimately, the provision of a global and time-sensitive glaciers velocity product will help to understand the evolution of the Himalayan glaciers in lights of glaciological (e.g. presence of debris-cover, surges, proglacial lakes) and climatic conditions. In this presentation, we focus on the analysis of the Landsat archive spanning the 1972 to 2012 period, which is global and provide multidecadal and continuous observation. We present the processing strategy including preprocessing of the images, image-matching and merging of the various results obtained from the repetitivity of the acquisitions in order to obtain a more robust, precise and complete glaciers velocity fields. We show that the recent archive (Landsat 4, 5 and 7, from 1982 to 2013) allows an estimate of the velocity for most of the Himalayan glaciers, except for the parts moving at rates inferior than the sensitivity of the method of about 15m/year. Geometric inaccuracies for the earlier missions (1 to 3, from 1972 to 1993), restrict the sensitivity to the largest glaciers but is sufficient enough to derive changes in the dynamic of those glaciers at decadal scales.
An Integrated Nonlinear Analysis library - (INA) for solar system plasma turbulence
NASA Astrophysics Data System (ADS)
Munteanu, Costel; Kovacs, Peter; Echim, Marius; Koppan, Andras
2014-05-01
We present an integrated software library dedicated to the analysis of time series recorded in space and adapted to investigate turbulence, intermittency and multifractals. The library is written in MATLAB and provides a graphical user interface (GUI) customized for the analysis of space physics data available online like: Coordinated Data Analysis Web (CDAWeb), Automated Multi Dataset Analysis system (AMDA), Planetary Science Archive (PSA), World Data Center Kyoto (WDC), Ulysses Final Archive (UFA) and Cluster Active Archive (CAA). Three main modules are already implemented in INA : the Power Spectral Density (PSD) Analysis, the Wavelet and Intemittency Analysis and the Probability Density Functions (PDF) analysis.The layered structure of the software allows the user to easily switch between different modules/methods while retaining the same time interval for the analysis. The wavelet analysis module includes algorithms to compute and analyse the PSD, the Scalogram, the Local Intermittency Measure (LIM) or the Flatness parameter. The PDF analysis module includes algorithms for computing the PDFs for a range of scales and parameters fully customizable by the user; it also computes the Flatness parameter and enables fast comparison with standard PDF profiles like, for instance, the Gaussian PDF. The library has been already tested on Cluster and Venus Express data and we will show relevant examples. Research supported by the European Community's Seventh Framework Programme (FP7/2007-2013) under grant agreement no 313038/STORM, and a grant of the Romanian Ministry of National Education, CNCS UEFISCDI, project number PN-II-ID PCE-2012-4-0418.
2016-12-01
including the GSBPP exit survey , archived GSBPP capstones, faculty advisement data, faculty interviews, and a new GSBPP student survey in order to detail...analysis from multiple sources, including the GSBPP exit survey , archived GSBPP capstones, faculty advisement data, faculty interviews, and a new...GSBPP student survey in order to detail the capstone’s process, content, and value to multiple stakeholders. The project team also employs the Plan-Do
The Protein Data Bank archive as an open data resource
Berman, Helen M.; Kleywegt, Gerard J.; Nakamura, Haruki; ...
2014-07-26
The Protein Data Bank archive was established in 1971, and recently celebrated its 40th anniversary (Berman et al. in Structure 20:391, 2012). Here, an analysis of interrelationships of the science, technology and community leads to further insights into how this resource evolved into one of the oldest and most widely used open-access data resources in biology.
The Protein Data Bank archive as an open data resource.
Berman, Helen M; Kleywegt, Gerard J; Nakamura, Haruki; Markley, John L
2014-10-01
The Protein Data Bank archive was established in 1971, and recently celebrated its 40th anniversary (Berman et al. in Structure 20:391, 2012). An analysis of interrelationships of the science, technology and community leads to further insights into how this resource evolved into one of the oldest and most widely used open-access data resources in biology.
The Protein Data Bank archive as an open data resource
DOE Office of Scientific and Technical Information (OSTI.GOV)
Berman, Helen M.; Kleywegt, Gerard J.; Nakamura, Haruki
The Protein Data Bank archive was established in 1971, and recently celebrated its 40th anniversary (Berman et al. in Structure 20:391, 2012). Here, an analysis of interrelationships of the science, technology and community leads to further insights into how this resource evolved into one of the oldest and most widely used open-access data resources in biology.
NASA Astrophysics Data System (ADS)
Leprince, S.; Ayoub, F.; Avouac, J.
2011-12-01
We have developed a suite of algorithms for precise Co-registration of Optically Sensed Images and Correlation (COSI-Corr) which were implemented in a software package first released to the academic community in 2007. Its capability for accurate surface deformation measurement has proved useful for a wide variety of applications. We present the fundamental principles of COSI-Corr, which are the key ingredients to achieve sub-pixel registration and sub-pixel measurement accuracy, and we show how they can be applied to various types of images to extract 2D, 3D, or even 4D deformation fields of a given surface. Examples are drawn from recent collaborative studies and include: (1) The study of the Icelandic Krafla rifting crisis that occurred from 1975 to 1984 where we used a combination of archived airborne photographs, declassified spy satellite imagery, and modern satellite acquisitions to propose a detailed 2D displacement field of the ground; (2) The estimation of glacial velocities from fast New Zealand glaciers using successive ASTER acquisitions; (3) The derivation of sand dunes migration rates; (4) The estimation of ocean swell velocity taking advantage of the short time delay between the acquisition of different spectral bands on the SPOT 5 satellite; (5) The derivation of the full 3D ground displacement field induced by the 2010 Mw 7.2 El Mayor-Cucapah Earthquake, as recorded from pre- and post-event lidar acquisitions; (6) And, the estimation of 2D in plane deformation of mechanical samples under stress in the lab. Finally, we conclude by highlighting the potential future and implication of applying such correlation techniques on a large scale to provide global monitoring of our environment.
Multi-temporal InSAR Datastacks for Surface Deformation Monitoring: a Review
NASA Astrophysics Data System (ADS)
Ferretti, A.; Novali, F.; Prati, C.; Rocca, F.
2009-04-01
In the last decade extensive processing of thousands of satellite radar scenes acquired by different sensors (e.g. ERS-1/2, ENVISAT and RADARSAT) has demonstrated how multi-temporal data-sets can be successfully exploited for surface deformation monitoring, by identifying objects on the terrain that have a stable, point-like behaviour. These objects, referred to as Permanent or Persistent Scatterers (PS), can be geo-coded and monitored for movement very accurately, acting as a "natural" geodetic network, integrating successfully continuous GPS data. After a brief analysis of both advantages and drawbacks of InSAR datastacks, the paper presents examples of applications of PS measurements for detecting and monitoring active faults, aquifers and oil/gas reservoirs, using experience in Europe, North America and Japan, and concludes with a discussion on future directions for PSInSAR analysis. Special attention is paid to the possibility of creating deformation maps over wide areas using historical archives of data already available. This second part of the paper will briefly discuss the technical features of the new radar sensors recently launched (namely: TerraSAR-X, RADARSAT-2, and CosmoSkyMed) and their impact on space geodesy, highlighting the importance of data continuity and standardized acquisition policies for almost all InSAR and PSInSAR applications. Finally, recent advances in the algorithms applied in PS analysis, such as detection of "temporary PS", PS characterization and exploitation of distributed scatterers, will be briefly discussed based on the processing of real data.
IRLooK: an advanced mobile infrared signature measurement, data reduction, and analysis system
NASA Astrophysics Data System (ADS)
Cukur, Tamer; Altug, Yelda; Uzunoglu, Cihan; Kilic, Kayhan; Emir, Erdem
2007-04-01
Infrared signature measurement capability has a key role in the electronic warfare (EW) self protection systems' development activities. In this article, the IRLooK System and its capabilities will be introduced. IRLooK is a truly innovative mobile infrared signature measurement system with all its design, manufacturing and integration accomplished by an engineering philosophy peculiar to ASELSAN. IRLooK measures the infrared signatures of military and civil platforms such as fixed/rotary wing aircrafts, tracked/wheeled vehicles and navy vessels. IRLooK has the capabilities of data acquisition, pre-processing, post-processing, analysis, storing and archiving over shortwave, mid-wave and long wave infrared spectrum by means of its high resolution radiometric sensors and highly sophisticated software analysis tools. The sensor suite of IRLooK System includes imaging and non-imaging radiometers and a spectroradiometer. Single or simultaneous multiple in-band measurements as well as high radiant intensity measurements can be performed. The system provides detailed information on the spectral, spatial and temporal infrared signature characteristics of the targets. It also determines IR Decoy characteristics. The system is equipped with a high quality field proven two-axes tracking mount to facilitate target tracking. Manual or automatic tracking is achieved by using a passive imaging tracker. The system also includes a high quality weather station and field-calibration equipment including cavity and extended area blackbodies. The units composing the system are mounted on flat-bed trailers and the complete system is designed to be transportable by large body aircraft.
PDS Archive Release of Apollo 11, Apollo 12, and Apollo 17 Lunar Rock Sample Images
NASA Technical Reports Server (NTRS)
Garcia, P. A.; Stefanov, W. L.; Lofgren, G. E.; Todd, N. S.; Gaddis, L. R.
2013-01-01
Scientists at the Johnson Space Center (JSC) Lunar Sample Laboratory, Information Resources Directorate, and Image Science & Analysis Laboratory have been working to digitize (scan) the original film negatives of Apollo Lunar Rock Sample photographs [1, 2]. The rock samples, and associated regolith and lunar core samples, were obtained during the Apollo 11, 12, 14, 15, 16 and 17 missions. The images allow scientists to view the individual rock samples in their original or subdivided state prior to requesting physical samples for their research. In cases where access to the actual physical samples is not practical, the images provide an alternate mechanism for study of the subject samples. As the negatives are being scanned, they have been formatted and documented for permanent archive in the NASA Planetary Data System (PDS). The Astromaterials Research and Exploration Science Directorate (which includes the Lunar Sample Laboratory and Image Science & Analysis Laboratory) at JSC is working collaboratively with the Imaging Node of the PDS on the archiving of these valuable data. The PDS Imaging Node is now pleased to announce the release of the image archives for Apollo missions 11, 12, and 17.
NASA Astrophysics Data System (ADS)
Miller, C. J.; Gasson, D.; Fuentes, E.
2007-10-01
The NOAO NVO Portal is a web application for one-stop discovery, analysis, and access to VO-compliant imaging data and services. The current release allows for GUI-based discovery of nearly a half million images from archives such as the NOAO Science Archive, the Hubble Space Telescope WFPC2 and ACS instruments, XMM-Newton, Chandra, and ESO's INT Wide-Field Survey, among others. The NOAO Portal allows users to view image metadata, footprint wire-frames, FITS image previews, and provides one-click access to science quality imaging data throughout the entire sky via the Firefox web browser (i.e., no applet or code to download). Users can stage images from multiple archives at the NOAO NVO Portal for quick and easy bulk downloads. The NOAO NVO Portal also provides simplified and direct access to VO analysis services, such as the WESIX catalog generation service. We highlight the features of the NOAO NVO Portal (http://nvo.noao.edu).
Li, Xueming; Zheng, Shawn; Agard, David A.; Cheng, Yifan
2015-01-01
Newly developed direct electron detection cameras have a high image output frame rate that enables recording dose fractionated image stacks of frozen hydrated biological samples by electron cryomicroscopy (cryoEM). Such novel image acquisition schemes provide opportunities to analyze cryoEM data in ways that were previously impossible. The file size of a dose fractionated image stack is 20 ~ 60 times larger than that of a single image. Thus, efficient data acquisition and on-the-fly analysis of a large number of dose-fractionated image stacks become a serious challenge to any cryoEM data acquisition system. We have developed a computer-assisted system, named UCSFImage4, for semi-automated cryo-EM image acquisition that implements an asynchronous data acquisition scheme. This facilitates efficient acquisition, on-the-fly motion correction, and CTF analysis of dose fractionated image stacks with a total time of ~60 seconds/exposure. Here we report the technical details and configuration of this system. PMID:26370395
In-database processing of a large collection of remote sensing data: applications and implementation
NASA Astrophysics Data System (ADS)
Kikhtenko, Vladimir; Mamash, Elena; Chubarov, Dmitri; Voronina, Polina
2016-04-01
Large archives of remote sensing data are now available to scientists, yet the need to work with individual satellite scenes or product files constrains studies that span a wide temporal range or spatial extent. The resources (storage capacity, computing power and network bandwidth) required for such studies are often beyond the capabilities of individual geoscientists. This problem has been tackled before in remote sensing research and inspired several information systems. Some of them such as NASA Giovanni [1] and Google Earth Engine have already proved their utility for science. Analysis tasks involving large volumes of numerical data are not unique to Earth Sciences. Recent advances in data science are enabled by the development of in-database processing engines that bring processing closer to storage, use declarative query languages to facilitate parallel scalability and provide high-level abstraction of the whole dataset. We build on the idea of bridging the gap between file archives containing remote sensing data and databases by integrating files into relational database as foreign data sources and performing analytical processing inside the database engine. Thereby higher level query language can efficiently address problems of arbitrary size: from accessing the data associated with a specific pixel or a grid cell to complex aggregation over spatial or temporal extents over a large number of individual data files. This approach was implemented using PostgreSQL for a Siberian regional archive of satellite data products holding hundreds of terabytes of measurements from multiple sensors and missions taken over a decade-long span. While preserving the original storage layout and therefore compatibility with existing applications the in-database processing engine provides a toolkit for provisioning remote sensing data in scientific workflows and applications. The use of SQL - a widely used higher level declarative query language - simplifies interoperability between desktop GIS, web applications and geographic web services and interactive scientific applications (MATLAB, IPython). The system is also automatically ingesting direct readout data from meteorological and research satellites in near-real time with distributed acquisition workflows managed by Taverna workflow engine [2]. The system has demonstrated its utility in performing non-trivial analytic processing such as the computation of the Robust Satellite Technique (RST) indices [3]. It had been useful in different tasks such as studying urban heat islands, analyzing patterns in the distribution of wildfire occurrences, detecting phenomena related to seismic and earthquake activity. Initial experience has highlighted several limitations of the proposed approach yet it has demonstrated ability to facilitate the use of large archives of remote sensing data by geoscientists. 1. J.G. Acker, G. Leptoukh, Online analysis enhances use of NASA Earth science data. EOS Trans. AGU, 2007, 88(2), P. 14-17. 2. D. Hull, K. Wolsfencroft, R. Stevens, C. Goble, M.R. Pocock, P. Li and T. Oinn, Taverna: a tool for building and running workflows of services. Nucleic Acids Research. 2006. V. 34. P. W729-W732. 3. V. Tramutoli, G. Di Bello, N. Pergola, S. Piscitelli, Robust satellite techniques for remote sensing of seismically active areas // Annals of Geophysics. 2001. no. 44(2). P. 295-312.
NASA Astrophysics Data System (ADS)
Neakrase, Lynn; Hornung, Danae; Sweebe, Kathrine; Huber, Lyle; Chanover, Nancy J.; Stevenson, Zena; Berdis, Jodi; Johnson, Joni J.; Beebe, Reta F.
2017-10-01
The Research and Analysis programs within NASA’s Planetary Science Division now require archiving of resultant data with the Planetary Data System (PDS) or an equivalent archive. The PDS Atmospheres Node is developing an online environment for assisting data providers with this task. The Educational Labeling System for Atmospheres (ELSA) is being designed with Django/Python coding to provide an easier environment for facilitating not only communication with the PDS node, but also streamlining the process of learning, developing, submitting, and reviewing archive bundles under the new PDS4 archiving standard. Under the PDS4 standard, data are archived in bundles, collections, and basic products that form an organizational hierarchy of interconnected labels that describe the data and relationships between the data and its documentation. PDS4 labels are implemented using Extensible Markup Language (XML), which is an international standard for managing metadata. Potential data providers entering the ELSA environment can learn more about PDS4, plan and develop label templates, and build their archive bundles. ELSA provides an interface to tailor label templates aiding in the creation of required internal Logical Identifiers (URN - Uniform Resource Names) and Context References (missions, instruments, targets, facilities, etc.). The underlying structure of ELSA uses Django/Python code that make maintaining and updating the interface easy to do for our undergraduate/graduate students. The ELSA environment will soon provide an interface for using the tailored templates in a pipeline to produce entire collections of labeled products, essentially building the user’s archive bundle. Once the pieces of the archive bundle are assembled, ELSA provides options for queuing the completed bundle for peer review. The peer review process has also been streamlined for online access and tracking to help make the archiving process with PDS as transparent as possible. We discuss the current status of ELSA and provide examples of its implementation.
The HEASARC in 2016: 25 Years and Counting
NASA Astrophysics Data System (ADS)
Drake, Stephen Alan; Smale, Alan P.
2016-04-01
The High Energy Astrophysics Archival Research Center or HEASARC (http://heasarc.gsfc.nasa.gov/) has been the NASA astrophysics discipline archive supporting multi-mission cosmic X-ray and gamma-ray astronomy research for 25 years, and, through its LAMBDA (Legacy Archive for Microwave Background Data Analysis: http://lambda.gsfc.nasa.gov/) component, the archive for cosmic microwave background data for the last 8 years. The HEASARC is the designated archive which supports NASA's Physics of the Cosmos theme (http://pcos.gsfc.nasa.gov/).The HEASARC provides a unified archive and software structure aimed both at 'legacy' high-energy missions such as Einstein, EXOSAT, ROSAT, RXTE, and Suzaku, contemporary missions such as Fermi, Swift, XMM-Newton, Chandra, NuSTAR, etc., and upcoming missions, such as Astro-H and NICER. The HEASARC's high-energy astronomy archive has grown so that it presently contains more than 80 terabytes (TB) of data from 30 past and present orbital missions. The user community downloaded 160 TB of high-energy data from the HEASARC last year, i.e., an amount equivalent to twice the size of the archive.We discuss some of the upcoming new initiatives and developments for the HEASARC, including the arrival of public data from the JAXA/NASA Astro-H mission, expected to have been launched in February 2016, and the NASA mission of opportunity Neutron Star Interior Composition Explorer (NICER), expected to be deployed in late summer 2016. We also highlight some of the new software and web initiatives of the HEASARC, and discuss our plans for the next 3 years.
HEASARC - The High Energy Astrophysics Science Archive Research Center
NASA Technical Reports Server (NTRS)
Smale, Alan P.
2011-01-01
The High Energy Astrophysics Science Archive Research Center (HEASARC) is NASA's archive for high-energy astrophysics and cosmic microwave background (CMB) data, supporting the broad science goals of NASA's Physics of the Cosmos theme. It provides vital scientific infrastructure to the community by standardizing science data formats and analysis programs, providing open access to NASA resources, and implementing powerful archive interfaces. Over the next five years the HEASARC will ingest observations from up to 12 operating missions, while serving data from these and over 30 archival missions to the community. The HEASARC archive presently contains over 37 TB of data, and will contain over 60 TB by the end of 2014. The HEASARC continues to secure major cost savings for NASA missions, providing a reusable mission-independent framework for reducing, analyzing, and archiving data. This approach was recognized in the NRC Portals to the Universe report (2007) as one of the HEASARC's great strengths. This poster describes the past and current activities of the HEASARC and our anticipated developments in coming years. These include preparations to support upcoming high energy missions (NuSTAR, Astro-H, GEMS) and ground-based and sub-orbital CMB experiments, as well as continued support of missions currently operating (Chandra, Fermi, RXTE, Suzaku, Swift, XMM-Newton and INTEGRAL). In 2012 the HEASARC (which now includes LAMBDA) will support the final nine-year WMAP data release. The HEASARC is also upgrading its archive querying and retrieval software with the new Xamin system in early release - and building on opportunities afforded by the growth of the Virtual Observatory and recent developments in virtual environments and cloud computing.
Raising orphans from a metadata morass: A researcher's guide to re-use of public 'omics data.
Bhandary, Priyanka; Seetharam, Arun S; Arendsee, Zebulun W; Hur, Manhoi; Wurtele, Eve Syrkin
2018-02-01
More than 15 petabases of raw RNAseq data is now accessible through public repositories. Acquisition of other 'omics data types is expanding, though most lack a centralized archival repository. Data-reuse provides tremendous opportunity to extract new knowledge from existing experiments, and offers a unique opportunity for robust, multi-'omics analyses by merging metadata (information about experimental design, biological samples, protocols) and data from multiple experiments. We illustrate how predictive research can be accelerated by meta-analysis with a study of orphan (species-specific) genes. Computational predictions are critical to infer orphan function because their coding sequences provide very few clues. The metadata in public databases is often confusing; a test case with Zea mays mRNA seq data reveals a high proportion of missing, misleading or incomplete metadata. This metadata morass significantly diminishes the insight that can be extracted from these data. We provide tips for data submitters and users, including specific recommendations to improve metadata quality by more use of controlled vocabulary and by metadata reviews. Finally, we advocate for a unified, straightforward metadata submission and retrieval system. Copyright © 2017 Elsevier B.V. All rights reserved.
Application of LA-MC-ICP-MS for analysis of Sr isotope ratios in speleothems
NASA Astrophysics Data System (ADS)
Weber, Michael; Scholz, Denis; Wassenburg, Jasper A.; Jochum, Klaus Peter; Breitenbach, Sebastian
2017-04-01
Speleothems are well established climate archives. In order to reconstruct past climate variability, several geochemical proxies, such as δ13C and δ18O as well as trace elements are available. Since several factors influence each individual proxy, robust interpretation is often hampered. This calls for multi-proxy approaches involving additional isotope systems that can help to delineate the role of different sources of water within the epikarst and changes in soil composition. Sr isotope ratios (87Sr/86Sr) have been shown to provide useful information about water residence time and water mixing in the host rock. Furthermore, Sr isotopes are not fractionated during calcite precipitation, implying that the 87Sr/86Sr ratio of the speleothem provides a direct record of the drip water. While most speleothem studies applying Sr isotopes used the TIMS methodology, LA-MC-ICP-MS has been utilized for several other archives, such as otoliths and teeth. This method provides the advantage of faster data acquisition, higher spatial resolution, larger sample throughput and the absence of chemical treatment prior to analysis. Here we present the first LA-MC-ICP-MS Sr isotope data for speleothems. The analytical uncertainty of our LA-MC-ICP-MS Sr data is in a similar range as for other carbonate materials. The results of different ablation techniques (i.e. line scan and spots) are reproducible within error, implying that the application of this technique on speleothems is possible. In addition, several comparative measurements of different carbonate reference materials (i.e. MACS-3, JCt-1, JCp-1), such as tests with standard bracketing and comparison of the 87Sr/86Sr ratios with nanosecond laser ablation system and a state-of-the-art femtosecond laser ablation system, show the robustness of the method. We applied the method to samples from Morocco (Grotte de Piste) and India (Mawmluh Cave). Our results show only very small changes in the 87Sr/86Sr ratios of both speleothems. However, one speleothem from Mawmluh Cave shows a slight increase of 87Sr/86Sr within the error, which is reproducible with line scans and spots.
Automated data acquisition technology development:Automated modeling and control development
NASA Technical Reports Server (NTRS)
Romine, Peter L.
1995-01-01
This report documents the completion of, and improvements made to, the software developed for automated data acquisition and automated modeling and control development on the Texas Micro rackmounted PC's. This research was initiated because a need was identified by the Metal Processing Branch of NASA Marshall Space Flight Center for a mobile data acquisition and data analysis system, customized for welding measurement and calibration. Several hardware configurations were evaluated and a PC based system was chosen. The Welding Measurement System (WMS), is a dedicated instrument strickly for use of data acquisition and data analysis. In addition to the data acquisition functions described in this thesis, WMS also supports many functions associated with process control. The hardware and software requirements for an automated acquisition system for welding process parameters, welding equipment checkout, and welding process modeling were determined in 1992. From these recommendations, NASA purchased the necessary hardware and software. The new welding acquisition system is designed to collect welding parameter data and perform analysis to determine the voltage versus current arc-length relationship for VPPA welding. Once the results of this analysis are obtained, they can then be used to develop a RAIL function to control welding startup and shutdown without torch crashing.
The Chandra X-ray Observatory: An Astronomical Facility Available to the World
NASA Technical Reports Server (NTRS)
Smith, Randall K.
2006-01-01
The Chandra X-ray observatory, one of NASA's "Great Observatories," provides high angular and spectral resolution X-ray data which is freely available to all. In this review I describe the instruments on chandra along with their current calibration, as well as the chandra proposal system, the freely-available Chandra analysis software package CIAO, and the Chandra archive. As Chandra is in its 6th year of operation, the archive already contains calibrated observations of a large range of X-ray sources. The Chandra X-ray Center is committed to assisting astronomers from any country who wish to use data from the archive or propose for observations
48 CFR 215.404-1 - Proposal analysis techniques.
Code of Federal Regulations, 2010 CFR
2010-10-01
... reliability of its estimating and accounting systems. [63 FR 55040, Oct. 14, 1998, as amended at 71 FR 69494... 48 Federal Acquisition Regulations System 3 2010-10-01 2010-10-01 false Proposal analysis techniques. 215.404-1 Section 215.404-1 Federal Acquisition Regulations System DEFENSE ACQUISITION...
NASA's Planetary Data System: Support for the Delivery of Derived Data Sets at the Atmospheres Node
NASA Astrophysics Data System (ADS)
Chanover, Nancy J.; Beebe, Reta; Neakrase, Lynn; Huber, Lyle; Rees, Shannon; Hornung, Danae
2015-11-01
NASA’s Planetary Data System is charged with archiving electronic data products from NASA planetary missions that are sponsored by NASA’s Science Mission Directorate. This archive, currently organized by science disciplines, uses standards for describing and storing data that are designed to enable future scientists who are unfamiliar with the original experiments to analyze the data, and to do this using a variety of computer platforms, with no additional support. These standards address the data structure, description contents, and media design. The new requirement in the NASA ROSES-2015 Research Announcement to include a Data Management Plan will result in an increase in the number of derived data sets that are being delivered to the PDS. These data sets may come from the Planetary Data Archiving, Restoration and Tools (PDART) program, other Data Analysis Programs (DAPs) or be volunteered by individuals who are publishing the results of their analysis. In response to this increase, the PDS Atmospheres Node is developing a set of guidelines and user tools to make the process of archiving these derived data products more efficient. Here we provide a description of Atmospheres Node resources, including a letter of support for the proposal stage, a communication schedule for the planned archive effort, product label samples and templates in extensible markup language (XML), documentation templates, and validation tools necessary for producing a PDS4-compliant derived data bundle(s) efficiently and accurately.
Zhou, Ren-Bin; Lu, Hui-Meng; Liu, Jie; Shi, Jian-Yu; Zhu, Jing; Lu, Qin-Qin; Yin, Da-Chuan
2016-01-01
Recombinant expression of proteins has become an indispensable tool in modern day research. The large yields of recombinantly expressed proteins accelerate the structural and functional characterization of proteins. Nevertheless, there are literature reported that the recombinant proteins show some differences in structure and function as compared with the native ones. Now there have been more than 100,000 structures (from both recombinant and native sources) publicly available in the Protein Data Bank (PDB) archive, which makes it possible to investigate if there exist any proteins in the RCSB PDB archive that have identical sequence but have some difference in structures. In this paper, we present the results of a systematic comparative study of the 3D structures of identical naturally purified versus recombinantly expressed proteins. The structural data and sequence information of the proteins were mined from the RCSB PDB archive. The combinatorial extension (CE), FATCAT-flexible and TM-Align methods were employed to align the protein structures. The root-mean-square distance (RMSD), TM-score, P-value, Z-score, secondary structural elements and hydrogen bonds were used to assess the structure similarity. A thorough analysis of the PDB archive generated five-hundred-seventeen pairs of native and recombinant proteins that have identical sequence. There were no pairs of proteins that had the same sequence and significantly different structural fold, which support the hypothesis that expression in a heterologous host usually could fold correctly into their native forms.
Zhou, Ren-Bin; Lu, Hui-Meng; Liu, Jie; Shi, Jian-Yu; Zhu, Jing; Lu, Qin-Qin; Yin, Da-Chuan
2016-01-01
Recombinant expression of proteins has become an indispensable tool in modern day research. The large yields of recombinantly expressed proteins accelerate the structural and functional characterization of proteins. Nevertheless, there are literature reported that the recombinant proteins show some differences in structure and function as compared with the native ones. Now there have been more than 100,000 structures (from both recombinant and native sources) publicly available in the Protein Data Bank (PDB) archive, which makes it possible to investigate if there exist any proteins in the RCSB PDB archive that have identical sequence but have some difference in structures. In this paper, we present the results of a systematic comparative study of the 3D structures of identical naturally purified versus recombinantly expressed proteins. The structural data and sequence information of the proteins were mined from the RCSB PDB archive. The combinatorial extension (CE), FATCAT-flexible and TM-Align methods were employed to align the protein structures. The root-mean-square distance (RMSD), TM-score, P-value, Z-score, secondary structural elements and hydrogen bonds were used to assess the structure similarity. A thorough analysis of the PDB archive generated five-hundred-seventeen pairs of native and recombinant proteins that have identical sequence. There were no pairs of proteins that had the same sequence and significantly different structural fold, which support the hypothesis that expression in a heterologous host usually could fold correctly into their native forms. PMID:27517583
InSAR data for monitoring land subsidence: time to think big
NASA Astrophysics Data System (ADS)
Ferretti, A.; Colombo, D.; Fumagalli, A.; Novali, F.; Rucci, A.
2015-11-01
Satellite interferometric synthetic aperture radar (InSAR) data have proven effective and valuable in the analysis of urban subsidence phenomena based on multi-temporal radar images. Results obtained by processing data acquired by different radar sensors, have shown the potential of InSAR and highlighted the key points for an operational use of this technology, namely: (1) regular acquisition over large areas of interferometric data stacks; (2) use of advanced processing algorithms, capable of estimating and removing atmospheric disturbances; (3) access to significant processing power for a regular update of the information over large areas. In this paper, we show how the operational potential of InSAR has been realized thanks to the recent advances in InSAR processing algorithms, the advent of cloud computing and the launch of new satellite platforms, specifically designed for InSAR analyses (e.g. Sentinel-1a operated by the ESA and ALOS2 operated by JAXA). The processing of thousands of SAR scenes to cover an entire nation has been performed successfully in Italy in a project financed by the Italian Ministry of the Environment. The challenge for the future is to pass from the historical analysis of SAR scenes already acquired in digital archives to a near real-time monitoring program where up to date deformation data are routinely provided to final users and decision makers.
Schuffenhauer, A; Popov, M; Schopfer, U; Acklin, P; Stanek, J; Jacoby, E
2004-12-01
This publication describes processes for the selection of chemical compounds for the building of a high-throughput screening (HTS) collection for drug discovery, using the currently implemented process in the Discovery Technologies Unit of the Novartis Institute for Biomedical Research, Basel Switzerland as reference. More generally, the currently existing compound acquisition models and practices are discussed. Our informatics, chemistry and biology-driven compound selection consists of two steps: 1) The individual compounds are filtered and grouped into three priority classes on the basis of their individual structural properties. Substructure filters are used to eliminate or penalize compounds based on unwanted structural properties. The similarity of the structures to reference ligands of the main proven druggable target families is computed, and drug-similar compounds are prioritized for the following diversity analysis. 2) The compounds are compared to the archive compounds and a diversity analysis is performed. This is done separately for the prioritized, regular and penalized compounds with increasingly stringent dissimilarity criterion. The process includes collecting vendor catalogues and monitoring the availability of samples together with the selection and purchase decision points. The development of a corporate vendor catalogue database is described. In addition to the selection methods on a per single molecule basis, selection criteria for scaffold and combinatorial chemistry projects in collaboration with compound vendors are discussed.
Data management, archiving, visualization and analysis of space physics data
NASA Technical Reports Server (NTRS)
Russell, C. T.
1995-01-01
A series of programs for the visualization and analysis of space physics data has been developed at UCLA. In the course of those developments, a number of lessons have been learned regarding data management and data archiving, as well as data analysis. The issues now facing those wishing to develop such software, as well as the lessons learned, are reviewed. Modern media have eased many of the earlier problems of the physical volume required to store data, the speed of access, and the permanence of the records. However, the ultimate longevity of these media is still a question of debate. Finally, while software development has become easier, cost is still a limiting factor in developing visualization and analysis software.
Practice acquisition: a due diligence checklist. HFMA Principles and Practices Board.
1995-12-01
As healthcare executives act to form integrated healthcare systems that encompass entities such as physician-hospital organizations and medical group practices, they often discover that practical guidance on acquiring physician practices is scarce. To address the need for authoritative guidance on practice acquisition, HFMA's Principles and Practices Board has developed a detailed analysis of physician practices acquisition issues, Issues Analysis 95-1: Acquisition of Physician Practices. This analysis includes a detailed due diligence checklist developed to assist both healthcare financial managers involved in acquiring physician practices and physician owners interested in selling their practices.
Contrast in Terahertz Images of Archival Documents—Part II: Influence of Topographic Features
NASA Astrophysics Data System (ADS)
Bardon, Tiphaine; May, Robert K.; Taday, Philip F.; Strlič, Matija
2017-04-01
We investigate the potential of terahertz time-domain imaging in reflection mode to reveal archival information in documents in a non-invasive way. In particular, this study explores the parameters and signal processing tools that can be used to produce well-contrasted terahertz images of topographic features commonly found in archival documents, such as indentations left by a writing tool, as well as sieve lines. While the amplitude of the waveforms at a specific time delay can provide the most contrasted and legible images of topographic features on flat paper or parchment sheets, this parameter may not be suitable for documents that have a highly irregular surface, such as water- or fire-damaged documents. For analysis of such documents, cross-correlation of the time-domain signals can instead yield images with good contrast. Analysis of the frequency-domain representation of terahertz waveforms can also provide well-contrasted images of topographic features, with improved spatial resolution when utilising high-frequency content. Finally, we point out some of the limitations of these means of analysis for extracting information relating to topographic features of interest from documents.
Better Living Through Metadata: Examining Archive Usage
NASA Astrophysics Data System (ADS)
Becker, G.; Winkelman, S.; Rots, A.
2013-10-01
The primary purpose of an observatory's archive is to provide access to the data through various interfaces. User interactions with the archive are recorded in server logs, which can be used to answer basic questions like: Who has downloaded dataset X? When did she do this? Which tools did she use? The answers to questions like these fill in patterns of data access (e.g., how many times dataset X has been downloaded in the past three years). Analysis of server logs provides metrics of archive usage and provides feedback on interface use which can be used to guide future interface development. The Chandra X-ray Observatory is fortunate in that a database to track data access and downloads has been continuously recording such transactions for years; however, it is overdue for an update. We will detail changes we hope to effect and the differences the changes may make to our usage metadata picture. We plan to gather more information about the geographic location of users without compromising privacy; create improved archive statistics; and track and assess the impact of web “crawlers” and other scripted access methods on the archive. With the improvements to our download tracking we hope to gain a better understanding of the dissemination of Chandra's data; how effectively it is being done; and perhaps discover ideas for new services.
Davis, Brian N.; Werpy, Jason; Friesz, Aaron M.; Impecoven, Kevin; Quenzer, Robert; Maiersperger, Tom; Meyer, David J.
2015-01-01
Current methods of searching for and retrieving data from satellite land remote sensing archives do not allow for interactive information extraction. Instead, Earth science data users are required to download files over low-bandwidth networks to local workstations and process data before science questions can be addressed. New methods of extracting information from data archives need to become more interactive to meet user demands for deriving increasingly complex information from rapidly expanding archives. Moving the tools required for processing data to computer systems of data providers, and away from systems of the data consumer, can improve turnaround times for data processing workflows. The implementation of middleware services was used to provide interactive access to archive data. The goal of this middleware services development is to enable Earth science data users to access remote sensing archives for immediate answers to science questions instead of links to large volumes of data to download and process. Exposing data and metadata to web-based services enables machine-driven queries and data interaction. Also, product quality information can be integrated to enable additional filtering and sub-setting. Only the reduced content required to complete an analysis is then transferred to the user.
NASA Astrophysics Data System (ADS)
Clements, Oliver; Walker, Peter
2014-05-01
The cost of working with extremely large data sets is an increasingly important issue within the Earth Observation community. From global coverage data at any resolution to small coverage data at extremely high resolution, the community has always produced big data. This will only increase as new sensors are deployed and their data made available. Over time standard workflows have emerged. These have been facilitated by the production and adoption of standard technologies. Groups such as the International Organisation for Standardisation (ISO) and the Open Geospatial Consortium (OGC) have been a driving force in this area for many years. The production of standard protocols and interfaces such as OPeNDAP, Web Coverage Service (WCS), Web Processing Service (WPS) and the newer emerging standards such as Web Coverage Processing Service (WCPS) have helped to galvanise these workflows. An example of a traditional workflow, assume a researcher wants to assess the temporal trend in chlorophyll concentration. This would involve a discovery phase, an acquisition phase, a processing phase and finally a derived product or analysis phase. Each element of this workflow has an associated temporal and monetary cost. Firstly the researcher would require a high bandwidth connection or the acquisition phase would take too long. Secondly the researcher must have their own expensive equipment for use in the processing phase. Both of these elements cost money and time. This can make the whole process prohibitive to scientists from the developing world or "citizen scientists" that do not have the processing infrastructure necessary. The use of emerging technologies can help improve both the monetary and time costs associated with these existing workflows. By utilising a WPS that is hosted at the same location as the data a user is able to apply processing to the data without needing their own processing infrastructure. This however limits the user to predefined processes that are made available by the data provider. The emerging OGC WCPS standard combined with big data analytics engines may provide a mechanism to improve this situation. The technology allows users to create their own queries using an SQL like query language and apply them over available large data archive, once again at the data providers end. This not only removes the processing cost whilst still allowing user defined processes it also reduces the bandwidth required, as only the final analysis or derived product needs to be downloaded. The maturity of the new technologies is a stage where their use should be justified by a quantitative assessment rather than simply by the fact that they are new developments. We will present a study of the time and cost requirements for a selection of existing workflows and then show how new/emerging standards and technologies can help to both reduce the cost to the user by shifting processing to the data, and reducing the required bandwidth for analysing large datasets, making analysis of big-data archives possible for a greater and more diverse audience.
The preservation of microbial DNA in archived soils of various genetic types.
Ivanova, Ekaterina A; Korvigo, Ilia O; Aparin, Boris F; Chirak, Evgenii L; Pershina, Elizaveta V; Romaschenko, Nikolay S; Provorov, Nikolai A; Andronov, Evgeny E
2017-01-01
This study is a comparative analysis of samples of archived (stored for over 70-90 years) and modern soils of two different genetic types-chernozem and sod-podzolic soils. We revealed a reduction in biodiversity of archived soils relative to their modern state. Particularly, long-term storage in the museum exerted a greater impact on the microbiomes of sod-podzolic soils, while chernozem samples better preserved the native community. Thus, the persistence of microbial DNA in soil is largely determined by the physico-chemical characteristics that differ across soil types. Chernozems create better conditions for the long-term DNA preservation than sod-podzolic soils. This results in supposedly higher levels of biodiversity conservation in the microbiomes of chernozem with preservation of major microbial taxa dominant in the modern (control) soil samples, which makes archived chernozems a promising object for paleosoil studies.
The preservation of microbial DNA in archived soils of various genetic types
Korvigo, Ilia O.; Aparin, Boris F.; Chirak, Evgenii L.; Pershina, Elizaveta V.; Romaschenko, Nikolay S.; Provorov, Nikolai A.; Andronov, Evgeny E.
2017-01-01
This study is a comparative analysis of samples of archived (stored for over 70–90 years) and modern soils of two different genetic types–chernozem and sod-podzolic soils. We revealed a reduction in biodiversity of archived soils relative to their modern state. Particularly, long-term storage in the museum exerted a greater impact on the microbiomes of sod-podzolic soils, while chernozem samples better preserved the native community. Thus, the persistence of microbial DNA in soil is largely determined by the physico-chemical characteristics that differ across soil types. Chernozems create better conditions for the long-term DNA preservation than sod-podzolic soils. This results in supposedly higher levels of biodiversity conservation in the microbiomes of chernozem with preservation of major microbial taxa dominant in the modern (control) soil samples, which makes archived chernozems a promising object for paleosoil studies. PMID:28339464
The LCOGT Science Archive and Data Pipeline
NASA Astrophysics Data System (ADS)
Lister, Tim; Walker, Z.; Ciardi, D.; Gelino, C. R.; Good, J.; Laity, A.; Swain, M.
2013-01-01
Las Cumbres Observatory Global Telescope (LCOGT) is building and deploying a world-wide network of optical telescopes dedicated to time-domain astronomy. In the past year, we have deployed and commissioned four new 1m telescopes at McDonald Observatory, Texas and at CTIO, Chile, with more to come at SAAO, South Africa and Siding Spring Observatory, Australia. To handle these new data sources coming from the growing LCOGT network, and to serve them to end users, we have constructed a new data pipeline and Science Archive. We describe the new LCOGT pipeline, currently under development and testing, which makes use of the ORAC-DR automated recipe-based data reduction pipeline and illustrate some of the new data products. We also present the new Science Archive, which is being developed in partnership with the Infrared Processing and Analysis Center (IPAC) and show some of the new features the Science Archive provides.
Puncture-proof picture archiving and communication system.
Willis, C E; McCluggage, C W; Orand, M R; Parker, B R
2001-06-01
As we become increasingly dependent on our picture archiving and communications system (PACS) for the clinical practice of medicine, the demand for improved reliability becomes urgent. Borrowing principles from the discipline of Reliability Engineering, we have identified components of our system that constitute single points of failure and have endeavored to eliminate these through redundant components and manual work-around procedures. To assess the adequacy of our preparations, we have identified a set of plausible events that could interfere with the function of one or more of our PACS components. These events could be as simple as the loss of the network connection to a single component or as broad as the loss of our central data center. We have identified the need to continue to operate during adverse conditions, as well as the requirement to recover rapidly from major disruptions in service. This assessment led us to modify the physical locations of central PACS components within our physical plant. We are also taking advantage of actual disruptive events coincident with a major expansion of our facility to test our recovery procedures. Based on our recognition of the vital nature of our electronic images for patient care, we are now recording electronic images in two copies on disparate media. The image database is critical to both continued operations and recovery. Restoration of the database from periodic tape backups with a 24-hour cycle time may not support our clinical scenario: acquisition modalities have a limited local storage capacity, some of which will not contain the daily workload. Restoration of the database from the archived media is an exceedingly slow process, that will likely not meet our requirement to restore clinical operations without significant delay. Our PACS vendor is working on concurrent image databases that would be capable of nearly immediate switchover and recovery.
USAID Expands eMODIS Coverage for Famine Early Warning
NASA Astrophysics Data System (ADS)
Jenkerson, C.; Meyer, D. J.; Evenson, K.; Merritt, M.
2011-12-01
Food security in countries at risk is monitored by U.S. Agency for International Development (USAID) through its Famine Early Warning Systems Network (FEWS NET) using many methods including Moderate Resolution Imaging Spectroradiometer (MODIS) data processed by U.S. Geological Survey (USGS) into eMODIS Normalized Difference Vegetation Index (NDVI) products. Near-real time production is used comparatively with trends derived from the eMODIS archive to operationally monitor vegetation anomalies indicating threatened cropland and rangeland conditions. eMODIS production over Central America and the Caribbean (CAMCAR) began in 2009, and processes 10-day NDVI composites every 5 days from surface reflectance inputs produced using predicted spacecraft and climatology information at Land and Atmosphere Near real time Capability for Earth Observing Systems (EOS) (LANCE). These expedited eMODIS composites are backed by a parallel archive of precision-based NDVI calculated from surface reflectance data ordered through Level 1 and Atmosphere Archive and Distribution System (LAADS). Success in the CAMCAR region led to the recent expansion of eMODIS production to include Africa in 2010, and Central Asia in 2011. Near-real time 250-meter products are available for each region on the last day of an acquisition interval (generally before midnight) from an anonymous file transfer protocol (FTP) distribution site (ftp://emodisftp.cr.usgs.gov/eMODIS). The FTP site concurrently hosts the regional historical collections (2000 to present) which are also searchable using the USGS Earth Explorer (http://edcsns17.cr.usgs.gov/NewEarthExplorer). As eMODIS coverage continues to grow, these geographically gridded, georeferenced tagged image file format (GeoTIFF) NDVI composites increase their utility as effective tools for operational monitoring of near-real time vegetation data against historical trends.
Digital Image Support in the ROADNet Real-time Monitoring Platform
NASA Astrophysics Data System (ADS)
Lindquist, K. G.; Hansen, T. S.; Newman, R. L.; Vernon, F. L.; Nayak, A.; Foley, S.; Fricke, T.; Orcutt, J.; Rajasekar, A.
2004-12-01
The ROADNet real-time monitoring infrastructure has allowed researchers to integrate geophysical monitoring data from a wide variety of signal domains. Antelope-based data transport, relational-database buffering and archiving, backup/replication/archiving through the Storage Resource Broker, and a variety of web-based distribution tools create a powerful monitoring platform. In this work we discuss our use of the ROADNet system for the collection and processing of digital image data. Remote cameras have been deployed at approximately 32 locations as of September 2004, including the SDSU Santa Margarita Ecological Reserve, the Imperial Beach pier, and the Pinon Flats geophysical observatory. Fire monitoring imagery has been obtained through a connection to the HPWREN project. Near-real-time images obtained from the R/V Roger Revelle include records of seafloor operations by the JASON submersible, as part of a maintenance mission for the H2O underwater seismic observatory. We discuss acquisition mechanisms and the packet architecture for image transport via Antelope orbservers, including multi-packet support for arbitrarily large images. Relational database storage supports archiving of timestamped images, image-processing operations, grouping of related images and cameras, support for motion-detect triggers, thumbnail images, pre-computed video frames, support for time-lapse movie generation and storage of time-lapse movies. Available ROADNet monitoring tools include both orbserver-based display of incoming real-time images and web-accessible searching and distribution of images and movies driven by the relational database (http://mercali.ucsd.edu/rtapps/rtimbank.php). An extension to the Kepler Scientific Workflow System also allows real-time image display via the Ptolemy project. Custom time-lapse movies may be made from the ROADNet web pages.
FASEA: A FPGA Acquisition System and Software Event Analysis for liquid scintillation counting
NASA Astrophysics Data System (ADS)
Steele, T.; Mo, L.; Bignell, L.; Smith, M.; Alexiev, D.
2009-10-01
The FASEA (FPGA based Acquisition and Software Event Analysis) system has been developed to replace the MAC3 for coincidence pulse processing. The system uses a National Instruments Virtex 5 FPGA card (PXI-7842R) for data acquisition and a purpose developed data analysis software for data analysis. Initial comparisons to the MAC3 unit are included based on measurements of 89Sr and 3H, confirming that the system is able to accurately emulate the behaviour of the MAC3 unit.
Lin, Chi-Hung; Krisp, Christoph; Packer, Nicolle H; Molloy, Mark P
2018-02-10
Glycoproteomics investigates glycan moieties in a site specific manner to reveal the functional roles of protein glycosylation. Identification of glycopeptides from data-dependent acquisition (DDA) relies on high quality MS/MS spectra of glycopeptide precursors and often requires manual validation to ensure confident assignments. In this study, we investigated pseudo-MRM (MRM-HR) and data-independent acquisition (DIA) as alternative acquisition strategies for glycopeptide analysis. These approaches allow data acquisition over the full MS/MS scan range allowing data re-analysis post-acquisition, without data re-acquisition. The advantage of MRM-HR over DDA for N-glycopeptide detection was demonstrated from targeted analysis of bovine fetuin where all three N-glycosylation sites were detected, which was not the case with DDA. To overcome the duty cycle limitation of MRM-HR acquisition needed for analysis of complex samples such as plasma we trialed DIA. This allowed development of a targeted DIA method to identify N-glycopeptides without pre-defined knowledge of the glycan composition, thus providing the potential to identify N-glycopeptides with unexpected structures. This workflow was demonstrated by detection of 59 N-glycosylation sites from 41 glycoproteins from a HILIC enriched human plasma tryptic digest. 21 glycoforms of IgG1 glycopeptides were identified including two truncated structures that are rarely reported. We developed a data-independent mass spectrometry workflow to identify specific glycopeptides from complex biological mixtures. The novelty is that this approach does not require glycan composition to be pre-defined, thereby allowing glycopeptides carrying unexpected glycans to be identified. This is demonstrated through the analysis of immunoglobulins in human plasma where we detected two IgG1 glycoforms that are rarely observed. Copyright © 2017 Elsevier B.V. All rights reserved.
EBI metagenomics--a new resource for the analysis and archiving of metagenomic data.
Hunter, Sarah; Corbett, Matthew; Denise, Hubert; Fraser, Matthew; Gonzalez-Beltran, Alejandra; Hunter, Christopher; Jones, Philip; Leinonen, Rasko; McAnulla, Craig; Maguire, Eamonn; Maslen, John; Mitchell, Alex; Nuka, Gift; Oisel, Arnaud; Pesseat, Sebastien; Radhakrishnan, Rajesh; Rocca-Serra, Philippe; Scheremetjew, Maxim; Sterk, Peter; Vaughan, Daniel; Cochrane, Guy; Field, Dawn; Sansone, Susanna-Assunta
2014-01-01
Metagenomics is a relatively recently established but rapidly expanding field that uses high-throughput next-generation sequencing technologies to characterize the microbial communities inhabiting different ecosystems (including oceans, lakes, soil, tundra, plants and body sites). Metagenomics brings with it a number of challenges, including the management, analysis, storage and sharing of data. In response to these challenges, we have developed a new metagenomics resource (http://www.ebi.ac.uk/metagenomics/) that allows users to easily submit raw nucleotide reads for functional and taxonomic analysis by a state-of-the-art pipeline, and have them automatically stored (together with descriptive, standards-compliant metadata) in the European Nucleotide Archive.
EBI metagenomics—a new resource for the analysis and archiving of metagenomic data
Hunter, Sarah; Corbett, Matthew; Denise, Hubert; Fraser, Matthew; Gonzalez-Beltran, Alejandra; Hunter, Christopher; Jones, Philip; Leinonen, Rasko; McAnulla, Craig; Maguire, Eamonn; Maslen, John; Mitchell, Alex; Nuka, Gift; Oisel, Arnaud; Pesseat, Sebastien; Radhakrishnan, Rajesh; Rocca-Serra, Philippe; Scheremetjew, Maxim; Sterk, Peter; Vaughan, Daniel; Cochrane, Guy; Field, Dawn; Sansone, Susanna-Assunta
2014-01-01
Metagenomics is a relatively recently established but rapidly expanding field that uses high-throughput next-generation sequencing technologies to characterize the microbial communities inhabiting different ecosystems (including oceans, lakes, soil, tundra, plants and body sites). Metagenomics brings with it a number of challenges, including the management, analysis, storage and sharing of data. In response to these challenges, we have developed a new metagenomics resource (http://www.ebi.ac.uk/metagenomics/) that allows users to easily submit raw nucleotide reads for functional and taxonomic analysis by a state-of-the-art pipeline, and have them automatically stored (together with descriptive, standards-compliant metadata) in the European Nucleotide Archive. PMID:24165880
NASA Technical Reports Server (NTRS)
Herman, J. R.; Hudson, R. D.; Serafino, G.
1990-01-01
Arguments are presented showing that the basic empirical model of the solar backscatter UV (SBUV) instrument degradation used by Cebula et al. (1988) in their analysis of the SBUV data is likely to lead to an incorrect estimate of the ozone trend. A correction factor is given as a function of time and altitude that brings the SBUV data into approximate agreement with the SAGE, SME, and Dobson network ozone trends. It is suggested that the currently archived SBUV ozone data should be used with caution for periods of analysis exceeding 1 yr, since it is likely that the yearly decreases contained in the archived data are too large.
From local to national scale DInSAR analysis for the comprehension of Earth's surface dynamics.
NASA Astrophysics Data System (ADS)
De Luca, Claudio; Casu, Francesco; Manunta, Michele; Zinno, Ivana; lanari, Riccardo
2017-04-01
Earth Observation techniques can be very helpful for the estimation of several sources of ground deformation due to their characteristics of large spatial coverage, high resolution and cost effectiveness. In this scenario, Differential Synthetic Aperture Radar Interferometry (DInSAR) is one of the most effective methodologies for its capability to generate spatially dense deformation maps with centimeter to millimeter accuracy. DInSAR exploits the phase difference (interferogram) between SAR image pairs relevant to acquisitions gathered at different times, but with the same illumination geometry and from sufficiently close flight tracks, whose separation is typically referred to as baseline. Among several, the SBAS algorithm is one of the most used DInSAR approaches and it is aimed at generating displacement time series at a multi-scale level by exploiting a set of small baseline interferograms. SBAS, and generally DInSAR, has taken benefit from the large availability of spaceborne SAR data collected along years by several satellite systems, with particular regard to the European ERS and ENVISAT sensors, which have acquired SAR images worldwide during approximately 20 years. While the application of SBAS to ERS and ENVISAT data at local scale is widely testified, very few examples involving those archives for analysis at huge spatial scale are available in literature. This is mainly due to the required processing power (in terms of CPUs, memory and storage) and the limited availability of automatic processing procedures (unsupervised tools), which are mandatory requirements for obtaining displacement results in a time effective way. Accordingly, in this work we present a methodology for generating the Vertical and Horizontal (East-West) components of Earth's surface deformation at very large (national/continental) spatial scale. In particular, it relies on the availability of a set of SAR data collected over an Area of Interest (AoI), which could be some hundreds of thousands of square kilometers wide, from ascending and descending orbits. The exploited SAR data are processed, on a local basis, through the Parallel SBAS (P-SBAS) approach thus generating the displacement time series and the corresponding mean deformation velocity maps. Subsequently, starting from the so generated DInSAR results, the proposed methodology lays on a proper mosaicking procedure to finally retrieve the mean velocity maps of the Vertical and Horizontal (East-West) deformation components relevant to the overall AoI. This technique permits to account for possible regional trends (tectonics trend) not easily detectable by the local scale DInSAR analyses. We tested the proposed methodology with the ENVISAT ASAR archives that have been acquired, from ascending and descending orbits, over California (US), covering an area of about 100.000 km2. The presented methodology can be easily applied also to other SAR satellite data. Above all, it is particularly suitable to deal with the very large data flow provided by the Sentinel-1 constellation, which collects data with a global coverage policy and an acquisition mode specifically designed for interferometric applications.
Global tropospheric experiment at the Hong Kong Atmosphere Chemistry Measurement Station
NASA Technical Reports Server (NTRS)
Carroll, Mary Ann; Wang, Tao
1995-01-01
The major activities of the Global Tropospheric Experiment at the Hong Kong Atmospheric Chemistry Measurement Station are presented for the period 1 January - 31 December 1995. Activities included data analysis, reduction, and archiving of atmospheric measurements and sampling. Sampling included O3, CO, SO2, NO, TSP, RSP, and ozone column density. A data archive was created for the surface meteorological data. Exploratory data analysis was performed, including examination of time series, frequency distributions, diurnal variations and correlation. The major results have been or will be published in scientific journals as well as presented at conferences/workshops. Abstracts are attached.
Astro-H Data Analysis, Processing and Archive
NASA Technical Reports Server (NTRS)
Angelini, Lorella; Terada, Yukikatsu; Loewenstein, Michael; Miller, Eric D.; Yamaguchi, Hiroya; Yaqoob, Tahir; Krimm, Hans; Harrus, Ilana; Takahashi, Hiromitsu; Nobukawa, Masayoshi;
2016-01-01
Astro-H (Hitomi) is an X-ray Gamma-ray mission led by Japan with international participation, launched on February 17, 2016. The payload consists of four different instruments (SXS, SXI, HXI and SGD) that operate simultaneously to cover the energy range from 0.3 keV up to 600 keV. This paper presents the analysis software and the data processing pipeline created to calibrate and analyze the Hitomi science data along with the plan for the archive and user support.These activities have been a collaborative effort shared between scientists and software engineers working in several institutes in Japan and USA.
A computerized aircraft battery servicing facility
NASA Technical Reports Server (NTRS)
Glover, Richard D.
1992-01-01
The latest upgrade to the Aerospace Energy Systems Laboratory (AESL) is described. The AESL is a distributed digital system consisting of a central system and battery servicing stations connected by a high-speed serial data bus. The entire system is located in two adjoining rooms; the bus length is approximately 100 ft. Each battery station contains a digital processor, data acquisition, floppy diskette data storage, and operator interfaces. The operator initiates a servicing task and thereafter the battery station monitors the progress of the task and terminates it at the appropriate time. The central system provides data archives, manages the data bus, and provides a timeshare interface for multiple users. The system also hosts software production tools for the battery stations and the central system.
Seismic data acquisition at the FACT site for the CASPAR project.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jones, Kyle R.; Chael, Eric Paul; Hart, Darren M.
Since May 2010, we have been recording continuous seismic data at Sandia's FACT site. The collected signals provide us with a realistic archive for testing algorithms under development for local monitoring of explosive testing. Numerous small explosive tests are routinely conducted around Kirtland AFB by different organizations. Our goal is to identify effective methods for distinguishing these events from normal daily activity on and near the base, such as vehicles, aircraft, and storms. In this report, we describe the recording system, and present some observations of the varying ambient noise conditions at FACT. We present examples of various common, non-explosive,more » sources. Next we show signals from several small explosions, and discuss their characteristic features.« less
Real-time micro-modelling of city evacuations
NASA Astrophysics Data System (ADS)
Löhner, Rainald; Haug, Eberhard; Zinggerling, Claudio; Oñate, Eugenio
2018-01-01
A methodology to integrate geographical information system (GIS) data with large-scale pedestrian simulations has been developed. Advances in automatic data acquisition and archiving from GIS databases, automatic input for pedestrian simulations, as well as scalable pedestrian simulation tools have made it possible to simulate pedestrians at the individual level for complete cities in real time. An example that simulates the evacuation of the city of Barcelona demonstrates that this is now possible. This is the first step towards a fully integrated crowd prediction and management tool that takes into account not only data gathered in real time from cameras, cell phones or other sensors, but also merges these with advanced simulation tools to predict the future state of the crowd.
BOREAS Level-1B TIMS Imagery: At-sensor Radiance in BSQ Format
NASA Technical Reports Server (NTRS)
Hall, Forrest G. (Editor); Nickeson, Jaime (Editor); Strub, Richard; Newcomer, Jeffrey A.; Chernobieff, Sonia
2000-01-01
The Boreal Ecosystem-Atmospheric Study (BOREAS) Staff Science Aircraft Data Acquisition Program focused on providing the research teams with the remotely sensed satellite data products they needed to compare and spatially extend point results. For BOREAS, the Thermal Infrared Multispectral Scanner (TIMS) imagery, along with other aircraft images, was collected to provide spatially extensive information over the primary study areas. The Level-1b TIMS images cover the time periods of 16 to 20 Apr 1994 and 06 to 17 Sep 1994. The system calibrated images are stored in binary image format files. The TIMS images are available from the Earth Observing System Data and Information System (EOSDIS) Oak Ridge National Laboratory (ORNL) Distributed Active Archive Center (DAAC).
High-Resolution Satellite Data Open for Government Research
NASA Technical Reports Server (NTRS)
Neigh, Christopher S. R.; Masek, Jeffrey G.; Nickeson, Jaime E.
2013-01-01
U.S. satellite commercial imagery (CI) with resolution less than 1 meter is a common geospatial reference used by the public through Web applications, mobile devices, and the news media. However, CI use in the scientific community has not kept pace, even though those who are performing U.S. government research have access to these data at no cost.Previously, studies using multiple CI acquisitions from IKONOS-2, Quickbird-2, GeoEye-1, WorldView-1, and WorldView-2 would have been cost prohibitive. Now, with near-global submeter coverage and online distribution, opportunities abound for future scientific studies. This archive is already quite extensive (examples are shown in Figure 1) and is being used in many novel applications.
NASA Astrophysics Data System (ADS)
Purss, M. B.; Lewis, A.; Ip, A.; Evans, B.
2013-12-01
The next decade promises an exponential increase in volumes of open data from Earth observing satellites. The ESA Sentinels, the Japan Meteorological Agency's Himawari 8/9 geostationary satellites, various NASA missions, and of course the many EO satellites planned from China, will produce petabyte scale datasets of national and global significance. It is vital that we develop new ways of managing, accessing and using this ';big-data' from satellites, to produce value added information within realistic timeframes. A paradigm shift is required away from traditional ';scene based' (and labour intensive) approaches with data storage and delivery for processing at local sites, to emerging High Performance Data (HPD) models where the data are organised and co-located with High Performance Computational (HPC) infrastructures in a way that enables users to bring themselves, their algorithms and the HPC processing power to the data. Automated workflows, that allow the entire archive of data to be rapidly reprocessed from raw data to fully calibrated products, are a crucial requirement for the effective stewardship of these datasets. New concepts such as arranging and viewing data as ';data objects' which underpin the delivery of ';information as a service' are also integral to realising the transition into HPD analytics. As Australia's national remote sensing and geoscience agency, Geoscience Australia faces a pressing need to solve the problems of ';big-data', in particular around the 25-year archive of calibrated Landsat data. The challenge is to ensure standardised information can be extracted from the entire archive and applied to nationally significant problems in hazards, water management, land management, resource development and the environment. Ultimately, these uses justify government investment in these unique systems. A key challenge was how best to organise the archive of calibrated Landsat data (estimated to grow to almost 1 PB by the end of 2014) in a way that supports HPD applications yet with the ability to trace each observation (pixel) back to its original satellite acquisition. The approach taken was to develop a multi-dimensional array (a data cube) underpinned by the partitioning the data into tiles, without any temporal aggregation. This allows for flexible spatio-temporal queries of the archive whilst minimising the need to perform geospatial processing just to locate the pixels of interest. Equally important is the development and implementation of international data interoperability standards (such as OGC web services and ISO metadata standards) that will provide advanced access for users to interact with and query the data cube without needing to download any data or to go through specialised data portals. This new approach will vastly improve access to, and the impact of, Australia's Landsat archive holdings.
Staley, S; Romlein, J; Chacko, A K; Wider, R
2000-05-01
Picture archiving and communication system (PACS) maintenance on an individual site basis has historically been a complex and costly challenge. With the advent of enterprise-wide PACS projects such as the Virtual Radiology Environment (VRE) project, the challenge of a maintenance program with even more complexities has presented itself. The approach of the project management team for the VRE project is not one of reactive maintenance, but one of highly proactive planning and negotiations, in hopes of capitalizing on the economies of scale of an enterprise-wide PACS maintenance program. A proactive maintenance program is one aspect of life-cycle management. As with any capital acquisition, life-cycle management may be used to manage the specific project aspects related to PACS. The purpose of an enterprise-wide warranty and maintenance life-cycle management approach is to maintain PACS at its maximum operational efficiency and utilization levels through a flexible, shared, yet symbiotic relationship between local, regional, and vendor resources. These goals include providing maximum operational performance levels on a local, regional, and enterprise basis, while maintaining acceptable costs and resource utilization levels. This goal must be achieved without negatively impacting point of care activities, regardless of changes to the clinical business environment.
MODIS Information, Data, and Control System (MIDACS) system specifications and conceptual design
NASA Technical Reports Server (NTRS)
Han, D.; Salomonson, V.; Ormsby, J.; Ardanuy, P.; Mckay, A.; Hoyt, D.; Jaffin, S.; Vallette, B.; Sharts, B.; Folta, D.
1988-01-01
The MODIS Information, Data, and Control System (MIDACS) Specifications and Conceptual Design Document discusses system level requirements, the overall operating environment in which requirements must be met, and a breakdown of MIDACS into component subsystems, which include the Instrument Support Terminal, the Instrument Control Center, the Team Member Computing Facility, the Central Data Handling Facility, and the Data Archive and Distribution System. The specifications include sizing estimates for the processing and storage capacities of each data system element, as well as traffic analyses of data flows between the elements internally, and also externally across the data system interfaces. The specifications for the data system, as well as for the individual planning and scheduling, control and monitoring, data acquisition and processing, calibration and validation, and data archive and distribution components, do not yet fully specify the data system in the complete manner needed to achieve the scientific objectives of the MODIS instruments and science teams. The teams have not yet been formed; however, it was possible to develop the specifications and conceptual design based on the present concept of EosDIS, the Level-1 and Level-2 Functional Requirements Documents, the Operations Concept, and through interviews and meetings with key members of the scientific community.
Cost-effectiveness prospects of picture archiving and communication systems.
Hindel, R; Preger, W
1988-01-01
PAC (picture archiving and communication) systems are widely discussed and promoted as the organizational solution to digital image management in a radiology department. For approximately two decades digital imaging has increasingly been used for such diagnostic modalities as CT, DSA, MRI, DR (Digital Radiography) and others. PACS are seen as a step toward high technology integration and more efficient management. Although the acquisition of such technology is investment intensive, there are well-founded projections that prolonged operation will prove cost justified. Such justification can only partly be derived from cost reduction through PAC with respect to present department management--the major justification is preparation for future economic pressures which could make survival of a department without modern technology difficult. Especially in the United States the political climate favors 'competitive medicine' and reduced government support. Seen in this context PACS promises to speed the transition of Health Care Services into a business with tight resource management, cost accounting and marketing. The following paper analyzes cost and revenue in a typical larger Radiology Department, projects various scenarios of cost reduction by means of digital technology and concludes with cautious optimism that the investment expenses for a PACS will be justified in the near future by prudent utilization of high technology.
A topic clustering approach to finding similar questions from large question and answer archives.
Zhang, Wei-Nan; Liu, Ting; Yang, Yang; Cao, Liujuan; Zhang, Yu; Ji, Rongrong
2014-01-01
With the blooming of Web 2.0, Community Question Answering (CQA) services such as Yahoo! Answers (http://answers.yahoo.com), WikiAnswer (http://wiki.answers.com), and Baidu Zhidao (http://zhidao.baidu.com), etc., have emerged as alternatives for knowledge and information acquisition. Over time, a large number of question and answer (Q&A) pairs with high quality devoted by human intelligence have been accumulated as a comprehensive knowledge base. Unlike the search engines, which return long lists of results, searching in the CQA services can obtain the correct answers to the question queries by automatically finding similar questions that have already been answered by other users. Hence, it greatly improves the efficiency of the online information retrieval. However, given a question query, finding the similar and well-answered questions is a non-trivial task. The main challenge is the word mismatch between question query (query) and candidate question for retrieval (question). To investigate this problem, in this study, we capture the word semantic similarity between query and question by introducing the topic modeling approach. We then propose an unsupervised machine-learning approach to finding similar questions on CQA Q&A archives. The experimental results show that our proposed approach significantly outperforms the state-of-the-art methods.
WFIRST: Science from the Guest Investigator and Parallel Observation Programs
NASA Astrophysics Data System (ADS)
Postman, Marc; Nataf, David; Furlanetto, Steve; Milam, Stephanie; Robertson, Brant; Williams, Ben; Teplitz, Harry; Moustakas, Leonidas; Geha, Marla; Gilbert, Karoline; Dickinson, Mark; Scolnic, Daniel; Ravindranath, Swara; Strolger, Louis; Peek, Joshua; Marc Postman
2018-01-01
The Wide Field InfraRed Survey Telescope (WFIRST) mission will provide an extremely rich archival dataset that will enable a broad range of scientific investigations beyond the initial objectives of the proposed key survey programs. The scientific impact of WFIRST will thus be significantly expanded by a robust Guest Investigator (GI) archival research program. We will present examples of GI research opportunities ranging from studies of the properties of a variety of Solar System objects, surveys of the outer Milky Way halo, comprehensive studies of cluster galaxies, to unique and new constraints on the epoch of cosmic re-ionization and the assembly of galaxies in the early universe.WFIRST will also support the acquisition of deep wide-field imaging and slitless spectroscopic data obtained in parallel during campaigns with the coronagraphic instrument (CGI). These parallel wide-field imager (WFI) datasets can provide deep imaging data covering several square degrees at no impact to the scheduling of the CGI program. A competitively selected program of well-designed parallel WFI observation programs will, like the GI science above, maximize the overall scientific impact of WFIRST. We will give two examples of parallel observations that could be conducted during a proposed CGI program centered on a dozen nearby stars.
Setting the standard: 25 years of operating the JCMT
NASA Astrophysics Data System (ADS)
Dempsey, Jessica T.; Bell, Graham S.; Chrysostomou, Antonio; Coulson, Iain M.; Davis, Gary R.; Economou, Frossie; Friberg, Per; Jenness, Timothy; Johnstone, Doug; Tilanus, Remo P. J.; Thomas, Holly S.; Walther, Craig A.
2014-08-01
The James Clerk Maxwell Telescope (JCMT) is the largest single-dish submillimetre telescope in the world, and throughout its lifetime the volume and impact of its science output have steadily increased. A key factor for this continuing productivity is an ever-evolving approach to optimising operations, data acquisition, and science product pipelines and archives. The JCMT was one of the first common-user telescopes to adopt flexible scheduling in 2003, and its impact over a decade of observing will be presented. The introduction of an advanced data-reduction pipeline played an integral role, both for fast real-time reduction during observing, and for science-grade reduction in support of individual projects, legacy surveys, and the JCMT Science Archive. More recently, these foundations have facilitated the commencement of remote observing in addition to traditional on-site operations to further increase on-sky science time. The contribution of highly-trained and engaged operators, support and technical staff to efficient operations will be described. The long-term returns of this evolution are presented here, noting they were achieved in face of external pressures for leaner operating budgets and reduced staffing levels. In an era when visiting observers are being phased out of many observatories, we argue that maintaining a critical level of observer participation is vital to improving and maintaining scientific productivity and facility longevity.
New Technology Changing The Face of Mobile Seismic Networks
NASA Astrophysics Data System (ADS)
Brisbourne, A.; Denton, P.; Seis-Uk
SEIS-UK, a seismic equipment pool and data management facility run by a consortium of four UK universities (Leicester, Leeds, Cambridge and Royal Holloway, London) completed its second phase in 2001. To compliment the existing broadband equipment pool, which has been deployed to full capacity to date, the consortium undertook a tender evaluation process for low-power, lightweight sensors and recorders, for use on both controlled source and passive seismic experiments. The preferred option, selected by the consortium, was the Guralp CMG-6TD system, with 150 systems ordered. The CMG-6TD system is a new concept in temporary seismic equipment. A 30s- 100Hz force-feedback sensor, integral 24bit digitiser and 3-4Gbyte of solid-state memory are all housed in a single unit. Use of the most recent technologies has kept the power consumption to below 1W and the weight to 3.5Kg per unit. The concept of the disk-swap procedure for obtaining data from the field has been usurped by a fast data download technique using firewire technology. This allows for rapid station servicing, essential when 150 stations are in use, and also ensures the environmental integrity of the system by removing the requirement for a disk access port and envi- ronmentally exposed data disk. The system therefore meets the criteria for controlled source and passive seismic experiments: (1) the single unit concept and low-weight is designed for rapid deployment on short-term projects; (2) the low power consumption reduces the power-supply requirements facilitating deployment; (3) the low self-noise and bandwidth of the sensor make it applicable to passive experiments involving nat- ural sources. Further to this acquisition process, in collaboration with external groups, the SEIS- UK data management procedures have been streamlined with the integration of the Guralp GCF format data into the PASSCAL PDB software. This allows for rapid dissemination of field data and the production of archive-ready datasets, reducing the time between field recording and data archive. The archiving procedure for SEIS- UK datasets has been established, with data from experiments carried out with the broadband equipment already on the permanent continuous data archive at IRIS DMC.
Moreno-Martínez, F Javier; Montoro, Pedro R; Rodríguez-Rojo, Inmaculada C
2014-12-01
This article presents a new corpus of 820 words pertaining to 14 semantic categories, 7 natural (animals, body parts, insects, flowers, fruits, trees, and vegetables) and 7 man-made (buildings, clothing, furniture, kitchen utensils, musical instruments, tools, and vehicles); each word in the database was collected empirically in a previous exemplar generation study. In the present study, 152 Spanish speakers provided data for four psycholinguistic variables known to affect lexical-semantic processing in both neurologically intact and brain-damaged participants: age of acquisition, familiarity, manipulability, and typicality. Furthermore, we collected lexical frequency data derived from Internet search hits, plus three additional Spanish lexical frequency indexes. Word length, number of syllables, and the proportion of respondents citing the exemplar as a category member-which can be useful as an additional measure of typicality-are also provided. Reliability and validity indexes showed that our items display characteristics similar to those of other corpora. Overall, this new corpus of words provides a useful tool for scientists engaged in cognitive- and neuroscience-based research focused on examining language, memory, and object processing. The full set of norms can be downloaded from www.psychonomic.org/archive.
A Constraint-Based Approach to Acquisition of Word-Final Consonant Clusters in Turkish Children
ERIC Educational Resources Information Center
Gokgoz-Kurt, Burcu
2017-01-01
The current study provides a constraint-based analysis of L1 word-final consonant cluster acquisition in Turkish child language, based on the data originally presented by Topbas and Kopkalli-Yavuz (2008). The present analysis was done using [?]+obstruent consonant cluster acquisition. A comparison of Gradual Learning Algorithm (GLA) under…
ERIC Educational Resources Information Center
Dominguez, Laura; Hicks, Glyn; Song, Hee-Jeong
2012-01-01
This study offers a Minimalist analysis of the L2 acquisition of binding properties whereby cross-linguistic differences arise from the interaction of anaphoric feature specifications and operations of the computational system (Reuland 2001, 2011; Hicks 2009). This analysis attributes difficulties in the L2 acquisition of locality and orientation…
ERIC Educational Resources Information Center
Butler, Cassandra
2012-01-01
This study utilized a quantitative design using archival data from Johnson County Community College (JCCC), located in Johnson County, Kansas, and the Office of Financial Aid for students who graduated, withdrew or dropped out of the college in academic years 2006, 2007 and 2008. Because this study used archival data, we can only show…
Alonso, Wladimir J; Nascimento, Francielle C; Chowell, Gerardo; Schuck-Paim, Cynthia
2018-05-01
The analysis of historical death certificates has enormous potential for understanding how the health of populations was shaped by diseases and epidemics and by the implementation of specific interventions. In Brazil, the systematic archiving of mortality records was initiated only in 1944-hence the analysis of death registers before this time requires searching for these documents in public archives, notaries, parishes, and especially ancient cemeteries, which are often the only remaining source of information about these deaths. This article describes an effort to locate original death certificates in Brazil and document their organization, accessibility, and preservation. To this end, we conducted an exploratory study in 19 of the 27 Brazilian states, focusing on the period surrounding the 1918 influenza pandemic (1913-1921). We included 55 cemeteries, 22 civil archives, and one military archive. Apart from few exceptions, the results show the absence of a curatorial policy for the organization, access or even physical preservation of this material, frequently leading to unavailability, deterioration, and ultimately its complete loss. This study indicates the need to promote the preservation of a historical heritage that is a key to understanding historical epidemiological patterns and human responses to global health threats. Copyright © 2018 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Kluiving, Sjoerd; De Ridder, Tim; Van Dasselaar, Marcel; Roozen, Stan; Prins, Maarten; Van Mourik, Jan
2016-04-01
In Medieval times the city of Vlaardingen (the Netherlands) was strategically located on the confluence of three rivers, the Meuse, the Merwede and the Vlaarding. A church of early 8th century was already located here. In a short period of time Vlaardingen developed into an international trading place, the most important place in the former county of Holland. Starting from the 11th century the river Meuse threatened to flood the settlement. These floods have been registered in the archives of the fluvisol and were recognised in a multidisciplinary sedimentary analysis of these archives. To secure the future of this vulnerable soil archive currently an extensive interdisciplinary research (76 mechanical drill holes, grain size analysis (GSA), thermo-gravimetric analysis (TGA), archaeological remains, soil analysis, dating methods, micromorphology, and microfauna has started in 2011 to gain knowledge on the sedimentological and pedological subsurface of the mound as well as on the well-preserved nature of the archaeological evidence. Pedogenic features are recorded with soil descriptive, micromorphological and geochemical (XRF) analysis. The soil sequence of 5 meters thickness exhibits a complex mix of 'natural' as well as 'anthropogenic layering' and initial soil formation that enables to make a distinction for relatively stable periods between periods with active sedimentation. In this paper the results of this large-scale project are demonstrated in a number of cross-sections with interrelated geological, pedological and archaeological stratification. Distinction between natural and anthropogenic layering is made on the occurrence of chemical elements phosphor and potassium. A series of four stratigraphic / sedimentary units record the period before and after the flooding disaster. Given the many archaeological remnants and features present in the lower units, we assume that the medieval landscape was drowned while it was inhabited in the 12th century AD. After a final drowning phase in the 13th century, as a reaction to it, inhabitants started to raise the surface.
Kimura, Yurika; Kubo, Sachiho; Koda, Hiroko; Shigemoto, Kazuhiro; Sawabe, Motoji; Kitamura, Ken
2013-08-01
Molecular analysis using archival human inner ear specimens is challenging because of the anatomical complexity, long-term fixation, and decalcification. However, this method may provide great benefit for elucidation of otological diseases. Here, we extracted mRNA for RT-PCR from tissues dissected from archival FFPE human inner ears by laser microdissection. Three human temporal bones obtained at autopsy were fixed in formalin, decalcified by EDTA, and embedded in paraffin. The samples were isolated into spiral ligaments, outer hair cells, spiral ganglion cells, and stria vascularis by laser microdissection. RNA was extracted and heat-treated in 10 mM citrate buffer to remove the formalin-derived modification. To identify the sites where COCH and SLC26A5 mRNA were expressed, semi-nested RT-PCR was performed. We also examined how long COCH mRNA could be amplified by semi-nested RT-PCR in archival temporal bone. COCH was expressed in the spiral ligament and stria vascularis. However, SLC26A5 was expressed only in outer hair cells. The maximum base length of COCH mRNA amplified by RT-PCR was 98 bp in 1 case and 123 bp in 2 cases. We detected COCH and SLC26A5 mRNA in specific structures and cells of the inner ear from archival human temporal bone. Our innovative method using laser microdissection and semi-nested RT-PCR should advance future RNA study of human inner ear diseases. Copyright © 2013 Elsevier B.V. All rights reserved.
Graphical user interface for image acquisition and processing
Goldberg, Kenneth A.
2002-01-01
An event-driven GUI-based image acquisition interface for the IDL programming environment designed for CCD camera control and image acquisition directly into the IDL environment where image manipulation and data analysis can be performed, and a toolbox of real-time analysis applications. Running the image acquisition hardware directly from IDL removes the necessity of first saving images in one program and then importing the data into IDL for analysis in a second step. Bringing the data directly into IDL creates an opportunity for the implementation of IDL image processing and display functions in real-time. program allows control over the available charge coupled device (CCD) detector parameters, data acquisition, file saving and loading, and image manipulation and processing, all from within IDL. The program is built using IDL's widget libraries to control the on-screen display and user interface.
Duley, Aaron R; Janelle, Christopher M; Coombes, Stephen A
2004-11-01
The cardiovascular system has been extensively measured in a variety of research and clinical domains. Despite technological and methodological advances in cardiovascular science, the analysis and evaluation of phasic changes in heart rate persists as a way to assess numerous psychological concomitants. Some researchers, however, have pointed to constraints on data analysis when evaluating cardiac activity indexed by heart rate or heart period. Thus, an off-line application toolkit for heart rate analysis is presented. The program, written with National Instruments' LabVIEW, incorporates a variety of tools for off-line extraction and analysis of heart rate data. Current methods and issues concerning heart rate analysis are highlighted, and how the toolkit provides a flexible environment to ameliorate common problems that typically lead to trial rejection is discussed. Source code for this program may be downloaded from the Psychonomic Society Web archive at www.psychonomic.org/archive/.
48 CFR 915.404 - Proposal analysis.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 48 Federal Acquisition Regulations System 5 2011-10-01 2011-10-01 false Proposal analysis. 915.404 Section 915.404 Federal Acquisition Regulations System DEPARTMENT OF ENERGY CONTRACTING METHODS AND CONTRACT TYPES CONTRACTING BY NEGOTIATION Contract Pricing 915.404 Proposal analysis. ...
48 CFR 1215.404 - Proposal analysis.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 48 Federal Acquisition Regulations System 5 2011-10-01 2011-10-01 false Proposal analysis. 1215.404 Section 1215.404 Federal Acquisition Regulations System DEPARTMENT OF TRANSPORTATION CONTRACTING METHODS AND CONTRACT TYPES CONTRACTING BY NEGOTIATION Contract Pricing 1215.404 Proposal analysis. ...
48 CFR 2815.404 - Proposal analysis.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 48 Federal Acquisition Regulations System 6 2011-10-01 2011-10-01 false Proposal analysis. 2815.404 Section 2815.404 Federal Acquisition Regulations System DEPARTMENT OF JUSTICE CONTRACTING METHODS AND CONTRACT TYPES CONTRACTING BY NEGOTIATION Contract Pricing 2815.404 Proposal analysis. ...
48 CFR 1215.404 - Proposal analysis.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 48 Federal Acquisition Regulations System 5 2010-10-01 2010-10-01 false Proposal analysis. 1215.404 Section 1215.404 Federal Acquisition Regulations System DEPARTMENT OF TRANSPORTATION CONTRACTING METHODS AND CONTRACT TYPES CONTRACTING BY NEGOTIATION Contract Pricing 1215.404 Proposal analysis. ...
48 CFR 2815.404 - Proposal analysis.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 48 Federal Acquisition Regulations System 6 2010-10-01 2010-10-01 true Proposal analysis. 2815.404 Section 2815.404 Federal Acquisition Regulations System DEPARTMENT OF JUSTICE Contracting Methods and Contract Types CONTRACTING BY NEGOTIATION Contract Pricing 2815.404 Proposal analysis. ...
48 CFR 915.404 - Proposal analysis.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 48 Federal Acquisition Regulations System 5 2010-10-01 2010-10-01 false Proposal analysis. 915.404 Section 915.404 Federal Acquisition Regulations System DEPARTMENT OF ENERGY CONTRACTING METHODS AND CONTRACT TYPES CONTRACTING BY NEGOTIATION Contract Pricing 915.404 Proposal analysis. ...
Loudig, Olivier; Liu, Christina; Rohan, Thomas; Ben-Dov, Iddo Z
2018-05-05
-Archived, clinically classified formalin-fixed paraffin-embedded (FFPE) tissues can provide nucleic acids for retrospective molecular studies of cancer development. By using non-invasive or pre-malignant lesions from patients who later develop invasive disease, gene expression analyses may help identify early molecular alterations that predispose to cancer risk. It has been well described that nucleic acids recovered from FFPE tissues have undergone severe physical damage and chemical modifications, which make their analysis difficult and generally requires adapted assays. MicroRNAs (miRNAs), however, which represent a small class of RNA molecules spanning only up to ~18-24 nucleotides, have been shown to withstand long-term storage and have been successfully analyzed in FFPE samples. Here we present a 3' barcoded complementary DNA (cDNA) library preparation protocol specifically optimized for the analysis of small RNAs extracted from archived tissues, which was recently demonstrated to be robust and highly reproducible when using archived clinical specimens stored for up to 35 years. This library preparation is well adapted to the multiplex analysis of compromised/degraded material where RNA samples (up to 18) are ligated with individual 3' barcoded adapters and then pooled together for subsequent enzymatic and biochemical preparations prior to analysis. All purifications are performed by polyacrylamide gel electrophoresis (PAGE), which allows size-specific selections and enrichments of barcoded small RNA species. This cDNA library preparation is well adapted to minute RNA inputs, as a pilot polymerase chain reaction (PCR) allows determination of a specific amplification cycle to produce optimal amounts of material for next-generation sequencing (NGS). This approach was optimized for the use of degraded FFPE RNA from specimens archived for up to 35 years and provides highly reproducible NGS data.
Multi-Sensory Approach to Search for Young Stellar Objects in CG4
NASA Astrophysics Data System (ADS)
Hoette, Vivian L.; Rebull, L. M.; McCarron, K.; Johnson, C. H.; Gartner, C.; VanDerMolen, J.; Gamble, L.; Matche, L.; McCartney, A.; Doering, M.; Crump, R.; Laorr, A.; Mork, K.; Steinbergs, E.; Wigley, E.; Caruso, S.; Killingstad, N.; McCanna, T.
2011-01-01
Individuals with disabilities - specifically individuals who are deaf or hard of hearing (DHH) and/or blind and visually-impaired (BVI) - have traditionally been underrepresented in the fields of Science, Technology, Engineering, and Math (STEM). The low incidence rate of these populations, coupled with geographic isolation, creates limited opportunities for students to work with and receive mentoring by professionals who not only have specialty knowledge in disability areas but also work in STEM fields. Yerkes Observatory scientists, along with educators from the Wisconsin School for the Deaf, the Wisconsin Center for the Blind and Visually Impaired, Breck School, and Oak Park and River Forest High School, are engaged in active research with a Spitzer Science Center (SSC) scientist. Our ultimate goals are threefold; to engage DHH and BVI students with equal success as their sighted and hearing peers, to share our techniques to make astronomy more accessible to DHH and BVI youth, and to generate a life-long interest which will lead our students to STEM careers. This poster tracks our work with an SSC scientist during the spring, summer, and fall of 2010. The group coauthored another AAS poster on finding Young Stellar Objects (YSO) in the CG4 Nebula in Puppis. During the project, the students, scientists and teachers developed a number of techniques for learning the necessary science as well as doing the required data acquisition and analysis. Collaborations were formed between students with disabilities and their non-disabled peers to create multi-media projects. Ultimately, the projects created for our work with NITARP will be disseminated through our professional connections in order to ignite a passion for astronomy in all students - with and without disabilities. This research was made possible through the NASA/IPAC Teacher Archive Research Project (NITARP) and was funded by NASA Astrophysics Data Program and Archive Outreach funds.
48 CFR 1511.011-76 - Legal analysis.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 48 Federal Acquisition Regulations System 6 2012-10-01 2012-10-01 false Legal analysis. 1511.011... ACQUISITION PLANNING DESCRIBING AGENCY NEEDS 1511.011-76 Legal analysis. Contracting Officers shall insert the clause at 1552.211-76 when it is determined that the contract involves legal analysis. ...
48 CFR 1511.011-76 - Legal analysis.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 48 Federal Acquisition Regulations System 6 2013-10-01 2013-10-01 false Legal analysis. 1511.011... ACQUISITION PLANNING DESCRIBING AGENCY NEEDS 1511.011-76 Legal analysis. Contracting Officers shall insert the clause at 1552.211-76 when it is determined that the contract involves legal analysis. ...
48 CFR 1511.011-76 - Legal analysis.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 48 Federal Acquisition Regulations System 6 2014-10-01 2014-10-01 false Legal analysis. 1511.011... ACQUISITION PLANNING DESCRIBING AGENCY NEEDS 1511.011-76 Legal analysis. Contracting Officers shall insert the clause at 1552.211-76 when it is determined that the contract involves legal analysis. ...
48 CFR 1511.011-76 - Legal analysis.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 48 Federal Acquisition Regulations System 6 2011-10-01 2011-10-01 false Legal analysis. 1511.011... ACQUISITION PLANNING DESCRIBING AGENCY NEEDS 1511.011-76 Legal analysis. Contracting Officers shall insert the clause at 1552.211-76 when it is determined that the contract involves legal analysis. ...
DOT National Transportation Integrated Search
1988-10-01
An analysis of the current environment within the Acquisition stage of the Weapon System Life Cycle Pertaining to the Logistics Support Analysis (LSA) process, the Logistics Support Analysis Record (LSAR), and other Logistics Support data was underta...
DOT National Transportation Integrated Search
1988-10-01
An analysis of the current environment within the Acquisition stage of the Weapon System Life Cycle Pertaining to the Logistics Support Analysis (LSA) process, the Logistics Support Analysis Record (LSAR), and other Logistics Support data was underta...
48 CFR 570.402-6 - Cost-benefit analysis.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 48 Federal Acquisition Regulations System 4 2014-10-01 2014-10-01 false Cost-benefit analysis. 570.402-6 Section 570.402-6 Federal Acquisition Regulations System GENERAL SERVICES ADMINISTRATION SPECIAL... Continued Space Requirements 570.402-6 Cost-benefit analysis. (a) The cost-benefit analysis must consider...
48 CFR 570.402-6 - Cost-benefit analysis.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 48 Federal Acquisition Regulations System 4 2011-10-01 2011-10-01 false Cost-benefit analysis. 570.402-6 Section 570.402-6 Federal Acquisition Regulations System GENERAL SERVICES ADMINISTRATION SPECIAL... Continued Space Requirements 570.402-6 Cost-benefit analysis. (a) The cost-benefit analysis must consider...
Deconvoluting complex structural histories archived in brittle fault zones
NASA Astrophysics Data System (ADS)
Viola, G.; Scheiber, T.; Fredin, O.; Zwingmann, H.; Margreth, A.; Knies, J.
2016-11-01
Brittle deformation can saturate the Earth's crust with faults and fractures in an apparently chaotic fashion. The details of brittle deformational histories and implications on, for example, seismotectonics and landscape, can thus be difficult to untangle. Fortunately, brittle faults archive subtle details of the stress and physical/chemical conditions at the time of initial strain localization and eventual subsequent slip(s). Hence, reading those archives offers the possibility to deconvolute protracted brittle deformation. Here we report K-Ar isotopic dating of synkinematic/authigenic illite coupled with structural analysis to illustrate an innovative approach to the high-resolution deconvolution of brittle faulting and fluid-driven alteration of a reactivated fault in western Norway. Permian extension preceded coaxial reactivation in the Jurassic and Early Cretaceous fluid-related alteration with pervasive clay authigenesis. This approach represents important progress towards time-constrained structural models, where illite characterization and K-Ar analysis are a fundamental tool to date faulting and alteration in crystalline rocks.
The AMBRE project: Parameterisation of FGK-type stars from the ESO:HARPS archived spectra
NASA Astrophysics Data System (ADS)
De Pascale, M.; Worley, C. C.; de Laverny, P.; Recio-Blanco, A.; Hill, V.; Bijaoui, A.
2014-10-01
Context. The AMBRE project is a collaboration between the European Southern Observatory (ESO) and the Observatoire de la Côte d'Azur (OCA). It has been established to determine the stellar atmospheric parameters of the archived spectra of four ESO spectrographs. Aims: The analysis of the ESO:HARPS archived spectra for the determination of their atmospheric parameters (effective temperature, surface gravity, global metallicities, and abundance of α-elements over iron) is presented. The sample being analysed (AMBRE:HARPS) covers the period from 2003 to 2010 and is comprised of 126 688 scientific spectra corresponding to ~17 218 different stars. Methods: For the analysis of the AMBRE:HARPS spectral sample, the automated pipeline developed for the analysis of the AMBRE:FEROS archived spectra has been adapted to the characteristics of the HARPS spectra. Within the pipeline, the stellar parameters are determined by the MATISSE algorithm, which has been developed at OCA for the analysis of large samples of stellar spectra in the framework of galactic archaeology. In the present application, MATISSE uses the AMBRE grid of synthetic spectra, which covers FGKM-type stars for a range of gravities and metallicities. Results: We first determined the radial velocity and its associated error for the ~15% of the AMBRE:HARPS spectra, for which this velocity had not been derived by the ESO:HARPS reduction pipeline. The stellar atmospheric parameters and the associated chemical index [α/Fe] with their associated errors have then been estimated for all the spectra of the AMBRE:HARPS archived sample. Based on key quality criteria, we accepted and delivered the parameterisation of 93 116 (74% of the total sample) spectra to ESO. These spectra correspond to ~10 706 stars; each are observed between one and several hundred times. This automatic parameterisation of the AMBRE:HARPS spectra shows that the large majority of these stars are cool main-sequence dwarfs with metallicities greater than -0.5 dex (as expected, given that HARPS has been extensively used for planet searches around GK-stars).
Trends of brominated diphenyl ethers in fresh and archived Great Lakes fish (1979-2005)
Batterman, Stuart; Chernyak, Sergei; Gwynn, Erica; Cantonwine, David; Jia, Chunrong; Begnoche, Linda J.; Hickey, James P.
2007-01-01
While few environmental measurements of brominated diphenyl ethers (BDEs) were completed prior to the mid-1990s, analysis of appropriately archived samples might enable the determination of contaminant trends back to the introduction of these chemicals. In this paper, we first investigate the stability of BDEs in archived frozen and extracted fish samples, and then characterize trends of these chemicals in rainbow smelt (Osmerus mordax) and lake trout (Salvelinus namaycush) in each of the Great Lakes between 1979 and 2005. We focus on the four most common congeners (BDE-47, 100, 99 and 153) and use a change-point analysis to detect shifts in trends. Analyses of archived fish samples yielded precise BDE concentration measurements with only small losses (0.8% per year in frozen fish tissues, 2.2% per year in refrigerated extracts). Trends in fish from all Great Lakes showed large increases in BDE concentrations that started in the early to mid-1980s with fairly consistent doubling times (generally 2–4 years except in Lake Erie smelt where levels increased very slowly), though concentrations and trends show differences by congener, fish species and lake. The most recent data show that accumulation rates are slowing, and concentrations of penta- and hexa-congeners in trout from Lakes Ontario and Michigan and smelt from Lake Ontario started to decrease in the mid-1990s. Trends in smelt and trout are evolving somewhat differently, and trout concentrations in the five lakes are now ranked as Michigan > Superior = Ontario > Huron = Erie, and smelt concentrations as Michigan > Ontario > Huron > Superior > Erie. The analysis of properly archived samples permits the reconstruction of historical trends, congener distributions, biomagnification and other information that can aid the understanding and management of these contaminants.
The HARPS-N archive through a Cassandra, NoSQL database suite?
NASA Astrophysics Data System (ADS)
Molinari, Emilio; Guerra, Jose; Harutyunyan, Avet; Lodi, Marcello; Martin, Adrian
2016-07-01
The TNG-INAF is developing the science archive for the WEAVE instrument. The underlying architecture of the archive is based on a non relational database, more precisely, on Apache Cassandra cluster, which uses a NoSQL technology. In order to test and validate the use of this architecture, we created a local archive which we populated with all the HARPSN spectra collected at the TNG since the instrument's start of operations in mid-2012, as well as developed tools for the analysis of this data set. The HARPS-N data set is two orders of magnitude smaller than WEAVE, but we want to demonstrate the ability to walk through a complete data set and produce scientific output, as valuable as that produced by an ordinary pipeline, though without accessing directly the FITS files. The analytics is done by Apache Solr and Spark and on a relational PostgreSQL database. As an example, we produce observables like metallicity indexes for the targets in the archive and compare the results with the ones coming from the HARPS-N regular data reduction software. The aim of this experiment is to explore the viability of a high availability cluster and distributed NoSQL database as a platform for complex scientific analytics on a large data set, which will then be ported to the WEAVE Archive System (WAS) which we are developing for the WEAVE multi object, fiber spectrograph.
Preserving the Pyramid of STI Using Buckets
NASA Technical Reports Server (NTRS)
Nelson, Michael L.; Maly, Kurt
2004-01-01
The product of research projects is information. Through the life cycle of a project, information comes from many sources and takes many forms. Traditionally, this body of information is summarized in a formal publication, typically a journal article. While formal publications enjoy the benefits of peer review and technical editing, they are also often compromises in media format and length. As such, we consider a formal publication to represent an abstract to a larger body of work: a pyramid of scientific and technical information (STI). While this abstract may be sufficient for some applications, an in-depth use or analysis is likely to require the supporting layers from the pyramid. We have developed buckets to preserve this pyramid of STI. Buckets provide an archive- and protocol-independent container construct in which all related information objects can be logically grouped together, archived, and manipulated as a single object. Furthermore, buckets are active archival objects and can communicate with each other, people, or arbitrary network services. Buckets are an implementation of the Smart Object, Dumb Archive (SODA) DL model. In SODA, data objects are more important than the archives that hold them. Much of the functionality traditionally associated with archives is pushed down into the objects, such as enforcing terms and conditions, negotiating display, and content maintenance. In this paper, we discuss the motivation, design, and implication of bucket use in DLs with respect to grey literature.
Scientific Benefits of Space Science Models Archiving at Community Coordinated Modeling Center
NASA Technical Reports Server (NTRS)
Kuznetsova, Maria M.; Berrios, David; Chulaki, Anna; Hesse, Michael; MacNeice, Peter J.; Maddox, Marlo M.; Pulkkinen, Antti; Rastaetter, Lutz; Taktakishvili, Aleksandre
2009-01-01
The Community Coordinated Modeling Center (CCMC) hosts a set of state-of-the-art space science models ranging from the solar atmosphere to the Earth's upper atmosphere. CCMC provides a web-based Run-on-Request system, by which the interested scientist can request simulations for a broad range of space science problems. To allow the models to be driven by data relevant to particular events CCMC developed a tool that automatically downloads data from data archives and transform them to required formats. CCMC also provides a tailored web-based visualization interface for the model output, as well as the capability to download the simulation output in portable format. CCMC offers a variety of visualization and output analysis tools to aid scientists in interpretation of simulation results. During eight years since the Run-on-request system became available the CCMC archived the results of almost 3000 runs that are covering significant space weather events and time intervals of interest identified by the community. The simulation results archived at CCMC also include a library of general purpose runs with modeled conditions that are used for education and research. Archiving results of simulations performed in support of several Modeling Challenges helps to evaluate the progress in space weather modeling over time. We will highlight the scientific benefits of CCMC space science model archive and discuss plans for further development of advanced methods to interact with simulation results.
Dose-Response Analysis of RNA-Seq Profiles in Archival ...
Use of archival resources has been limited to date by inconsistent methods for genomic profiling of degraded RNA from formalin-fixed paraffin-embedded (FFPE) samples. RNA-sequencing offers a promising way to address this problem. Here we evaluated transcriptomic dose responses using RNA-sequencing in paired FFPE and frozen (FROZ) samples from two archival studies in mice, one 20 years old. Experimental treatments included 3 different doses of di(2-ethylhexyl)phthalate or dichloroacetic acid for the recently archived and older studies, respectively. Total RNA was ribo-depleted and sequenced using the Illumina HiSeq platform. In the recently archived study, FFPE samples had 35% lower total counts compared to FROZ samples but high concordance in fold-change values of differentially expressed genes (DEGs) (r2 = 0.99), highly enriched pathways (90% overlap with FROZ), and benchmark dose estimates for preselected target genes (2% difference vs FROZ). In contrast, older FFPE samples had markedly lower total counts (3% of FROZ) and poor concordance in global DEGs and pathways. However, counts from FFPE and FROZ samples still positively correlated (r2 = 0.84 across all transcripts) and showed comparable dose responses for more highly expressed target genes. These findings highlight potential applications and issues in using RNA-sequencing data from FFPE samples. Recently archived FFPE samples were highly similar to FROZ samples in sequencing q
Archiving Software Systems: Approaches to Preserve Computational Capabilities
NASA Astrophysics Data System (ADS)
King, T. A.
2014-12-01
A great deal of effort is made to preserve scientific data. Not only because data is knowledge, but it is often costly to acquire and is sometimes collected under unique circumstances. Another part of the science enterprise is the development of software to process and analyze the data. Developed software is also a large investment and worthy of preservation. However, the long term preservation of software presents some challenges. Software often requires a specific technology stack to operate. This can include software, operating systems and hardware dependencies. One past approach to preserve computational capabilities is to maintain ancient hardware long past its typical viability. On an archive horizon of 100 years, this is not feasible. Another approach to preserve computational capabilities is to archive source code. While this can preserve details of the implementation and algorithms, it may not be possible to reproduce the technology stack needed to compile and run the resulting applications. This future forward dilemma has a solution. Technology used to create clouds and process big data can also be used to archive and preserve computational capabilities. We explore how basic hardware, virtual machines, containers and appropriate metadata can be used to preserve computational capabilities and to archive functional software systems. In conjunction with data archives, this provides scientist with both the data and capability to reproduce the processing and analysis used to generate past scientific results.
ERIC Educational Resources Information Center
Khoja, Suleiman; Ventura, Frank
1997-01-01
Determines the extent physics textbooks contribute to physics teaching objectives and knowledge acquisition in Libya. Analysis of seventh- through ninth-grade physics textbooks and cognitive demand shows a limited effect of textbook content on knowledge acquisition and educational objectives. Suggestions are made for promoting the acquisition of…
Seabird tissue archival and monitoring project: Protocol for collecting and banking seabird eggs
Weston-York, Geoff; Porter, Barbara J.; Pugh, Rebecca S.; Roseneau, David G.; Simac, Kristin S.; Becker, Paul R.; Thorsteinson, Lyman K.; Wise, Stephen A.
2001-01-01
Archiving biological and environmental samples for retrospective analysis is a major component of systematic environmental monitoring. The long-term storage of carefully selected, representative samples in an environmental specimen bank is an important complement to the real-time monitoring of the environment. These archived samples permit:The use of subsequently developed innovative analytical technology that was not available at the time the samples were archived, for clear state-of-art identification an~ quantification of analytes of interest,The identification and quantification of analytes that are of subsequent interest but that were not of interest at the time the samples were archived, andThe comparison of present and past analytical techniques and values, providing continued credibility of past analytical values, and allowing flexibility in environmental monitoring programs.Seabirds, including albatrosses, pelicans, cormorants, terns, kittiwakes, murres, guillemots, and puffins spend most of their lives at sea and have special adaptations for feeding in the marine environment, including the ability to excrete the excess salt obtained from ingesting seawater. Many species nest in dense groups (colonies) on steep, precipitous sea-cliffs and headlands.Seabirds are long-lived and slow to mature. They occupy high positions in the marine food web and are considered sensitive indicators for the marine environment (prey includes krill, small fish, and squid). Breeding success, timing of nesting, diets, and survival rates may provide early indications of changing environmental conditions (e.g., see Hatch et aI., 1993). Chemical analysis of seabird tissues, including egg contents, can be particularly useful in determining whether contaminants (and potential biological effects) associated with human industrial activities, such as offshore petroleum and mineral exploration and development, are accumulating in marine environments. The collection and archival of seabird tissues over a period of several years will be a resource for future analyses, providing samples that can be used to determine historical baseline contaminant levels.
48 CFR 315.404 - Proposal analysis.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 48 Federal Acquisition Regulations System 4 2011-10-01 2011-10-01 false Proposal analysis. 315.404 Section 315.404 Federal Acquisition Regulations System HEALTH AND HUMAN SERVICES CONTRACTING METHODS AND CONTRACT TYPES CONTRACTING BY NEGOTIATION Contract Pricing 315.404 Proposal analysis. ...
48 CFR 1415.404 - Proposal analysis.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 48 Federal Acquisition Regulations System 5 2011-10-01 2011-10-01 false Proposal analysis. 1415.404 Section 1415.404 Federal Acquisition Regulations System DEPARTMENT OF THE INTERIOR CONTRACTING METHODS AND CONTRACT TYPES CONTRACTING BY NEGOTIATION Contract Pricing 1415.404 Proposal analysis. ...
48 CFR 815.404 - Proposal analysis.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 48 Federal Acquisition Regulations System 5 2011-10-01 2011-10-01 false Proposal analysis. 815.404 Section 815.404 Federal Acquisition Regulations System DEPARTMENT OF VETERANS AFFAIRS CONTRACTING METHODS AND CONTRACT TYPES CONTRACTING BY NEGOTIATION Contract Pricing 815.404 Proposal analysis. ...
48 CFR 315.404 - Proposal analysis.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 48 Federal Acquisition Regulations System 4 2010-10-01 2010-10-01 false Proposal analysis. 315.404 Section 315.404 Federal Acquisition Regulations System HEALTH AND HUMAN SERVICES CONTRACTING METHODS AND CONTRACT TYPES CONTRACTING BY NEGOTIATION Contract Pricing 315.404 Proposal analysis. ...
48 CFR 815.404 - Proposal analysis.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 48 Federal Acquisition Regulations System 5 2010-10-01 2010-10-01 false Proposal analysis. 815.404 Section 815.404 Federal Acquisition Regulations System DEPARTMENT OF VETERANS AFFAIRS CONTRACTING METHODS AND CONTRACT TYPES CONTRACTING BY NEGOTIATION Contract Pricing 815.404 Proposal analysis. ...
48 CFR 1415.404 - Proposal analysis.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 48 Federal Acquisition Regulations System 5 2010-10-01 2010-10-01 false Proposal analysis. 1415.404 Section 1415.404 Federal Acquisition Regulations System DEPARTMENT OF THE INTERIOR CONTRACTING METHODS AND CONTRACT TYPES CONTRACTING BY NEGOTIATION Contract Pricing 1415.404 Proposal analysis. ...
A Community Seismic Experiment in the ENAM Primary Site
NASA Astrophysics Data System (ADS)
Van Avendonk, H. J.
2012-12-01
Eastern North America (ENAM) was chosen as a GeoPRISMS Rift Initiation and Evolution primary site because it represents a mature continental margin with onshore and offshore rift basins in which the record of extension and continental break-up is preserved. The degree to which syn-rift magmatism and preexisting lithospheric weaknesses controlled the evolution of the margin can be further investigated if we image its 3-D structure at small and large length scales with active-source and earthquake seismic imaging. In the Summer of 2012 we submitted a proposal to the US National Science Foundation for an ambitious plan for data acquisition on a 400 km wide section of the mid-Atlantic East Coast margin around Cape Hatteras, from unextended continental lithosphere onshore to mature oceanic lithosphere offshore. This area includes an important along-strike transition in the morphology of the margin from the Carolina Trough to the Baltimore Canyon Trough, and two major fracture zones that are associated with significant offsets at the modern Mid-Atlantic Ridge. The study area also covers several features representing the post-rift modification of the margin by slope instability and fluid flow. As the Earthscope Transportable Array reaches the East Coast of the US in 2013 and 2014, we will have an unprecedented opportunity to image the detailed structure of the rifted margin. To make effective use of the research infrastructure, including the seismic vessel R/V Marcus Langseth, the Earthscope seismic instrumentation, and US OBS Instrument Pool, we propose to collect a suite of seismic data at the mid-Atlantic margin in the context of a community-driven experiment with completely open data access. This multi-faceted seismic experiment offers an immense opportunity for education of young scientists. We propose an integrated education effort during and after acquisition. The science and field parties for data acquisition will largely consist of young scientists, who will be chosen by application. Following the cruise, we propose to hold two short courses on multi-channel seismic reflection and wide-angle reflection and refraction data processing using the new seismic data. The acquisition of all seismic data, archiving of the data in existing data bases, and distribution to the community will take two years. Afterwards, proposals developed by any member of the science community can be submitted for further data analysis and testing of current scientific hypotheses regarding the evolution and dynamics of the ENAM margin.
System Security Authorization Agreement (SSAA) for the WIRE Archive and Research Facility
NASA Technical Reports Server (NTRS)
2002-01-01
The Wide-Field Infrared Explorer (WIRE) Archive and Research Facility (WARF) is operated and maintained by the Department of Physics, USAF Academy. The lab is located in Fairchild Hall, 2354 Fairchild Dr., Suite 2A103, USAF Academy, CO 80840. The WARF will be used for research and education in support of the NASA Wide Field Infrared Explorer (WIRE) satellite, and for related high-precision photometry missions and activities. The WARF will also contain the WIRE preliminary and final archives prior to their delivery to the National Space Science Data Center (NSSDC). The WARF consists of a suite of equipment purchased under several NASA grants in support of WIRE research. The core system consists of a Red Hat Linux workstation with twin 933 MHz PIII processors, 1 GB of RAM, 133 GB of hard disk space, and DAT and DLT tape drives. The WARF is also supported by several additional networked Linux workstations. Only one of these (an older 450 Mhz PIII computer running Red Hat Linux) is currently running, but the addition of several more is expected over the next year. In addition, a printer will soon be added. The WARF will serve as the primary research facility for the analysis and archiving of data from the WIRE satellite, together with limited quantities of other high-precision astronomical photometry data from both ground- and space-based facilities. However, the archive to be created here will not be the final archive; rather, the archive will be duplicated at the NSSDC and public access to the data will generally take place through that site.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dunn, W.N.
1998-03-01
LUG and Sway brace ANalysis (LUGSAN) II is an analysis and database computer program that is designed to calculate store lug and sway brace loads for aircraft captive carriage. LUGSAN II combines the rigid body dynamics code, SWAY85, with a Macintosh Hypercard database to function both as an analysis and archival system. This report describes the LUGSAN II application program, which operates on the Macintosh System (Hypercard 2.2 or later) and includes function descriptions, layout examples, and sample sessions. Although this report is primarily a user`s manual, a brief overview of the LUGSAN II computer code is included with suggestedmore » resources for programmers.« less
Archive & Data Management Activities for ISRO Science Archives
NASA Astrophysics Data System (ADS)
Thakkar, Navita; Moorthi, Manthira; Gopala Krishna, Barla; Prashar, Ajay; Srinivasan, T. P.
2012-07-01
ISRO has kept a step ahead by extending remote sensing missions to planetary and astronomical exploration. It has started with Chandrayaan-1 and successfully completed the moon imaging during its life time in the orbit. Now, in future ISRO is planning to launch Chandrayaan-2 (next moon mission), Mars Mission and Astronomical mission ASTROSAT. All these missions are characterized by the need to receive process, archive and disseminate the acquired science data to the user community for analysis and scientific use. All these science missions will last for a few months to a few years but the data received are required to be archived, interoperable and requires a seamless access to the user community for the future. ISRO has laid out definite plans to archive these data sets in specified standards and develop relevant access tools to be able to serve the user community. To achieve this goal, a Data Center is set up at Bangalore called Indian Space Science Data Center (ISSDC). This is the custodian of all the data sets of the current and future science missions of ISRO . Chandrayaan-1 is the first among the planetary missions launched/to be launched by ISRO and we had taken the challenge and developed a system for data archival and dissemination of the payload data received. For Chandrayaan-1 the data collected from all the instruments are processed and is archived in the archive layer in the Planetary Data System (PDS 3.0) standards, through the automated pipeline. But the dataset once stored is of no use unless it is made public, which requires a Web-based dissemination system that can be accessible to all the planetary scientists/data users working in this field. Towards this, a Web- based Browse and Dissemination system has been developed, wherein users can register and search for their area of Interest and view the data archived for TMC & HYSI with relevant Browse chips and Metadata of the data. Users can also order the data and get it on their desktop in the PDS. For other AO payloads users can view the metadata and the data is available through FTP site. This same archival and dissemination strategy will be extended for the next moon mission Chandrayaan-2. ASTROSAT is going to be the first multi-wavelength astronomical mission for which the data is archived at ISSDC. It consists of five astronomical payloads that would allow simultaneous multi-wavelengths observations from X-ray to Ultra-Violet (UV) of astronomical objects. It is planned to archive the data sets in FITS. The archive of the ASTROSAT will be done in the Archive Layer at ISSDC. The Browse of the Archive will be available through the ISDA (Indian Science Data Archive) web site. The Browse will be IVOA compliant with a search mechanism using VOTable. The data will be available to the users only on request basis via a FTP site after the lock in period is over. It is planned that the Level2 pipeline software and various modules for processing the data sets will be also available on the web site. This paper, describes the archival procedure of Chandrayaan-1 and archive plan for the ASTROSAT, Chandrayaan-2 and other future mission of ISRO including the discussion on data management activities.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-01-03
...] RIN 9000-AL74 Federal Acquisition Regulation; Time-and-Materials and Labor-Hour Contracts for... final rule amending the Federal Acquisition Regulation (FAR) to implement Government Accountability... in the following sections. II. Discussion and Analysis The Civilian Agency Acquisition Council and...
Code of Federal Regulations, 2011 CFR
2011-07-01
... Acquisition Programs and Major Automated Information System Acquisition Programs. 1 To comply with NEPA and... ANALYSIS PROCESS (EIAP) § 989.1 Purpose. (a) This part implements the Air Force Environmental Impact Analysis Process (EIAP) and provides procedures for environmental impact analysis both within the United...
Application of Independent Component Analysis to Legacy UV Quasar Spectra
NASA Astrophysics Data System (ADS)
Richards, Gordon
2017-08-01
We propose to apply a novel analysis technique to UV spectroscopy ofquasars in the HST archive. We endeavor to analyze all of thearchival quasar spectra, but will first focus on those quasars thatalso have optical spectroscopy from SDSS. An archival investigationby Sulentic et al. (2007) revealed 130 known quasars with UV coverageof CIV complementing optical emission line coverage. Today, thesample has grown considerably and now includes COS spectroscopy. Ourproposal includes a proof-of-concept demonstration of the power of atechnique called Independent Component Analysis (ICA). ICA allows usto reduce complexity of of quasar spectra to just a handful ofnumbers. In addition to providing a uniform set of traditional linemeasurements (and carefully calibrated redshifts), we will provide ICAweights to the community with examples of how they can be used to doscience that previously would have been quite difficult. The time isripe for such an investigation because 1) it has been a decade sincethe last significant archival investigation of UV emission lines fromHST quasars, 2) the future is uncertain for obtaining new UV quasarspectroscopy, and 3) the rise of machine learning has provided us withpowerful new tools. Thus our proposed work will provide a true UVlegacy database for quasar-based investigations.
Restoration and PDS Archive of Apollo Lunar Rock Sample Data
NASA Technical Reports Server (NTRS)
Garcia, P. A.; Todd, N. S.; Lofgren, G. E.; Stefanov, W. L.; Runco, S. K.; LaBasse, D.; Gaddis, L. R.
2011-01-01
In 2008, scientists at the Johnson Space Center (JSC) Lunar Sample Laboratory and Image Science & Analysis Laboratory (under the auspices of the Astromaterials Research and Exploration Science Directorate or ARES) began work on a 4-year project to digitize the original film negatives of Apollo Lunar Rock Sample photographs. These rock samples together with lunar regolith and core samples were collected as part of the lander missions for Apollos 11, 12, 14, 15, 16 and 17. The original film negatives are stored at JSC under cryogenic conditions. This effort is data restoration in the truest sense. The images represent the only record available to scientists which allows them to view the rock samples when making a sample request. As the negatives are being scanned, they are also being formatted and documented for permanent archive in the NASA Planetary Data System (PDS) archive. The ARES group is working collaboratively with the Imaging Node of the PDS on the archiving.
Characterizing Space Environments with Long-Term Space Plasma Archive Resources
NASA Technical Reports Server (NTRS)
Minow, Joseph I.; Miller, J. Scott; Diekmann, Anne M.; Parker, Linda N.
2009-01-01
A significant scientific benefit of establishing and maintaining long-term space plasma data archives is the ready access the archives afford to resources required for characterizing spacecraft design environments. Space systems must be capable of operating in the mean environments driven by climatology as well as the extremes that occur during individual space weather events. Long- term time series are necessary to obtain quantitative information on environment variability and extremes that characterize the mean and worst case environments that may be encountered during a mission. In addition, analysis of large data sets are important to scientific studies of flux limiting processes that provide a basis for establishing upper limits to environment specifications used in radiation or charging analyses. We present applications using data from existing archives and highlight their contributions to space environment models developed at Marshall Space Flight Center including the Chandra Radiation Model, ionospheric plasma variability models, and plasma models of the L2 space environment.
FTS Spectra from the Mayall 4-m Telescope, 1975-1995
NASA Astrophysics Data System (ADS)
Pilachowski, Catherine A.; Hinkle, Kenneth H.; Young, Michael; Dennis, Harold; Gopu, Arvind; Henschel, Robert; Hayashi, Soichi
2017-01-01
The complete archive of spectra obtained with the Fourier Transform Spectrometers in use at the Mayall 4m telescope at the Kitt Peak National Observatory from 1975 through 1995 is now available to the community. The archive is hosted at Indiana University Bloomington, and includes nearly 10,000 individual spectra of more than 800 different astronomical sources. The FTS produced spectra in the wavelength regime from roughly 0.9 to 5 microns (11,000 to 2000 cm-1), mostly at relatively high spectral resolution. The archive can be searched to identify specific spectra of interest, and the spectra can be viewed online and downloaded in FITS format for analysis. Once a spectrum of interest has been identified, all spectra taken on the same date are provided to allow users to identify appropriate hot star spectra for telluric line division.The archive can be accessed on the web at https://sparc.sca.iu.edu.
ERIC Educational Resources Information Center
Goldschneider, Jennifer M.; DeKeyser, Robert M.
2005-01-01
This meta-analysis pools data from 25 years of research on the order of acquisition of English grammatical morphemes by students of English as a second language (ESL). Some researchers have posited a "natural" order of acquisition common to all ESL learners, but no single cause has been shown for this phenomenon. Our study investigated…
NASA Technical Reports Server (NTRS)
Goodman, Steven J.; Wright, Pat; Christian, Hugh; Blakeslee, Richard; Buechler, Dennis; Scharfen, Greg
1991-01-01
The global lightning signatures were analyzed from the DMSP Optical Linescan System (OLS) imagery archived at the National Snow and Ice Data Center. Transition to analysis of the digital archive becomes available and compare annual, interannual, and seasonal variations with other global data sets. An initial survey of the quality of the existing film archive was completed and lightning signatures were digitized for the summer months of 1986 to 1987. The relationship is studied between: (1) global and regional lightning activity and rainfall, and (2) storm electrical development and environment. Remote sensing data sets obtained from field programs are used in conjunction with satellite/radar/lightning data to develop and improve precipitation estimation algorithms, and to provide a better understanding of the co-evolving electrical, microphysical, and dynamical structure of storms.
Resolution analysis of archive films for the purpose of their optimal digitization and distribution
NASA Astrophysics Data System (ADS)
Fliegel, Karel; Vítek, Stanislav; Páta, Petr; Myslík, Jiří; Pecák, Josef; Jícha, Marek
2017-09-01
With recent high demand for ultra-high-definition (UHD) content to be screened in high-end digital movie theaters but also in the home environment, film archives full of movies in high-definition and above are in the scope of UHD content providers. Movies captured with the traditional film technology represent a virtually unlimited source of UHD content. The goal to maintain complete image information is also related to the choice of scanning resolution and spatial resolution for further distribution. It might seem that scanning the film material in the highest possible resolution using state-of-the-art film scanners and also its distribution in this resolution is the right choice. The information content of the digitized images is however limited, and various degradations moreover lead to its further reduction. Digital distribution of the content in the highest image resolution might be therefore unnecessary or uneconomical. In other cases, the highest possible resolution is inevitable if we want to preserve fine scene details or film grain structure for archiving purposes. This paper deals with the image detail content analysis of archive film records. The resolution limit in captured scene image and factors which lower the final resolution are discussed. Methods are proposed to determine the spatial details of the film picture based on the analysis of its digitized image data. These procedures allow determining recommendations for optimal distribution of digitized video content intended for various display devices with lower resolutions. Obtained results are illustrated on spatial downsampling use case scenario, and performance evaluation of the proposed techniques is presented.
48 CFR 970.1504-1 - Price analysis.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 48 Federal Acquisition Regulations System 5 2010-10-01 2010-10-01 false Price analysis. 970.1504-1 Section 970.1504-1 Federal Acquisition Regulations System DEPARTMENT OF ENERGY AGENCY SUPPLEMENTARY REGULATIONS DOE MANAGEMENT AND OPERATING CONTRACTS Contracting by Negotiation 970.1504-1 Price analysis. ...
48 CFR 970.1504-1 - Price analysis.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 48 Federal Acquisition Regulations System 5 2011-10-01 2011-10-01 false Price analysis. 970.1504-1 Section 970.1504-1 Federal Acquisition Regulations System DEPARTMENT OF ENERGY AGENCY SUPPLEMENTARY REGULATIONS DOE MANAGEMENT AND OPERATING CONTRACTS Contracting by Negotiation 970.1504-1 Price analysis. ...
Fermilab Today - Related Content
Fermilab Today Related Content Subscribe | Contact Fermilab Today | Archive | Classifieds Search Experiment Profiles Current Archive Current Fermilab Today Archive of 2015 Archive of 2014 Archive of 2013 Archive of 2012 Archive of 2011 Archive of 2010 Archive of 2009 Archive of 2008 Archive of 2007 Archive of
48 CFR 47.305-7 - Quantity analysis, direct delivery, and reduction of crosshauling and backhauling.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Quantity analysis, direct... Contracts 47.305-7 Quantity analysis, direct delivery, and reduction of crosshauling and backhauling. (a) Quantity analysis. (1) The requiring activity shall consider the acquisition of carload or truckload...
Pointers, Lessons Learned, and Rules of Thumb for Successful Vibro-Acoustic Data Acquisition
NASA Technical Reports Server (NTRS)
Rossoni, Peter
1998-01-01
This presentation contains helpful pointers for successful vibroacoustic data acquisition in the following three areas: Instrumentation, Vibration Control and Pyro-shock data acquisition and analysis. A helpful bibliography is provided.
NASA Astrophysics Data System (ADS)
Linick, J. P.; Pieri, D. C.; Sanchez, R. M.
2014-12-01
The physical and temporal systematics of the world's volcanic activity is a compelling and productive arena for the exercise of orbital remote sensing techniques, informing studies ranging from basic volcanology to societal risk. Comprised of over 160,000 frames and spanning 15 years of the Terra platform mission, the ASTER Volcano Archive (AVA: http://ava.jpl.nasa.gov) is the world's largest (100+Tb) high spatial resolution (15-30-90m/pixel), multi-spectral (visible-SWIR-TIR), downloadable (kml enabled) dedicated archive of volcano imagery. We will discuss the development of the AVA, and describe its growing capability to provide new easy public access to ASTER global volcano remote sensing data. AVA system architecture is designed to facilitate parameter-based data mining, and for the implementation of archive-wide data analysis algorithms. Such search and analysis capabilities exploit AVA's unprecedented time-series data compilations for over 1,550 volcanoes worldwide (Smithsonian Holocene catalog). Results include thermal anomaly detection and mapping, as well as detection of SO2 plumes from explosive eruptions and passive SO2 emissions confined to the troposphere. We are also implementing retrospective ASTER image retrievals based on volcanic activity reports from Volcanic Ash Advisory Centers (VAACs) and the US Air Force Weather Agency (AFWA). A major planned expansion of the AVA is currently underway, with the ingest of the full 1972-present LANDSAT, and NASA EO-1, volcano imagery for comparison and integration with ASTER data. Work described here is carried out under contract to NASA at the Jet Propulsion Laboratory as part of the California Institute of Technology.
Analysis of civilian labor costs within the department of the navy
2017-06-01
LITERATURE REVIEW As of the writing of this report, the author was able to locate seven reports written by the RAND Corporation and the Government...model that mimics the growth of data from 2007 to 2016. 15 Adapted from C. Greaver, non-archived email , January 27, 2017.2 Figure 1. Changes of...projections. Adapted from C. Greaver, non-archived email , January 27, 2017. Figure 2. Difference between Predictions with and without NSPS Hire Type. The
Collaborative Metadata Curation in Support of NASA Earth Science Data Stewardship
NASA Technical Reports Server (NTRS)
Sisco, Adam W.; Bugbee, Kaylin; le Roux, Jeanne; Staton, Patrick; Freitag, Brian; Dixon, Valerie
2018-01-01
Growing collection of NASA Earth science data is archived and distributed by EOSDIS’s 12 Distributed Active Archive Centers (DAACs). Each collection and granule is described by a metadata record housed in the Common Metadata Repository (CMR). Multiple metadata standards are in use, and core elements of each are mapped to and from a common model – the Unified Metadata Model (UMM). Work done by the Analysis and Review of CMR (ARC) Team.
Paiva, Anthony; Shou, Wilson Z
2016-08-01
The last several years have seen the rapid adoption of the high-resolution MS (HRMS) for bioanalytical support of high throughput in vitro ADME profiling. Many capable software tools have been developed and refined to process quantitative HRMS bioanalysis data for ADME samples with excellent performance. Additionally, new software applications specifically designed for quan/qual soft spot identification workflows using HRMS have greatly enhanced the quality and efficiency of the structure elucidation process for high throughput metabolite ID in early in vitro ADME profiling. Finally, novel approaches in data acquisition and compression, as well as tools for transferring, archiving and retrieving HRMS data, are being continuously refined to tackle the issue of large data file size typical for HRMS analyses.
Third International Symposium on Space Mission Operations and Ground Data Systems, part 2
NASA Technical Reports Server (NTRS)
Rash, James L. (Editor)
1994-01-01
Under the theme of 'Opportunities in Ground Data Systems for High Efficiency Operations of Space Missions,' the SpaceOps '94 symposium included presentations of more than 150 technical papers spanning five topic areas: Mission Management, Operations, Data Management, System Development, and Systems Engineering. The symposium papers focus on improvements in the efficiency, effectiveness, and quality of data acquisition, ground systems, and mission operations. New technology, methods, and human systems are discussed. Accomplishments are also reported in the application of information systems to improve data retrieval, reporting, and archiving; the management of human factors; the use of telescience and teleoperations; and the design and implementation of logistics support for mission operations. This volume covers expert systems, systems development tools and approaches, and systems engineering issues.
Ortel, Terry W.; Spies, Ryan R.
2015-11-19
Next-Generation Radar (NEXRAD) has become an integral component in the estimation of precipitation (Kitzmiller and others, 2013). The high spatial and temporal resolution of NEXRAD has revolutionized the ability to estimate precipitation across vast regions, which is especially beneficial in areas without a dense rain-gage network. With the improved precipitation estimates, hydrologic models can produce reliable streamflow forecasts for areas across the United States. NEXRAD data from the National Weather Service (NWS) has been an invaluable tool used by the U.S. Geological Survey (USGS) for numerous projects and studies; NEXRAD data processing techniques similar to those discussed in this Fact Sheet have been developed within the USGS, including the NWS Quantitative Precipitation Estimates archive developed by Blodgett (2013).
ESAC RFI Survey in the SMOS 1400-1427MHz Passive Band
NASA Astrophysics Data System (ADS)
Castillo-Fraile, Manuel; Uranga, Ekhi
2016-08-01
The SMOS (Soil Moisture and Ocean Salinity) satellite was launched on 2 November 2009, and it is the ESA second Earth Explorer Opportunity mission. After 6 years of successful Operations, the status of the SMOS mission is excellent, and it is providing very reliable acquisition, nominal and NRT data processing, archiving, and dissemination services of Level 1 and Level 2 processed data around the entire Planet. However, SMOS observations are significantly affected by RF interferences in several World areas. In this context, a new RFI detection and monitoring tool has been implemented at ESAC to provide to the national radiofrequency authorities with a proper detection and localization method of the illegal ground emitters in order to ensure the protection of the SMOS 1400- 1427MHz Passive Band for scientific applications.
NASA Technical Reports Server (NTRS)
Leptoukh, Gregory G.
2005-01-01
The NASA Goddard Earth Sciences Data and Information Services Center (GES DISC) is one of the major Distributed Active Archive Centers (DAACs) archiving and distributing remote sensing data from the NASA's Earth Observing System. In addition to providing just data, the GES DISC/DAAC has developed various value-adding processing services. A particularly useful service is data processing a t the DISC (i.e., close to the input data) with the users' algorithms. This can take a number of different forms: as a configuration-managed algorithm within the main processing stream; as a stand-alone program next to the on-line data storage; as build-it-yourself code within the Near-Archive Data Mining (NADM) system; or as an on-the-fly analysis with simple algorithms embedded into the web-based tools (to avoid downloading unnecessary all the data). The existing data management infrastructure at the GES DISC supports a wide spectrum of options: from data subsetting data spatially and/or by parameter to sophisticated on-line analysis tools, producing economies of scale and rapid time-to-deploy. Shifting processing and data management burden from users to the GES DISC, allows scientists to concentrate on science, while the GES DISC handles the data management and data processing at a lower cost. Several examples of successful partnerships with scientists in the area of data processing and mining are presented.
48 CFR 3046.792 - Cost benefit analysis (USCG).
Code of Federal Regulations, 2013 CFR
2013-10-01
... 48 Federal Acquisition Regulations System 7 2013-10-01 2012-10-01 true Cost benefit analysis (USCG). 3046.792 Section 3046.792 Federal Acquisition Regulations System DEPARTMENT OF HOMELAND SECURITY... Cost benefit analysis (USCG). If a specific warranty is considered not to be cost beneficial by the...
48 CFR 3046.792 - Cost benefit analysis (USCG).
Code of Federal Regulations, 2014 CFR
2014-10-01
... 48 Federal Acquisition Regulations System 7 2014-10-01 2014-10-01 false Cost benefit analysis (USCG). 3046.792 Section 3046.792 Federal Acquisition Regulations System DEPARTMENT OF HOMELAND SECURITY... Cost benefit analysis (USCG). If a specific warranty is considered not to be cost beneficial by the...
Harrison, Arnell S.; Dadisman, Shawn V.; Davis, Jeffrey B.; Wiese, Dana S.
2008-01-01
In July of 2002, the U.S. Geological Survey and St. Johns River Water Management District (SJRWMD) conducted geophysical surveys in Lakes Ada, Crystal, Jennie, Mary, Rice, and Sylvan, central Florida, as part of the USGS Lakes and Coastal Aquifers (LCA) study. This report serves as an archive of unprocessed digital boomer seismic reflection data, trackline maps, navigation files, Geographic Information System (GIS) files, and formal Federal Geographic Data Committee (FGDC) metadata. Filtered and gained (a relative increase in signal amplitude) digital images of the seismic profiles are also provided. Refer to the Acronyms page for expansions of acronyms and abbreviations used in this report. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG-Y format (Barry and others, 1975) and may be downloaded and processed with commercial or public domain software such as Seismic Unix (SU). Example SU processing scripts and USGS software for viewing the SEG-Y files (Zihlman, 1992) are also provided. The USGS Florida Integrated Science Center (FISC) - St. Petersburg assigns a unique identifier to each cruise or field activity. For example, 02LCA02 tells us the data were collected in 2002 for the Lakes and Coastal Aquifers (LCA) study and the data were collected during the second field activity for that study in that calendar year. Refer to http://walrus.wr.usgs.gov/infobank/programs/html/definition/activity.html for a detailed description of the method used to assign the field activity ID. The boomer plate is an acoustic energy source that consists of capacitors charged to a high voltage and discharged through a transducer in the water. The transducer is towed on a sled floating on the water surface and when discharged emits a short acoustic pulse, or shot, which propagates through the water, sediment column, or rock beneath. The acoustic energy is reflected at density boundaries (such as the seafloor, sediment, or rock layers beneath the seafloor), detected by the receiver, and recorded by a PC-based seismic acquisition system. This process is repeated at timed intervals (for example, 0.5 s) and recorded for specific intervals of time (for example, 100 ms). In this way, a two-dimensional (2-D) vertical profile of the shallow geologic structure beneath the ship track is produced. Figure 1 displays the acquisition geometry. Refer to table 1 for a summary of acquisition parameters. Table 2 lists trackline statistics. The unprocessed seismic data are stored in SEG-Y format (Barry and others, 1975). For a detailed description of the data format, refer to the SEG-Y Format page. See the How To Download SEG-Y Data page for download instructions. The printable profiles provided here are GIF images that were filtered and gained using Seismic Unix software. Refer to the Software page for details about the processing and examples of the processing scripts. The processed SEG-Y data were exported to Chesapeake Technology, Inc. (CTI) SonarWeb software to produce an interactive Web page of the profile, which allows the user to obtain a geographic location and depth from the profile for a curser position. This information is displayed in the status bar of the browser.
Forde, Arnell S.; Dadisman, Shawn V.; Wiese, Dana S.; Phelps, Daniel C.
2012-01-01
In July of 2005, the U.S. Geological Survey (USGS), in cooperation with the Florida Geological Survey (FGS), conducted a geophysical survey of the Atlantic Ocean offshore of Florida's east coast from Flagler Beach to Daytona Beach. This report serves as an archive of unprocessed digital boomer subbottom data, trackline maps, navigation files, Geographic Information System (GIS) files, Field Activity Collection System (FACS) logs and formal Federal Geographic Data Committee (FGDC) metadata. Filtered and gained (showing a relative increase in signal amplitude) digital images of the seismic profiles are also provided. Refer to the Acronyms page for expansions of acronyms and abbreviations used in this report. The USGS Saint Petersburg Coastal and Marine Science Center (SPCMSC) assigns a unique identifier to each cruise or field activity. For example, 05FGS01 tells us the data were collected in 2005 for cooperative work with the FGS and the data were collected during the first field activity for that project in that calendar year. Refer to http://walrus.wr.usgs.gov/infobank/programs/html/definition/activity.html for a detailed description of the method used to assign the field activity ID. The boomer subbottom processing system consists of an acoustic energy source that is made up of capacitors charged to a high voltage and discharged through a transducer in the water. The transducer is towed on a sled floating on the water surface and when discharged emits a short acoustic pulse, or shot, which propagates through the water column and shallow stratrigraphy below. The acoustic energy is reflected at density boundaries (such as the seafloor or sediment layers beneath the seafloor), detected by the receiver (a hydrophone streamer), and recorded by a PC-based seismic acquisition system. This process is repeated at timed intervals (for example, 0.5 s) and recorded for specific intervals of time (for example, 100 ms). In this way, a two-dimensional (2-D) vertical image of the shallow geologic structure beneath the ship track is produced. Figure 1 displays the acquisition geometry. Refer to table 1 for a summary of acquisition parameters and table 2 for trackline statistics. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG Y format (Barry and others, 1975), except an ASCII format is used for the first 3,200 bytes of the card image header instead of the standard EBCDIC format. For a detailed description about the recorded trace headers, refer to the SEG Y Format page. The SEG Y files may be downloaded and processed with commercial or public domain software such as Seismic Unix (Cohen and Stockwell, 2005). See the How To Download SEG Y Data page for download instructions. The printable profiles provided here are GIF images that were processed and gained using SU software; refer to the Software page for links to example SU processing scripts. The processed SEG Y data were also exported to Chesapeake Technology, Inc. (CTI) SonarWeb software to produce a geospatially interactive version of the profile that allows the user to obtain a geographic location and depth from the profile for a given cursor position; this information is displayed in the status bar of the browser. Please note that clicking on the profile image switches it to "Expanded View" (a compressed image of the entire line) and cursor tracking is not available in this mode.
The Operation and Architecture of the Keck Observatory Archive
NASA Astrophysics Data System (ADS)
Berriman, G. B.; Gelino, C. R.; Laity, A.; Kong, M.; Swain, M.; Holt, J.; Goodrich, R.; Mader, J.; Tran, H. D.
2014-05-01
The Infrared Processing and Analysis Center (IPAC) and the W. M. Keck Observatory (WMKO) are collaborating to build an archive for the twin 10-m Keck Telescopes, located near the summit of Mauna Kea. The Keck Observatory Archive (KOA) takes advantage of IPAC's long experience with managing and archiving large and complex data sets from active missions and serving them to the community; and of the Observatory's knowledge of the operation of its sophisticated instrumentation and the organization of the data products. By the end of 2013, KOA will contain data from all eight active observatory instruments, with an anticipated volume of 28 TB. The data include raw science and observations, quick look products, weather information, and, for some instruments, reduced and calibrated products. The goal of including data from all instruments is the cumulation of a rapid expansion of the archive's holdings, and already data from four new instruments have been added since October 2012. One more active instrument, the integral field spectrograph OSIRIS, is scheduled for ingestion in December 2013. After preparation for ingestion into the archive, the data are transmitted electronically from WMKO to IPAC for curation in the physical archive. This process includes validation of the science and content of the data and verification that data were not corrupted in transmission. The archived data include both newly-acquired observations and all previously acquired observations. The older data extends back to the date of instrument commissioning; for some instruments, such as HIRES, these data can extend as far back as 1994. KOA will continue to ingest all newly obtained observations, at an anticipated volume of 4 TB per year, and plans to ingest data from two decommissioned instruments. Access to these data is governed by a data use policy that guarantees Principal Investigators (PI) exclusive access to their data for at least 18 months, and allows for extensions as granted by institutional Selecting Officials. Approximately one-half of the data in the archive are public. The archive architecture takes advantage of existing software and is designed for sustainability. The data preparation and quality assurance software exploits the software infrastructure at WMKO, and the physical archive at IPAC re-uses the portable component based architecture developed originally for the Infrared Science Archive, with extensions custom to KOA as needed. We will discuss the science services available to end-users. These include web and program query interfaces, interactive tabulation of data and metadata, association of files with science files, and interactive visualization of data products. We will discuss how the growth in the archive holdings has led to to a growth in usage and published science results. Finally, we will discuss the future of KOA, including the provision of data reduction pipelines and interoperability with world-wide archives and data centers, including VO-compliant services.
NASA Astrophysics Data System (ADS)
Raugh, Anne; Henneken, Edwin
The Planetary Data System (PDS) is actively involved in designing both metadata and interfaces to make the assignment of Digital Object Identifiers (DOIs) to archival data a part of the archiving process for all data creators. These DOIs will be registered through DataCite, a non-profit organization whose members are all deeply concerned with archival research data, provenance tracking through the literature, and proper acknowledgement of the various types of efforts that contribute to the creation of an archival reference data set. Making the collection of citation metadata and its ingestion into the DataCite DOI database easy - and easy to do correctly - is in the best interests of all stakeholders: the data creators; the curators; the indexing organizations like the Astrophysics Data System (ADS); and the data users. But in order to realize the promise of DOIs, there are three key issues to address: 1) How do we incorporate the metadata collection process simply and naturally into the PDS archive creation process; 2) How do we encourage journal editors to require references to previously published data with the same rigor with which they require references to previously published research and analysis; and finally, 3) How can we change the culture of academic and research employers to recognize that the effort required to prepare a PDS archival data set is a career achievement on par with contributing to a refereed article in the professional literature. Data archives and scholarly publications are the long-term return on investment that funding agencies and the science community expect in exchange for research spending. The traceability and reproducibility ensured by the integration of DOIs and their related metadata into indexing and search services is an essential part of providing and optimizing that return.
Long-term EEJ variations by using the improved EE-index
NASA Astrophysics Data System (ADS)
Fujimoto, A.; Uozumi, T.; Abe, Sh.; Matsushita, H.; Imajo, Sh.; Ishitsuka, J. K.; Yoshikawa, A.
2016-03-01
In 2008, International Center for Space Weather Science and Education, Kyushu University (ICSWSE) proposed the EE-index, which is an index to monitor the equatorial geomagnetic phenomena. EE-index has been improved with the development of the MAGnetic Data Acquisition System and the Circum-pan Pacific Magnetometer Network (MAGDAS/CPMN) and the enormous archive of MAGDAS/CPMN data over 10 years since the initial article. Using the improved EE-index, we examined the solar cycle variation of equatorial electrojet (EEJ) by the time series analysis for EUEL (one part of EE-index) at Ancon in Peru and the solar activity from September 18, 1998 to March 31, 2015. We found that the long-term variation of daily EEJ peak intensity has a trend similar to that of F10.7 (the solar activity). The power spectrum of the daily EEJ peak has clearly two dominant peaks throughout the analysis interval: 14.5 days and 180 days (semi-annual). The solar cycle variation of daily EEJ peak correlates well with that of F10.7 (the correlation coefficient 0.99). We conclude that the daily EEJ peak intensity is roughly determined as the summation of the long-period trend of the solar activity resulting from the solar cycle and day-to-day variations caused by various sources such as lunar tides, geometric effects, magnetospheric phenomena and atmospheric phenomena. This work presents the primary evidence for solar cycle variations of EEJ on the long-term study of the EE-index
J.C. Rowland; D.R. Harp; C.J. Wilson; A.L. Atchley; V.E. Romanovsky; E.T. Coon; S.L. Painter
2016-02-02
This Modeling Archive is in support of an NGEE Arctic publication available at doi:10.5194/tc-10-341-2016. This dataset contains an ensemble of thermal-hydro soil parameters including porosity, thermal conductivity, thermal conductivity shape parameters, and residual saturation of peat and mineral soil. The ensemble was generated using a Null-Space Monte Carlo analysis of parameter uncertainty based on a calibration to soil temperatures collected at the Barrow Environmental Observatory site by the NGEE team. The micro-topography of ice wedge polygons present at the site is included in the analysis using three 1D column models to represent polygon center, rim and trough features. The Arctic Terrestrial Simulator (ATS) was used in the calibration to model multiphase thermal and hydrological processes in the subsurface.
Edward Bermingham and the archives of clinical surgery: America's First Surgical Journal.
Rutkow, Ira
2015-04-01
To explore the life of Edward J. Bermingham (1853-1922) and his founding, in 1876, of the Archives of Clinical Surgery, the nation's first surgical journal. Beginning in the 1870s, American medicine found itself in the middle of a revolution marked by fundamental economic, scientific, and social transformations. For those physicians who wanted to be regarded as surgeons, the push toward specialization was foremost among these changes. The rise of surgery as a specialty was accomplished through various new initiatives; among them was the development of dedicated literature in the form of specialty journals to disseminate news of surgical research and technical innovations in a timely fashion. An analysis of the published medical and lay literature and unpublished documents relating to Edward J. Bermingham and the Archives of Clinical Surgery. At a time when surgery was not considered a separate branch of medicine but a mere technical mode of treatment, Bermingham's publication of the Archives of Clinical Surgery was a milestone event in the ensuing rise of surgery as a specialty within the whole of medicine. The long forgotten Archives of Clinical Surgery provides a unique window into the world of surgery, as it existed when the medical revolution and the process of specialization were just beginning. For this reason, the Archives is among the more important primary resources with which to gain an understanding of prescientific surgery as it reached its endpoint in America.
NASA Astrophysics Data System (ADS)
Percy Plasencia Linares, Milton; Russi, Marino; Pesaresi, Damiano; Cravos, Claudio
2010-05-01
The Italian National Institute for Oceanography and Experimental Geophysics (Istituto Nazionale di Oceanografia e di Geofisica Sperimentale, OGS) is running the Antarctic Seismographic Argentinean Italian Network (ASAIN), made of 7 seismic stations located in the Scotia Sea region in Antarctica and in Tierra del Fuego - Argentina: data from these stations are transferred in real time to the OGS headquarters in Trieste (Italy) via satellite links provided by the Instituto Antártico Argentino (IAA). Data is collected and archived primarily in Güralp Compress Format (GCF) through the Scream! software at OGS and IAA, and transmitted also in real time to the Observatories and Research Facilities for European Seismology (ORFEUS). The main real time seismic data acquisition and processing system of the ASAIN network is based on the EarthWorm 7.3 (Open Source) software suite installed on a Linux server at the OGS headquarters in Trieste. It runs several software modules for data collection, data archiving, data publication on dedicated web servers: wave_serverV, Winston Wave Server, and data analysis and realtime monitoring through Swarm program. OGS is also running, in close cooperation with the Friuli-Venezia Giulia Civil Defense, the North East (NI) Italy seismic network, making use of the Antelope commercial software suite from BRTT as the main acquisition system. As a test to check the global capabilities of the Antelope software suite, we also set up an instance of Antelope acquiring data in real time from both the regional ASAIN seismic network in Antarctica and a subset of the Global Seismic Network (GSN) funded by the Incorporated Research Institution for Seismology (IRIS). The facilities of the IRIS Data Management System, and specifically the IRIS Data Management Center, were used for real time access to waveform required in this study. The first tests indicated that more than 80% of the earthquakes with magnitude M>5.0 listed in the Preliminary Determination of Epicenters (PDE) catalogue of the National Earthquake Information Center (NEIC) of the United States Geological Survey (USGS) were also correctly automatically detected by Antelope, with an average location error of 0.05 degrees and average body wave magnitude Mb estimation error below 0.1. The average time difference between event origin time and the actual time of event determination by Antelope was of about 45': the comparison with 20', the IASPEI91 P-wave travel time for 180 degrees distance, and 25', the estimate of our test system data latency, indicate that Antelope is a serious candidate for regional and global early warning systems.
48 CFR 2815.404-2 - Information to support proposal analysis.
Code of Federal Regulations, 2010 CFR
2010-10-01
... Information to support proposal analysis. All requests for field pricing support shall be made by the... 48 Federal Acquisition Regulations System 6 2010-10-01 2010-10-01 true Information to support proposal analysis. 2815.404-2 Section 2815.404-2 Federal Acquisition Regulations System DEPARTMENT OF...
48 CFR 815.404-2 - Information to support proposal analysis.
Code of Federal Regulations, 2013 CFR
2013-10-01
...-2 Information to support proposal analysis. In evaluating start-up and other non-recurring costs... 48 Federal Acquisition Regulations System 5 2013-10-01 2013-10-01 false Information to support proposal analysis. 815.404-2 Section 815.404-2 Federal Acquisition Regulations System DEPARTMENT OF...
48 CFR 815.404-2 - Information to support proposal analysis.
Code of Federal Regulations, 2010 CFR
2010-10-01
...-2 Information to support proposal analysis. In evaluating start-up and other non-recurring costs... 48 Federal Acquisition Regulations System 5 2010-10-01 2010-10-01 false Information to support proposal analysis. 815.404-2 Section 815.404-2 Federal Acquisition Regulations System DEPARTMENT OF...
48 CFR 2815.404-2 - Information to support proposal analysis.
Code of Federal Regulations, 2013 CFR
2013-10-01
... Information to support proposal analysis. All requests for field pricing support shall be made by the... 48 Federal Acquisition Regulations System 6 2013-10-01 2013-10-01 false Information to support proposal analysis. 2815.404-2 Section 2815.404-2 Federal Acquisition Regulations System DEPARTMENT OF...
48 CFR 2815.404-2 - Information to support proposal analysis.
Code of Federal Regulations, 2014 CFR
2014-10-01
... Information to support proposal analysis. All requests for field pricing support shall be made by the... 48 Federal Acquisition Regulations System 6 2014-10-01 2014-10-01 false Information to support proposal analysis. 2815.404-2 Section 2815.404-2 Federal Acquisition Regulations System DEPARTMENT OF...
48 CFR 2815.404-2 - Information to support proposal analysis.
Code of Federal Regulations, 2012 CFR
2012-10-01
... Information to support proposal analysis. All requests for field pricing support shall be made by the... 48 Federal Acquisition Regulations System 6 2012-10-01 2012-10-01 false Information to support proposal analysis. 2815.404-2 Section 2815.404-2 Federal Acquisition Regulations System DEPARTMENT OF...