Sample records for archive observation model

  1. Linking Science Analysis with Observation Planning: A Full Circle Data Lifecycle

    NASA Technical Reports Server (NTRS)

    Grosvenor, Sandy; Jones, Jeremy; Koratkar, Anuradha; Li, Connie; Mackey, Jennifer; Neher, Ken; Wolf, Karl; Obenschain, Arthur F. (Technical Monitor)

    2001-01-01

    A clear goal of the Virtual Observatory (VO) is to enable new science through analysis of integrated astronomical archives. An additional and powerful possibility of the VO is to link and integrate these new analyses with planning of new observations. By providing tools that can be used for observation planning in the VO, the VO will allow the data lifecycle to come full circle: from theory to observations to data and back around to new theories and new observations. The Scientist's Expert Assistant (SEA) Simulation Facility (SSF) is working to combine the ability to access existing archives with the ability to model and visualize new observations. Integrating the two will allow astronomers to better use the integrated archives of the VO to plan and predict the success of potential new observations more efficiently, The full circle lifecycle enabled by SEA can allow astronomers to make substantial leaps in the quality of data and science returns on new observations. Our paper examines the exciting potential of integrating archival analysis with new observation planning, such as performing data calibration analysis on archival images and using that analysis to predict the success of new observations, or performing dynamic signal-to-noise analysis combining historical results with modeling of new instruments or targets. We will also describe how the development of the SSF is progressing and what have been its successes and challenges.

  2. Linking Science Analysis with Observation Planning: A Full Circle Data Lifecycle

    NASA Technical Reports Server (NTRS)

    Jones, Jeremy; Grosvenor, Sandy; Wolf, Karl; Li, Connie; Koratkar, Anuradha; Powers, Edward I. (Technical Monitor)

    2001-01-01

    A clear goal of the Virtual Observatory (VO) is to enable new science through analysis of integrated astronomical archives. An additional and powerful possibility of the VO is to link and integrate these new analyses with planning of new observations. By providing tools that can be used for observation planning in the VO, the VO will allow the data lifecycle to come full circle: from theory to observations to data and back around to new theories and new observations. The Scientist's Expert Assistant (SEA) Simulation Facility (SSF) is working to combine the ability to access existing archives with the ability to model and visualize new observations. Integrating the two will allow astronomers to better use the integrated archives of the VO to plan and predict the success of potential new observations. The full circle lifecycle enabled by SEA can allow astronomers to make substantial leaps in the quality of data and science returns on new observations. Our paper will examine the exciting potential of integrating archival analysis with new observation planning, such as performing data calibration analysis on archival images and using that analysis to predict the success of new observations, or performing dynamic signal-to-noise analysis combining historical results with modeling of new instruments or targets. We will also describe how the development of the SSF is progressing and what has been its successes and challenges.

  3. Providing comprehensive and consistent access to astronomical observatory archive data: the NASA archive model

    NASA Astrophysics Data System (ADS)

    McGlynn, Thomas; Fabbiano, Giuseppina; Accomazzi, Alberto; Smale, Alan; White, Richard L.; Donaldson, Thomas; Aloisi, Alessandra; Dower, Theresa; Mazzerella, Joseph M.; Ebert, Rick; Pevunova, Olga; Imel, David; Berriman, Graham B.; Teplitz, Harry I.; Groom, Steve L.; Desai, Vandana R.; Landry, Walter

    2016-07-01

    Since the turn of the millennium a constant concern of astronomical archives have begun providing data to the public through standardized protocols unifying data from disparate physical sources and wavebands across the electromagnetic spectrum into an astronomical virtual observatory (VO). In October 2014, NASA began support for the NASA Astronomical Virtual Observatories (NAVO) program to coordinate the efforts of NASA astronomy archives in providing data to users through implementation of protocols agreed within the International Virtual Observatory Alliance (IVOA). A major goal of the NAVO collaboration has been to step back from a piecemeal implementation of IVOA standards and define what the appropriate presence for the US and NASA astronomy archives in the VO should be. This includes evaluating what optional capabilities in the standards need to be supported, the specific versions of standards that should be used, and returning feedback to the IVOA, to support modifications as needed. We discuss a standard archive model developed by the NAVO for data archive presence in the virtual observatory built upon a consistent framework of standards defined by the IVOA. Our standard model provides for discovery of resources through the VO registries, access to observation and object data, downloads of image and spectral data and general access to archival datasets. It defines specific protocol versions, minimum capabilities, and all dependencies. The model will evolve as the capabilities of the virtual observatory and needs of the community change.

  4. Providing Comprehensive and Consistent Access to Astronomical Observatory Archive Data: The NASA Archive Model

    NASA Technical Reports Server (NTRS)

    McGlynn, Thomas; Guiseppina, Fabbiano A; Accomazzi, Alberto; Smale, Alan; White, Richard L.; Donaldson, Thomas; Aloisi, Alessandra; Dower, Theresa; Mazzerella, Joseph M.; Ebert, Rick; hide

    2016-01-01

    Since the turn of the millennium a constant concern of astronomical archives have begun providing data to the public through standardized protocols unifying data from disparate physical sources and wavebands across the electromagnetic spectrum into an astronomical virtual observatory (VO). In October 2014, NASA began support for the NASA Astronomical Virtual Observatories (NAVO) program to coordinate the efforts of NASA astronomy archives in providing data to users through implementation of protocols agreed within the International Virtual Observatory Alliance (IVOA). A major goal of the NAVO collaboration has been to step back from a piecemeal implementation of IVOA standards and define what the appropriate presence for the US and NASA astronomy archives in the VO should be. This includes evaluating what optional capabilities in the standards need to be supported, the specific versions of standards that should be used, and returning feedback to the IVOA, to support modifications as needed. We discuss a standard archive model developed by the NAVO for data archive presence in the virtual observatory built upon a consistent framework of standards defined by the IVOA. Our standard model provides for discovery of resources through the VO registries, access to observation and object data, downloads of image and spectral data and general access to archival datasets. It defines specific protocol versions, minimum capabilities, and all dependencies. The model will evolve as the capabilities of the virtual observatory and needs of the community change.

  5. STARS 2.0: 2nd-generation open-source archiving and query software

    NASA Astrophysics Data System (ADS)

    Winegar, Tom

    2008-07-01

    The Subaru Telescope is in process of developing an open-source alternative to the 1st-generation software and databases (STARS 1) used for archiving and query. For STARS 2, we have chosen PHP and Python for scripting and MySQL as the database software. We have collected feedback from staff and observers, and used this feedback to significantly improve the design and functionality of our future archiving and query software. Archiving - We identified two weaknesses in 1st-generation STARS archiving software: a complex and inflexible table structure and uncoordinated system administration for our business model: taking pictures from the summit and archiving them in both Hawaii and Japan. We adopted a simplified and normalized table structure with passive keyword collection, and we are designing an archive-to-archive file transfer system that automatically reports real-time status and error conditions and permits error recovery. Query - We identified several weaknesses in 1st-generation STARS query software: inflexible query tools, poor sharing of calibration data, and no automatic file transfer mechanisms to observers. We are developing improved query tools and sharing of calibration data, and multi-protocol unassisted file transfer mechanisms for observers. In the process, we have redefined a 'query': from an invisible search result that can only transfer once in-house right now, with little status and error reporting and no error recovery - to a stored search result that can be monitored, transferred to different locations with multiple protocols, reporting status and error conditions and permitting recovery from errors.

  6. Satellite Remote Sensing is Key to Water Cycle Integrator

    NASA Astrophysics Data System (ADS)

    Koike, T.

    2016-12-01

    To promote effective multi-sectoral, interdisciplinary collaboration based on coordinated and integrated efforts, the Global Earth Observation System of Systems (GEOSS) is now developing a "GEOSS Water Cycle Integrator (WCI)", which integrates "Earth observations", "modeling", "data and information", "management systems" and "education systems". GEOSS/WCI sets up "work benches" by which partners can share data, information and applications in an interoperable way, exchange knowledge and experiences, deepen mutual understanding and work together effectively to ultimately respond to issues of both mitigation and adaptation. (A work bench is a virtual geographical or phenomenological space where experts and managers collaborate to use information to address a problem within that space). GEOSS/WCI enhances the coordination of efforts to strengthen individual, institutional and infrastructure capacities, especially for effective interdisciplinary coordination and integration. GEOSS/WCI archives various satellite data to provide various hydrological information such as cloud, rainfall, soil moisture, or land-surface snow. These satellite products were validated using land observation in-situ data. Water cycle models can be developed by coupling in-situ and satellite data. River flows and other hydrological parameters can be simulated and validated by in-situ data. Model outputs from weather-prediction, seasonal-prediction, and climate-prediction models are archived. Some of these model outputs are archived on an online basis, but other models, e.g., climate-prediction models are archived on an offline basis. After models are evaluated and biases corrected, the outputs can be used as inputs into the hydrological models for predicting the hydrological parameters. Additionally, we have already developed a data-assimilation system by combining satellite data and the models. This system can improve our capability to predict hydrological phenomena. The WCI can provide better predictions of the hydrological parameters for integrated water resources management (IWRM) and also assess the impact of climate change and calculate adaptation needs.

  7. ModelArchiver—A program for facilitating the creation of groundwater model archives

    USGS Publications Warehouse

    Winston, Richard B.

    2018-03-01

    ModelArchiver is a program designed to facilitate the creation of groundwater model archives that meet the requirements of the U.S. Geological Survey (USGS) policy (Office of Groundwater Technical Memorandum 2016.02, https://water.usgs.gov/admin/memo/GW/gw2016.02.pdf, https://water.usgs.gov/ogw/policy/gw-model/). ModelArchiver version 1.0 leads the user step-by-step through the process of creating a USGS groundwater model archive. The user specifies the contents of each of the subdirectories within the archive and provides descriptions of the archive contents. Descriptions of some files can be specified automatically using file extensions. Descriptions also can be specified individually. Those descriptions are added to a readme.txt file provided by the user. ModelArchiver moves the content of the archive to the archive folder and compresses some folders into .zip files.As part of the archive, the modeler must create a metadata file describing the archive. The program has a built-in metadata editor and provides links to websites that can aid in creation of the metadata. The built-in metadata editor is also available as a stand-alone program named FgdcMetaEditor version 1.0, which also is described in this report. ModelArchiver updates the metadata file provided by the user with descriptions of the files in the archive. An optional archive list file generated automatically by ModelMuse can streamline the creation of archives by identifying input files, output files, model programs, and ancillary files for inclusion in the archive.

  8. The Paleoclimate Uncertainty Cascade: Tracking Proxy Errors Via Proxy System Models.

    NASA Astrophysics Data System (ADS)

    Emile-Geay, J.; Dee, S. G.; Evans, M. N.; Adkins, J. F.

    2014-12-01

    Paleoclimatic observations are, by nature, imperfect recorders of climate variables. Empirical approaches to their calibration are challenged by the presence of multiple sources of uncertainty, which may confound the interpretation of signals and the identifiability of the noise. In this talk, I will demonstrate the utility of proxy system models (PSMs, Evans et al, 2013, 10.1016/j.quascirev.2013.05.024) to quantify the impact of all known sources of uncertainty. PSMs explicitly encode the mechanistic knowledge of the physical, chemical, biological and geological processes from which paleoclimatic observations arise. PSMs may be divided into sensor, archive and observation components, all of which may conspire to obscure climate signals in actual paleo-observations. As an example, we couple a PSM for the δ18O of speleothem calcite to an isotope-enabled climate model (Dee et al, submitted) to analyze the potential of this measurement as a proxy for precipitation amount. A simple soil/karst model (Partin et al, 2013, 10.1130/G34718.1) is used as sensor model, while a hiatus-permitting chronological model (Haslett & Parnell, 2008, 10.1111/j.1467-9876.2008.00623.x) is used as part of the observation model. This subdivision allows us to explicitly model the transformation from precipitation amount to speleothem calcite δ18O as a multi-stage process via a physical and chemical sensor model, and a stochastic archive model. By illustrating the PSM's behavior within the context of the climate simulations, we show how estimates of climate variability may be affected by each submodel's transformation of the signal. By specifying idealized climate signals(periodic vs. episodic, slow vs. fast) to the PSM, we investigate how frequency and amplitude patterns are modulated by sensor and archive submodels. To the extent that the PSM and the climate models are representative of real world processes, then the results may help us more accurately interpret existing paleodata, characterize their uncertainties, and design sampling strategies that exploit their strengths while mitigating their weaknesses.

  9. Dynamic Data Management Based on Archival Process Integration at the Centre for Environmental Data Archival

    NASA Astrophysics Data System (ADS)

    Conway, Esther; Waterfall, Alison; Pepler, Sam; Newey, Charles

    2015-04-01

    In this paper we decribe a business process modelling approach to the integration of exisiting archival activities. We provide a high level overview of existing practice and discuss how procedures can be extended and supported through the description of preservation state. The aim of which is to faciliate the dynamic controlled management of scientific data through its lifecycle. The main types of archival processes considered are: • Management processes that govern the operation of an archive. These management processes include archival governance (preservation state management, selection of archival candidates and strategic management) . • Operational processes that constitute the core activities of the archive which maintain the value of research assets. These operational processes are the acquisition, ingestion, deletion, generation of metadata and preservation actvities, • Supporting processes, which include planning, risk analysis and monitoring of the community/preservation environment. We then proceed by describing the feasability testing of extended risk management and planning procedures which integrate current practices. This was done through the CEDA Archival Format Audit which inspected British Atmospherics Data Centre and National Earth Observation Data Centre Archival holdings. These holdings are extensive, comprising of around 2PB of data and 137 million individual files which were analysed and characterised in terms of format based risk. We are then able to present an overview of the risk burden faced by a large scale archive attempting to maintain the usability of heterogeneous environmental data sets. We conclude by presenting a dynamic data management information model that is capable of describing the preservation state of archival holdings throughout the data lifecycle. We provide discussion of the following core model entities and their relationships: • Aspirational entities, which include Data Entity definitions and their associated Preservation Objectives. • Risk entities, which act as drivers for change within the data lifecycle. These include Acquisitional Risks, Technical Risks, Strategic Risks and External Risks • Plan entities, which detail the actions to bring about change within an archive. These include Acquisition Plans, Preservation Plans and Monitoring plans • The Result entities describe the successful outcomes of the executed plans. These include Acquisitions, Mitigations and Accepted Risks.

  10. "Small" data in a big data world: archiving terrestrial ecology data at ORNL DAAC

    NASA Astrophysics Data System (ADS)

    Santhana Vannan, S. K.; Beaty, T.; Boyer, A.; Deb, D.; Hook, L.; Shrestha, R.; Thornton, M.; Virdi, M.; Wei, Y.; Wright, D.

    2016-12-01

    The Oak Ridge National Laboratory Distributed Active Archive Center (ORNL DAAC http://daac.ornl.gov), a NASA-funded data center, archives a diverse collection of terrestrial biogeochemistry and ecological dynamics observations and models in support of NASA's Earth Science program. The ORNL DAAC has been addressing the increasing challenge of publishing diverse small data products into an online archive while dealing with the enhanced need for integration and availability of these data to address big science questions. This paper will show examples of "small" diverse data holdings - ranging from the Daymet model output data to site-based soil moisture observation data. We define "small" by the data volume of these data products compared to petabyte scale observations. We will highlight the use of tools and services for visualizing diverse data holdings and subsetting services such as the MODIS land products subsets tool (at ORNL DAAC) that provides big MODIS data in small chunks. Digital Object Identifiers (DOI) and data citations have enhanced the availability of data. The challenge faced by data publishers now is to deal with the increased number of publishable data products and most importantly the difficulties of publishing small diverse data products into an online archive. This paper will also present our experiences designing a data curation system for these types of data. The characteristics of these data will be examined and their scientific value will be demonstrated via data citation metrics. We will present case studies of leveraging specialized tools and services that have enabled small data sets to realize their "big" scientific potential. Overall, we will provide a holistic view of the challenges and potential of small diverse terrestrial ecology data sets from data curation to distribution.

  11. Status of the TESS Science Processing Operations Center

    NASA Technical Reports Server (NTRS)

    Jenkins, Jon M.; Twicken, Joseph D.; Campbell, Jennifer; Tenebaum, Peter; Sanderfer, Dwight; Davies, Misty D.; Smith, Jeffrey C.; Morris, Rob; Mansouri-Samani, Masoud; Girouardi, Forrest; hide

    2017-01-01

    The Transiting Exoplanet Survey Satellite (TESS) science pipeline is being developed by the Science Processing Operations Center (SPOC) at NASA Ames Research Center based on the highly successful Kepler Mission science pipeline. Like the Kepler pipeline, the TESS science pipeline will provide calibrated pixels, simple and systematic error-corrected aperture photometry, and centroid locations for all 200,000+ target stars, observed over the 2-year mission, along with associated uncertainties. The pixel and light curve products are modeled on the Kepler archive products and will be archived to the Mikulski Archive for Space Telescopes (MAST). In addition to the nominal science data, the 30-minute Full Frame Images (FFIs) simultaneously collected by TESS will also be calibrated by the SPOC and archived at MAST. The TESS pipeline will search through all light curves for evidence of transits that occur when a planet crosses the disk of its host star. The Data Validation pipeline will generate a suite of diagnostic metrics for each transit-like signature discovered, and extract planetary parameters by fitting a limb-darkened transit model to each potential planetary signature. The results of the transit search will be modeled on the Kepler transit search products (tabulated numerical results, time series products, and pdf reports) all of which will be archived to MAST.

  12. Earth observation archive activities at DRA Farnborough

    NASA Technical Reports Server (NTRS)

    Palmer, M. D.; Williams, J. M.

    1993-01-01

    Space Sector, Defence Research Agency (DRA), Farnborough have been actively involved in the acquisition and processing of Earth Observation data for over 15 years. During that time an archive of over 20,000 items has been built up. This paper describes the major archive activities, including: operation and maintenance of the main DRA Archive, the development of a prototype Optical Disc Archive System (ODAS), the catalog systems in use at DRA, the UK Processing and Archive Facility for ERS-1 data, and future plans for archiving activities.

  13. NASA Langley Atmospheric Science Data Center (ASDC) Experience with Aircraft Data

    NASA Astrophysics Data System (ADS)

    Perez, J.; Sorlie, S.; Parker, L.; Mason, K. L.; Rinsland, P.; Kusterer, J.

    2011-12-01

    Over the past decade the NASA Langley ASDC has archived and distributed a variety of aircraft mission data sets. These datasets posed unique challenges for archiving from the rigidity of the archiving system and formats to the lack of metadata. The ASDC developed a state-of-the-art data archive and distribution system to serve the atmospheric sciences data provider and researcher communities. The system, called Archive - Next Generation (ANGe), is designed with a distributed, multi-tier, serviced-based, message oriented architecture enabling new methods for searching, accessing, and customizing data. The ANGe system provides the ease and flexibility to ingest and archive aircraft data through an ad hoc workflow or to develop a new workflow to suit the providers needs. The ASDC will describe the challenges encountered in preparing aircraft data for archiving and distribution. The ASDC is currently providing guidance to the DISCOVER-AQ (Deriving Information on Surface Conditions from Column and Vertically Resolved Observations Relevant to Air Quality) Earth Venture-1 project on developing collection, granule, and browse metadata as well as supporting the ADAM (Airborne Data For Assessing Models) site.

  14. Europlanet/IDIS: Combining Diverse Planetary Observations and Models

    NASA Astrophysics Data System (ADS)

    Schmidt, Walter; Capria, Maria Teresa; Chanteur, Gerard

    2013-04-01

    Planetary research involves a diversity of research fields from astrophysics and plasma physics to atmospheric physics, climatology, spectroscopy and surface imaging. Data from all these disciplines are collected from various space-borne platforms or telescopes, supported by modelling teams and laboratory work. In order to interpret one set of data often supporting data from different disciplines and other missions are needed while the scientist does not always have the detailed expertise to access and utilize these observations. The Integrated and Distributed Information System (IDIS) [1], developed in the framework of the Europlanet-RI project, implements a Virtual Observatory approach ([2] and [3]), where different data sets, stored in archives around the world and in different formats, are accessed, re-formatted and combined to meet the user's requirements without the need of familiarizing oneself with the different technical details. While observational astrophysical data from different observatories could already earlier be accessed via Virtual Observatories, this concept is now extended to diverse planetary data and related model data sets, spectral data bases etc. A dedicated XML-based Europlanet Data Model (EPN-DM) [4] was developed based on data models from the planetary science community and the Virtual Observatory approach. A dedicated editor simplifies the registration of new resources. As the EPN-DM is a super-set of existing data models existing archives as well as new spectroscopic or chemical data bases for the interpretation of atmospheric or surface observations, or even modeling facilities at research institutes in Europe or Russia can be easily integrated and accessed via a Table Access Protocol (EPN-TAP) [5] adapted from the corresponding protocol of the International Virtual Observatory Alliance [6] (IVOA-TAP). EPN-TAP allows to search catalogues, retrieve data and make them available through standard IVOA tools if the access to the archive is compatible with IVOA standards. For some major data archives with different standards adaptation tools are available to make the access transparent to the user. EuroPlaNet-IDIS has contributed to the definition of PDAP, the Planetary Data Access Protocol of the International Planetary Data Alliance (IPDA) [7] to access the major planetary data archives of NASA in the USA [8], ESA in Europe [9] and JAXA in Japan [10]. Acknowledgement: Europlanet-RI was funded by the European Commission under the 7th Framework Program, grant 228319 "Capacities Specific Programme" - Research Infrastructures Action. Reference: [1] Details to IDIS and the Europlanet-RI via Web-site: http://www.idis.europlanet-ri.eu/ [2] Demonstrator implementation for Plasma-VO AMDA: http://cdpp-amda.cesr.fr/DDHTML/index.html [3] Demonstrator implementation for the IDIS-VO: http://www.idis-dyn.europlanet-ri.eu/vodev.shtml [4] Europlanet Data Model EPN-DM: http://www.europlanet-idis.fi/documents/public_documents/EPN-DM-v2.0.pdf [5] Europlanet Table Access Protocol EPN-TAP: http://www.europlanet-idis.fi/documents/public_documents/EPN-TAPV_0.26.pdf [6] International Virtual Observatory Alliance IVOA: http://www.ivoa.net [7] International Planetary Data Alliance IPDA: http://planetarydata.org/ [8] NASA's Planetary Data System: http://pds.jpl.nasa.gov/ [9] ESA's Planetary Science Archive PSA: http://www.sciops.esa.int/index.php?project=PSA [10] JAXAs Data Archive and Transmission System DARTS: http://darts.isas.jaxa.jp/

  15. Current status of the international Halley Watch infrared net archive

    NASA Technical Reports Server (NTRS)

    Mcguinness, Brian B.

    1988-01-01

    The primary purposes of the Halley Watch have been to promote Halley observations, coordinate and standardize the observing where useful, and to archive the results in a database readily accessible to cometary scientists. The intention of IHW is to store the observations themselves, along with any information necessary to allow users to understand and use the data, but to exclude interpretations of these data. Each of the archives produced by the IHW will appear in two versions: a printed archive and a digital archive on CD-ROMs. The archive is expected to have a very long lifetime. The IHW has already produced an archive for P/Crommelin. This consists of one printed volume and two 1600 bpi tapes. The Halley archive will contain at least twenty gigabytes of information.

  16. F-CHROMA.Flare Chromospheres: Observations, Models and Archives

    NASA Astrophysics Data System (ADS)

    Cauzzi, Gianna; Fletcher, Lyndsay; Mathioudakis, Mihalis; Carlsson, Mats; Heinzel, Petr; Berlicki, Arek; Zuccarello, Francesca

    2014-06-01

    F-CHROMA is a collaborative project newly funded under the EU-Framework Programme 7 "FP7-SPACE-2013-1", involving seven different European research Institutes and Universities. The goal of F-CHROMA is to substantially advance our understanding of the physics of energy dissipation and radiation in the flaring solar atmosphere, with a particular focus on the flares' chromosphere. A major outcome of the F-CHROMA project will be the creation of an archive of chromospheric flare observations and models to be made available to the community for further research.In this poster we describe the structure and milestones of the project, the different activities planned, as well as early results. Emphasis will be given to the dissemination efforts of the project to make results of these activities available to and usable by the community.

  17. The Planetary Archive

    NASA Astrophysics Data System (ADS)

    Penteado, Paulo F.; Trilling, David; Szalay, Alexander; Budavári, Tamás; Fuentes, César

    2014-11-01

    We are building the first system that will allow efficient data mining in the astronomical archives for observations of Solar System Bodies. While the Virtual Observatory has enabled data-intensive research making use of large collections of observations across multiple archives, Planetary Science has largely been denied this opportunity: most astronomical data services are built based on sky positions, and moving objects are often filtered out.To identify serendipitous observations of Solar System objects, we ingest the archive metadata. The coverage of each image in an archive is a volume in a 3D space (RA,Dec,time), which we can represent efficiently through a hierarchical triangular mesh (HTM) for the spatial dimensions, plus a contiguous time interval. In this space, an asteroid occupies a curve, which we determine integrating its orbit into the past. Thus when an asteroid trajectory intercepts the volume of an archived image, we have a possible observation of that body. Our pipeline then looks in the archive's catalog for a source with the corresponding coordinates, to retrieve its photometry. All these matches are stored into a database, which can be queried by object identifier.This database consists of archived observations of known Solar System objects. This means that it grows not only from the ingestion of new images, but also from the growth in the number of known objects. As new bodies are discovered, our pipeline can find archived observations where they could have been recorded, providing colors for these newly-found objects. This growth becomes more relevant with the new generation of wide-field surveys, particularly LSST.We also present one use case of our prototype archive: after ingesting the metadata for SDSS, 2MASS and GALEX, we were able to identify serendipitous observations of Solar System bodies in these 3 archives. Cross-matching these occurrences provided us with colors from the UV to the IR, a much wider spectral range than that commonly used for asteroid taxonomy. We present here archive-derived spectrophotometry from searching for 440 thousand asteroids, from 0.3 to 3 µm. In the future we will expand to other archives, including HST, Spitzer, WISE and Pan-STARRS.

  18. Archive Management of NASA Earth Observation Data to Support Cloud Analysis

    NASA Technical Reports Server (NTRS)

    Lynnes, Christopher; Baynes, Kathleen; McInerney, Mark A.

    2017-01-01

    NASA collects, processes and distributes petabytes of Earth Observation (EO) data from satellites, aircraft, in situ instruments and model output, with an order of magnitude increase expected by 2024. Cloud-based web object storage (WOS) of these data can simplify the execution of such an increase. More importantly, it can also facilitate user analysis of those volumes by making the data available to the massively parallel computing power in the cloud. However, storing EO data in cloud WOS has a ripple effect throughout the NASA archive system with unexpected challenges and opportunities. One challenge is modifying data servicing software (such as Web Coverage Service servers) to access and subset data that are no longer on a directly accessible file system, but rather in cloud WOS. Opportunities include refactoring of the archive software to a cloud-native architecture; virtualizing data products by computing on demand; and reorganizing data to be more analysis-friendly.

  19. The ISO Data Archive and Interoperability with Other Archives

    NASA Astrophysics Data System (ADS)

    Salama, Alberto; Arviset, Christophe; Hernández, José; Dowson, John; Osuna, Pedro

    The ESA's Infrared Space Observatory (ISO), an unprecedented observatory for infrared astronomy launched in November 1995, successfully made nearly 30,000 scientific observations in its 2.5-year mission. The ISO data can be retrieved from the ISO Data Archive, available at ISO Data Archive , and comprised of about 150,000 observations, including parallel and serendipity mode observations. A user-friendly Java interface permits queries to the database and data retrieval. The interface currently offers a wide variety of links to other archives, such as name resolution with NED and SIMBAD, access to electronic articles from ADS and CDS/VizieR, and access to IRAS data. In the past year development has been focused on improving the IDA interoperability with other astronomical archives, either by accessing other relevant archives or by providing direct access to the ISO data for external services. A mechanism of information transfer has been developed, allowing direct query to the IDA via a Java Server Page, returning quick look ISO images and relevant, observation-specific information embedded in an HTML page. This method has been used to link from the CDS/Vizier Data Centre and ADS, and work with IPAC to allow access to the ISO Archive from IRSA, including display capabilities of the observed sky regions onto other mission images, is in progress. Prospects for further links to and from other archives and databases are also addressed.

  20. The HIPPO Project Archive: Carbon Cycle and Greenhouse Gas Data

    NASA Astrophysics Data System (ADS)

    Christensen, S. W.; Aquino, J.; Hook, L.; Williams, S. F.

    2012-12-01

    The HIAPER (NSF/NCAR Gulfstream V Aircraft) Pole-to-Pole Observations (HIPPO) project measured a comprehensive suite of atmospheric trace gases and aerosols pertinent to understanding the global carbon cycle from the surface to the tropopause and approximately pole-to-pole over the Pacific Ocean. Flights took place over five missions during different seasons from 2009 to 2011. Data and documentation are available to the public from two archives: (1) NCAR's Earth Observing Laboratory (EOL) provides complete aircraft and flight operational data, and (2) the U.S. DOE's Carbon Dioxide Information Analysis Center (CDIAC) provides integrated measurement data products. The integrated products are more generally useful for secondary analyses. Data processing is nearing completion, although improvements to the data will continue to evolve and analyses will continue many years into the future. Periodic new releases of integrated measurement (merged) products will be generated by EOL when individual measurement data have been updated as directed by the Lead Principal Investigator. The EOL and CDIAC archives will share documentation and supplemental links and will ensure that the latest versions of data products are available to users of both archives. The EOL archive (http://www.eol.ucar.edu/projects/hippo/) provides the underlying investigator-provided data, including supporting data sets (e.g. operational satellite, model output, global observations, etc.), and ancillary flight operational information including field catalogs, data quality reports, software, documentation, publications, photos/imagery, and other detailed information about the HIPPO missions. The CDIAC archive provides integrated measurement data products, user documentation, and metadata through the HIPPO website (http://hippo.ornl.gov). These merged products were derived by consistently combining the aircraft state parameters for position, time, temperature, pressure, and wind speed with meteorological, atmospheric chemistry and aerosol measurements made by several teams of investigators. Files are in ASCII text format. Selected data products have been loaded into a relational database for customized data subsetting and export formatting. We anticipate adding model-generated products to the archive. Metadata records have been compiled into a searchable CDIAC index and have been submitted to climate change research metadata clearinghouses (e.g., GCMD). Each data product is given a complete bibliographic citation and a persistent identifier (DOI) to facilitate attribution and access. A data policy was adopted that balances the needs of the project investigators with the interests of the scientific user community.

  1. Mars Observer data production, transfer, and archival: The data production assembly line

    NASA Technical Reports Server (NTRS)

    Childs, David B.

    1993-01-01

    This paper describes the data production, transfer, and archival process designed for the Mars Observer Flight Project. It addresses the developmental and operational aspects of the archive collection production process. The developmental aspects cover the design and packaging of data products for archival and distribution to the planetary community. Also discussed is the design and development of a data transfer and volume production process capable of handling the large throughput and complexity of the Mars Observer data products. The operational aspects cover the main functions of the process: creating data and engineering products, collecting the data products and ancillary products in a central repository, producing archive volumes, validating volumes, archiving, and distributing the data to the planetary community.

  2. The Planetary Data System Information Model for Geometry Metadata

    NASA Astrophysics Data System (ADS)

    Guinness, E. A.; Gordon, M. K.

    2014-12-01

    The NASA Planetary Data System (PDS) has recently developed a new set of archiving standards based on a rigorously defined information model. An important part of the new PDS information model is the model for geometry metadata, which includes, for example, attributes of the lighting and viewing angles of observations, position and velocity vectors of a spacecraft relative to Sun and observing body at the time of observation and the location and orientation of an observation on the target. The PDS geometry model is based on requirements gathered from the planetary research community, data producers, and software engineers who build search tools. A key requirement for the model is that it fully supports the breadth of PDS archives that include a wide range of data types from missions and instruments observing many types of solar system bodies such as planets, ring systems, and smaller bodies (moons, comets, and asteroids). Thus, important design aspects of the geometry model are that it standardizes the definition of the geometry attributes and provides consistency of geometry metadata across planetary science disciplines. The model specification also includes parameters so that the context of values can be unambiguously interpreted. For example, the reference frame used for specifying geographic locations on a planetary body is explicitly included with the other geometry metadata parameters. The structure and content of the new PDS geometry model is designed to enable both science analysis and efficient development of search tools. The geometry model is implemented in XML, as is the main PDS information model, and uses XML schema for validation. The initial version of the geometry model is focused on geometry for remote sensing observations conducted by flyby and orbiting spacecraft. Future releases of the PDS geometry model will be expanded to include metadata for landed and rover spacecraft.

  3. L'archivage a long terme de la maquette numerique trois-dimensionnelle annotee

    NASA Astrophysics Data System (ADS)

    Kheddouci, Fawzi

    The use of engineering drawings in the development of mechanical products, including the exchange of engineering data as well as for archiving, is common industry practice. Traditionally, paper has been the mean to deliver those needs. However, these practices have evolved in favour of computerized tools and methods for the creation, diffusion and preservation of data involved in the process of developing aeronautical products characterized by life cycles that can exceed 70 years. Therefore, it is necessary to redefine how to maintain this data in a context whereby engineering drawings are being replaced by the 3D annotated digital mock-up. This thesis addresses the issue of long-term archiving of 3D annotated digital mock-ups, which includes geometric and dimensional tolerances, as well as other notes and specifications, in compliance with the requirements formulated by the aviation industry including regulatory and legal requirements. First, we review the requirements imposed by the aviation industry in the context of long-term archiving of 3D annotated digital mock-ups. We then consider alternative solutions. We begin by identifying the theoretical approach behind the choice of a conceptual model for digital long-term archiving. Then we evaluate, among the proposed alternatives, an archiving format that will guarantee the preservation of the integrity of the 3D annotated model (geometry, tolerances and other metadata) and its sustainability. The evaluation of 3D PDF PRC as a potential archiving format is carried out on a sample of 185 3D CATIA V5 models (parts and assemblies) provided by industrial partners. This evaluation is guided by a set of criteria including the transfer of geometry, 3D annotations, views, captures and parts positioning in assembly. The results indicate that maintaining the exact geometry is done successfully when transferring CATIA V5 models to 3D PDF PRC. Concerning the transfer of 3D annotations, we observed degradation associated with their display on the 3D model. This problem can, however, be solved by performing the conversion of the native model to STEP first, and then to 3D PDF PRC. In view of current tools, PDF 3D PRC is considered as a potential solution for long-term archiving of 3D annotated models for individual parts. However, this solution is currently not deemed adequate for archiving assemblies. The practice of 2D drawing will thus remain, in the short term, for assemblies.

  4. A comprehensive cost model for NASA data archiving

    NASA Technical Reports Server (NTRS)

    Green, J. L.; Klenk, K. F.; Treinish, L. A.

    1990-01-01

    A simple archive cost model has been developed to help predict NASA's archiving costs. The model covers data management activities from the beginning of the mission through launch, acquisition, and support of retrospective users by the long-term archive; it is capable of determining the life cycle costs for archived data depending on how the data need to be managed to meet user requirements. The model, which currently contains 48 equations with a menu-driven user interface, is available for use on an IBM PC or AT.

  5. Between Oais and Agile a Dynamic Data Management Approach

    NASA Astrophysics Data System (ADS)

    Bennett, V. L.; Conway, E. A.; Waterfall, A. M.; Pepler, S.

    2015-12-01

    In this paper we decribe an approach to the integration of existing archival activities which lies between compliance with the more rigid OAIS/TRAC standards and a more flexible "Agile" approach to the curation and preservation of Earth Observation data. We provide a high level overview of existing practice and discuss how these procedures can be extended and supported through the description of preservation state. The aim of which is to facilitate the dynamic controlled management of scientific data through its lifecycle. While processes are considered they are not statically defined but rather driven by human interactions in the form of risk management/review procedure that produce actionable plans, which are responsive to change. We then proceed by describing the feasibility testing of extended risk management and planning procedures which integrate current practices. This was done through the CEDA Archival Format Audit which inspected British Atmospheric Data Centre and NERC Earth Observation Data Centre Archival holdings. These holdings are extensive, comprising of around 2 Petabytes of data and 137 million individual files, which were analysed and characterised in terms of format, based risk. We are then able to present an overview of the format based risk burden faced by a large scale archive attempting to maintain the usability of heterogeneous environmental data sets We continue by presenting a dynamic data management information model and provide discussion of the following core model entities and their relationships: Aspirational entities, which include Data Entity definitions and their associated Preservation Objectives. Risk entities, which act as drivers for change within the data lifecycle. These include Acquisitional Risks, Technical Risks, Strategic Risks and External Risks Plan entities, which detail the actions to bring about change within an archive. These include Acquisition Plans, Preservation Plans and Monitoring plans which support responsive interactions with the community. The Result entities describe the outcomes of the plans. This includes Acquisitions. Mitigations and Accepted Risks. With risk acceptance permitting imperfect but functional solutions that can be realistically supported within an archives resource levels

  6. Metadata and Buckets in the Smart Object, Dumb Archive (SODA) Model

    NASA Technical Reports Server (NTRS)

    Nelson, Michael L.; Maly, Kurt; Croom, Delwin R., Jr.; Robbins, Steven W.

    2004-01-01

    We present the Smart Object, Dumb Archive (SODA) model for digital libraries (DLs), and discuss the role of metadata in SODA. The premise of the SODA model is to "push down" many of the functionalities generally associated with archives into the data objects themselves. Thus the data objects become "smarter", and the archives "dumber". In the SODA model, archives become primarily set managers, and the objects themselves negotiate and handle presentation, enforce terms and conditions, and perform data content management. Buckets are our implementation of smart objects, and da is our reference implementation for dumb archives. We also present our approach to metadata translation for buckets.

  7. The Global Streamflow Indices and Metadata archive (G-SIM): A compilation of global streamflow time series indices and meta-data

    NASA Astrophysics Data System (ADS)

    Do, Hong; Gudmundsson, Lukas; Leonard, Michael; Westra, Seth; Senerivatne, Sonia

    2017-04-01

    In-situ observations of daily streamflow with global coverage are a crucial asset for understanding large-scale freshwater resources which are an essential component of the Earth system and a prerequisite for societal development. Here we present the Global Streamflow Indices and Metadata archive (G-SIM), a collection indices derived from more than 20,000 daily streamflow time series across the globe. These indices are designed to support global assessments of change in wet and dry extremes, and have been compiled from 12 free-to-access online databases (seven national databases and five international collections). The G-SIM archive also includes significant metadata to help support detailed understanding of streamflow dynamics, with the inclusion of drainage area shapefile and many essential catchment properties such as land cover type, soil and topographic characteristics. The automated procedure in data handling and quality control of the project makes G-SIM a reproducible, extendible archive and can be utilised for many purposes in large-scale hydrology. Some potential applications include the identification of observational trends in hydrological extremes, the assessment of climate change impacts on streamflow regimes, and the validation of global hydrological models.

  8. HotGAS: A Public Archive of Ready-to-Go Chandra High Energy Grating Spectral Products for Active Galactic Nuclei

    NASA Astrophysics Data System (ADS)

    Yaqoob, T.

    2005-12-01

    We describe a public WWW archive (HotGAS) containing data products from Chandra observations using the High Energy Grating Spectrometer (HETGS). Spectral products are available from the archive in various formats and are suitable for use by non-experts and experts alike. Lightcurves and cross-dispersion profiles are also available. Easy and user-friendly access for non X-ray astronomers to reprocessed, publishable quality grating data products should help to promote inter-disciplinary and multi-wavelength research on active galactic nuclei (AGN). The archive will also be useful to X-ray astronomers who have not yet had experience with high resolution X-ray spectroscopy, as well as experienced X-ray astronomers who need quick access to clean and ready-to-go data products. Theoreticians may find the archive useful for testing their models without having to deal with the fine details of data processing and reduction. We also anticipate that the archive will be useful for training graduate students in high-resolution X-ray spectroscopy and for providing a resource for projects for high-school and graduate students. We plan to eventually expand the archive to include AGN data from the Chandra Low Energy Grating Spectrometer (LETGS), and the XMM-Newton Reflection-Grating Spectrometer (RGS). Further in the future we plan to extend the archive to include data from other astrophysical sources aside from AGN. The project thus far is funded by an archival Chandra grant.

  9. Archive Management of NASA Earth Observation Data to Support Cloud Analysis

    NASA Technical Reports Server (NTRS)

    Lynnes, Christopher; Baynes, Kathleen; McInerney, Mark

    2017-01-01

    NASA collects, processes and distributes petabytes of Earth Observation (EO) data from satellites, aircraft, in situ instruments and model output, with an order of magnitude increase expected by 2024. Cloud-based web object storage (WOS) of these data can simplify the execution of such an increase. More importantly, it can also facilitate user analysis of those volumes by making the data available to the massively parallel computing power in the cloud. However, storing EO data in cloud WOS has a ripple effect throughout the NASA archive system with unexpected challenges and opportunities. One challenge is modifying data servicing software (such as Web Coverage Service servers) to access and subset data that are no longer on a directly accessible file system, but rather in cloud WOS. Opportunities include refactoring of the archive software to a cloud-native architecture; virtualizing data products by computing on demand; and reorganizing data to be more analysis-friendly. Reviewed by Mark McInerney ESDIS Deputy Project Manager.

  10. Aeolian Dunes: New High-Resolution Archives of Past Wind-Intensity and -Direction

    NASA Astrophysics Data System (ADS)

    Lindhorst, S.; Betzler, C.

    2017-12-01

    The understanding of the long-term variability of local wind-fields is most relevant for calibrating climate models and for the prediction of the socio-economic consequences of climate change. Continuous instrumental-based weather observations go back less than two centuries; aeolian dunes, however, contain an archive of past wind-field fluctuations which is basically unread. We present new ways to reconstruct annual to seasonal changes of wind intensity and predominant wind direction from dune-sediment composition and -geometries based on ground-penetrating radar (GPR) data, grain-size analyses and different age-dating approaches. Resulting proxy-based data series on wind are validated against instrumental based weather observations. Our approach can be applied to both recent as well as fossil dunes. Potential applications include the validation of climate models, the reconstruction of past supra-regional wind systems and the monitoring of future shifts in the climate system.

  11. NASA's SPICE System Models the Solar System

    NASA Technical Reports Server (NTRS)

    Acton, Charles

    1996-01-01

    SPICE is NASA's multimission, multidiscipline information system for assembling, distributing, archiving, and accessing space science geometry and related data used by scientists and engineers for mission design and mission evaluation, detailed observation planning, mission operations, and science data analysis.

  12. An Ontology-Based Archive Information Model for the Planetary Science Community

    NASA Technical Reports Server (NTRS)

    Hughes, J. Steven; Crichton, Daniel J.; Mattmann, Chris

    2008-01-01

    The Planetary Data System (PDS) information model is a mature but complex model that has been used to capture over 30 years of planetary science data for the PDS archive. As the de-facto information model for the planetary science data archive, it is being adopted by the International Planetary Data Alliance (IPDA) as their archive data standard. However, after seventeen years of evolutionary change the model needs refinement. First a formal specification is needed to explicitly capture the model in a commonly accepted data engineering notation. Second, the core and essential elements of the model need to be identified to help simplify the overall archive process. A team of PDS technical staff members have captured the PDS information model in an ontology modeling tool. Using the resulting knowledge-base, work continues to identify the core elements, identify problems and issues, and then test proposed modifications to the model. The final deliverables of this work will include specifications for the next generation PDS information model and the initial set of IPDA archive data standards. Having the information model captured in an ontology modeling tool also makes the model suitable for use by Semantic Web applications.

  13. New Developments At The Science Archives Of The NASA Exoplanet Science Institute

    NASA Astrophysics Data System (ADS)

    Berriman, G. Bruce

    2018-06-01

    The NASA Exoplanet Science Institute (NExScI) at Caltech/IPAC is the science center for NASA's Exoplanet Exploration Program and as such, NExScI operates three scientific archives: the NASA Exoplanet Archive (NEA) and Exoplanet Follow-up Observation Program Website (ExoFOP), and the Keck Observatory Archive (KOA).The NASA Exoplanet Archive supports research and mission planning by the exoplanet community by operating a service that provides confirmed and candidate planets, numerous project and contributed data sets and integrated analysis tools. The ExoFOP provides an environment for exoplanet observers to share and exchange data, observing notes, and information regarding the Kepler, K2, and TESS candidates. KOA serves all raw science and calibration observations acquired by all active and decommissioned instruments at the W. M. Keck Observatory, as well as reduced data sets contributed by Keck observers.In the coming years, the NExScI archives will support a series of major endeavours allowing flexible, interactive analysis of the data available at the archives. These endeavours exploit a common infrastructure based upon modern interfaces such as JuypterLab and Python. The first service will enable reduction and analysis of precision radial velocity data from the HIRES Keck instrument. The Exoplanet Archive is developing a JuypterLab environment based on the HIRES PRV interactive environment. Additionally, KOA is supporting an Observatory initiative to develop modern, Python based pipelines, and as part of this work, it has delivered a NIRSPEC reduction pipeline. The ensemble of pipelines will be accessible through the same environments.

  14. Building a Trustworthy Environmental Science Data Repository: Lessons Learned from the ORNL DAAC

    NASA Astrophysics Data System (ADS)

    Wei, Y.; Santhana Vannan, S. K.; Boyer, A.; Beaty, T.; Deb, D.; Hook, L.

    2017-12-01

    The Oak Ridge National Laboratory Distributed Active Archive Center (ORNL DAAC, https://daac.ornl.gov) for biogeochemical dynamics is one of NASA's Earth Observing System Data and Information System (EOSDIS) data centers. The mission of the ORNL DAAC is to assemble, distribute, and provide data services for a comprehensive archive of terrestrial biogeochemistry and ecological dynamics observations and models to facilitate research, education, and decision-making in support of NASA's Earth Science. Since its establishment in 1994, ORNL DAAC has been continuously building itself into a trustworthy environmental science data repository by not only ensuring the quality and usability of its data holdings, but also optimizing its data publication and management process. This paper describes the lessons learned from ORNL DAAC's effort toward this goal. ORNL DAAC has been proactively implementing international community standards throughout its data management life cycle, including data publication, preservation, discovery, visualization, and distribution. Data files in standard formats, detailed documentation, and metadata following standard models are prepared to improve the usability and longevity of data products. Assignment of a Digital Object Identifier (DOI) ensures the identifiability and accessibility of every data product, including the different versions and revisions of its life cycle. ORNL DAAC's data citation policy assures data producers receive appropriate recognition of use of their products. Web service standards, such as OpenSearch and Open Geospatial Consortium (OGC), promotes the discovery, visualization, distribution, and integration of ORNL DAAC's data holdings. Recently, ORNL DAAC began efforts to optimize and standardize its data archival and data publication workflows, to improve the efficiency and transparency of its data archival and management processes.

  15. GEOSS interoperability for Weather, Ocean and Water

    NASA Astrophysics Data System (ADS)

    Richardson, David; Nyenhuis, Michael; Zsoter, Ervin; Pappenberger, Florian

    2013-04-01

    "Understanding the Earth system — its weather, climate, oceans, atmosphere, water, land, geodynamics, natural resources, ecosystems, and natural and human-induced hazards — is crucial to enhancing human health, safety and welfare, alleviating human suffering including poverty, protecting the global environment, reducing disaster losses, and achieving sustainable development. Observations of the Earth system constitute critical input for advancing this understanding." With this in mind, the Group on Earth Observations (GEO) started implementing the Global Earth Observation System of Systems (GEOSS). GEOWOW, short for "GEOSS interoperability for Weather, Ocean and Water", is supporting this objective. GEOWOW's main challenge is to improve Earth observation data discovery, accessibility and exploitability, and to evolve GEOSS in terms of interoperability, standardization and functionality. One of the main goals behind the GEOWOW project is to demonstrate the value of the TIGGE archive in interdisciplinary applications, providing a vast amount of useful and easily accessible information to the users through the GEO Common Infrastructure (GCI). GEOWOW aims at developing funcionalities that will allow easy discovery, access and use of TIGGE archive data and of in-situ observations, e.g. from the Global Runoff Data Centre (GRDC), to support applications such as river discharge forecasting.TIGGE (THORPEX Interactive Grand Global Ensemble) is a key component of THORPEX: a World Weather Research Programme to accelerate the improvements in the accuracy of 1-day to 2 week high-impact weather forecasts for the benefit of humanity. The TIGGE archive consists of ensemble weather forecast data from ten global NWP centres, starting from October 2006, which has been made available for scientific research. The TIGGE archive has been used to analyse hydro-meteorological forecasts of flooding in Europe as well as in China. In general the analysis has been favourable in terms of forecast skill and concluded that the use of a multi-model forecast is beneficial. Long term analysis of individual centres, such as the European Centre for Medium-Range Weather Forecasts (ECMWF), has been conducted in the past. However, no long term and large scale study has been performed so far with inclusion of different global numerical models. Here we present some initial results from such a study.

  16. 2011 Tohoku, Japan tsunami data available from the National Oceanic and Atmospheric Administration/National Geophysical Data Center

    NASA Astrophysics Data System (ADS)

    Dunbar, P. K.; Mccullough, H. L.; Mungov, G.; Harris, E.

    2012-12-01

    The U.S. National Oceanic and Atmospheric Administration (NOAA) has primary responsibility for providing tsunami warnings to the Nation, and a leadership role in tsunami observations and research. A key component of this effort is easy access to authoritative data on past tsunamis, a responsibility of the National Geophysical Data Center (NGDC) and collocated World Service for Geophysics. Archive responsibilities include the global historical tsunami database, coastal tide-gauge data from US/NOAA operated stations, the Deep-ocean Assessment and Reporting of Tsunami (DART®) data, damage photos, as well as other related hazards data. Taken together, this integrated archive supports tsunami forecast, warning, research, mitigation and education efforts of NOAA and the Nation. Understanding the severity and timing of tsunami effects is important for tsunami hazard mitigation and warning. The global historical tsunami database includes the date, time, and location of the source event, magnitude of the source, event validity, maximum wave height, the total number of fatalities and dollar damage. The database contains additional information on run-ups (locations where tsunami waves were observed by eyewitnesses, field reconnaissance surveys, tide gauges, or deep ocean sensors). The run-up table includes arrival times, distance from the source, measurement type, maximum wave height, and the number of fatalities and damage for the specific run-up location. Tide gauge data are required for modeling the interaction of tsunami waves with the coast and for verifying propagation and inundation models. NGDC is the long-term archive for all NOAA coastal tide gauge data and is currently archiving 15-second to 1-minute water level data from the NOAA Center for Operational Oceanographic Products and Services (CO-OPS) and the NOAA Tsunami Warning Centers. DART® buoys, which are essential components of tsunami warning systems, are now deployed in all oceans, giving coastal communities faster and more accurate tsunami warnings. NOAA's National Data Buoy Center disseminates real-time DART® data and NGDC processes and archives post-event 15-second high-resolution bottom pressure time series data. An event-specific archive of DART® observations recorded during recent significant tsunamis, including the March 2011 Tohoku, Japan event, are now available through new tsunami event pages integrated with the NGDC global historical tsunami database. These pages are developed to deliver comprehensive summaries of each tsunami event, including socio-economic impacts, tsunami travel time maps, raw observations, de-tided residuals, spectra of the tsunami signal compared to the energy of the background noise, and wavelets. These data are invaluable to tsunami researchers and educators as they are essential to providing a more thorough understanding of tsunamis and their propagation in the open ocean and subsequent inundation of coastal communities. NGDC has collected 289 tide gauge observations, 34 Deep-ocean Assessment and Reporting of Tsunami (DART®) and bottom pressure recorder (BPR) station observations, and over 5,000 eyewitness reports and post-tsunami field survey measurements for the 2011 Tohoku event.

  17. HST archive primer, version 4.1

    NASA Technical Reports Server (NTRS)

    Fruchter, A. (Editor); Baum, S. (Editor)

    1994-01-01

    This version of the HST Archive Primer provides the basic information a user needs to know to access the HST archive via StarView the new user interface to the archive. Using StarView, users can search for observations interest, find calibration reference files, and retrieve data from the archive. Both the terminal version of StarView and the X-windows version feature a name resolver which simplifies searches of the HST archive based on target name. In addition, the X-windows version of StarView allows preview of all public HST data; compressed versions of public images are displayed via SAOIMAGE, while spectra are plotted using the public plotting package, XMGR. Finally, the version of StarView described here features screens designed for observers preparing Cycle 5 HST proposals.

  18. COMBINE archive and OMEX format: one file to share all information to reproduce a modeling project.

    PubMed

    Bergmann, Frank T; Adams, Richard; Moodie, Stuart; Cooper, Jonathan; Glont, Mihai; Golebiewski, Martin; Hucka, Michael; Laibe, Camille; Miller, Andrew K; Nickerson, David P; Olivier, Brett G; Rodriguez, Nicolas; Sauro, Herbert M; Scharm, Martin; Soiland-Reyes, Stian; Waltemath, Dagmar; Yvon, Florent; Le Novère, Nicolas

    2014-12-14

    With the ever increasing use of computational models in the biosciences, the need to share models and reproduce the results of published studies efficiently and easily is becoming more important. To this end, various standards have been proposed that can be used to describe models, simulations, data or other essential information in a consistent fashion. These constitute various separate components required to reproduce a given published scientific result. We describe the Open Modeling EXchange format (OMEX). Together with the use of other standard formats from the Computational Modeling in Biology Network (COMBINE), OMEX is the basis of the COMBINE Archive, a single file that supports the exchange of all the information necessary for a modeling and simulation experiment in biology. An OMEX file is a ZIP container that includes a manifest file, listing the content of the archive, an optional metadata file adding information about the archive and its content, and the files describing the model. The content of a COMBINE Archive consists of files encoded in COMBINE standards whenever possible, but may include additional files defined by an Internet Media Type. Several tools that support the COMBINE Archive are available, either as independent libraries or embedded in modeling software. The COMBINE Archive facilitates the reproduction of modeling and simulation experiments in biology by embedding all the relevant information in one file. Having all the information stored and exchanged at once also helps in building activity logs and audit trails. We anticipate that the COMBINE Archive will become a significant help for modellers, as the domain moves to larger, more complex experiments such as multi-scale models of organs, digital organisms, and bioengineering.

  19. Building Community Around Hydrologic Data Models Within CUAHSI

    NASA Astrophysics Data System (ADS)

    Maidment, D.

    2007-12-01

    The Consortium of Universities for the Advancement of Hydrologic Science, Inc (CUAHSI) has a Hydrologic Information Systems project which aims to provide better data access and capacity for data synthesis for the nation's water information, both that collected by academic investigators and that collected by water agencies. These data include observations of streamflow, water quality, groundwater levels, weather and climate and aquatic biology. Each water agency or research investigator has a unique method of formatting their data (syntactic heterogeneity) and describing their variables (semantic heterogeneity). The result is a large agglomeration of data in many formats and descriptions whose full content is hard to interpret and analyze. CUAHSI is helping to resolve syntactic heterogeneity through the development of WaterML, a standard XML markup language for communicating water observations data through web services, and a standard relational database structure for archiving data called the Observations Data Model. Variables in these data archiving and communicating systems are indexed against a controlled vocabulary of descriptive terms to provide the capacity to synthesize common data types from disparate data sources.

  20. Status of the TESS Science Processing Operations Center

    NASA Astrophysics Data System (ADS)

    Jenkins, Jon Michael; Caldwell, Douglas A.; Davies, Misty; Li, Jie; Morris, Robert L.; Rose, Mark; Smith, Jeffrey C.; Tenenbaum, Peter; Ting, Eric; Twicken, Joseph D.; Wohler, Bill

    2018-06-01

    The Transiting Exoplanet Survey Satellite (TESS) was selected by NASA’s Explorer Program to conduct a search for Earth’s closest cousins starting in 2018. TESS will conduct an all-sky transit survey of F, G and K dwarf stars between 4 and 12 magnitudes and M dwarf stars within 200 light years. TESS is expected to discover 1,000 small planets less than twice the size of Earth, and to measure the masses of at least 50 of these small worlds. The TESS science pipeline is being developed by the Science Processing Operations Center (SPOC) at NASA Ames Research Center based on the highly successful Kepler science pipeline. Like the Kepler pipeline, the TESS pipeline provides calibrated pixels, simple and systematic error-corrected aperture photometry, and centroid locations for all 200,000+ target stars observed over the 2-year mission, along with associated uncertainties. The pixel and light curve products are modeled on the Kepler archive products and will be archived to the Mikulski Archive for Space Telescopes (MAST). In addition to the nominal science data, the 30-minute Full Frame Images (FFIs) simultaneously collected by TESS will also be calibrated by the SPOC and archived at MAST. The TESS pipeline searches through all light curves for evidence of transits that occur when a planet crosses the disk of its host star. The Data Validation pipeline generates a suite of diagnostic metrics for each transit-like signature, and then extracts planetary parameters by fitting a limb-darkened transit model to each potential planetary signature. The results of the transit search are modeled on the Kepler transit search products (tabulated numerical results, time series products, and pdf reports) all of which will be archived to MAST. Synthetic sample data products are available at https://archive.stsci.edu/tess/ete-6.html.Funding for the TESS Mission has been provided by the NASA Science Mission Directorate.

  1. Archive of observations of periodic comet Crommelin made during its 1983-84 apparition

    NASA Technical Reports Server (NTRS)

    Sekanina, Z. (Editor); Aronsson, M.

    1985-01-01

    This is an archive of 680 reduced observations of Periodic Comet Crommelin made during its 1984 apparition. The archive integrates reports by members of the eight networks of the International Halley Watch (IHW) and presents the results of a trial run designed to test the preparedness of the IHW organization for the current apparition of Periodic Comet Halley.

  2. The role of ocean climate data in operational Naval oceanography

    NASA Technical Reports Server (NTRS)

    Chesbrough, Radm G.

    1992-01-01

    Local application of global-scale models describes the U.S. Navy's basic philosophy for operational oceanography in support of fleet operations. Real-time data, climatologies, coupled air/ocean models, and large scale computers are the essential components of the Navy's system for providing the war fighters with the performance predictions and tactical decision aids they need to operate safely and efficiently. In peacetime, these oceanographic predictions are important for safety of navigation and flight. The paucity and uneven distribution of real-time data mean we have to fall back on climatology to provide the basic data to operate our models. The Navy is both a producer and user of climatologies; it provides observations to the national archives and in turn employs data from these archives to establish data bases. Suggestions for future improvements to ocean climate data are offered.

  3. MODIS land data at the EROS data center DAAC

    USGS Publications Warehouse

    Jenkerson, Calli B.; Reed, B.C.

    2001-01-01

    The US Geological Survey's (USGS) Earth Resources Observation Systems (EROS) Data Center (EDC) in Sioux Falls, SD, USA, is the primary national archive for land processes data and one of the National Aeronautics and Space Administration's (NASA) Distributed Active Archive Centers (DAAC) for the Earth Observing System (EOS). One of EDC's functions as a DAAC is the archival and distribution of Moderate Resolution Spectroradiometer (MODIS) Land Data collected from the Earth Observing System (EOS) satellite Terra. More than 500,000 publicly available MODIS land data granules totaling 25 Terabytes (Tb) are currently stored in the EDC archive. This collection is managed, archived, and distributed by EOS Data and Information System (EOSDIS) Core System (ECS) at EDC. EDC User Services support the use of MODIS Land data, which include land surface reflectance/albedo, temperature/emissivity, vegetation characteristics, and land cover, by responding to user inquiries, constructing user information sites on the EDC web page, and presenting MODIS materials worldwide.

  4. Queuing Models of Tertiary Storage

    NASA Technical Reports Server (NTRS)

    Johnson, Theodore

    1996-01-01

    Large scale scientific projects generate and use large amounts of data. For example, the NASA Earth Observation System Data and Information System (EOSDIS) project is expected to archive one petabyte per year of raw satellite data. This data is made automatically available for processing into higher level data products and for dissemination to the scientific community. Such large volumes of data can only be stored in robotic storage libraries (RSL's) for near-line access. A characteristic of RSL's is the use of a robot arm that transfers media between a storage rack and the read/write drives, thus multiplying the capacity of the system. The performance of the RSL's can be a critical limiting factor for the performance of the archive system. However, the many interacting components of an RSL make a performance analysis difficult. In addition, different RSL components can have widely varying performance characteristics. This paper describes our work to develop performance models of an RSL in isolation. Next we show how the RSL model can be incorporated into a queuing network model. We use the models to make some example performance studies of archive systems. The models described in this paper, developed for the NASA EODIS project, are implemented in C with a well defined interface. The source code, accompanying documentation, and also sample JAVA applets are available at: http://www.cis.ufl.edu/ted/

  5. Ultraviolet Spectra of Subluminous Objects Found in the Kiso Schmidt Survey and Systematic Reanalysis of the Archived Ultraviolet Spectra of White Dwarfs Observed with the IUE Satellite Under the Astrophysics Data Program (ADP)

    NASA Technical Reports Server (NTRS)

    Wegner, Gary A.

    1988-01-01

    Recent research under NASA grant NAG5-971 consisted of the performance of two projects in conjunction with the International Ultraviolet Explorer (IUE) satellites. These are: (1) to look at the ultraviolet spectra of subluminous stars identified from visual wavelength spectroscopy that had been originally discovered from the Kiso Schmidt survey for ultraviolet excess stars and (2) to carry out a systematic reanalysis of the archived IUE spectra of white dwarfs. This report presents information on the progress of the re-reduction of over 600 IUE white dwarf spectra and their subsequent analysis employing model atmospheres and the observation of the Kiso ultraviolet excess stars.

  6. Wave Processes in Arctic Seas, Observed from TerraSAR-X

    DTIC Science & Technology

    2015-09-30

    Susanne Lehner German Aerospace Center Maritime Safety and Security Lab Henrich-Focke-Str. 4 28199 Bremen Germany phone: 0049 421/ 24420...of high resolution sea state forecast models in the German Bight, The International Archives of the Photogrammetry, Remote Sensing and Spatial

  7. Interpreting the Latitudinal Structure of Differences Between Modeled and Observed Temperature Trends (Invited)

    NASA Astrophysics Data System (ADS)

    Santer, B. D.; Mears, C. A.; Gleckler, P. J.; Solomon, S.; Wigley, T.; Arblaster, J.; Cai, W.; Gillett, N. P.; Ivanova, D. P.; Karl, T. R.; Lanzante, J.; Meehl, G. A.; Stott, P.; Taylor, K. E.; Thorne, P.; Wehner, M. F.; Zou, C.

    2010-12-01

    We perform the most comprehensive comparison to date of simulated and observed temperature trends. Comparisons are made for different latitude bands, timescales, and temperature variables, using information from a multi-model archive and a variety of observational datasets. Our focus is on temperature changes in the lower troposphere (TLT), the mid- to upper troposphere (TMT), and at the sea surface (SST). For SST, TLT, and TMT, trend comparisons over the satellite era (1979 to 2009) always yield closest agreement in mid-latitudes of the Northern Hemisphere. There are pronounced discrepancies in the tropics and in the Southern Hemisphere: in both regions, the multi-model average warming is consistently larger than observed. At high latitudes in the Northern Hemisphere, the observed tropospheric warming exceeds multi-model average trends. The similarity in the latitudinal structure of this discrepancy pattern across different temperature variables and observational data sets suggests that these trend differences are real, and are not due to residual inhomogeneities in the observations. The interpretation of these results is hampered by the fact that the CMIP-3 multi-model archive analyzed here convolves errors in key external forcings with errors in the model response to forcing. Under a "forcing error" interpretation, model-average temperature trends in the Southern Hemisphere extratropics are biased warm because many models neglect (and/or inaccurately specify) changes in stratospheric ozone and the indirect effects of aerosols. An alternative "response error" explanation for the model trend errors is that there are fundamental problems with model clouds and ocean heat uptake over the Southern Ocean. When SST changes are compared over the longer period 1950 to 2009, there is close agreement between simulated and observed trends poleward of 50°S. This result is difficult to reconcile with the hypothesis that the trend discrepancies over 1979 to 2009 are primarily attributable to response errors. Our results suggest that biases in multi-model average temperature trends over the satellite era can be plausibly linked to forcing errors. Better partitioning of the forcing and response components of model errors will require a systematic program of numerical experimentation, with a focus on exploring the climate response to uncertainties in key historical forcings.

  8. An enhanced archive facilitating climate impacts analysis

    USGS Publications Warehouse

    Maurer, E.P.; Brekke, L.; Pruitt, T.; Thrasher, B.; Long, J.; Duffy, P.; Dettinger, M.; Cayan, D.; Arnold, J.

    2014-01-01

    We describe the expansion of a publicly available archive of downscaled climate and hydrology projections for the United States. Those studying or planning to adapt to future climate impacts demand downscaled climate model output for local or regional use. The archive we describe attempts to fulfill this need by providing data in several formats, selectable to meet user needs. Our archive has served as a resource for climate impacts modelers, water managers, educators, and others. Over 1,400 individuals have transferred more than 50 TB of data from the archive. In response to user demands, the archive has expanded from monthly downscaled data to include daily data to facilitate investigations of phenomena sensitive to daily to monthly temperature and precipitation, including extremes in these quantities. New developments include downscaled output from the new Coupled Model Intercomparison Project phase 5 (CMIP5) climate model simulations at both the monthly and daily time scales, as well as simulations of surface hydrologi- cal variables. The web interface allows the extraction of individual projections or ensemble statistics for user-defined regions, promoting the rapid assessment of model consensus and uncertainty for future projections of precipitation, temperature, and hydrology. The archive is accessible online (http://gdo-dcp.ucllnl.org/downscaled_ cmip_projections).

  9. Application service provider (ASP) financial models for off-site PACS archiving

    NASA Astrophysics Data System (ADS)

    Ratib, Osman M.; Liu, Brent J.; McCoy, J. Michael; Enzmann, Dieter R.

    2003-05-01

    For the replacement of its legacy Picture Archiving and Communication Systems (approx. annual workload of 300,000 procedures), UCLA Medical Center has evaluated and adopted an off-site data-warehousing solution based on an ASP financial with a one-time single payment per study archived. Different financial models for long-term data archive services were compared to the traditional capital/operational costs of on-site digital archives. Total cost of ownership (TCO), including direct and indirect expenses and savings, were compared for each model. Financial parameters were considered: logistic/operational advantages and disadvantages of ASP models versus traditional archiving systems. Our initial analysis demonstrated that the traditional linear ASP business model for data storage was unsuitable for large institutions. The overall cost markedly exceeds the TCO of an in-house archive infrastructure (when support and maintenance costs are included.) We demonstrated, however, that non-linear ASP pricing models can be cost-effective alternatives for large-scale data storage, particularly if they are based on a scalable off-site data-warehousing service and the prices are adapted to the specific size of a given institution. The added value of ASP is that it does not require iterative data migrations from legacy media to new storage media at regular intervals.

  10. Archiving, processing, and disseminating ASTER products at the USGS EROS Data Center

    USGS Publications Warehouse

    Jones, B.; Tolk, B.; ,

    2002-01-01

    The U.S. Geological Survey EROS Data Center archives, processes, and disseminates Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) data products. The ASTER instrument is one of five sensors onboard the Earth Observing System's Terra satellite launched December 18, 1999. ASTER collects broad spectral coverage with high spatial resolution at near infrared, shortwave infrared, and thermal infrared wavelengths with ground resolutions of 15, 30, and 90 meters, respectively. The ASTER data are used in many ways to understand local and regional earth-surface processes. Applications include land-surface climatology, volcanology, hazards monitoring, geology, agronomy, land cover change, and hydrology. The ASTER data are available for purchase from the ASTER Ground Data System in Japan and from the Land Processes Distributed Active Archive Center in the United States, which receives level 1A and level 1B data from Japan on a routine basis. These products are archived and made available to the public within 48 hours of receipt. The level 1A and level 1B data are used to generate higher level products that include routine and on-demand decorrelation stretch, brightness temperature at the sensor, emissivity, surface reflectance, surface kinetic temperature, surface radiance, polar surface and cloud classification, and digital elevation models. This paper describes the processes and procedures used to archive, process, and disseminate standard and on-demand higher level ASTER products at the Land Processes Distributed Active Archive Center.

  11. A Chandra Search for Coronal X Rays from the Cool White Dwarf GD 356

    NASA Technical Reports Server (NTRS)

    Weisskopf, Martin C.; Wu, Kinwah; Trimble, Virginia; ODell, Stephen L.; Elsner, Ronald F.; Zavlin, Vyacheslav E.; Kouveliotou, Chryssa

    2006-01-01

    We report observations with the Chandra X-ray Observatory of the single, cool, magnetic white dwarf GD 356. For consistent comparison with other X-ray observations of single white dwarfs, we also re-analyzed archival ROSAT data for GD 356 (GJ 1205), G 99-47 (GR 290 = V1201 Ori), GD 90, G 195-19 (EG250 = GJ 339.1), and WD 2316+123 and archival Chandra data for LHS 1038 (GJ 1004) and GD 358 (V777 Her). Our Chandra observation detected no X rays from GD 356, setting the most restrictive upper limit to the X-ray luminosity from any cool white dwarf - Lx less than 6.0 x 10(exp 25) erg s(sup -1), at 99.7% confidence, for a 1- keV thermal-bremsstrahlung spectrum. The corresponding limit to the electron density is no less than 4.4x10(exp 11) per cubic centimeter. Our re-analysis of the archival data confirmed the non-detections reported by the original investigators. We discuss the implications of our and prior observations on models for coronal emission from white dwarfs. For magnetic white dwarfs, we emphasize the more stringent constraints imposed by cyclotron radiation. In addition, we describe (in an appendix) a statistical methodology for detecting a source and for constraining the strength of a source, which applies even when the number of source or background events is small.

  12. The function of the earth observing system - Data information system Distributed Active Archive Centers

    NASA Technical Reports Server (NTRS)

    Lapenta, C. C.

    1992-01-01

    The functionality of the Distributed Active Archive Centers (DAACs) which are significant elements of the Earth Observing System Data and Information System (EOSDIS) is discussed. Each DAAC encompasses the information management system, the data archival and distribution system, and the product generation system. The EOSDIS DAACs are expected to improve the access to earth science data set needed for global change research.

  13. The Growth of the User Community of the La Silla Paranal Observatory Science Archive

    NASA Astrophysics Data System (ADS)

    Romaniello, M.; Arnaboldi, M.; Da Rocha, C.; De Breuck, C.; Delmotte, N.; Dobrzycki, A.; Fourniol, N.; Freudling, W.; Mascetti, L.; Micol, A.; Retzlaff, J.; Sterzik, M.; Sequeiros, I. V.; De Breuck, M. V.

    2016-03-01

    The archive of the La Silla Paranal Observatory has grown steadily into a powerful science resource for the ESO astronomical community. Established in 1998, the Science Archive Facility (SAF) stores both the raw data generated by all ESO instruments and selected processed (science-ready) data. The growth of the SAF user community is analysed through access and publication statistics. Statistics are presented for archival users, who do not contribute to observing proposals, and contrasted with regular and archival users, who are successful in competing for observing time. Archival data from the SAF contribute to about one paper out of four that use data from ESO facilities. This study reveals that the blend of users constitutes a mixture of the traditional ESO community making novel use of the data and of a new community being built around the SAF.

  14. NASA'S Earth Science Data Stewardship Activities

    NASA Technical Reports Server (NTRS)

    Lowe, Dawn R.; Murphy, Kevin J.; Ramapriyan, Hampapuram

    2015-01-01

    NASA has been collecting Earth observation data for over 50 years using instruments on board satellites, aircraft and ground-based systems. With the inception of the Earth Observing System (EOS) Program in 1990, NASA established the Earth Science Data and Information System (ESDIS) Project and initiated development of the Earth Observing System Data and Information System (EOSDIS). A set of Distributed Active Archive Centers (DAACs) was established at locations based on science discipline expertise. Today, EOSDIS consists of 12 DAACs and 12 Science Investigator-led Processing Systems (SIPS), processing data from the EOS missions, as well as the Suomi National Polar Orbiting Partnership mission, and other satellite and airborne missions. The DAACs archive and distribute the vast majority of data from NASA’s Earth science missions, with data holdings exceeding 12 petabytes The data held by EOSDIS are available to all users consistent with NASA’s free and open data policy, which has been in effect since 1990. The EOSDIS archives consist of raw instrument data counts (level 0 data), as well as higher level standard products (e.g., geophysical parameters, products mapped to standard spatio-temporal grids, results of Earth system models using multi-instrument observations, and long time series of Earth System Data Records resulting from multiple satellite observations of a given type of phenomenon). EOSDIS data stewardship responsibilities include ensuring that the data and information content are reliable, of high quality, easily accessible, and usable for as long as they are considered to be of value.

  15. NASDA's earth observation satellite data archive policy for the earth observation data and information system (EOIS)

    NASA Technical Reports Server (NTRS)

    Sobue, Shin-ichi; Yoshida, Fumiyoshi; Ochiai, Osamu

    1996-01-01

    NASDA's new Advanced Earth Observing Satellite (ADEOS) is scheduled for launch in August, 1996. ADEOS carries 8 sensors to observe earth environmental phenomena and sends their data to NASDA, NASA, and other foreign ground stations around the world. The downlink data bit rate for ADEOS is 126 MB/s and the total volume of data is about 100 GB per day. To archive and manage such a large quantity of data with high reliability and easy accessibility it was necessary to develop a new mass storage system with a catalogue information database using advanced database management technology. The data will be archived and maintained in the Master Data Storage Subsystem (MDSS) which is one subsystem in NASDA's new Earth Observation data and Information System (EOIS). The MDSS is based on a SONY ID1 digital tape robotics system. This paper provides an overview of the EOIS system, with a focus on the Master Data Storage Subsystem and the NASDA Earth Observation Center (EOC) archive policy for earth observation satellite data.

  16. Integrating Wind Profiling Radars and Radiosonde Observations with Model Point Data to Develop a Decision Support Tool to Assess Upper-Level Winds for Space Launch

    NASA Technical Reports Server (NTRS)

    Bauman, William H., III; Flinn, Clay

    2013-01-01

    On the day-of-launch, the 45th Weather Squadron (45 WS) Launch Weather Officers (LWOs) monitor the upper-level winds for their launch customers to include NASA's Launch Services Program and NASA's Ground Systems Development and Operations Program. They currently do not have the capability to display and overlay profiles of upper-level observations and numerical weather prediction model forecasts. The LWOs requested the Applied Meteorology Unit (AMU) develop a tool in the form of a graphical user interface (GUI) that will allow them to plot upper-level wind speed and direction observations from the Kennedy Space Center (KSC) 50 MHz tropospheric wind profiling radar, KSC Shuttle Landing Facility 915 MHz boundary layer wind profiling radar and Cape Canaveral Air Force Station (CCAFS) Automated Meteorological Processing System (AMPS) radiosondes, and then overlay forecast wind profiles from the model point data including the North American Mesoscale (NAM) model, Rapid Refresh (RAP) model and Global Forecast System (GFS) model to assess the performance of these models. The AMU developed an Excel-based tool that provides an objective method for the LWOs to compare the model-forecast upper-level winds to the KSC wind profiling radars and CCAFS AMPS observations to assess the model potential to accurately forecast changes in the upperlevel profile through the launch count. The AMU wrote Excel Visual Basic for Applications (VBA) scripts to automatically retrieve model point data for CCAFS (XMR) from the Iowa State University Archive Data Server (http://mtarchive.qeol.iastate.edu) and the 50 MHz, 915 MHz and AMPS observations from the NASA/KSC Spaceport Weather Data Archive web site (http://trmm.ksc.nasa.gov). The AMU then developed code in Excel VBA to automatically ingest and format the observations and model point data in Excel to ready the data for generating Excel charts for the LWO's. The resulting charts allow the LWOs to independently initialize the three models 0-hour forecasts against the observations to determine which is the best performing model and then overlay the model forecasts on time-matched observations during the launch countdown to further assess the model performance and forecasts. This paper will demonstrate integration of observed and predicted atmospheric conditions into a decision support tool and demonstrate how the GUI is implemented in operations.

  17. Scientific Benefits of Space Science Models Archiving at Community Coordinated Modeling Center

    NASA Technical Reports Server (NTRS)

    Kuznetsova, Maria M.; Berrios, David; Chulaki, Anna; Hesse, Michael; MacNeice, Peter J.; Maddox, Marlo M.; Pulkkinen, Antti; Rastaetter, Lutz; Taktakishvili, Aleksandre

    2009-01-01

    The Community Coordinated Modeling Center (CCMC) hosts a set of state-of-the-art space science models ranging from the solar atmosphere to the Earth's upper atmosphere. CCMC provides a web-based Run-on-Request system, by which the interested scientist can request simulations for a broad range of space science problems. To allow the models to be driven by data relevant to particular events CCMC developed a tool that automatically downloads data from data archives and transform them to required formats. CCMC also provides a tailored web-based visualization interface for the model output, as well as the capability to download the simulation output in portable format. CCMC offers a variety of visualization and output analysis tools to aid scientists in interpretation of simulation results. During eight years since the Run-on-request system became available the CCMC archived the results of almost 3000 runs that are covering significant space weather events and time intervals of interest identified by the community. The simulation results archived at CCMC also include a library of general purpose runs with modeled conditions that are used for education and research. Archiving results of simulations performed in support of several Modeling Challenges helps to evaluate the progress in space weather modeling over time. We will highlight the scientific benefits of CCMC space science model archive and discuss plans for further development of advanced methods to interact with simulation results.

  18. National Operational Hydrologic Remote Sensing Center - The ultimate source

    Science.gov Websites

    Analysis Satellite Obs Forecasts Data Archive SHEF Products Observations near City, ST Go Science Database Airborne Snow Surveys Satellite Snow Cover Mapping Snow Modeling and Data Assimilation Analyses polar-orbiting and geostationary satellite imagery. Maps are provided for the U.S. and the northern

  19. Clinical experiences with an ASP model backup archive for PACS images

    NASA Astrophysics Data System (ADS)

    Liu, Brent J.; Cao, Fei; Documet, Luis; Huang, H. K.; Muldoon, Jean

    2003-05-01

    Last year we presented a Fault-Tolerant Backup Archive using an Application Service Provider (ASP) model for disaster recovery. The purpose of this paper is to update and provide clinical experiences related towards implementing the ASP model archive solution for short-term backup of clinical PACS image data as well as possible applications other than disaster recovery. The ASP backup archive provides instantaneous, automatic backup of acquired PACS image data and instantaneous recovery of stored PACS image data all at a low operational cost and with little human intervention. This solution can be used for a variety of scheduled and unscheduled downtimes that occur on the main PACS archive. A backup archive server with hierarchical storage was implemented offsite from the main PACS archive location. Clinical data from a hospital PACS is sent to this ASP storage server in parallel to the exams being archived in the main server. Initially, connectivity between the main archive and the ASP storage server is established via a T-1 connection. In the future, other more cost-effective means of connectivity will be researched such as the Internet 2. We have integrated the ASP model backup archive with a clinical PACS at Saint John's Health Center and has been operational for over 6 months. Pitfalls encountered during integration with a live clinical PACS and the impact to clinical workflow will be discussed. In addition, estimations of the cost of establishing such a solution as well as the cost charged to the users will be included. Clinical downtime scenarios, such as a scheduled mandatory downtime and an unscheduled downtime due to a disaster event to the main archive, were simulated and the PACS exams were sent successfully from the offsite ASP storage server back to the hospital PACS in less than 1 day. The ASP backup archive was able to recover PACS image data for comparison studies with no complex operational procedures. Furthermore, no image data loss was encountered during the recovery. During any clinical downtime scenario, the ASP backup archive server can repopulate a clinical PACS quickly with the majority of studies available for comparison during the interim until the main PACS archive is fully recovered.

  20. Service-Based Extensions to an OAIS Archive for Science Data Management

    NASA Astrophysics Data System (ADS)

    Flathers, E.; Seamon, E.; Gessler, P. E.

    2014-12-01

    With new data management mandates from major funding sources such as the National Institutes for Health and the National Science Foundation, architecture of science data archive systems is becoming a critical concern for research institutions. The Consultative Committee for Space Data Systems (CCSDS), in 2002, released their first version of a Reference Model for an Open Archival Information System (OAIS). The CCSDS document (now an ISO standard) was updated in 2012 with additional focus on verifying the authenticity of data and developing concepts of access rights and a security model. The OAIS model is a good fit for research data archives, having been designed to support data collections of heterogeneous types, disciplines, storage formats, etc. for the space sciences. As fast, reliable, persistent Internet connectivity spreads, new network-available resources have been developed that can support the science data archive. A natural extension of an OAIS archive is the interconnection with network- or cloud-based services and resources. We use the Service Oriented Architecture (SOA) design paradigm to describe a set of extensions to an OAIS-type archive: purpose and justification for each extension, where and how each extension connects to the model, and an example of a specific service that meets the purpose.

  1. SODA: Smart Objects, Dumb Archives

    NASA Technical Reports Server (NTRS)

    Nelson, Michael L.; Maly, Kurt; Zubair, Mohammad; Shen, Stewart N. T.

    2004-01-01

    We present the Smart Object, Dumb Archive (SODA) model for digital libraries (DLs). The SODA model transfers functionality traditionally associated with archives to the archived objects themselves. We are exploiting this shift of responsibility to facilitate other DL goals, such as interoperability, object intelligence and mobility, and heterogeneity. Objects in a SODA DL negotiate presentation of content and handle their own terms and conditions. In this paper we present implementations of our smart objects, buckets, and our dumb archive (DA). We discuss the status of buckets and DA and how they are used in a variety of DL projects.

  2. Understanding the dust cycle at high latitudes: integrating models and observations

    NASA Astrophysics Data System (ADS)

    Albani, S.; Mahowald, N. M.; Maggi, V.; Delmonte, B.; Winckler, G.; Potenza, M. A. C.; Baccolo, G.; Balkanski, Y.

    2017-12-01

    Changing climate conditions affect dust emissions and the global dust cycle, which in turn affects climate and biogeochemistry. Paleodust archives from land, ocean, and ice sheets preserve the history of dust deposition for a range of spatial scales from close to the major hemispheric sources to remote sinks such as the polar ice sheets. In each hemisphere common features on the glacial-interglacial time scale mark the baseline evolution of the dust cycle, and inspired the hypothesis that increased dust deposition to ocean stimulated the glacial biological pump contributing to the reduction of atmospheric carbon dioxide levels. On the other hand finer geographical and temporal scales features are superposed to these glacial-interglacial trends, providing the chance of a more sophisticated understanding of the dust cycle, for instance allowing distinctions in terms of source availability or transport patterns as recorded by different records. As such paleodust archives can prove invaluable sources of information, especially when characterized by a quantitative estimation of the mass accumulation rates, and interpreted in connection with climate models. We review our past work and present ongoing research showing how climate models can help in the interpretation of paleodust records, as well as the potential of the same observations for constraining the representation of the global dust cycle embedded in Earth System Models, both in terms of magnitude and physical parameters related to particle sizes and optical properties. Finally we show the impacts on climate, based on this kind of observationally constrained model simulations.

  3. The CAnadian Surface Prediction ARchive (CaSPAr): A Platform to Enhance Environmental Modelling in Canada and Globally

    NASA Astrophysics Data System (ADS)

    Tolson, B.; Mai, J.; Kornelsen, K. C.; Coulibaly, P. D.; Anctil, F.; Fortin, V.; Leahy, M.; Hall, B.

    2017-12-01

    Environmental models are tools for the modern society for a wide range of applications such as flood and drought monitoring, carbon storage and release estimates, predictions of power generation amounts, or reservoir management amongst others. Environmental models differ in the types of processes they incorporate, where land surface models focus on the energy, water, and carbon cycle of the land and hydrological models concentrate mainly on the water cycle. All these models, however, have in common that they rely on environmental input data from ground observations such as temperature, precipitation and/or radiation to force the model. If the same model is run in forecast mode, numerical weather predictions (NWPs) are needed to replace these ground observations. Therefore, it is critical that NWP data be available to develop models and validate forecast performance. These data are provided by the Meteorological Service of Canada (MSC) on a daily basis. MSC provides multiple products ranging from large scale global models ( 33km/grid cell) to high resolution pan-Canadian models ( 2.5km/grid cell). Operational products providing forecasts in real-time are made publicly available only at the time of issue through various means with new forecasts issued 2-4 times per day. Unfortunately, long term storage of these data are offline and relatively inaccessible to the research and operational communities. The new Canadian Surface Prediction Archive (CaSPAr) platform is an accessible rolling archive of 10 of MSC's NWP products. The 500TB platform will allow users to extract specific time periods, regions of interest and variables of interest in an easy to access NetCDF format. CaSPAr and community contributed post-processing scripts and tools are being developed such that the users, for example, can interpolate the data due to their needs or auto-generate model forcing files. We will present the CaSPAr platform and provide some insights in the current development of the web-based user interface (frontend) and implementations used to retrieve MSC's data and provide the data to the user in the inquired shape (backend).

  4. How Continuous Observations of Shortwave Reflectance Spectra Can Narrow the Range of Shortwave Climate Feedbacks

    NASA Astrophysics Data System (ADS)

    Feldman, D.; Collins, W. D.; Wielicki, B. A.; Shea, Y.; Mlynczak, M. G.; Kuo, C.; Nguyen, N.

    2017-12-01

    Shortwave feedbacks are a persistent source of uncertainty for climate models and a large contributor to the diagnosed range of equilibrium climate sensitivity (ECS) for the international multi-model ensemble. The processes that contribute to these feedbacks affect top-of-atmosphere energetics and produce spectral signatures that may be time-evolving. We explore the value of such spectral signatures for providing an observational constraint on model ECS by simulating top-of-atmosphere shortwave reflectance spectra across much of the energetically-relevant shortwave bandpass (300 to 2500 nm). We present centennial-length shortwave hyperspectral simulations from low, medium and high ECS models that reported to the CMIP5 archive as part of an Observing System Simulation Experiment (OSSE) in support of the CLimate Absolute Radiance and Refractivity Observatory (CLARREO). Our framework interfaces with CMIP5 archive results and is agnostic to the choice of model. We simulated spectra from the INM-CM4 model (ECS of 2.08 °K/2xCO2), the MIROC5 model (ECS of 2.70 °K/2xCO2), and the CSIRO Mk3-6-0 (ECS of 4.08 °K/2xCO2) based on those models' integrations of the RCP8.5 scenario for the 21st Century. This approach allows us to explore how perfect data records can exclude models of lower or higher climate sensitivity. We find that spectral channels covering visible and near-infrared water-vapor overtone bands can potentially exclude a low or high sensitivity model with under 15 years' of absolutely-calibrated data. These different spectral channels are sensitive to model cloud radiative effect and cloud height changes, respectively. These unprecedented calculations lay the groundwork for spectral simulations of perturbed-physics ensembles in order to identify those shortwave observations that can help narrow the range in shortwave model feedbacks and ultimately help reduce the stubbornly-large range in model ECS.

  5. Throughfall Displacement Experiment (TDE) Ecosystem Model Intercomparison Project Data Archive

    DOE Data Explorer

    Hanson, Paul J. [Oak Ridge National Laboratory (ORNL), Oak Ridge, TN (USA); Amthor, Jeffrey S. [Oak Ridge National Laboratory (ORNL), Oak Ridge, TN (USA); Baldocchi, Dennis D. [University of California, Berkeley; Grant, Robert F. [University of Alberta, Canada; Hartley, Anne E. [Ohio State University; Hui, Dafeng [University of Oklahoma; Hunt, Jr., E. Raymond [Agricultural Research Service, U.S. Department of Agriculture; Johnson, Dale W. [University of Nevada, Reno; Kimball, John S. [University of Montana; King, Anthony W. [Oak Ridge National Laboratory (ORNL), Oak Ridge, TN (USA); Luo, Yiqi [University of Oklahoma; McNulty, Steven G. [Southern Global Change Program, U.S. Forest Service, U.S. Department of Agriculture; Sun, Ge [North Carolina State University, Raleigh, NC (USA); Thornton, Peter E. [University of Montana; Wang, Shusen [Geomatics Canada - Canada Centre for Remote Sensing Natural Resources, Canada; Williams, Matthew [University of Edinburgh, United Kingdom; Wilson, Kell B. [National Oceanic and Atmospheric Administration, U.S. Department of Commerce; Wullschleger, Stanley D. [Oak Ridge National Laboratory (ORNL), Oak Ridge, TN (USA)

    2002-08-01

    This archive provides and documents data from a project whose purpose is to compare the output of various ecosystem models when they are run with the data from the Throughfall Displacement Experiment (TDE) at Walker Branch Watershed, Oak Ridge, Tennessee. The project is not designed to determine which models are "best" for diagnosis (i.e., explaining the current functioning of the system) or prognosis (i.e., predicting the response of the system to future conditions), but, rather, to clarify similarities and differences among the models and their components, so that all models can be improved. Data archive: ftp://cdiac.ornl.gov/ftp/tdemodel/. TDE data archive web site: http://cdiac.ess-dive.lbl.gov/epubs/ndp/ndp078a/ndp078a.html.

  6. Improvements in Space Geodesy Data Discovery at the CDDIS

    NASA Technical Reports Server (NTRS)

    Noll, C.; Pollack, N.; Michael, P.

    2011-01-01

    The Crustal Dynamics Data Information System (CDDIS) supports data archiving and distribution activities for the space geodesy and geodynamics community. The main objectives of the system are to store space geodesy and geodynamics related data products in a central data bank. to maintain information about the archival of these data, and to disseminate these data and information in a timely manner to a global scientific research community. The archive consists of GNSS, laser ranging, VLBI, and DORIS data sets and products derived from these data. The CDDIS is one of NASA's Earth Observing System Data and Information System (EOSDIS) distributed data centers; EOSDIS data centers serve a diverse user community and arc tasked to provide facilities to search and access science data and products. Several activities are currently under development at the CDDIS to aid users in data discovery, both within the current community and beyond. The CDDIS is cooperating in the development of Geodetic Seamless Archive Centers (GSAC) with colleagues at UNAVCO and SIO. TIle activity will provide web services to facilitate data discovery within and across participating archives. In addition, the CDDIS is currently implementing modifications to the metadata extracted from incoming data and product files pushed to its archive. These enhancements will permit information about COOlS archive holdings to be made available through other data portals such as Earth Observing System (EOS) Clearinghouse (ECHO) and integration into the Global Geodetic Observing System (GGOS) portal.

  7. Lessons learned from planetary science archiving

    NASA Astrophysics Data System (ADS)

    Zender, J.; Grayzeck, E.

    2006-01-01

    The need for scientific archiving of past, current, and future planetary scientific missions, laboratory data, and modeling efforts is indisputable. To quote from a message by G. Santayama carved over the entrance of the US Archive in Washington DC “Those who can not remember the past are doomed to repeat it.” The design, implementation, maintenance, and validation of planetary science archives are however disputed by the involved parties. The inclusion of the archives into the scientific heritage is problematic. For example, there is the imbalance between space agency requirements and institutional and national interests. The disparity of long-term archive requirements and immediate data analysis requests are significant. The discrepancy between the space missions archive budget and the effort required to design and build the data archive is large. An imbalance exists between new instrument development and existing, well-proven archive standards. The authors present their view on the problems and risk areas in the archiving concepts based on their experience acquired within NASA’s Planetary Data System (PDS) and ESA’s Planetary Science Archive (PSA). Individual risks and potential problem areas are discussed based on a model derived from a system analysis done upfront. The major risk for a planetary mission science archive is seen in the combination of minimal involvement by Mission Scientists and inadequate funding. The authors outline how the risks can be reduced. The paper ends with the authors view on future planetary archive implementations including the archive interoperability aspect.

  8. Environmental System Science Data Infrastructure for a Virtual Ecosystem (ESS-DIVE) - A New U.S. DOE Data Archive

    NASA Astrophysics Data System (ADS)

    Agarwal, D.; Varadharajan, C.; Cholia, S.; Snavely, C.; Hendrix, V.; Gunter, D.; Riley, W. J.; Jones, M.; Budden, A. E.; Vieglais, D.

    2017-12-01

    The ESS-DIVE archive is a new U.S. Department of Energy (DOE) data archive designed to provide long-term stewardship and use of data from observational, experimental, and modeling activities in the earth and environmental sciences. The ESS-DIVE infrastructure is constructed with the long-term vision of enabling broad access to and usage of the DOE sponsored data stored in the archive. It is designed as a scalable framework that incentivizes data providers to contribute well-structured, high-quality data to the archive and that enables the user community to easily build data processing, synthesis, and analysis capabilities using those data. The key innovations in our design include: (1) application of user-experience research methods to understand the needs of users and data contributors; (2) support for early data archiving during project data QA/QC and before public release; (3) focus on implementation of data standards in collaboration with the community; (4) support for community built tools for data search, interpretation, analysis, and visualization tools; (5) data fusion database to support search of the data extracted from packages submitted and data available in partner data systems such as the Earth System Grid Federation (ESGF) and DataONE; and (6) support for archiving of data packages that are not to be released to the public. ESS-DIVE data contributors will be able to archive and version their data and metadata, obtain data DOIs, search for and access ESS data and metadata via web and programmatic portals, and provide data and metadata in standardized forms. The ESS-DIVE archive and catalog will be federated with other existing catalogs, allowing cross-catalog metadata search and data exchange with existing systems, including DataONE's Metacat search. ESS-DIVE is operated by a multidisciplinary team from Berkeley Lab, the National Center for Ecological Analysis and Synthesis (NCEAS), and DataONE. The primarily data copies are hosted at DOE's NERSC supercomputing facility with replicas at DataONE nodes.

  9. Conceptual Data Visualization in Archival Finding Aids: Preliminary User Responses

    ERIC Educational Resources Information Center

    Bahde, Anne

    2017-01-01

    This paper explores possibilities for marrying data visualization to online archival finding aids, which have continually suffered from usability issues in their long history. This paper describes a project in which two different data visualization models were built to replace sections of an archival finding aid. Users were then shown the models,…

  10. AstroCloud, a Cyber-Infrastructure for Astronomy Research: Data Archiving and Quality Control

    NASA Astrophysics Data System (ADS)

    He, B.; Cui, C.; Fan, D.; Li, C.; Xiao, J.; Yu, C.; Wang, C.; Cao, Z.; Chen, J.; Yi, W.; Li, S.; Mi, L.; Yang, S.

    2015-09-01

    AstroCloud is a cyber-Infrastructure for Astronomy Research initiated by Chinese Virtual Observatory (China-VO) under funding support from NDRC (National Development and Reform commission) and CAS (Chinese Academy of Sciences)1(Cui et al. 2014). To archive the astronomical data in China, we present the implementation of the astronomical data archiving system (ADAS). Data archiving and quality control are the infrastructure for the AstroCloud. Throughout the data of the entire life cycle, data archiving system standardized data, transferring data, logging observational data, archiving ambient data, And storing these data and metadata in database. Quality control covers the whole process and all aspects of data archiving.

  11. A new archival infrastructure for highly-structured astronomical data

    NASA Astrophysics Data System (ADS)

    Dovgan, Erik; Knapic, Cristina; Sponza, Massimo; Smareglia, Riccardo

    2018-03-01

    With the advent of the 2020 Radio Astronomy Telescopes era, the amount and format of the radioastronomical data is becoming a massive and performance-critical challenge. Such an evolution of data models and data formats require new data archiving techniques that allow massive and fast storage of data that are at the same time also efficiently processed. A useful expertise for efficient archiviation has been obtained through data archiving of Medicina and Noto Radio Telescopes. The presented archival infrastructure named the Radio Archive stores and handles various formats, such as FITS, MBFITS, and VLBI's XML, which includes description and ancillary files. The modeling and architecture of the archive fulfill all the requirements of both data persistence and easy data discovery and exploitation. The presented archive already complies with the Virtual Observatory directives, therefore future service implementations will also be VO compliant. This article presents the Radio Archive services and tools, from the data acquisition to the end-user data utilization.

  12. Searching for Unresolved Binary Brown Dwarfs

    NASA Astrophysics Data System (ADS)

    Albretsen, Jacob; Stephens, Denise

    2007-10-01

    There are currently L and T brown dwarfs (BDs) with errors in their classification of +/- 1 to 2 spectra types. Metallicity and gravitational differences have accounted for some of these discrepancies, and recent studies have shown unresolved binary BDs may offer some explanation as well. However limitations in technology and resources often make it difficult to clearly resolve an object that may be binary in nature. Stephens and Noll (2006) identified statistically strong binary source candidates from Hubble Space Telescope (HST) images of Trans-Neptunian Objects (TNOs) that were apparently unresolved using model point-spread functions for single and binary sources. The HST archive contains numerous observations of BDs using the Near Infrared Camera and Multi-Object Spectrometer (NICMOS) that have never been rigorously analyzed for binary properties. Using methods developed by Stephens and Noll (2006), BD observations from the HST data archive are being analyzed for possible unresolved binaries. Preliminary results will be presented. This technique will identify potential candidates for future observations to determine orbital information.

  13. The amplitude of decadal to multidecadal variability in precipitation simulated by state-of-the-art climate models

    NASA Astrophysics Data System (ADS)

    Ault, T. R.; Cole, J. E.; St. George, S.

    2012-11-01

    We assess the magnitude of decadal to multidecadal (D2M) variability in Climate Model Intercomparison Project 5 (CMIP5) simulations that will be used to understand, and plan for, climate change as part of the Intergovernmental Panel on Climate Change's 5th Assessment Report. Model performance on D2M timescales is evaluated using metrics designed to characterize the relative and absolute magnitude of variability at these frequencies. In observational data, we find that between 10% and 35% of the total variance occurs on D2M timescales. Regions characterized by the high end of this range include Africa, Australia, western North America, and the Amazon region of South America. In these areas D2M fluctuations are especially prominent and linked to prolonged drought. D2M fluctuations account for considerably less of the total variance (between 5% and 15%) in the CMIP5 archive of historical (1850-2005) simulations. The discrepancy between observation and model based estimates of D2M prominence reflects two features of the CMIP5 archive. First, interannual components of variability are generally too energetic. Second, decadal components are too weak in several key regions. Our findings imply that projections of the future lack sufficient decadal variability, presenting a limited view of prolonged drought and pluvial risk.

  14. Defining the Core Archive Data Standards of the International Planetary Data Alliance (IPDA)

    NASA Technical Reports Server (NTRS)

    Hughes, J. Steven; Crichton, Dan; Beebe, Reta; Guinness, Ed; Heather, David; Zender, Joe

    2007-01-01

    A goal of the International Planetary Data Alliance (lPDA) is to develop a set of archive data standards that enable the sharing of scientific data across international agencies and missions. To help achieve this goal, the IPDA steering committee initiated a six month proj ect to write requirements for and draft an information model based on the Planetary Data System (PDS) archive data standards. The project had a special emphasis on data formats. A set of use case scenarios were first developed from which a set of requirements were derived for the IPDA archive data standards. The special emphasis on data formats was addressed by identifying data formats that have been used by PDS nodes and other agencies in the creation of successful data sets for the Planetary Data System (PDS). The dependency of the IPDA information model on the PDS archive standards required the compilation of a formal specification of the archive standards currently in use by the PDS. An ontology modelling tool was chosen to capture the information model from various sources including the Planetary Science Data Dictionary [I] and the PDS Standards Reference [2]. Exports of the modelling information from the tool database were used to produce the information model document using an object-oriented notation for presenting the model. The tool exports can also be used for software development and are directly accessible by semantic web applications.

  15. OceanNOMADS: Real-time and retrospective access to operational U.S. ocean prediction products

    NASA Astrophysics Data System (ADS)

    Harding, J. M.; Cross, S. L.; Bub, F.; Ji, M.

    2011-12-01

    The National Oceanic and Atmospheric Administration (NOAA) National Operational Model Archive and Distribution System (NOMADS) provides both real-time and archived atmospheric model output from servers at the National Centers for Environmental Prediction (NCEP) and National Climatic Data Center (NCDC) respectively (http://nomads.ncep.noaa.gov/txt_descriptions/marRutledge-1.pdf). The NOAA National Ocean Data Center (NODC) with NCEP is developing a complementary capability called OceanNOMADS for operational ocean prediction models. An NCEP ftp server currently provides real-time ocean forecast output (http://www.opc.ncep.noaa.gov/newNCOM/NCOM_currents.shtml) with retrospective access through NODC. A joint effort between the Northern Gulf Institute (NGI; a NOAA Cooperative Institute) and the NOAA National Coastal Data Development Center (NCDDC; a division of NODC) created the developmental version of the retrospective OceanNOMADS capability (http://www.northerngulfinstitute.org/edac/ocean_nomads.php) under the NGI Ecosystem Data Assembly Center (EDAC) project (http://www.northerngulfinstitute.org/edac/). Complementary funding support for the developmental OceanNOMADS from U.S. Integrated Ocean Observing System (IOOS) through the Southeastern University Research Association (SURA) Model Testbed (http://testbed.sura.org/) this past year provided NODC the analogue that facilitated the creation of an NCDDC production version of OceanNOMADS (http://www.ncddc.noaa.gov/ocean-nomads/). Access tool development and storage of initial archival data sets occur on the NGI/NCDDC developmental servers with transition to NODC/NCCDC production servers as the model archives mature and operational space and distribution capability grow. Navy operational global ocean forecast subsets for U.S waters comprise the initial ocean prediction fields resident on the NCDDC production server. The NGI/NCDDC developmental server currently includes the Naval Research Laboratory Inter-America Seas Nowcast/Forecast System over the Gulf of Mexico from 2004-Mar 2011, the operational Naval Oceanographic Office (NAVOCEANO) regional USEast ocean nowcast/forecast system from early 2009 to present, and the NAVOCEANO operational regional AMSEAS (Gulf of Mexico/Caribbean) ocean nowcast/forecast system from its inception 25 June 2010 to present. AMSEAS provided one of the real-time ocean forecast products accessed by NOAA's Office of Response and Restoration from the NGI/NCDDC developmental OceanNOMADS during the Deep Water Horizon oil spill last year. The developmental server also includes archived, real-time Navy coastal forecast products off coastal Japan in support of U.S./Japanese joint efforts following the 2011 tsunami. Real-time NAVOCEANO output from regional prediction systems off Southern California and around Hawaii, currently available on the NCEP ftp server, are scheduled for archival on the developmental OceanNOMADS by late 2011 along with the next generation Navy/NOAA global ocean prediction output. Accession and archival of additional regions is planned as server capacities increase.

  16. Meteorology Research in DOE's Atmosphere to Electrons (A2e) Program

    NASA Astrophysics Data System (ADS)

    Cline, J.; Haupt, S. E.; Shaw, W. J.

    2017-12-01

    DOE's Atmosphere to electrons (A2e) program is performing cutting edge research to allow optimization of wind plants. This talk will summarize the atmospheric science portion of A2e, with an overview of recent and planned observation and modeling projects designed to bridge the terra incognita between the mesoscale and the microscales that affect wind plants. Introduction A2e is a major focus of the Wind Energy Technologies Office (WETO) within the Office of Energy Efficiency & Renewable Energy (EERE) at the DOE. The overall objective of A2e is to optimize wind power production and integrates improved knowledge of atmospheric inflow (fuel), turbine and plant aerodynamics, and control systems. The atmospheric component of the work addresses both the need for improved forecasting of hub-height winds and the need for improved turbulence characterization for turbine inflows under realistic atmospheric conditions and terrain. Several projects will be discussed to address observations of meteorological variables in regions not typically observed. The modelling needs are addressed through major multi-institutional integrated studies comprising both theoretical and numerical advances to improve models and field observations for physical insight. Model improvements are subjected to formal verification and validation, and numerical and observational data are archived and disseminated to the public through the A2e Data Archive and Portal (DAP; http://a2e.energy.gov). The overall outcome of this work will be increased annual energy production from wind plants and improved turbine lifetimes through a better understanding of atmospheric loading. We will briefly describe major components of the atmospheric part of the A2e strategy and work being done and planned.

  17. Operationally Monitoring Sea Ice at the Canadian Ice Service

    NASA Astrophysics Data System (ADS)

    de Abreu, R.; Flett, D.; Carrieres, T.; Falkingham, J.

    2004-05-01

    The Canadian Ice Service (CIS) of the Meteorological Service of Canada promotes safe and efficient maritime operations and protects Canada's environment by providing reliable and timely information about ice and iceberg conditions in Canadian waters. Daily and seasonal charts describing the extent, type and concentration of sea ice and icebergs are provided to support navigation and other activities (e.g. oil and gas) in coastal waters. The CIS relies on a suite of spaceborne visible, infrared and microwave sensors to operationally monitor ice conditions in Canadian coastal and inland waterways. These efforts are complemented by operational sea ice models that are customized and run at the CIS. The archive of these data represent a 35 year archive of ice conditions and have proven to be a valuable dataset for historical sea ice analysis. This presentation will describe the daily integration of remote sensing observations and modelled ice conditions used to produce ice and iceberg products. A review of the decadal evolution of this process will be presented, as well as a glimpse into the future of ice and iceberg monitoring. Examples of the utility of the CIS digital sea ice archive for climate studies will also be presented.

  18. The Panchromatic STARBurst IRregular Dwarf Survey (STARBIRDS): Observations and Data Archive

    NASA Astrophysics Data System (ADS)

    McQuinn, Kristen B. W.; Mitchell, Noah P.; Skillman, Evan D.

    2015-06-01

    Understanding star formation in resolved low mass systems requires the integration of information obtained from observations at different wavelengths. We have combined new and archival multi-wavelength observations on a set of 20 nearby starburst and post-starburst dwarf galaxies to create a data archive of calibrated, homogeneously reduced images. Named the panchromatic “STARBurst IRregular Dwarf Survey” archive, the data are publicly accessible through the Mikulski Archive for Space Telescopes. This first release of the archive includes images from the Galaxy Evolution Explorer Telescope (GALEX), the Hubble Space Telescope (HST), and the Spitzer Space Telescope (Spitzer) Multiband Imaging Photometer instrument. The data sets include flux calibrated, background subtracted images, that are registered to the same world coordinate system. Additionally, a set of images are available that are all cropped to match the HST field of view. The GALEX and Spitzer images are available with foreground and background contamination masked. Larger GALEX images extending to 4 times the optical extent of the galaxies are also available. Finally, HST images convolved with a 5″ point spread function and rebinned to the larger pixel scale of the GALEX and Spitzer 24 μm images are provided. Future additions are planned that will include data at other wavelengths such as Spitzer IRAC, ground-based Hα, Chandra X-ray, and Green Bank Telescope H i imaging. Based on observations made with the NASA/ESA Hubble Space Telescope, and obtained from the Hubble Legacy Archive, which is a collaboration between the Space Telescope Science Institute (STScI/NASA), the Space Telescope European Coordinating Facility (ST-ECF/ESA), and the Canadian Astronomy Data Centre (CADC/NRC/CSA).

  19. The Role of Data Archives in Synoptic Solar Physics

    NASA Astrophysics Data System (ADS)

    Reardon, Kevin

    The detailed study of solar cycle variations requires analysis of recorded datasets spanning many years of observations, that is, a data archive. The use of digital data, combined with powerful database server software, gives such archives new capabilities to provide, quickly and flexibly, selected pieces of information to scientists. Use of standardized protocols will allow multiple databases, independently maintained, to be seamlessly joined, allowing complex searches spanning multiple archives. These data archives also benefit from being developed in parallel with the telescope itself, which helps to assure data integrity and to provide close integration between the telescope and archive. Development of archives that can guarantee long-term data availability and strong compatibility with other projects makes solar-cycle studies easier to plan and realize.

  20. The digital archive of the International Halley Watch

    NASA Technical Reports Server (NTRS)

    Klinglesmith, D. A., III; Niedner, M. B.; Grayzeck, E.; Aronsson, M.; Newburn, R. L.; Warnock, A., III

    1992-01-01

    The International Halley Watch was established to coordinate, collect, archive, and distribute the scientific data from Comet P/Halley that would be obtained from both the ground and space. This paper describes one of the end products of that effort, namely the IHW Digital Archive. The IHW Digital Archive consists of 26 CD-ROM's containing over 32 gigabytes of data from the 9 IHW disciplines as well as data from the 5 spacecraft missions flown to comet P/Haley and P/Giacobini-Zinner. The total archive contains over 50,000 observations by 1,500 observers from at least 40 countries. The first 24 CD's, which are currently available, contain data from the 9 IHW disciplines. The two remaining CD's will have the spacecraft data and should be available within the next year. A test CD-ROM of these data has been created and is currently under review.

  1. Observational and modeling constraints on global anthropogenic enrichment of mercury.

    PubMed

    Amos, Helen M; Sonke, Jeroen E; Obrist, Daniel; Robins, Nicholas; Hagan, Nicole; Horowitz, Hannah M; Mason, Robert P; Witt, Melanie; Hedgecock, Ian M; Corbitt, Elizabeth S; Sunderland, Elsie M

    2015-04-07

    Centuries of anthropogenic releases have resulted in a global legacy of mercury (Hg) contamination. Here we use a global model to quantify the impact of uncertainty in Hg atmospheric emissions and cycling on anthropogenic enrichment and discuss implications for future Hg levels. The plausibility of sensitivity simulations is evaluated against multiple independent lines of observation, including natural archives and direct measurements of present-day environmental Hg concentrations. It has been previously reported that pre-industrial enrichment recorded in sediment and peat disagree by more than a factor of 10. We find this difference is largely erroneous and caused by comparing peat and sediment against different reference time periods. After correcting this inconsistency, median enrichment in Hg accumulation since pre-industrial 1760 to 1880 is a factor of 4.3 for peat and 3.0 for sediment. Pre-industrial accumulation in peat and sediment is a factor of ∼ 5 greater than the precolonial era (3000 BC to 1550 AD). Model scenarios that omit atmospheric emissions of Hg from early mining are inconsistent with observational constraints on the present-day atmospheric, oceanic, and soil Hg reservoirs, as well as the magnitude of enrichment in archives. Future reductions in anthropogenic emissions will initiate a decline in atmospheric concentrations within 1 year, but stabilization of subsurface and deep ocean Hg levels requires aggressive controls. These findings are robust to the ranges of uncertainty in past emissions and Hg cycling.

  2. Characterizing the LANDSAT Global Long-Term Data Record

    NASA Technical Reports Server (NTRS)

    Arvidson, T.; Goward, S. N.; Williams, D. L.

    2006-01-01

    The effects of global climate change are fast becoming politically, sociologically, and personally important: increasing storm frequency and intensity, lengthening cycles of drought and flood, expanding desertification and soil salinization. A vital asset in the analysis of climate change on a global basis is the 34-year record of Landsat imagery. In recognition of its increasing importance, a detailed analysis of the Landsat observation coverage within the US archive was commissioned. Results to date indicate some unexpected gaps in the US-held archive. Fortunately, throughout the Landsat program, data have been downlinked routinely to International Cooperator (IC) ground stations for archival, processing, and distribution. These IC data could be combined with the current US holdings to build a nearly global, annual observation record over this 34-year period. Today, we have inadequate information as to which scenes are available from which IC archives. Our best estimate is that there are over four million digital scenes in the IC archives, compared with the nearly two million scenes held in the US archive. This vast pool of Landsat observations needs to be accurately documented, via metadata, to determine the existence of complementary scenes and to characterize the potential scope of the global Landsat observation record. Of course, knowing the extent and completeness of the data record is but the first step. It will be necessary to assure that the data record is easy to use, internally consistent in terms of calibration and data format, and fully accessible in order to fully realize its potential.

  3. Metadata Design in the New PDS4 Standards - Something for Everybody

    NASA Astrophysics Data System (ADS)

    Raugh, Anne C.; Hughes, John S.

    2015-11-01

    The Planetary Data System (PDS) archives, supports, and distributes data of diverse targets, from diverse sources, to diverse users. One of the core problems addressed by the PDS4 data standard redesign was that of metadata - how to accommodate the increasingly sophisticated demands of search interfaces, analytical software, and observational documentation into label standards without imposing limits and constraints that would impinge on the quality or quantity of metadata that any particular observer or team could supply. And yet, as an archive, PDS must have detailed documentation for the metadata in the labels it supports, or the institutional knowledge encoded into those attributes will be lost - putting the data at risk.The PDS4 metadata solution is based on a three-step approach. First, it is built on two key ISO standards: ISO 11179 "Information Technology - Metadata Registries", which provides a common framework and vocabulary for defining metadata attributes; and ISO 14721 "Space Data and Information Transfer Systems - Open Archival Information System (OAIS) Reference Model", which provides the framework for the information architecture that enforces the object-oriented paradigm for metadata modeling. Second, PDS has defined a hierarchical system that allows it to divide its metadata universe into namespaces ("data dictionaries", conceptually), and more importantly to delegate stewardship for a single namespace to a local authority. This means that a mission can develop its own data model with a high degree of autonomy and effectively extend the PDS model to accommodate its own metadata needs within the common ISO 11179 framework. Finally, within a single namespace - even the core PDS namespace - existing metadata structures can be extended and new structures added to the model as new needs are identifiedThis poster illustrates the PDS4 approach to metadata management and highlights the expected return on the development investment for PDS, users and data preparers.

  4. Long-term archiving and data access: modelling and standardization

    NASA Technical Reports Server (NTRS)

    Hoc, Claude; Levoir, Thierry; Nonon-Latapie, Michel

    1996-01-01

    This paper reports on the multiple difficulties inherent in the long-term archiving of digital data, and in particular on the different possible causes of definitive data loss. It defines the basic principles which must be respected when creating long-term archives. Such principles concern both the archival systems and the data. The archival systems should have two primary qualities: independence of architecture with respect to technological evolution, and generic-ness, i.e., the capability of ensuring identical service for heterogeneous data. These characteristics are implicit in the Reference Model for Archival Services, currently being designed within an ISO-CCSDS framework. A system prototype has been developed at the French Space Agency (CNES) in conformance with these principles, and its main characteristics will be discussed in this paper. Moreover, the data archived should be capable of abstract representation regardless of the technology used, and should, to the extent that it is possible, be organized, structured and described with the help of existing standards. The immediate advantage of standardization is illustrated by several concrete examples. Both the positive facets and the limitations of this approach are analyzed. The advantages of developing an object-oriented data model within this contxt are then examined.

  5. The LCOGT Observation Portal, Data Pipeline and Science Archive

    NASA Astrophysics Data System (ADS)

    Lister, Tim; LCOGT Science Archive Team

    2014-01-01

    Las Cumbres Observatory Global Telescope (LCOGT) is building and deploying a world-wide network of optical telescopes dedicated to time-domain astronomy. During 2012-2013, we successfully deployed and commissioned nine new 1m telescopes at McDonald Observatory (Texas), CTIO (Chile), SAAO (South Africa) and Siding Spring Observatory (Australia). New, improved cameras and additional telescopes will be deployed during 2014. To enable the diverse LCOGT user community of scientific and educational users to request observations on the LCOGT Network and to see their progress and get access to their data, we have developed an Observation Portal system. This Observation Portal integrates proposal submission and observation requests with seamless access to the data products from the data pipelines in near-realtime and long-term products from the Science Archive. We describe the LCOGT Observation Portal and the data pipeline, currently in operation, which makes use of the ORAC-DR automated recipe-based data reduction pipeline and illustrate some of the new data products. We also present the LCOGT Science Archive, which is being developed in partnership with the Infrared Processing and Analysis Center (IPAC) and show some of the new features the Science Archive provides.

  6. The Operation and Architecture of the Keck Observatory Archive

    NASA Astrophysics Data System (ADS)

    Berriman, G. B.; Gelino, C. R.; Laity, A.; Kong, M.; Swain, M.; Holt, J.; Goodrich, R.; Mader, J.; Tran, H. D.

    2014-05-01

    The Infrared Processing and Analysis Center (IPAC) and the W. M. Keck Observatory (WMKO) are collaborating to build an archive for the twin 10-m Keck Telescopes, located near the summit of Mauna Kea. The Keck Observatory Archive (KOA) takes advantage of IPAC's long experience with managing and archiving large and complex data sets from active missions and serving them to the community; and of the Observatory's knowledge of the operation of its sophisticated instrumentation and the organization of the data products. By the end of 2013, KOA will contain data from all eight active observatory instruments, with an anticipated volume of 28 TB. The data include raw science and observations, quick look products, weather information, and, for some instruments, reduced and calibrated products. The goal of including data from all instruments is the cumulation of a rapid expansion of the archive's holdings, and already data from four new instruments have been added since October 2012. One more active instrument, the integral field spectrograph OSIRIS, is scheduled for ingestion in December 2013. After preparation for ingestion into the archive, the data are transmitted electronically from WMKO to IPAC for curation in the physical archive. This process includes validation of the science and content of the data and verification that data were not corrupted in transmission. The archived data include both newly-acquired observations and all previously acquired observations. The older data extends back to the date of instrument commissioning; for some instruments, such as HIRES, these data can extend as far back as 1994. KOA will continue to ingest all newly obtained observations, at an anticipated volume of 4 TB per year, and plans to ingest data from two decommissioned instruments. Access to these data is governed by a data use policy that guarantees Principal Investigators (PI) exclusive access to their data for at least 18 months, and allows for extensions as granted by institutional Selecting Officials. Approximately one-half of the data in the archive are public. The archive architecture takes advantage of existing software and is designed for sustainability. The data preparation and quality assurance software exploits the software infrastructure at WMKO, and the physical archive at IPAC re-uses the portable component based architecture developed originally for the Infrared Science Archive, with extensions custom to KOA as needed. We will discuss the science services available to end-users. These include web and program query interfaces, interactive tabulation of data and metadata, association of files with science files, and interactive visualization of data products. We will discuss how the growth in the archive holdings has led to to a growth in usage and published science results. Finally, we will discuss the future of KOA, including the provision of data reduction pipelines and interoperability with world-wide archives and data centers, including VO-compliant services.

  7. Validation of the 1/12 degrees Arctic Cap Nowcast/Forecast System (ACNFS)

    DTIC Science & Technology

    2010-11-04

    IBM Power 6 ( Davinci ) at NAVOCEANO with a 2 hr time step for the ice model and a 30 min time step for the ocean model. All model boundaries are...run using 320 processors on the Navy DSRC IBM Power 6 ( Davinci ) at NAVOCEANO. A typical one-day hindcast takes approximately 1.0 wall clock hour...meter. As more observations become available, further studies of ice draft will be used as a validation tool . The IABP program archived 102 Argos

  8. Validation of the 1/12 deg Arctic Cap Nowcast/Forecast System (ACNFS)

    DTIC Science & Technology

    2010-11-04

    IBM Power 6 ( Davinci ) at NAVOCEANO with a 2 hr time step for the ice model and a 30 min time step for the ocean model. All model boundaries are...run using 320 processors on the Navy DSRC IBM Power 6 ( Davinci ) at NAVOCEANO. A typical one-day hindcast takes approximately 1.0 wall clock hour...meter. As more observations become available, further studies of ice draft will be used as a validation tool . The IABP program archived 102 Argos

  9. Diagnosing the Dynamics of Observed and Simulated Ecosystem Gross Primary Productivity with Time Causal Information Theory Quantifiers

    PubMed Central

    Sippel, Sebastian; Mahecha, Miguel D.; Hauhs, Michael; Bodesheim, Paul; Kaminski, Thomas; Gans, Fabian; Rosso, Osvaldo A.

    2016-01-01

    Data analysis and model-data comparisons in the environmental sciences require diagnostic measures that quantify time series dynamics and structure, and are robust to noise in observational data. This paper investigates the temporal dynamics of environmental time series using measures quantifying their information content and complexity. The measures are used to classify natural processes on one hand, and to compare models with observations on the other. The present analysis focuses on the global carbon cycle as an area of research in which model-data integration and comparisons are key to improving our understanding of natural phenomena. We investigate the dynamics of observed and simulated time series of Gross Primary Productivity (GPP), a key variable in terrestrial ecosystems that quantifies ecosystem carbon uptake. However, the dynamics, patterns and magnitudes of GPP time series, both observed and simulated, vary substantially on different temporal and spatial scales. We demonstrate here that information content and complexity, or Information Theory Quantifiers (ITQ) for short, serve as robust and efficient data-analytical and model benchmarking tools for evaluating the temporal structure and dynamical properties of simulated or observed time series at various spatial scales. At continental scale, we compare GPP time series simulated with two models and an observations-based product. This analysis reveals qualitative differences between model evaluation based on ITQ compared to traditional model performance metrics, indicating that good model performance in terms of absolute or relative error does not imply that the dynamics of the observations is captured well. Furthermore, we show, using an ensemble of site-scale measurements obtained from the FLUXNET archive in the Mediterranean, that model-data or model-model mismatches as indicated by ITQ can be attributed to and interpreted as differences in the temporal structure of the respective ecological time series. At global scale, our understanding of C fluxes relies on the use of consistently applied land models. Here, we use ITQ to evaluate model structure: The measures are largely insensitive to climatic scenarios, land use and atmospheric gas concentrations used to drive them, but clearly separate the structure of 13 different land models taken from the CMIP5 archive and an observations-based product. In conclusion, diagnostic measures of this kind provide data-analytical tools that distinguish different types of natural processes based solely on their dynamics, and are thus highly suitable for environmental science applications such as model structural diagnostics. PMID:27764187

  10. The new European Hubble archive

    NASA Astrophysics Data System (ADS)

    De Marchi, Guido; Arevalo, Maria; Merin, Bruno

    2016-01-01

    The European Hubble Archive (hereafter eHST), hosted at ESA's European Space Astronomy Centre, has been released for public use in October 2015. The eHST is now fully integrated with the other ESA science archives to ensure long-term preservation of the Hubble data, consisting of more than 1 million observations from 10 different scientific instruments. The public HST data, the Hubble Legacy Archive, and the high-level science data products are now all available to scientists through a single, carefully designed and user friendly web interface. In this talk, I will show how the the eHST can help boost archival research, including how to search on sources in the field of view thanks to precise footprints projected onto the sky, how to obtain enhanced previews of imaging data and interactive spectral plots, and how to directly link observations with already published papers. To maximise the scientific exploitation of Hubble's data, the eHST offers connectivity to virtual observatory tools, easily integrates with the recently released Hubble Source Catalog, and is fully accessible through ESA's archives multi-mission interface.

  11. The SpaceInn SISMA archive

    NASA Astrophysics Data System (ADS)

    Rainer, Monica; Poretti, Ennio; Mistò, Angelo; Rosa Panzera, Maria

    2017-10-01

    The Spectroscopic Indicators in a SeisMic Archive (SISMA) has been built in the framework of the FP7 SpaceInn project to contain the 7013 HARPS spectra observed during the CoRoT asteroseismic groundbased program, along with their variability and asteroseismic indicators. The spectra pertain to 261 stars spread around the whole Herztsprung-Russell diagram: 72 of them were CoRoT targets while the others were observed in order to better characterize their variability classes. The Legacy Data lightcurves of the CoRoT targets are also stored in the archive.

  12. Getting here: five steps forwards and four back

    NASA Astrophysics Data System (ADS)

    Griffin, R. Elizabeth

    The concept of libraries of stellar spectra is by no means new, though access to on-line ones is a relatively recent achievement. The road to the present state has been rocky, and we are still far short of what is needed and what can easily be attained. Spectra as by-products of individual research projects are inhomogeneous, biassed, and can be dangerously inadequate for modelling complex stellar systems. Archival products are eclectic, but unique in the time domain. Getting telescope time for the required level of homogeneity, inclusivity and completeness for new libraries requires strong scientific arguments that must be competitive. Using synthetic spectra builds misconceptions into the modelling. Attempts to set up the initial requirements (archives of observed spectra) encountered dogged resistance, much of which has never been resolved. Those struggles, and the indelible effects they have upon our science, will be reviewed, and the basics of a promotional programme outlined.

  13. Evaluating Transient Global and Regional Model Simulations: Bridging the Model/Observations Information Gap

    NASA Astrophysics Data System (ADS)

    Rutledge, G. K.; Karl, T. R.; Easterling, D. R.; Buja, L.; Stouffer, R.; Alpert, J.

    2001-05-01

    A major transition in our ability to evaluate transient Global Climate Model (GCM) simulations is occurring. Real-time and retrospective numerical weather prediction analysis, model runs, climate simulations and assessments are proliferating from a handful of national centers to dozens of groups across the world. It is clear that it is no longer sufficient for any one national center to develop its data services alone. The comparison of transient GCM results with the observational climate record is difficult for several reasons. One limitation is that the global distributions of a number of basic climate quantities, such as precipitation, are not well known. Similarly, observational limitations exist with model re-analysis data. Both the NCEP/NCAR, and the ECMWF, re-analysis eliminate the problems of changing analysis systems but observational data also contain time-dependant biases. These changes in input data are blended with the natural variability making estimates of true variability uncertain. The need for data homogeneity is critical to study questions related to the ability to evaluate simulation of past climate. One approach to correct for time-dependant biases and data sparse regions is the development and use of high quality 'reference' data sets. The primary U.S. National responsibility for the archive and service of weather and climate data rests with the National Climatic Data Center (NCDC). However, as supercomputers increase the temporal and spatial resolution of both Numerical Weather Prediction (NWP) and GCM models, the volume and varied formats of data presented for archive at NCDC, using current communications technologies and data management techniques is limiting the scientific access of these data. To address this ever expanding need for climate and NWP information, NCDC along with the National Center's for Environmental Prediction (NCEP) have initiated the NOAA Operational Model Archive and Distribution System (NOMADS). NOMADS is a collaboration between the Center for Ocean-Land-Atmosphere studies (COLA); the Geophysical Fluid Dynamics Laboratory (GFDL); the George Mason University (GMU); the National Center for Atmospheric Research (NCAR); the NCDC; NCEP; the Pacific Marine Environmental Laboratory (PMEL); and the University of Washington. The objective of the NOMADS is to preserve and provide retrospective access to GCM's and reference quality long-term observational and high volume three dimensional data as well as NCEP NWP models and re-start and re-analysis information. The creation of the NOMADS features a data distribution, format independent, methodology enabling scientific collaboration between researchers. The NOMADS configuration will allow a researcher to transparently browse, extract and intercompare retrospective observational and model data products from any of the participating centers. NOMADS will provide the ability to easily initialize and compare the results of ongoing climate model assessments and NWP output. Beyond the ingest and access capability soon to be implemented with NOMADS is the challenge of algorithm development for the inter-comparison of large-array data (e.g., satellite and radar) with surface, upper-air, and sub-surface ocean observational data. The implementation of NOMADS should foster the development of new quality control processes by taking advantage of distributed data access.

  14. Assessing High-Resolution Weather Research and Forecasting (WRF) Forecasts Using an Object-Based Diagnostic Evaluation

    DTIC Science & Technology

    2014-02-01

    Operational Model Archive and Distribution System ( NOMADS ). The RTMA product was generated using a 2-D variational method to assimilate point weather...observations and satellite-derived measurements (National Weather Service, 2013). The products were downloaded using the NOMADS General Regularly...of the completed WRF run" read Start_Date echo $Start_Date echo " " echo "Enter 2- digit , zulu, observation hour (HH) for remapping" read oHH

  15. Archival of Amateur Observations in Support to ESA/Rosetta Mission

    NASA Astrophysics Data System (ADS)

    Shirinian, R.; Yanamandra-Fisher, P. A.; Buratti, B. J.

    2016-12-01

    The European Space Agency's Rosetta mission to comet 67P/Churyumov-Gerasimenko (CG) has included a global ground-based observing campaign consisting of both professional and amateur observers. While professional observers have access to world class observatories with multi-spectral instruments, amateur observers use smaller aperture telescopes that mainly cover the optical spectrum. Amateur observers however, have the advantage of being able to observe as needed since their time is not competed by other observers as it is in professional facilities. This allows amateurs to create a temporal baseline of observations throughout a mission to complement professional observations with context. The Rosetta mission has had an active amateur observer campaign for over 2 years, from January 2014 to August 2016 and has nearly 150 active observers from around the globe. As the Rosetta mission and its observer campaign come to an end in September 2016, an important goal of the project is the collection and archival of the amateur observational data. The ESA's Planetary Science Archive (PSA) has created a unique system that provides firewalled user-specific directories for amateur observers to upload and archive their data, allowing professionals and amateurs to crowdsource data for future science analyses. Possible future science products could include analysis of luminosity, dust cover, position angle, and tail length, all of which can be analyzed over time due to the consistent amateur data taken for over two years. A challenge for the project is that amateur observers have varying amounts of data, ranging from a few megabytes to several gigabytes. Our project addresses the retrieval of amateur observations, renaming, reformatting, and upload to the PSA. The final steps of the archival of amateur observations are the quality check of the data, some of the possible analyses, and identification of data that can be integrated with professional data analysis. The unique integration of amateur observations and professional observations will be crowdsourced for future scientific analysis. This research project was conducted at NASA/JPL/Caltech with funding from NASA and the NSF REU site Consortium for Undergraduate Research Experiences (CURE) ran through LACC.

  16. Expansion of the On-line Archive "Statistically Downscaled WCRP CMIP3 Climate Projections"

    NASA Astrophysics Data System (ADS)

    Brekke, L. D.; Pruitt, T.; Maurer, E. P.; Das, T.; Duffy, P.; White, K.

    2009-12-01

    Presentation highlights status and plans for a public-access archive of downscaled CMIP3 climate projections. Incorporating climate projection information into long-term evaluations of water and energy resources requires analysts to have access to projections at "basin-relevant" resolution. Such projections would ideally be bias-corrected to account for climate model tendencies to systematically simulate historical conditions different than observed. In 2007, the U.S. Bureau of Reclamation, Santa Clara University and Lawrence Livermore National Laboratory (LLNL) collaborated to develop an archive of 112 bias-corrected and spatially disaggregated (BCSD) CMIP3 temperature and precipitation projections. These projections were generated using 16 CMIP3 models to simulate three emissions pathways (A2, A1b, and B1) from one or more initializations (runs). Projections are specified on a monthly time step from 1950-2099 and at 0.125 degree spatial resolution within the North American Land Data Assimilation System domain (i.e. contiguous U.S., southern Canada and northern Mexico). Archive data are freely accessible at LLNL Green Data Oasis (url). Since being launched, the archive has served over 3500 data requests by nearly 500 users in support of a range of planning, research and educational activities. Archive developers continue to look for ways to improve the archive and respond to user needs. One request has been to serve the intermediate datasets generated during the BCSD procedure, helping users to interpret the relative influences of the bias-correction and spatial disaggregation on the transformed CMIP3 output. This request has been addressed with intermediate datasets now posted at the archive web-site. Another request relates closely to studying hydrologic and ecological impacts under climate change, where users are asking for projected diurnal temperature information (e.g., projected daily minimum and maximum temperature) and daily time step resolution. In response, archive developers are adding content in 2010, teaming with Scripps Institution of Oceanography (through their NOAA-RISA California-Nevada Applications Program and the California Climate Change Center) to apply a new daily downscaling technique to a sub-ensemble of the archive’s CMIP3 projections. The new technique, Bias-Corrected Constructed Analogs, combines the BC part of BCSD with a recently developed technique that preserves the daily sequencing structure of CMIP3 projections (Constructed Analogs, or CA). Such data will more easily serve hydrologic and ecological impacts assessments, and offer an opportunity to evaluate projection uncertainty associated with downscaling technique. Looking ahead to the arrival CMIP5 projections, archive collaborators have plans apply both BCSD and BCCA over the contiguous U.S. consistent with CMIP3 applications above, and also apply BCSD globally at a 0.5 degree spatial resolution. The latter effort involves collaboration with U.S. Army Corps of Engineers (USACE) and Climate Central.

  17. THE PANCHROMATIC STARBURST IRREGULAR DWARF SURVEY (STARBIRDS): OBSERVATIONS AND DATA ARCHIVE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McQuinn, Kristen B. W.; Mitchell, Noah P.; Skillman, Evan D., E-mail: kmcquinn@astro.umn.edu

    2015-06-22

    Understanding star formation in resolved low mass systems requires the integration of information obtained from observations at different wavelengths. We have combined new and archival multi-wavelength observations on a set of 20 nearby starburst and post-starburst dwarf galaxies to create a data archive of calibrated, homogeneously reduced images. Named the panchromatic “STARBurst IRregular Dwarf Survey” archive, the data are publicly accessible through the Mikulski Archive for Space Telescopes. This first release of the archive includes images from the Galaxy Evolution Explorer Telescope (GALEX), the Hubble Space Telescope (HST), and the Spitzer Space Telescope (Spitzer) Multiband Imaging Photometer instrument. The datamore » sets include flux calibrated, background subtracted images, that are registered to the same world coordinate system. Additionally, a set of images are available that are all cropped to match the HST field of view. The GALEX and Spitzer images are available with foreground and background contamination masked. Larger GALEX images extending to 4 times the optical extent of the galaxies are also available. Finally, HST images convolved with a 5″ point spread function and rebinned to the larger pixel scale of the GALEX and Spitzer 24 μm images are provided. Future additions are planned that will include data at other wavelengths such as Spitzer IRAC, ground-based Hα, Chandra X-ray, and Green Bank Telescope H i imaging.« less

  18. Diagnosing the Dynamics of Observed and Simulated Ecosystem Gross Primary Productivity with Time Causal Information Theory Quantifiers

    DOE PAGES

    Sippel, Sebastian; Lange, Holger; Mahecha, Miguel D.; ...

    2016-10-20

    Data analysis and model-data comparisons in the environmental sciences require diagnostic measures that quantify time series dynamics and structure, and are robust to noise in observational data. This paper investigates the temporal dynamics of environmental time series using measures quantifying their information content and complexity. The measures are used to classify natural processes on one hand, and to compare models with observations on the other. The present analysis focuses on the global carbon cycle as an area of research in which model-data integration and comparisons are key to improving our understanding of natural phenomena. We investigate the dynamics of observedmore » and simulated time series of Gross Primary Productivity (GPP), a key variable in terrestrial ecosystems that quantifies ecosystem carbon uptake. However, the dynamics, patterns and magnitudes of GPP time series, both observed and simulated, vary substantially on different temporal and spatial scales. Here we demonstrate that information content and complexity, or Information Theory Quantifiers (ITQ) for short, serve as robust and efficient data-analytical and model benchmarking tools for evaluating the temporal structure and dynamical properties of simulated or observed time series at various spatial scales. At continental scale, we compare GPP time series simulated with two models and an observations-based product. This analysis reveals qualitative differences between model evaluation based on ITQ compared to traditional model performance metrics, indicating that good model performance in terms of absolute or relative error does not imply that the dynamics of the observations is captured well. Furthermore, we show, using an ensemble of site-scale measurements obtained from the FLUXNET archive in the Mediterranean, that model-data or model-model mismatches as indicated by ITQ can be attributed to and interpreted as differences in the temporal structure of the respective ecological time series. At global scale, our understanding of C fluxes relies on the use of consistently applied land models. Here, we use ITQ to evaluate model structure: The measures are largely insensitive to climatic scenarios, land use and atmospheric gas concentrations used to drive them, but clearly separate the structure of 13 different land models taken from the CMIP5 archive and an observations-based product. In conclusion, diagnostic measures of this kind provide data-analytical tools that distinguish different types of natural processes based solely on their dynamics, and are thus highly suitable for environmental science applications such as model structural diagnostics.« less

  19. Diagnosing the Dynamics of Observed and Simulated Ecosystem Gross Primary Productivity with Time Causal Information Theory Quantifiers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sippel, Sebastian; Lange, Holger; Mahecha, Miguel D.

    Data analysis and model-data comparisons in the environmental sciences require diagnostic measures that quantify time series dynamics and structure, and are robust to noise in observational data. This paper investigates the temporal dynamics of environmental time series using measures quantifying their information content and complexity. The measures are used to classify natural processes on one hand, and to compare models with observations on the other. The present analysis focuses on the global carbon cycle as an area of research in which model-data integration and comparisons are key to improving our understanding of natural phenomena. We investigate the dynamics of observedmore » and simulated time series of Gross Primary Productivity (GPP), a key variable in terrestrial ecosystems that quantifies ecosystem carbon uptake. However, the dynamics, patterns and magnitudes of GPP time series, both observed and simulated, vary substantially on different temporal and spatial scales. Here we demonstrate that information content and complexity, or Information Theory Quantifiers (ITQ) for short, serve as robust and efficient data-analytical and model benchmarking tools for evaluating the temporal structure and dynamical properties of simulated or observed time series at various spatial scales. At continental scale, we compare GPP time series simulated with two models and an observations-based product. This analysis reveals qualitative differences between model evaluation based on ITQ compared to traditional model performance metrics, indicating that good model performance in terms of absolute or relative error does not imply that the dynamics of the observations is captured well. Furthermore, we show, using an ensemble of site-scale measurements obtained from the FLUXNET archive in the Mediterranean, that model-data or model-model mismatches as indicated by ITQ can be attributed to and interpreted as differences in the temporal structure of the respective ecological time series. At global scale, our understanding of C fluxes relies on the use of consistently applied land models. Here, we use ITQ to evaluate model structure: The measures are largely insensitive to climatic scenarios, land use and atmospheric gas concentrations used to drive them, but clearly separate the structure of 13 different land models taken from the CMIP5 archive and an observations-based product. In conclusion, diagnostic measures of this kind provide data-analytical tools that distinguish different types of natural processes based solely on their dynamics, and are thus highly suitable for environmental science applications such as model structural diagnostics.« less

  20. Supporting users through integrated retrieval, processing, and distribution systems at the Land Processes Distributed Active Archive Center

    USGS Publications Warehouse

    Kalvelage, Thomas A.; Willems, Jennifer

    2005-01-01

    The US Geological Survey's EROS Data Center (EDC) hosts the Land Processes Distributed Active Archive Center (LP DAAC). The LP DAAC supports NASA's Earth Observing System (EOS), which is a series of polar-orbiting and low inclination satellites for long-term global observations of the land surface, biosphere, solid Earth, atmosphere, and oceans. The EOS Data and Information Systems (EOSDIS) was designed to acquire, archive, manage and distribute Earth observation data to the broadest possible user community.The LP DAAC is one of four DAACs that utilize the EOSDIS Core System (ECS) to manage and archive their data. Since the ECS was originally designed, significant changes have taken place in technology, user expectations, and user requirements. Therefore the LP DAAC has implemented additional systems to meet the evolving needs of scientific users, tailored to an integrated working environment. These systems provide a wide variety of services to improve data access and to enhance data usability through subsampling, reformatting, and reprojection. These systems also support the wide breadth of products that are handled by the LP DAAC.The LP DAAC is the primary archive for the Landsat 7 Enhanced Thematic Mapper Plus (ETM+) data; it is the only facility in the United States that archives, processes, and distributes data from the Advanced Spaceborne Thermal Emission/Reflection Radiometer (ASTER) on NASA's Terra spacecraft; and it is responsible for the archive and distribution of “land products” generated from data acquired by the Moderate Resolution Imaging Spectroradiometer (MODIS) on NASA's Terra and Aqua satellites.

  1. SysML model of exoplanet archive functionality and activities

    NASA Astrophysics Data System (ADS)

    Ramirez, Solange

    2016-08-01

    The NASA Exoplanet Archive is an online service that serves data and information on exoplanets and their host stars to help astronomical research related to search for and characterization of extra-solar planetary systems. In order to provide the most up to date data sets to the users, the exoplanet archive performs weekly updates that include additions into the database and updates to the services as needed. These weekly updates are complex due to interfaces within the archive. I will be presenting a SysML model that helps us perform these update activities in a weekly basis.

  2. A Search for H I Lyα Counterparts to Ultrafast X-Ray Outflows

    NASA Astrophysics Data System (ADS)

    Kriss, Gerard A.; Lee, Julia C.; Danehkar, Ashkbiz

    2018-06-01

    Prompted by the H I Lyα absorption associated with the X-ray ultrafast outflow at ‑17,300 km s‑1 in the quasar PG 1211+143, we have searched archival UV spectra at the expected locations of H I Lyα absorption for a large sample of ultrafast outflows identified in XMM-Newton and Suzaku observations. Sixteen of the X-ray outflows have predicted H I Lyα wavelengths falling within the bandpass of spectra from either the Far Ultraviolet Spectroscopic Explorer or the Hubble Space Telescope, although none of the archival observations were simultaneous with the X-ray observations in which ultrafast X-ray outflows (UFOs) were detected. In our spectra broad features with FWHM of 1000 km s‑1 have 2σ upper limits on the H I column density of generally ≲2 × 1013 cm‑2. Using grids of photoionization models covering a broad range of spectral energy distributions (SEDs), we find that producing Fe XXVI Lyα X-ray absorption with equivalent widths >30 eV and associated H I Lyα absorption with {N}{{H}{{I}}}< 2× {10}13 {cm}}-2 requires total absorbing column densities {N}{{H}}> 5× {10}22 {cm}}-2 and ionization parameters log ξ ≳ 3.7. Nevertheless, a wide range of SEDs would predict observable H I Lyα absorption if ionization parameters are only slightly below peak ionization fractions for Fe XXV and Fe XXVI. The lack of Lyα features in the archival UV spectra indicates that the UFOs have very high ionization parameters, that they have very hard UV-ionizing spectra, or that they were not present at the time of the UV spectral observations owing to variability.

  3. National Satellite Land Remote Sensing Data Archive

    USGS Publications Warehouse

    Faundeen, John L.; Kelly, Francis P.; Holm, Thomas M.; Nolt, Jenna E.

    2013-01-01

    The National Satellite Land Remote Sensing Data Archive (NSLRSDA) resides at the U.S. Geological Survey's (USGS) Earth Resources Observation and Science (EROS) Center. Through the Land Remote Sensing Policy Act of 1992, the U.S. Congress directed the Department of the Interior (DOI) to establish a permanent Government archive containing satellite remote sensing data of the Earth's land surface and to make this data easily accessible and readily available. This unique DOI/USGS archive provides a comprehensive, permanent, and impartial observational record of the planet's land surface obtained throughout more than five decades of satellite remote sensing. Satellite-derived data and information products are primary sources used to detect and understand changes such as deforestation, desertification, agricultural crop vigor, water quality, invasive plant species, and certain natural hazards such as flood extent and wildfire scars.

  4. BAO plate archive digitization

    NASA Astrophysics Data System (ADS)

    Mickaelian, A. M.; Nikoghosyan, E. H.; Gigoyan, K. S.; Paronyan, G. M.; Abrahamyan, H. V.; Andreasyan, H. R.; Azatyan, N. M.; Kostandyan, G. R.; Khachatryan, K. G.; Vardanyan, A. V.; Gyulzadyan, M. V.; Mikayelyan, G. A.; Farmanyan, S. V.; Knyazyan, A. V.

    Astronomical plate archives created on the basis of numerous observations at many observatories are important part of the astronomical heritage. Byurakan Astrophysical Observatory (BAO) plate archive consists of 37,000 photographic plates and films, obtained at 2.6m telescope, 1m and 0.5m Schmidt telescopes and other smaller ones during 1947-1991. In 2015, we have started a project on the whole BAO Plate Archive digitization, creation of electronic database and its scientific usage. A Science Program Board is created to evaluate the observing material, to investigate new possibilities and to propose new projects based on the combined usage of these observations together with other world databases. The Executing Team consists of 11 astronomers and 2 computer scientists and will use 2 EPSON Perfection V750 Pro scanners for the digitization. The project will run during 3 years in 2015-2017 and the final result will be an electronic database and online interactive sky map to be used for further research projects.

  5. A New Archive of UKIRT Legacy Data at CADC

    NASA Astrophysics Data System (ADS)

    Bell, G. S.; Currie, M. J.; Redman, R. O.; Purves, M.; Jenness, T.

    2014-05-01

    We describe a new archive of legacy data from the United Kingdom Infrared Telescope (UKIRT) at the Canadian Astronomy Data Centre (CADC) containing all available data from the Cassegrain instruments. The desire was to archive the raw data in as close to the original format as possible, so where the data followed our current convention of having a single data file per observation, it was archived without alteration, except for minor fixes to headers of data in FITS format to allow it to pass fitsverify and be accepted by CADC. Some of the older data comprised multiple integrations in separate files per observation, stored in either Starlink NDF or Figaro DST format. These were placed inside HDS container files, and DST files were rearranged into NDF format. The describing the observations is ingested into the CAOM-2 repository via an intermediate MongoDB header database, which will also be used to guide the ORAC-DR pipeline in generating reduced data products.

  6. Building a VO-compliant Radio Astronomical DAta Model for Single-dish radio telescopes (RADAMS)

    NASA Astrophysics Data System (ADS)

    Santander-Vela, Juan de Dios; García, Emilio; Leon, Stephane; Espigares, Victor; Ruiz, José Enrique; Verdes-Montenegro, Lourdes; Solano, Enrique

    2012-11-01

    The Virtual Observatory (VO) is becoming the de-facto standard for astronomical data publication. However, the number of radio astronomical archives is still low in general, and even lower is the number of radio astronomical data available through the VO. In order to facilitate the building of new radio astronomical archives, easing at the same time their interoperability with VO framework, we have developed a VO-compliant data model which provides interoperable data semantics for radio data. That model, which we call the Radio Astronomical DAta Model for Single-dish (RADAMS) has been built using standards of (and recommendations from) the International Virtual Observatory Alliance (IVOA). This article describes the RADAMS and its components, including archived entities and their relationships to VO metadata. We show that by using IVOA principles and concepts, the effort needed for both the development of the archives and their VO compatibility has been lowered, and the joint development of two radio astronomical archives have been possible. We plan to adapt RADAMS to be able to deal with interferometry data in the future.

  7. CDDIS: NASA's Archive of Space Geodesy Data and Products Supporting GGOS

    NASA Technical Reports Server (NTRS)

    Noll, Carey; Michael, Patrick

    2016-01-01

    The Crustal Dynamics Data Information System (CDDIS) supports data archiving and distribution activities for the space geodesy and geodynamics community. The main objectives of the system are to store space geodesy and geodynamics related data and products in a central archive, to maintain information about the archival of these data,to disseminate these data and information in a timely manner to a global scientific research community, and provide user based tools for the exploration and use of the archive. The CDDIS data system and its archive is a key component in several of the geometric services within the International Association of Geodesy (IAG) and its observing systemthe Global Geodetic Observing System (GGOS), including the IGS, the International DORIS Service (IDS), the International Laser Ranging Service (ILRS), the International VLBI Service for Geodesy and Astrometry (IVS), and the International Earth Rotation and Reference Systems Service (IERS). The CDDIS provides on-line access to over 17 Tbytes of dataand derived products in support of the IAG services and GGOS. The systems archive continues to grow and improve as new activities are supported and enhancements are implemented. Recently, the CDDIS has established a real-time streaming capability for GNSS data and products. Furthermore, enhancements to metadata describing the contents ofthe archive have been developed to facilitate data discovery. This poster will provide a review of the improvements in the system infrastructure that CDDIS has made over the past year for the geodetic community and describe future plans for the system.

  8. Optimisation of solar synoptic observations

    NASA Astrophysics Data System (ADS)

    Klvaña, Miroslav; Sobotka, Michal; Švanda, Michal

    2012-09-01

    The development of instrumental and computer technologies is connected with steadily increasing needs for archiving of large data volumes. The current trend to meet this requirement includes the data compression and growth of storage capacities. This approach, however, has technical and practical limits. A further reduction of the archived data volume can be achieved by means of an optimisation of the archiving that consists in data selection without losing the useful information. We describe a method of optimised archiving of solar images, based on the selection of images that contain a new information. The new information content is evaluated by means of the analysis of changes detected in the images. We present characteristics of different kinds of image changes and divide them into fictitious changes with a disturbing effect and real changes that provide a new information. In block diagrams describing the selection and archiving, we demonstrate the influence of clouds, the recording of images during an active event on the Sun, including a period before the event onset, and the archiving of long-term history of solar activity. The described optimisation technique is not suitable for helioseismology, because it does not conserve the uniform time step in the archived sequence and removes the information about solar oscillations. In case of long-term synoptic observations, the optimised archiving can save a large amount of storage capacities. The actual capacity saving will depend on the setting of the change-detection sensitivity and on the capability to exclude the fictitious changes.

  9. Social and Behavioral Problems of Children with Agenesis of the Corpus Callosum

    ERIC Educational Resources Information Center

    Badaruddin, Denise H.; Andrews, Glena L.; Bolte, Sven; Schilmoeller, Kathryn J.; Schilmoeller, Gary; Paul, Lynn K.; Brown, Warren S.

    2007-01-01

    Archival data from a survey of parent observations was used to determine the prevalence of social and behavioral problems in children with agenesis of the corpus callosum (ACC). Parent observations were surveyed using the Child Behavior Checklist (CBCL) for 61 children with ACC who were selected from the archive based on criteria of motor…

  10. Constraint based scheduling for the Goddard Space Flight Center distributed Active Archive Center's data archive and distribution system

    NASA Technical Reports Server (NTRS)

    Short, Nick, Jr.; Bedet, Jean-Jacques; Bodden, Lee; Boddy, Mark; White, Jim; Beane, John

    1994-01-01

    The Goddard Space Flight Center (GSFC) Distributed Active Archive Center (DAAC) has been operational since October 1, 1993. Its mission is to support the Earth Observing System (EOS) by providing rapid access to EOS data and analysis products, and to test Earth Observing System Data and Information System (EOSDIS) design concepts. One of the challenges is to ensure quick and easy retrieval of any data archived within the DAAC's Data Archive and Distributed System (DADS). Over the 15-year life of EOS project, an estimated several Petabytes (10(exp 15)) of data will be permanently stored. Accessing that amount of information is a formidable task that will require innovative approaches. As a precursor of the full EOS system, the GSFC DAAC with a few Terabits of storage, has implemented a prototype of a constraint-based task and resource scheduler to improve the performance of the DADS. This Honeywell Task and Resource Scheduler (HTRS), developed by Honeywell Technology Center in cooperation the Information Science and Technology Branch/935, the Code X Operations Technology Program, and the GSFC DAAC, makes better use of limited resources, prevents backlog of data, provides information about resources bottlenecks and performance characteristics. The prototype which is developed concurrently with the GSFC Version 0 (V0) DADS, models DADS activities such as ingestion and distribution with priority, precedence, resource requirements (disk and network bandwidth) and temporal constraints. HTRS supports schedule updates, insertions, and retrieval of task information via an Application Program Interface (API). The prototype has demonstrated with a few examples, the substantial advantages of using HTRS over scheduling algorithms such as a First In First Out (FIFO) queue. The kernel scheduling engine for HTRS, called Kronos, has been successfully applied to several other domains such as space shuttle mission scheduling, demand flow manufacturing, and avionics communications scheduling.

  11. Next-generation forest change mapping across the United States: the landscape change monitoring system (LCMS)

    Treesearch

    Sean P. Healey; Warren B. Cohen; Yang Zhiqiang; Ken Brewer; Evan Brooks; Noel Gorelick; Mathew Gregory; Alexander Hernandez; Chengquan Huang; Joseph Hughes; Robert Kennedy; Thomas Loveland; Kevin Megown; Gretchen Moisen; Todd Schroeder; Brian Schwind; Stephen Stehman; Daniel Steinwand; James Vogelmann; Curtis Woodcock; Limin Yang; Zhe Zhu

    2015-01-01

    Forest change information is critical in forest planning, ecosystem modeling, and in updating forest condition maps. The Landsat satellite platform has provided consistent observations of the world’s ecosystems since 1972. A number of innovative change detection algorithms have been developed to use the Landsat archive to identify and characterize forest change. The...

  12. Speeches Archive

    Science.gov Websites

    Speeches Archive Former AF Top 3 Viewpoints and Speeches Air Force Warrior Games 2017 Events 2018 Air Force Strategic Documents Desert Storm 25th Anniversary Observances DoD Warrior Games Portraits in Courage

  13. Data Archival and Retrieval Enhancement (DARE) Metadata Modeling and Its User Interface

    NASA Technical Reports Server (NTRS)

    Hyon, Jason J.; Borgen, Rosana B.

    1996-01-01

    The Defense Nuclear Agency (DNA) has acquired terabytes of valuable data which need to be archived and effectively distributed to the entire nuclear weapons effects community and others...This paper describes the DARE (Data Archival and Retrieval Enhancement) metadata model and explains how it is used as a source for generating HyperText Markup Language (HTML)or Standard Generalized Markup Language (SGML) documents for access through web browsers such as Netscape.

  14. Announcing a Community Effort to Create an Information Model for Research Software Archives

    NASA Astrophysics Data System (ADS)

    Million, C.; Brazier, A.; King, T.; Hayes, A.

    2018-04-01

    An effort has started to create recommendations and standards for the archiving of planetary science research software. The primary goal is to define an information model that is consistent with OAIS standards.

  15. Clouds in ECMWF's 30 KM Resolution Global Atmospheric Forecast Model (TL639)

    NASA Technical Reports Server (NTRS)

    Cahalan, R. F.; Morcrette, J. J.

    1999-01-01

    Global models of the general circulation of the atmosphere resolve a wide range of length scales, and in particular cloud structures extend from planetary scales to the smallest scales resolvable, now down to 30 km in state-of-the-art models. Even the highest resolution models do not resolve small-scale cloud phenomena seen, for example, in Landsat and other high-resolution satellite images of clouds. Unresolved small-scale disturbances often grow into larger ones through non-linear processes that transfer energy upscale. Understanding upscale cascades is of crucial importance in predicting current weather, and in parameterizing cloud-radiative processes that control long term climate. Several movie animations provide examples of the temporal and spatial variation of cloud fields produced in 4-day runs of the forecast model at the European Centre for Medium-Range Weather Forecasts (ECMWF) in Reading, England, at particular times and locations of simultaneous measurement field campaigns. model resolution is approximately 30 km horizontally (triangular truncation TL639) with 31 vertical levels from surface to stratosphere. Timestep of the model is about 10 minutes, but animation frames are 3 hours apart, at timesteps when the radiation is computed. The animations were prepared from an archive of several 4-day runs at the highest available model resolution, and archived at ECMWF. Cloud, wind and temperature fields in an approximately 1000 km X 1000 km box were retrieved from the archive, then approximately 60 Mb Vis5d files were prepared with the help of Graeme Kelly of ECMWF, and were compressed into MPEG files each less than 3 Mb. We discuss the interaction of clouds and radiation in the model, and compare the variability of cloud liquid as a function of scale to that seen in cloud observations made in intensive field campaigns. Comparison of high-resolution global runs to cloud-resolving models, and to lower resolution climate models is leading to better understanding of the upscale cascade and suggesting new cloud-radiation parameterizations for climate models.

  16. The SpeX Prism Library for Ultracool Dwarfs: A Resource for Stellar, Exoplanet and Galactic Science and Student-Led Research

    NASA Astrophysics Data System (ADS)

    Burgasser, Adam

    The NASA Infrared Telescope Facility's (IRTF) SpeX spectrograph has been an essential tool in the discovery and characterization of ultracool dwarf (UCD) stars, brown dwarfs and exoplanets. Over ten years of SpeX data have been collected on these sources, and a repository of low-resolution (R 100) SpeX prism spectra has been maintained by the PI at the SpeX Prism Spectral Libraries website since 2008. As the largest existing collection of NIR UCD spectra, this repository has facilitated a broad range of investigations in UCD, exoplanet, Galactic and extragalactic science, contributing to over 100 publications in the past 6 years. However, this repository remains highly incomplete, has not been uniformly calibrated, lacks sufficient contextual data for observations and sources, and most importantly provides no data visualization or analysis tools for the user. To fully realize the scientific potential of these data for community research, we propose a two-year program to (1) calibrate and expand existing repository and archival data, and make it virtual-observatory compliant; (2) serve the data through a searchable web archive with basic visualization tools; and (3) develop and distribute an open-source, Python-based analysis toolkit for users to analyze the data. These resources will be generated through an innovative, student-centered research model, with undergraduate and graduate students building and validating the analysis tools through carefully designed coding challenges and research validation activities. The resulting data archive, the SpeX Prism Library, will be a legacy resource for IRTF and SpeX, and will facilitate numerous investigations using current and future NASA capabilities. These include deep/wide surveys of UCDs to measure Galactic structure and chemical evolution, and probe UCD populations in satellite galaxies (e.g., JWST, WFIRST); characterization of directly imaged exoplanet spectra (e.g., FINESSE), and development of low-temperature theoretical models of UCD and exoplanet atmospheres. Our program will also serve to validate the IRTF data archive during its development, by reducing and disseminating non-proprietary archival observations of UCDs to the community. The proposed program directly addresses NASA's strategic goals of exploring the origin and evolution of stars and planets that make up our universe, and discovering and studying planets around other stars.

  17. STScI Archive Manual, Version 7.0

    NASA Astrophysics Data System (ADS)

    Padovani, Paolo

    1999-06-01

    The STScI Archive Manual provides information a user needs to know to access the HST archive via its two user interfaces: StarView and a World Wide Web (WWW) interface. It provides descriptions of the StarView screens used to access information in the database and the format of that information, and introduces the use to the WWW interface. Using the two interfaces, users can search for observations, preview public data, and retrieve data from the archive. Using StarView one can also find calibration reference files and perform detailed association searches. With the WWW interface archive users can access, and obtain information on, all Multimission Archive at Space Telescope (MAST) data, a collection of mainly optical and ultraviolet datasets which include, amongst others, the International Ultraviolet Explorer (IUE) Final Archive. Both interfaces feature a name resolver which simplifies searches based on target name.

  18. Stewardship of very large digital data archives

    NASA Technical Reports Server (NTRS)

    Savage, Patric

    1991-01-01

    An archive is a permanent store. There are relatively few very large digital data archives in existence. Most business records are expired within five or ten years. Many kinds of business records that do have long lives are embedded in data bases that are continually updated and re-issued cyclically. Also, a great deal of permanent business records are actually archived as microfilm, fiche, or optical disk images - their digital version being an operational convenience rather than an archive. The problems forseen in stewarding the very large digital data archives that will accumulate during the mission of the Earth Observing System (EOS) are addressed. It focuses on the function of shepherding archived digital data into an endless future. Stewardship entails storing and protecting the archive and providing meaningful service to the community of users. The steward will (1) provide against loss due to physical phenomena; (2) assure that data is not lost due to storage technology obsolescence; and (3) maintain data in a current formatting methodology.

  19. Satellite and earth science data management activities at the U.S. geological survey's EROS data center

    USGS Publications Warehouse

    Carneggie, David M.; Metz, Gary G.; Draeger, William C.; Thompson, Ralph J.

    1991-01-01

    The U.S. Geological Survey's Earth Resources Observation Systems (EROS) Data Center, the national archive for Landsat data, has 20 years of experience in acquiring, archiving, processing, and distributing Landsat and earth science data. The Center is expanding its satellite and earth science data management activities to support the U.S. Global Change Research Program and the National Aeronautics and Space Administration (NASA) Earth Observing System Program. The Center's current and future data management activities focus on land data and include: satellite and earth science data set acquisition, development and archiving; data set preservation, maintenance and conversion to more durable and accessible archive medium; development of an advanced Land Data Information System; development of enhanced data packaging and distribution mechanisms; and data processing, reprocessing, and product generation systems.

  20. WIFIRE Data Model and Catalog for Wildfire Data and Tools

    NASA Astrophysics Data System (ADS)

    Altintas, I.; Crawl, D.; Cowart, C.; Gupta, A.; Block, J.; de Callafon, R.

    2014-12-01

    The WIFIRE project (wifire.ucsd.edu) is building an end-to-end cyberinfrastructure for real-time and data-driven simulation, prediction and visualization of wildfire behavior. WIFIRE may be used by wildfire management authorities in the future to predict wildfire rate of spread and direction, and assess the effectiveness of high-density sensor networks in improving fire and weather predictions. WIFIRE has created a data model for wildfire resources including sensed and archived data, sensors, satellites, cameras, modeling tools, workflows and social information including Twitter feeds. This data model and associated wildfire resource catalog includes a detailed description of the HPWREN sensor network, SDG&E's Mesonet, and NASA MODIS. In addition, the WIFIRE data-model describes how to integrate the data from multiple heterogeneous sources to provide detailed fire-related information. The data catalog describes 'Observables' captured by each instrument using multiple ontologies including OGC SensorML and NASA SWEET. Observables include measurements such as wind speed, air temperature, and relative humidity, as well as their accuracy and resolution. We have implemented a REST service for publishing to and querying from the catalog using Web Application Description Language (WADL). We are creating web-based user interfaces and mobile device Apps that use the REST interface for dissemination to wildfire modeling community and project partners covering academic, private, and government laboratories while generating value to emergency officials and the general public. Additionally, the Kepler scientific workflow system is instrumented to interact with this data catalog to access real-time streaming and archived wildfire data and stream it into dynamic data-driven wildfire models at scale.

  1. Operational environmental satellite archives in the 21st Century

    NASA Astrophysics Data System (ADS)

    Barkstrom, Bruce R.; Bates, John J.; Privette, Jeff; Vizbulis, Rick

    2007-09-01

    NASA, NOAA, and USGS collections of Earth science data are large, federated, and have active user communities and collections. Our experience raises five categories of issues for long-term archival: *Organization of the data in the collections is not well-described by text-based categorization principles *Metadata organization for these data is not well-described by Dublin Core and needs attention to data access and data use patterns *Long-term archival requires risk management approaches to dealing with the unique threats to knowledge preservation specific to digital information *Long-term archival requires careful attention to archival cost management *Professional data stewards for these collections may require special training. This paper suggests three mechanisms for improving the quality of long-term archival: *Using a maturity model to assess the readiness of data for accession, for preservation, and for future data usefulness *Developing a risk management strategy for systematically dealing with threats of data loss *Developing a life-cycle cost model for continuously evolving the collections and the data centers that house them.

  2. Contents of the JPL Distributed Active Archive Center (DAAC) archive, version 2-91

    NASA Technical Reports Server (NTRS)

    Smith, Elizabeth A. (Editor); Lassanyi, Ruby A. (Editor)

    1991-01-01

    The Distributed Active Archive Center (DAAC) archive at the Jet Propulsion Laboratory (JPL) includes satellite data sets for the ocean sciences and global change research to facilitate multidisciplinary use of satellite ocean data. Parameters include sea surface height, surface wind vector, sea surface temperature, atmospheric liquid water, and surface pigment concentration. The Jet Propulsion Laboratory DAAC is an element of the Earth Observing System Data and Information System (EOSDIS) and will be the United States distribution site for the Ocean Topography Experiment (TOPEX)/POSEIDON data and metadata.

  3. The Kanzelhöhe Online Data Archive

    NASA Astrophysics Data System (ADS)

    Pötzi, W.; Hirtenfellner-Polanec, W.; Temmer, M.

    The Kanzelhöhe Observatory provides high-cadence full-disk observations of solar activity phenomena like sunspots, flares and prominence eruptions on a regular basis. The data are available for download from the KODA (Kanzelhöhe Observatory Data Archive) which is freely accessible. The archive offers sunspot drawings back to 1950 and high cadence H-α data back to 1973. Images from other instruments, like white-light and CaIIK, are available since 2007 and 2010, respectively. In the following we describe how to access the archive and the format of the data.

  4. Calibration of EFOSC2 Broadband Linear Imaging Polarimetry

    NASA Astrophysics Data System (ADS)

    Wiersema, K.; Higgins, A. B.; Covino, S.; Starling, R. L. C.

    2018-03-01

    The European Southern Observatory Faint Object Spectrograph and Camera v2 is one of the workhorse instruments on ESO's New Technology Telescope, and is one of the most popular instruments at La Silla observatory. It is mounted at a Nasmyth focus, and therefore exhibits strong, wavelength and pointing-direction-dependent instrumental polarisation. In this document, we describe our efforts to calibrate the broadband imaging polarimetry mode, and provide a calibration for broadband B, V, and R filters to a level that satisfies most use cases (i.e. polarimetric calibration uncertainty 0.1%). We make our calibration codes public. This calibration effort can be used to enhance the yield of future polarimetric programmes with the European Southern Observatory Faint Object Spectrograph and Camera v2, by allowing good calibration with a greatly reduced number of standard star observations. Similarly, our calibration model can be combined with archival calibration observations to post-process data taken in past years, to form the European Southern Observatory Faint Object Spectrograph and Camera v2 legacy archive with substantial scientific potential.

  5. Flamsteed's stars. New perspectives on the life and work of the first Astronomer Royal (1646 - 1719).

    NASA Astrophysics Data System (ADS)

    Willmoth, F.

    Contents: 1. Introduction: the King's "astronomical observer". 2. Flamsteed's career in astronomy: nobility, morality and public utility (J. Bennett). 3. Astronomy and strife: John Flamsteed and the Royal Society (M. Feingold). 4. Models for the practice of astronomy: Flamsteed, Horrocks and Tycho (F. Willmoth). 5. Flamsteed's optics and the identity of the astronomical observer (A. Johns). 6. Equipping an observatory: Flamsteed and Molyneux discuss an astronomical quadrant (H. Higton). 7. Mathematical characters: Flamsteed and Christ's Hospital Royal Mathematical School (R. Iliffe). 8. "Professor" John Flamsteed (I. G. Stewart). 9. Edmond Halley and John Flamsteed at the Royal Observatory (A. Cook). 10. A unique copy of Flamsteed's Historia coelestis (1712) (O. Gingerich). 11. "Labour harder than thrashing": John Flamsteed, property and intellectual labour in nineteenth-century England (W. J. Ashworth). 12. The Flamsteed papers in the archives of the Royal Greenwich Observatory. (A. Perkins). A summary catalogue of Flamsteed's papers in the Royal Greenwich Observatory archives (compiled by F. Willmoth).

  6. MODIS Collection 6 Data at the National Snow and Ice Data Center (NSIDC)

    NASA Astrophysics Data System (ADS)

    Fowler, D. K.; Steiker, A. E.; Johnston, T.; Haran, T. M.; Fowler, C.; Wyatt, P.

    2015-12-01

    For over 15 years, the NASA National Snow and Ice Data Center Distributed Active Archive Center (NSIDC DAAC) has archived and distributed snow and sea ice products derived from the Moderate Resolution Imaging Spectroradiometer (MODIS) instruments on the NASA Earth Observing System (EOS) Aqua and Terra satellites. Collection 6 represents the next revision to NSIDC's MODIS archive, mainly affecting the snow-cover products. Collection 6 specifically addresses the needs of the MODIS science community by targeting the scenarios that have historically confounded snow detection and introduced errors into the snow-cover and fractional snow-cover maps even though MODIS snow-cover maps are typically 90 percent accurate or better under good observing conditions, Collection 6 uses revised algorithms to discriminate between snow and clouds, resolve uncertainties along the edges of snow-covered regions, and detect summer snow cover in mountains. Furthermore, Collection 6 applies modified and additional snow detection screens and new Quality Assessment protocols that enhance the overall accuracy of the snow maps compared with Collection 5. Collection 6 also introduces several new MODIS snow products, including a daily Climate Modelling Grid (CMG) cloud gap-filled (CGF) snow-cover map which generates cloud-free maps by using the most recent clear observations.. The MODIS Collection 6 sea ice extent and ice surface temperature algorithms and products are much the same as Collection 5; however, Collection 6 updates to algorithm inputs—in particular, the L1B calibrated radiances, land and water mask, and cloud mask products—have improved the sea ice outputs. The MODIS sea ice products are currently available at NSIDC, and the snow cover products are soon to follow in 2016 NSIDC offers a variety of methods for obtaining these data. Users can download data directly from an online archive or use the NASA Reverb Search & Order Tool to perform spatial, temporal, and parameter subsetting, reformatting, and re-projection of the data.

  7. Skill Testing a Three-Dimensional Global Tide Model to Historical Current Meter Records

    DTIC Science & Technology

    2013-12-17

    up to 20% weaker skill in the Southern Ocean. Citation: Timko, P. G., B. K. Arbic, J. G. Richman, R . B. Scott, E. J. Metzger, and A. J. Wallcraft (2013...model were identified from a current meter archive ( CMA ) of approximately 9000 unique time series previously used by Scott et al. [2010] and Timko et al...2012]. The CMA spans 40 years of observations. Some of the velocity records used in this study represents individ- ual depth bins from ADCP’s. The

  8. NVST Data Archiving System Based On FastBit NoSQL Database

    NASA Astrophysics Data System (ADS)

    Liu, Ying-bo; Wang, Feng; Ji, Kai-fan; Deng, Hui; Dai, Wei; Liang, Bo

    2014-06-01

    The New Vacuum Solar Telescope (NVST) is a 1-meter vacuum solar telescope that aims to observe the fine structures of active regions on the Sun. The main tasks of the NVST are high resolution imaging and spectral observations, including the measurements of the solar magnetic field. The NVST has been collecting more than 20 million FITS files since it began routine observations in 2012 and produces a maximum observational records of 120 thousand files in a day. Given the large amount of files, the effective archiving and retrieval of files becomes a critical and urgent problem. In this study, we implement a new data archiving system for the NVST based on the Fastbit Not Only Structured Query Language (NoSQL) database. Comparing to the relational database (i.e., MySQL; My Structured Query Language), the Fastbit database manifests distinctive advantages on indexing and querying performance. In a large scale database of 40 million records, the multi-field combined query response time of Fastbit database is about 15 times faster and fully meets the requirements of the NVST. Our study brings a new idea for massive astronomical data archiving and would contribute to the design of data management systems for other astronomical telescopes.

  9. Rnomads: An R Interface with the NOAA Operational Model Archive and Distribution System

    NASA Astrophysics Data System (ADS)

    Bowman, D. C.; Lees, J. M.

    2014-12-01

    The National Oceanic and Atmospheric Administration Operational Model Archive and Distribution System (NOMADS) facilitates rapid delivery of real time and archived environmental data sets from multiple agencies. These data are distributed free to the scientific community, industry, and the public. The rNOMADS package provides an interface between NOMADS and the R programming language. Like R itself, rNOMADS is open source and cross platform. It utilizes server-side functionality on the NOMADS system to subset model outputs for delivery to client R users. There are currently 57 real time and 10 archived models available through rNOMADS. Atmospheric models include the Global Forecast System and North American Mesoscale. Oceanic models include WAVEWATCH III and U. S. Navy Operational Global Ocean Model. rNOMADS has been downloaded 1700 times in the year since it was released. At the time of writing, it is being used for wind and solar power modeling, climate monitoring related to food security concerns, and storm surge/inundation calculations, among others. We introduce this new package and show how it can be used to extract data for infrasonic waveform modeling in the atmosphere.

  10. The Chandra X-ray Observatory: An Astronomical Facility Available to the World

    NASA Technical Reports Server (NTRS)

    Smith, Randall K.

    2006-01-01

    The Chandra X-ray observatory, one of NASA's "Great Observatories," provides high angular and spectral resolution X-ray data which is freely available to all. In this review I describe the instruments on chandra along with their current calibration, as well as the chandra proposal system, the freely-available Chandra analysis software package CIAO, and the Chandra archive. As Chandra is in its 6th year of operation, the archive already contains calibrated observations of a large range of X-ray sources. The Chandra X-ray Center is committed to assisting astronomers from any country who wish to use data from the archive or propose for observations

  11. A new technique for measuring aerosols with moonlight observations and a sky background model

    NASA Astrophysics Data System (ADS)

    Jones, Amy; Noll, Stefan; Kausch, Wolfgang; Kimeswenger, Stefan; Szyszka, Ceszary; Unterguggenberger, Stefanie

    2014-05-01

    There have been an ample number of studies on aerosols in urban, daylight conditions, but few for remote, nocturnal aerosols. We have developed a new technique for investigating such aerosols using our sky background model and astronomical observations. With a dedicated observing proposal we have successfully tested this technique for nocturnal, remote aerosol studies. This technique relies on three requirements: (a) sky background model, (b) observations taken with scattered moonlight, and (c) spectrophotometric standard star observations for flux calibrations. The sky background model was developed for the European Southern Observatory and is optimized for the Very Large Telescope at Cerro Paranal in the Atacama desert in Chile. This is a remote location with almost no urban aerosols. It is well suited for studying remote background aerosols that are normally difficult to detect. Our sky background model has an uncertainty of around 20 percent and the scattered moonlight portion is even more accurate. The last two requirements are having astronomical observations with moonlight and of standard stars at different airmasses, all during the same night. We had a dedicated observing proposal at Cerro Paranal with the instrument X-Shooter to use as a case study for this method. X-Shooter is a medium resolution, echelle spectrograph which covers the wavelengths from 0.3 to 2.5 micrometers. We observed plain sky at six different distances (7, 13, 20, 45, 90, and 110 degrees) to the Moon for three different Moon phases (between full and half). Also direct observations of spectrophotometric standard stars were taken at two different airmasses for each night to measure the extinction curve via the Langley method. This is an ideal data set for testing this technique. The underlying assumption is that all components, other than the atmospheric conditions (specifically aerosols and airglow), can be calculated with the model for the given observing parameters. The scattered moonlight model is designed for the average atmospheric conditions at Cerro Paranal. The Mie scattering is calculated for the average distribution of aerosol particles, but this input can be modified. We can avoid the airglow emission lines, and near full Moon the airglow continuum can be ignored. In the case study, by comparing the scattered moonlight for the various angles and wavelengths along with the extinction curve from the standard stars, we can iteratively find the optimal aerosol size distribution for the time of observation. We will present this new technique, the results from this case study, and how it can be implemented for investigating aerosols using the X-Shooter archive and other astronomical archives.

  12. Multiphoton fluorescence lifetime imaging of chemotherapy distribution in solid tumors

    NASA Astrophysics Data System (ADS)

    Carlson, Marjorie; Watson, Adrienne L.; Anderson, Leah; Largaespada, David A.; Provenzano, Paolo P.

    2017-11-01

    Doxorubicin is a commonly used chemotherapeutic employed to treat multiple human cancers, including numerous sarcomas and carcinomas. Furthermore, doxorubicin possesses strong fluorescent properties that make it an ideal reagent for modeling drug delivery by examining its distribution in cells and tissues. However, while doxorubicin fluorescence and lifetime have been imaged in live tissue, its behavior in archival samples that frequently result from drug and treatment studies in human and animal patients, and murine models of human cancer, has to date been largely unexplored. Here, we demonstrate imaging of doxorubicin intensity and lifetimes in archival formalin-fixed paraffin-embedded sections from mouse models of human cancer with multiphoton excitation and multiphoton fluorescence lifetime imaging microscopy (FLIM). Multiphoton excitation imaging reveals robust doxorubicin emission in tissue sections and captures spatial heterogeneity in cells and tissues. However, quantifying the amount of doxorubicin signal in distinct cell compartments, particularly the nucleus, often remains challenging due to strong signals in multiple compartments. The addition of FLIM analysis to display the spatial distribution of excited state lifetimes clearly distinguishes between signals in distinct compartments such as the cell nuclei versus cytoplasm and allows for quantification of doxorubicin signal in each compartment. Furthermore, we observed a shift in lifetime values in the nuclei of transformed cells versus nontransformed cells, suggesting a possible diagnostic role for doxorubicin lifetime imaging to distinguish normal versus transformed cells. Thus, data here demonstrate that multiphoton FLIM is a highly sensitive platform for imaging doxorubicin distribution in normal and diseased archival tissues.

  13. Compendium of NASA Data Base for the Global Tropospheric Experiment's Transport and Chemical Evolution Over the Pacific (TRACE-P). Volume 2; P-3B

    NASA Technical Reports Server (NTRS)

    Kleb, Mary M.; Scott, A. Donald, Jr.

    2003-01-01

    This report provides a compendium of NASA aircraft data that are available from NASA's Global Tropospheric Experiment's (GTE) Transport and Chemical Evolution over the Pacific (TRACE-P) Mission. The broad goal of TRACE-P was to characterize the transit and evolution of the Asian outflow over the western Pacific. Conducted from February 24 through April 10, 2001, TRACE-P integrated airborne, satellite- and ground based observations, as well as forecasts from aerosol and chemistry models. The format of this compendium utilizes data plots (time series) of selected data acquired aboard the NASA/Dryden DC-8 (vol. 1) and NASA/Wallops P-3B (vol. 2) aircraft during TRACE-P. The purpose of this document is to provide a representation of aircraft data that are available in archived format via NASA Langley's Distributed Active Archive Center (DAAC) and through the GTE Project Office archive. The data format is not intended to support original research/analyses, but to assist the reader in identifying data that are of interest.

  14. Compendium of NASA Data Base for the Global Tropospheric Experiment's Transport and Chemical Evolution Over the Pacific (TRACE-P). Volume 1; DC-8

    NASA Technical Reports Server (NTRS)

    Kleb, Mary M.; Scott, A. Donald, Jr.

    2003-01-01

    This report provides a compendium of NASA aircraft data that are available from NASA's Global Tropospheric Experiment's (GTE) Transport and Chemical Evolution over the Pacific (TRACE-P) Mission. The broad goal of TRACE-P was to characterize the transit and evolution of the Asian outflow over the western Pacific. Conducted from February 24 through April 10, 2001, TRACE-P integrated airborne, satellite- and ground-based observations, as well as forecasts from aerosol and chemistry models. The format of this compendium utilizes data plots (time series) of selected data acquired aboard the NASA/Dryden DC-8 (vol. 1) and NASA/Wallops P-3B (vol. 2) aircraft during TRACE-P. The purpose of this document is to provide a representation of aircraft data that are available in archived format via NASA Langley s Distributed Active Archive Center (DAAC) and through the GTE Project Office archive. The data format is not intended to support original research/analyses, but to assist the reader in identifying data that are of interest.

  15. The American Archival Profession and Information Technology Standards.

    ERIC Educational Resources Information Center

    Cox, Richard J.

    1992-01-01

    Discussion of the use of standards by archivists highlights the U.S. MARC AMC (Archives-Manuscript Control) format for reporting archival records and manuscripts; their interest in specific standards being developed for the OSI (Open Systems Interconnection) reference model; and the management of records in electronic formats. (16 references) (LAE)

  16. Archive, Access, and Supply of Scientifically Derived Data: A Data Model for Multi-Parameterized Querying Where Spectral Data Base Meets GIS-Based Mapping Archive

    NASA Astrophysics Data System (ADS)

    Nass, A.; D'Amore, M.; Helbert, J.

    2018-04-01

    An archiving structure and reference level of derived and already published data supports the scientific community significantly by a constant rise of knowledge and understanding based on recent discussions within Information Science and Management.

  17. The Archival Photograph and Its Meaning: Formalisms for Modeling Images

    ERIC Educational Resources Information Center

    Benson, Allen C.

    2009-01-01

    This article explores ontological principles and their potential applications in the formal description of archival photographs. Current archival descriptive practices are reviewed and the larger question is addressed: do archivists who are engaged in describing photographs need a more formalized system of representation, or do existing encoding…

  18. Wavelet data compression for archiving high-resolution icosahedral model data

    NASA Astrophysics Data System (ADS)

    Wang, N.; Bao, J.; Lee, J.

    2011-12-01

    With the increase of the resolution of global circulation models, it becomes ever more important to develop highly effective solutions to archive the huge datasets produced by those models. While lossless data compression guarantees the accuracy of the restored data, it can only achieve limited reduction of data size. Wavelet transform based data compression offers significant potentials in data size reduction, and it has been shown very effective in transmitting data for remote visualizations. However, for data archive purposes, a detailed study has to be conducted to evaluate its impact to the datasets that will be used in further numerical computations. In this study, we carried out two sets of experiments for both summer and winter seasons. An icosahedral grid weather model and a highly efficient wavelet data compression software were used for this study. Initial conditions were compressed and input to the model to run to 10 days. The forecast results were then compared to those forecast results from the model run with the original uncompressed initial conditions. Several visual comparisons, as well as the statistics of numerical comparisons are presented. These results indicate that with specified minimum accuracy losses, wavelet data compression achieves significant data size reduction, and at the same time, it maintains minimum numerical impacts to the datasets. In addition, some issues are discussed to increase the archive efficiency while retaining a complete set of meta data for each archived file.

  19. TAPAS, a VO archive at the IRAM 30-m telescope

    NASA Astrophysics Data System (ADS)

    Leon, Stephane; Espigares, Victor; Ruíz, José Enrique; Verdes-Montenegro, Lourdes; Mauersberger, Rainer; Brunswig, Walter; Kramer, Carsten; Santander-Vela, Juan de Dios; Wiesemeyer, Helmut

    2012-07-01

    Astronomical observatories are today generating increasingly large volumes of data. For an efficient use of them, databases have been built following the standards proposed by the International Virtual Observatory Alliance (IVOA), providing a common protocol to query them and make them interoperable. The IRAM 30-m radio telescope, located in Sierra Nevada (Granada, Spain) is a millimeter wavelength telescope with a constantly renewed, extensive choice of instruments, and capable of covering the frequency range between 80 and 370 GHz. It is continuously producing a large amount of data thanks to the more than 200 scientific projects observed each year. The TAPAS archive at the IRAM 30-m telescope is aimed to provide public access to the headers describing the observations performed with the telescope, according to a defined data policy, making as well the technical data available to the IRAM staff members. A special emphasis has been made to make it Virtual Observatory (VO) compliant, and to offer a VO compliant web interface allowing to make the information available to the scientific community. TAPAS is built using the Django Python framework on top of a relational MySQL database, and is fully integrated with the telescope control system. The TAPAS data model (DM) is based on the Radio Astronomical DAta Model for Single dish radio telescopes (RADAMS), to allow for easy integration into the VO infrastructure. A metadata modeling layer is used by the data-filler to allow an implementation free from assumptions about the control system and the underlying database. TAPAS and its public web interface ( http://tapas.iram.es ) provides a scalable system that can evolve with new instruments and observing modes. A meta description of the DM has been introduced in TAPAS in order to both avoid undesired coupling between the code and the DM and to provide a better management of the archive. A subset of the header data stored in TAPAS will be made available at the CDS.

  20. SPASE 2010 - Providing Access to the Heliophysics Data Environment

    NASA Astrophysics Data System (ADS)

    Thieman, J. R.; King, T. A.; Roberts, D.; Spase Consortium

    2010-12-01

    The Heliophysics division of NASA has adopted the Space Physics Archive Search and Extract (SPASE) Data Model for use within the Heliophysics Data Environment which is composed of virtual observatories, value-added services, resident and active archives, and other data providers. The SPASE Data Model has also been adopted by Japan's Inter-university Upper atmosphere Global Observation NETwork (IUGONET), NOAA's National Geophysics Data Center (NGDC), and the Canadian Space Science Data Portal (CSSDP). Europe's HELIO project harvests information from SPASE descriptions of resources as does Planetary Plasma Interactions (PPI) Node of NASA's Planetary Data System (PDS). All of the data sets in the Heliophysics Data Environment are intended to be described by the Space Physics Archive Search and Extract (SPASE) Data Model. Many have already been described in this way. The current version of the SPASE Data Model (2.2.0) may be found on the SPASE web site at http://www.spase-group.org SPASE data set descriptions are not as difficult to create as it might seem. Help is available in both the documentation and the many tools created to support SPASE description creators. There are now a number of very experienced users who are willing to help as well. The SPASE consortium has advanced to the next step in the odyssey to achieve well coordinated federation of resource providers by designing and implementing a set of core services to facilitate the exchange of metadata and delivery of data packages. An example is the registry service shown at http://vmo.igpp.ucla.edu/registry SPASE also incorporates new technologies that are useful to the overall effort, such as cloud storage. A review of the advances, uses of the SPASE data model, and role of services in a federated environment is presented.

  1. Community Exoplanet Follow-up Program

    NASA Technical Reports Server (NTRS)

    Howell, Steve

    2017-01-01

    During the Kepler mission, our team provided the community with the highest resolution images available anywhere of exoplanet host stars. Using speckle interferometry on the 3.5-m WIYN, and 8-m Gemini telescopes, thousands of observations have been obtained reaching the diffraction limit of the telescope. From these public data available at the NASA Exoplanet Archive, numerous publications have resulted and many scientific results have been obtained for exoplanets including the fact that high-resolution imaging is critical to fully characterize the planet host stars and the planets themselves (e.g., planet radius and incident flux). Exoplanet host star observations have also occurred (and continue) for K2 mission candidates with archival data available as well. Observational programs for TESS candidates, WFIRST program stars, and Zodiacal light candidates are currently on-going. Availability to propose or obtain such observations are possible through 1) collaboration with our team, 2) successfully proposing to WIYN or GEMINI for telescope time, or 3) using publically available archival data. This poster will highlight the observational program, how time is allocated and how our queue observational program works, and new features and observational modes that are available now.

  2. The NASA Navigator Program Ground Based Archives at the Michelson Science Center: Supporting the Search for Habitable Planets

    NASA Astrophysics Data System (ADS)

    Berriman, G. B.; Ciardi, D. R.; Good, J. C.; Laity, A. C.; Zhang, A.

    2006-07-01

    At ADASS XIV, we described how the W. M. Keck Observatory Archive (KOA) re-uses and extends the component based architecture of the NASA/IPAC Infrared Science Archive (IRSA) to ingest and serve level 0 observations made with HIRES, the High Resolution Echelle Spectrometer. Since August 18, the KOA has ingested 325 GB of data from 135 nights of observations. The architecture exploits a service layer between the mass storage layer and the user interface. This service layer consists of standalone utilities called through a simple executive that perform generic query and retrieval functions, such as query generation, database table sub-setting, and return page generation etc. It has been extended to implement proprietary access to data through deployment of query management middleware developed for the National Virtual Observatory. The MSC archives have recently extended this design to query and retrieve complex data sets describing the properties of potential target stars for the Terrestrial Planet Finder (TPF) missions. The archives can now support knowledge based retrieval, as well as data retrieval. This paper describes how extensions to the IRSA architecture, which is applicable across all wavelengths and astronomical datatypes, supports the design and development of the MSC NP archives at modest cost.

  3. A WWW-Based Archive and Retrieval System for Multimedia

    NASA Technical Reports Server (NTRS)

    Hyon, J.; Sorensen, S.; Martin, M.; Kawasaki, K.; Takacs, M.

    1996-01-01

    This paper describes the Data Distribution Laboratory (DDL) and discusses issues involved in building multimedia CD-ROMs. It describes the modeling philosophy for cataloging multimedia products and the worldwide-web (WWW)-based multimedia archive and retrieval system (Webcat) built on that model.

  4. A Bayesian Approach to Evaluating Consistency between Climate Model Output and Observations

    NASA Astrophysics Data System (ADS)

    Braverman, A. J.; Cressie, N.; Teixeira, J.

    2010-12-01

    Like other scientific and engineering problems that involve physical modeling of complex systems, climate models can be evaluated and diagnosed by comparing their output to observations of similar quantities. Though the global remote sensing data record is relatively short by climate research standards, these data offer opportunities to evaluate model predictions in new ways. For example, remote sensing data are spatially and temporally dense enough to provide distributional information that goes beyond simple moments to allow quantification of temporal and spatial dependence structures. In this talk, we propose a new method for exploiting these rich data sets using a Bayesian paradigm. For a collection of climate models, we calculate posterior probabilities its members best represent the physical system each seeks to reproduce. The posterior probability is based on the likelihood that a chosen summary statistic, computed from observations, would be obtained when the model's output is considered as a realization from a stochastic process. By exploring how posterior probabilities change with different statistics, we may paint a more quantitative and complete picture of the strengths and weaknesses of the models relative to the observations. We demonstrate our method using model output from the CMIP archive, and observations from NASA's Atmospheric Infrared Sounder.

  5. O Star Wind Mass-Loss Rates and Shock Physics from X-ray Line Profiles in Archival XMM RGS Data

    NASA Astrophysics Data System (ADS)

    Cohen, David

    O stars are characterized by their dense, supersonic stellar winds. These winds are the site of X-ray emission from shock-heated plasma. By analyzing high-resolution X-ray spectra of these O stars, we can learn about the wind-shock heating and X-ray production mechanism. But in addition, the X-rays can also be used to measure the mass-loss rate of the stellar wind, which is a key observational quantity whose value affects stellar evolution and energy, momentum, and mass input to the Galactic interstellar medium. We make this X-ray based mass-loss measurement by analyzing the profile shapes of the X-ray emission lines observed at high resolution with the Chandra and XMM-Newton grating spectrometers. One advantage of our method is that it is insensitive to small-scale clumping that affects density-squared diagnostics. We are applying this analysis technique to O stars in the Chandra archive, and are finding mass-loss rates lower than those traditionally assumed for these O stars, and in line with more recent independent determinations that do account for clumping. By extending this analysis to the XMM RGS data archive, we will make significant contributions to the understanding of both X-ray production in O stars and to addressing the issue of the actual mass-loss rates of O stars. The XMM RGS data archive provides several extensions and advantages over the smaller Chandra HETGS archive: (1) there are roughly twice as many O and early B stars in the XMM archive; (2) the longer wavelength response of the RGS provides access to diagnostically important lines of nitrogen and carbon; (3) the very long, multiple exposures of zeta Pup provide the opportunity to study this canonical O supergiant's X-ray spectrum in unprecedented detail, including looking at the time variability of X-ray line profiles. Our research team has developed a sophisticated empirical line profile model as well as a computational infrastructure for fitting the model to high-resolution X-ray spectra in order to determine the values of physically meaningful model parameters, and to place confidence limits on them. We have incorporated second-order effects into our models, including resonance scattering. We have also developed tools for modeling the X-ray opacity of the cold, X-ray absorbing wind component, which is a crucial ingredient of the technique we have developed for determining wind mass-loss rates from analyzing the ensemble of emission lines from a given star's X-ray spectrum. In addition to testing state-of-the-art wind shock models and measuring O star mass-loss rates, an important component of our proposed research program is the education of talented undergraduates. Swarthmore undergraduates have made significant contributions to the development of our line profile modeling, the wind opacity modeling, and related research topics such as laboratory astrophysics before going on to PhD programs. Two have been named as finalists for the APS's Apker prize. The research we propose here will involve two undergraduates and will likely lead to honors theses, refereed papers, and the opportunity to present their research results at national and international meetings. By measuring mass-loss rates for all the O stars for which high-resolution X-ray spectra exist and by constraining X-ray production mechanisms, we will address issues important to our understanding of stellar and galactic evolution: including the frequency of core collapse supernovae, the energetics of the Galactic interstellar medium, and the radiation conditions in star formation regions where not only new, solar-type stars form, but also where their planetary systems form and are subject to effects of high-energy emission from nearby stars. In this way, the work we are proposing in this project will make a contribution to NASA's mission to understand cosmic evolution and the conditions for generating and sustaining life in the Universe.

  6. The effect of channel deepening on tides and storm surge: A case study of Wilmington, NC

    NASA Astrophysics Data System (ADS)

    Familkhalili, R.; Talke, S. A.

    2016-09-01

    In this study we investigate the hypothesis that increasing channel depth in estuaries can amplify both tides and storm surge by developing an idealized numerical model representing the 1888, 1975, and 2015 bathymetric conditions of the Cape Fear River Estuary, NC. Archival tide gauge data recovered from the U.S. National Archives indicates that mean tidal range in Wilmington has doubled to 1.55 m since the 1880s, with a much smaller increase of 0.07 m observed near the ocean boundary. These tidal changes are reproduced by simulating channel depths of 7 m (1888 condition) and 15.5 m (modern condition). Similarly, model sensitivity studies using idealized, parametric tropical cyclones suggest that the storm surge in the worst-case, CAT-5 event may have increased from 3.8 ± 0.25 m to 5.6 ± 0.6 m since the nineteenth century. The amplification in both tides and storm surge is influenced by reduced hydraulic drag caused by greater mean depths.

  7. Outcome of the First wwPDB Hybrid/Integrative Methods Task Force Workshop

    PubMed Central

    Sali, Andrej; Berman, Helen M.; Schwede, Torsten; Trewhella, Jill; Kleywegt, Gerard; Burley, Stephen K.; Markley, John; Nakamura, Haruki; Adams, Paul; Bonvin, Alexandre M.J.J.; Chiu, Wah; Dal Peraro, Matteo; Di Maio, Frank; Ferrin, Thomas E.; Grünewald, Kay; Gutmanas, Aleksandras; Henderson, Richard; Hummer, Gerhard; Iwasaki, Kenji; Johnson, Graham; Lawson, Catherine L.; Meiler, Jens; Marti-Renom, Marc A.; Montelione, Gaetano T.; Nilges, Michael; Nussinov, Ruth; Patwardhan, Ardan; Rappsilber, Juri; Read, Randy J.; Saibil, Helen; Schröder, Gunnar F.; Schwieters, Charles D.; Seidel, Claus A. M.; Svergun, Dmitri; Topf, Maya; Ulrich, Eldon L.; Velankar, Sameer; Westbrook, John D.

    2016-01-01

    Summary Structures of biomolecular systems are increasingly computed by integrative modeling that relies on varied types of experimental data and theoretical information. We describe here the proceedings and conclusions from the first wwPDB Hybrid/Integrative Methods Task Force Workshop held at the European Bioinformatics Institute in Hinxton, UK, October 6 and 7, 2014. At the workshop, experts in various experimental fields of structural biology, experts in integrative modeling and visualization, and experts in data archiving addressed a series of questions central to the future of structural biology. How should integrative models be represented? How should the data and integrative models be validated? What data should be archived? How should the data and models be archived? What information should accompany the publication of integrative models? PMID:26095030

  8. From the Cluster Temperature Function to the Mass Function at Low Z

    NASA Technical Reports Server (NTRS)

    Mushotzky, Richard (Technical Monitor); Markevitch, Maxim

    2004-01-01

    This XMM project consisted of three observations of the nearby, hot galaxy cluster Triangulum Australis, one of the cluster center and two offsets. The goal was to measure the radial gas temperature profile out to large radii and derive the total gravitating mass within the radius of average mass overdensity 500. The central pointing also provides data for a detailed two-dimensional gas temperature map of this interesting cluster. We have analyzed all three observations. The derivation of the temperature map using the central pointing is complete, and the paper is soon to be submitted. During the course of this study and of the analysis of archival XMM cluster observations, it became apparent that the commonly used XMM background flare screening techniques are often not accurate enough for studies of the cluster outer regions. The information on the cluster's total masses is contained at large off-center distances, and it is precisely the temperatures for those low-brightness regions that are most affected by the detector background anomalies. In particular, our two offset observations of the Triangulum have been contaminated by the background flares ("bad cosmic weather") to a degree where they could not be used for accurate spectral analysis. This forced us to expand the scope of our project. We needed to devise a more accurate method of screening and modeling the background flares, and to evaluate the uncertainty of the XMM background modeling. To do this, we have analyzed a large number of archival EPIC blank-field and closed-cover observations. As a result, we have derived stricter background screening criteria. It also turned out that mild flares affecting EPIC-pn can be modeled with an adequate accuracy. Such modeling has been used to derive our Triangulum temperature map. The results of our XMM background analysis, including the modeling recipes, are presented in a paper which is in final preparation and will be submitted soon. It will be useful not only for our future analysis but for other XMM cluster observations as well.

  9. Constraining the temperature history of the past millennium using early instrumental observations

    NASA Astrophysics Data System (ADS)

    Brohan, P.; Allan, R.; Freeman, E.; Wheeler, D.; Wilkinson, C.; Williamson, F.

    2012-05-01

    The current assessment that twentieth-century global temperature change is unusual in the context of the last thousand years relies on estimates of temperature changes from natural proxies (tree-rings, ice-cores etc.) and climate model simulations. Confidence in such estimates is limited by difficulties in calibrating the proxies and systematic differences between proxy reconstructions and model simulations. As the difference between the estimates extends into the relatively recent period of the early nineteenth century it is possible to compare them with a reliable instrumental estimate of the temperature change over that period, provided that enough early thermometer observations, covering a wide enough expanse of the world, can be collected. One organisation which systematically made observations and collected the results was the English East-India Company (EEIC), and their archives have been preserved in the British Library. Inspection of those archives revealed 900 log-books of EEIC ships containing daily instrumental measurements of temperature and pressure, and subjective estimates of wind speed and direction, from voyages across the Atlantic and Indian Oceans between 1789 and 1834. Those records have been extracted and digitised, providing 273 000 new weather records offering an unprecedentedly detailed view of the weather and climate of the late eighteenth and early nineteenth centuries. The new thermometer observations demonstrate that the large-scale temperature response to the Tambora eruption and the 1809 eruption was modest (perhaps 0.5 °C). This provides a powerful out-of-sample validation for the proxy reconstructions - supporting their use for longer-term climate reconstructions. However, some of the climate model simulations in the CMIP5 ensemble show much larger volcanic effects than this - such simulations are unlikely to be accurate in this respect.

  10. Constraining the temperature history of the past millennium using early instrumental observations

    NASA Astrophysics Data System (ADS)

    Brohan, P.; Allan, R.; Freeman, E.; Wheeler, D.; Wilkinson, C.; Williamson, F.

    2012-10-01

    The current assessment that twentieth-century global temperature change is unusual in the context of the last thousand years relies on estimates of temperature changes from natural proxies (tree-rings, ice-cores, etc.) and climate model simulations. Confidence in such estimates is limited by difficulties in calibrating the proxies and systematic differences between proxy reconstructions and model simulations. As the difference between the estimates extends into the relatively recent period of the early nineteenth century it is possible to compare them with a reliable instrumental estimate of the temperature change over that period, provided that enough early thermometer observations, covering a wide enough expanse of the world, can be collected. One organisation which systematically made observations and collected the results was the English East India Company (EEIC), and their archives have been preserved in the British Library. Inspection of those archives revealed 900 log-books of EEIC ships containing daily instrumental measurements of temperature and pressure, and subjective estimates of wind speed and direction, from voyages across the Atlantic and Indian Oceans between 1789 and 1834. Those records have been extracted and digitised, providing 273 000 new weather records offering an unprecedentedly detailed view of the weather and climate of the late eighteenth and early nineteenth centuries. The new thermometer observations demonstrate that the large-scale temperature response to the Tambora eruption and the 1809 eruption was modest (perhaps 0.5 °C). This provides an out-of-sample validation for the proxy reconstructions - supporting their use for longer-term climate reconstructions. However, some of the climate model simulations in the CMIP5 ensemble show much larger volcanic effects than this - such simulations are unlikely to be accurate in this respect.

  11. Contents of the NASA ocean data system archive, version 11-90

    NASA Technical Reports Server (NTRS)

    Smith, Elizabeth A. (Editor); Lassanyi, Ruby A. (Editor)

    1990-01-01

    The National Aeronautics and Space Administration (NASA) Ocean Data System (NODS) archive at the Jet Propulsion Laboratory (JPL) includes satellite data sets for the ocean sciences and global-change research to facilitate multidisciplinary use of satellite ocean data. Parameters include sea-surface height, surface-wind vector, sea-surface temperature, atmospheric liquid water, and surface pigment concentration. NODS will become the Data Archive and Distribution Service of the JPL Distributed Active Archive Center for the Earth Observing System Data and Information System (EOSDIS) and will be the United States distribution site for Ocean Topography Experiment (TOPEX)/POSEIDON data and metadata.

  12. A biological survey on the Ottoman Archive papers and determination of the D10 value

    NASA Astrophysics Data System (ADS)

    Kantoğlu, Ömer; Ergun, Ece; Ozmen, Dilan; Halkman, Hilal B. D.

    2018-03-01

    The Ottoman Archives have one of the richest archive collections in the world. However, not all the archived documents are well preserved and some undergo biodeterioration. Therefore, a rapid and promising treatment method is necessary to preserve the collection for following generations as heritage. Radiation presents as an alternative for the treatment of archival materials for this purpose. In this study, we conducted a survey to determine the contamination species and the D10 values of the samples obtained from the shelves of the Ottoman Archives. The samples also included several insect pests collected at using a pheromone trap placed in the archive storage room. With the exception of few localized problems, no active pest presence was observed. The D10 values of mold contamination and reference mold (A. niger) were found to be 1.0 and 0.68 kGy, respectively. Based on these results, it can be concluded that an absorbed dose of 6 kGy is required to remove the contamination from the materials stored in the Ottoman Archives.

  13. Archiving InSight Lander Science Data Using PDS4 Standards

    NASA Astrophysics Data System (ADS)

    Stein, T.; Guinness, E. A.; Slavney, S.

    2017-12-01

    The InSight Mars Lander is scheduled for launch in 2018, and science data from the mission will be archived in the NASA Planetary Data System (PDS) using the new PDS4 standards. InSight is a geophysical lander with a science payload that includes a seismometer, a probe to measure subsurface temperatures and heat flow, a suite of meteorology instruments, a magnetometer, an experiment using radio tracking, and a robotic arm that will provide soil physical property information based on interactions with the surface. InSight is not the first science mission to archive its data using PDS4. However, PDS4 archives do not currently contain examples of the kinds of data that several of the InSight instruments will produce. Whereas the existing common PDS4 standards were sufficient for most of archiving requirements of InSight, the data generated by a few instruments required development of several extensions to the PDS4 information model. For example, the seismometer will deliver a version of its data in SEED format, which is standard for the terrestrial seismology community. This format required the design of a new product type in the PDS4 information model. A local data dictionary has also been developed for InSight that contains attributes that are not part of the common PDS4 dictionary. The local dictionary provides metadata relevant to all InSight data sets, and attributes specific to several of the instruments. Additional classes and attributes were designed for the existing PDS4 geometry dictionary that will capture metadata for the lander position and orientation, along with camera models for stereo image processing. Much of the InSight archive planning and design work has been done by a Data Archiving Working Group (DAWG), which has members from the InSight project and the PDS. The group coordinates archive design, schedules and peer review of the archive documentation and test products. The InSight DAWG archiving effort for PDS is being led by the PDS Geosciences Node with several other nodes working one-on-one with instruments relevant to their disciplines. Once the InSight mission begins operations, the DAWG will continue to provide oversight on release of InSight data to PDS. Lessons learned from InSight archive work will also feed forward to planning the archives for the Mars 2020 rover.

  14. ROSETTA: How to archive more than 10 years of mission

    NASA Astrophysics Data System (ADS)

    Barthelemy, Maud; Heather, D.; Grotheer, E.; Besse, S.; Andres, R.; Vallejo, F.; Barnes, T.; Kolokolova, L.; O'Rourke, L.; Fraga, D.; A'Hearn, M. F.; Martin, P.; Taylor, M. G. G. T.

    2018-01-01

    The Rosetta spacecraft was launched in 2004 and, after several planetary and two asteroid fly-bys, arrived at comet 67P/Churyumov-Gerasimenko in August 2014. After escorting the comet for two years and executing its scientific observations, the mission ended on 30 September 2016 through a touch down on the comet surface. This paper describes how the Planetary Science Archive (PSA) and the Planetary Data System - Small Bodies Node (PDS-SBN) worked with the Rosetta instrument teams to prepare the science data collected over the course of the Rosetta mission for inclusion in the science archive. As Rosetta is an international mission in collaboration between ESA and NASA, all science data from the mission are fully archived within both the PSA and the PDS. The Rosetta archiving process, supporting tools, archiving systems, and their evolution throughout the mission are described, along with a discussion of a number of the challenges faced during the Rosetta implementation. The paper then presents the current status of the archive for each of the science instruments, before looking to the improvements planned both for the archive itself and for the Rosetta data content. The lessons learned from the first 13 years of archiving on Rosetta are finally discussed with an aim to help future missions plan and implement their science archives.

  15. Characterizing Space Environments with Long-Term Space Plasma Archive Resources

    NASA Technical Reports Server (NTRS)

    Minow, Joseph I.; Miller, J. Scott; Diekmann, Anne M.; Parker, Linda N.

    2009-01-01

    A significant scientific benefit of establishing and maintaining long-term space plasma data archives is the ready access the archives afford to resources required for characterizing spacecraft design environments. Space systems must be capable of operating in the mean environments driven by climatology as well as the extremes that occur during individual space weather events. Long- term time series are necessary to obtain quantitative information on environment variability and extremes that characterize the mean and worst case environments that may be encountered during a mission. In addition, analysis of large data sets are important to scientific studies of flux limiting processes that provide a basis for establishing upper limits to environment specifications used in radiation or charging analyses. We present applications using data from existing archives and highlight their contributions to space environment models developed at Marshall Space Flight Center including the Chandra Radiation Model, ionospheric plasma variability models, and plasma models of the L2 space environment.

  16. The Archive of the Amateur Observation Network of the International Halley Watch. Volume 1; Comet Giacobini-Zinner

    NASA Technical Reports Server (NTRS)

    Edberg, Stephen J. (Editor)

    1996-01-01

    The International Halley Watch (IHW) was organized for the purpose of gathering and archiving the most complete record of the apparition of a comet, Comet Halley (1982i = 1986 III = 1P/Halley), ever compiled. The redirection of the International Cometary Explorer (ICE), toward Comet Giacobini-Zinner (1984e = 1985 XIII = 21P/Giacobini-Zinner) prompted the initiation of a formal watch on that comet. All the data collected on P/Giacobini-Zinner and P/Halley have been published on CD-ROM in the Comet Halley Archive. This document contains a printed version of the archive data, collected by amateur astronomers, on these two comets. Volume 1 contains the Comet Giacobini-Zinner data archive and Volume 2 contains the Comet Halley archive. Both volumes include information on how to read the data in both archives, as well as a history of both comet watches (including the organizing of the network of astronomers and lessons learned from that experience).

  17. Decadal changes in shortwave irradiance at the surface in the period from 1960 to 2000 estimated from Global Energy Balance Archive Data

    NASA Astrophysics Data System (ADS)

    Gilgen, H.; Roesch, A.; Wild, M.; Ohmura, A.

    2009-05-01

    Decadal changes in shortwave irradiance at the Earth's surface are estimated for the period from approximately 1960 through to 2000 from pyranometer records stored in the Global Energy Balance Archive. For this observational period, estimates could be calculated for a total of 140 cells of the International Satellite Cloud Climatology Project grid (an equal area 2.5° × 2.5° grid at the equator) using regression models allowing for station effects. In large regions worldwide, shortwave irradiance decreases in the first half of the observational period, recovers from the decrease in the 1980s, and thereafter increases, in line with previous reports. Years of trend reversals are determined for the grid cells which are best described with a second-order polynomial model. This reversal of the trend is observed in the majority of the grid cells in the interior of Europe and in Japan. In China, shortwave irradiance recovers during the 1990s in the majority of the grid cells in the southeast and northeast from the decrease observed in the period from 1960 through to 1990. A reversal of the trend in the 1980s or early 1990s is also observed for two grid cells in North America, and for the grid cells containing the Kuala Lumpur (Malaysia), Singapore, Casablanca (Morocco), Valparaiso (Chile) sites, and, noticeably, the remote South Pole and American Samoa sites. Negative trends persist, i.e., shortwave radiation decreases, for the observational period 1960 through to 2000 at the European coasts, in central and northwest China, and for three grid cells in India and two in Africa.

  18. Modelling the flaring activity of the high-z, hard X-ray-selected blazar IGR J22517+2217: Flaring activity of IGR J22517+2217

    DOE PAGES

    Lanzuisi, G.; De Rosa, A.; Ghisellini, G.; ...

    2012-03-21

    We present new Suzaku and Fermi data and re-analysed archival hard X-ray data from the INTErnational Gamma-Ray Astrophysics Laboratory (INTEGRAL) and Swift–Burst Alert Telescope (BAT) surveys to investigate the physical properties of the luminous, high-redshift, hard X-ray-selected blazar IGR J22517+2217, through the modelling of its broad-band spectral energy distribution (SED) in two different activity states. Through analysis of new Suzaku data and flux-selected data from archival hard X-ray observations, we build the source SED in two different states, one for the newly discovered flare that occurred in 2005 and one for the following quiescent period. Both SEDs are strongly dominatedmore » by the high-energy hump peaked at 10 20–10 22 Hz, which is at least two orders of magnitude higher than the low-energy (synchrotron) one at 10 11–10 14 Hz and varies by a factor of 10 between the two states. In both states the high-energy hump is modelled as inverse Compton emission between relativistic electrons and seed photons produced externally to the jet, while the synchrotron self-Compton component is found to be negligible. In our model the observed variability can be accounted for by a variation of the total number of emitting electrons and by a dissipation region radius changing from inside to outside the broad-line region as the luminosity increases. In its flaring activity, IGR J22517+2217 is revealed as one of the most powerful jets among the population of extreme, hard X-ray-selected, high-redshift blazars observed so far.« less

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Doug Blankenship

    Archive of ArcGIS data from the West Flank FORGE site located in Coso, California. Archive contains the following eight shapefiles: Polygon of the 3D geologic model (WestFlank3DGeologicModelExtent) Polylines of the traces 3D modeled faults (WestFlank3DModeledFaultTraces) Polylines of the fault traces from Duffield and Bacon, 1980 (WestFlankFaultsfromDuffieldandBacon) Polygon of the West Flank FORGE site (WestFlankFORGEsite) Polylines of the traces of the geologic cross-sections (cross-sections in a separate archive in the GDR) (WestFlankGeologicCrossSections) Polylines of the traces of the seismic reflection profiles through and adjacent to the West Flank site (seismic reflection profiles in a separate archive in the GDR) (WestFlankSiesmicReflectionProfiles) Pointsmore » of the well collars in and around the West Flank site (WestFlankWellCollars) Polylines of the surface expression of the West Flank well paths (WestFlankWellPaths)« less

  20. Fallon, Nevada FORGE Distinct Element Reservoir Modeling

    DOE Data Explorer

    Blankenship, Doug; Pettitt, Will; Riahi, Azadeh; Hazzard, Jim; Blanksma, Derrick

    2018-03-12

    Archive containing input/output data for distinct element reservoir modeling for Fallon FORGE. Models created using 3DEC, InSite, and in-house Python algorithms (ITASCA). List of archived files follows; please see 'Modeling Metadata.pdf' (included as a resource below) for additional file descriptions. Data sources include regional geochemical model, well positions and geometry, principal stress field, capability for hydraulic fractures, capability for hydro-shearing, reservoir geomechanical model-stimulation into multiple zones, modeled thermal behavior during circulation, and microseismicity.

  1. Lessons Learned While Exploring Cloud-Native Architectures for NASA EOSDIS Applications and Systems

    NASA Technical Reports Server (NTRS)

    Pilone, Dan; Mclaughlin, Brett; Plofchan, Peter

    2017-01-01

    NASA's Earth Observing System (EOS) is a coordinated series of satellites for long term global observations. NASA's Earth Observing System Data and Information System (EOSDIS) is a multi-petabyte-scale archive of environmental data that supports global climate change research by providing end-to-end services from EOS instrument data collection to science data processing to full access to EOS and other earth science data. On a daily basis, the EOSDIS ingests, processes, archives and distributes over 3 terabytes of data from NASA's Earth Science missions representing over 6000 data products ranging from various types of science disciplines. EOSDIS has continually evolved to improve the discoverability, accessibility, and usability of high-impact NASA data spanning the multi-petabyte-scale archive of Earth science data products. Reviewed and approved by Chris Lynnes.

  2. Modeling the plasmasphere based on LEO satellites onboard GPS measurements

    NASA Astrophysics Data System (ADS)

    Chen, Peng; Yao, Yibin; Li, Qinzheng; Yao, Wanqiang

    2017-01-01

    The plasmasphere, which is located above the ionosphere, is a significant component of Earth's atmosphere. A global plasmaspheric model was constructed using the total electron content (TEC) along the signal propagation path calculated using onboard Global Positioning System observations from the Constellation Observing System for Meteorology, Ionosphere, and Climate (COSMIC) and MetOp-A, provided by the COSMIC Data Analysis and Archive Center (CDAAC). First, the global plasmaspheric model was established using only COSMIC TEC, and a set of MetOp-A TEC provided by CDAAC served for external evaluation. Results indicated that the established model using only COSMIC data is highly accurate. Then, COSMIC and MetOp-A TEC were combined to produce a new global plasmaspheric model. Finally, the variational characteristics of global plasmaspheric electron content with latitude, local time, and season were investigated using the global plasmaspheric model established in this paper.

  3. New Developments in NOAA's Comprehensive Large Array-Data Stewardship System

    NASA Astrophysics Data System (ADS)

    Ritchey, N. A.; Morris, J. S.; Carter, D. J.

    2012-12-01

    The Comprehensive Large Array-data Stewardship System (CLASS) is part of the NOAA strategic goal of Climate Adaptation and Mitigation that gives focus to the building and sustaining of key observational assets and data archives critical to maintaining the global climate record. Since 2002, CLASS has been NOAA's enterprise solution for ingesting, storing and providing access to a host of near real-time remote sensing streams such as the Polar and Geostationary Operational Environmental Satellites (POES and GOES) and the Defense Meteorological Satellite Program (DMSP). Since October, 2011 CLASS has also been the dedicated Archive Data Segment (ADS) of the Suomi National Polar-orbiting Partnership (S-NPP). As the ADS, CLASS receives raw and processed S-NPP records for archival and distribution to the broad user community. Moving beyond just remote sensing and model data, NOAA has endorsed a plan to migrate all archive holdings from NOAA's National Data Centers into CLASS while retiring various disparate legacy data storage systems residing at the National Climatic Data Center (NCDC), National Geophysical Data Center (NGDC) and the National Oceanographic Data Center (NODC). In parallel to this data migration, CLASS is evolving to a service-oriented architecture utilizing cloud technologies for dissemination in addition to clearly defined interfaces that allow better collaboration with partners. This evolution will require implementation of standard access protocols and metadata which will lead to cost effective data and information preservation.

  4. VizieR Online Data Catalog: Carbon-enhanced metal-poor star BD+44493 EWs (Roederer+, 2016)

    NASA Astrophysics Data System (ADS)

    Roederer, I. U.; Placco, V. M.; Beers, T. C.

    2016-08-01

    We have obtained new observations of portions of the UV spectrum of BD+44493 using the Cosmic Origins Spectrograph (COS) on HST (13000

  5. The Rosetta Science Archive: Status and Plans for Enhancing the Archive Content

    NASA Astrophysics Data System (ADS)

    Heather, David; Barthelemy, Maud; Besse, Sebastien; Fraga, Diego; Grotheer, Emmanuel; O'Rourke, Laurence; Taylor, Matthew; Vallat, Claire

    2017-04-01

    On 30 September 2016, Rosetta completed its incredible mission by landing on the surface of Comet 67P/Churyumov-Gerasimenko. Although this marked an end to the spacecraft's active operations, intensive work is still ongoing with instrument teams preparing their final science data deliveries for ingestion into ESA's Planetary Science Archive (PSA). In addition, ESA is establishing contracts with some instrument teams to enhance their data and documentation in an effort to provide the best long-term archive possible for the Rosetta mission. Currently, the majority of teams have delivered all of their data from the nominal mission (end of 2015), and are working on their remaining increments from the 1-year mission extension. The aim is to complete the nominal archiving with data from the complete mission by the end of this year, when a full mission archive review will be held. This review will assess the complete data holdings from Rosetta and ensure that the archive is ready for the long-term. With the resources from the operational mission coming to an end, ESA has established a number of 'enhanced archiving' contracts to ensure that the best possible data are delivered to the archive before instrument teams disband. Updates are focused on key aspects of an instrument's calibration or the production of higher level data / information, and are therefore specific to each instrument's needs. These contracts are currently being kicked off, and will run for various lengths depending upon the activities to be undertaken. The full 'archive enhancement' process will run until September 2019, when the post operations activities for Rosetta will end. Within these contracts, most instrument teams will work on providing a Science User Guide for their data, as well as updating calibrations. Several teams will also be delivering higher level and derived products. For example, the VIRTIS team will be updating both their spectral and geometrical calibrations, and will aim to deliver mapping products to the final archive. Similarly, the OSIRIS team will be improving their calibrations and delivering data additionally in FITS format. The Rosetta Plasma Consortium (RPC) instruments will complete cross-calibrations and a number of activities individual to each instrument. The MIDAS team will also be working on cross-calibrations and will produce a dust particle catalog from the comet coma. GIADA will be producing dust environment maps, with products in 3D plus time. A contract also exists to produce and deliver data set(s) containing sup-porting ground-based observations from amateur astronomers. In addition to these contracts, the Rosetta ESA archiving team will produce calibrated data sets for the NAVCAM instrument, and will work to include the latest shape models from the comet into the final Rosetta archive. Work is also underway to provide a centralized solution to the problem of geometry on the comet. This presentation will outline the current status of the Rosetta archive, as well as highlighting some of the 'enhanced archiving' activities planned with the various instrument teams on Rosetta.

  6. PULSAR OBSERVATIONS USING THE FIRST STATION OF THE LONG WAVELENGTH ARRAY AND THE LWA PULSAR DATA ARCHIVE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stovall, K.; Dowell, J.; Eftekhari, T.

    2015-08-01

    We present initial pulsar results from the first station of the Long Wavelength Array (LWA1) obtained during the commissioning period of LWA1 and in early science results. We present detections of periodic emission from 44 previously known pulsars, including 3 millisecond pulsars. The effects of the interstellar medium (ISM) on pulsar emission are significantly enhanced at the low frequencies of the LWA1 band (10–88 MHz), making LWA1 a very sensitive instrument for characterizing changes in the dispersion measure (DM) and other effects from the ISM. Pulsars also often have significant evolution in their pulse profile at low frequency and amore » break in their spectral index. We report DM measurements for 44 pulsars, mean flux density measurements for 36 pulsars, and multi-frequency component spacing and widths for 15 pulsars with more than one profile component. For 27 pulsars, we report spectral index measurements within our frequency range. We also introduce the LWA1 Pulsar Data Archive, which stores reduced data products from LWA1 pulsar observations. Reduced data products for the observations presented here can be found in the archive. Reduced data products from future LWA1 pulsar observations will also be made available through the archive.« less

  7. First Light for ASTROVIRTEL Project

    NASA Astrophysics Data System (ADS)

    2000-04-01

    Astronomical data archives increasingly resemble virtual gold mines of information. A new project, known as ASTROVIRTEL aims to exploit these astronomical treasure troves by allowing scientists to use the archives as virtual telescopes. The competition for observing time on large space- and ground-based observatories such as the ESA/NASA Hubble Space Telescope and the ESO Very Large Telescope (VLT) is intense. On average, less than a quarter of applications for observing time are successful. The fortunate scientist who obtains observing time usually has one year of so-called proprietary time to work with the data before they are made publicly accessible and can be used by other astronomers. Precious data from these large research facilities retain their value far beyond their first birthday and may still be useful decades after they were first collected. The enormous quantity of valuable astronomical data now stored in the archives of the European Southern Observatory (ESO) and the Space Telescope-European Coordinating Facility (ST-ECF) is increasingly attracting the attention of astronomers. Scientists are aware that one set of observations can serve many different scientific purposes, including some that were not considered at all when the observations were first made. Data archives as "gold mines" for research [ASTROVIRTEL Logo; JPEG - 184 k] Astronomical data archives increasingly resemble virtual gold mines of information. A new project, known as ASTROVIRTEL or "Accessing Astronomical Archives as Virtual Telescopes" aims to exploit these astronomical treasure troves. It is supported by the European Commission (EC) within the "Access to Research Infrastructures" action under the "Improving Human Potential & the Socio-economic Knowledge Base" of the EC (under EU Fifth Framework Programme). ASTROVIRTEL has been established on behalf of the European Space Agency (ESA) and the European Southern Observatory (ESO) in response to rapid developments currently taking place in the fields of telescope and detector construction, computer hardware, data processing, archiving, and telescope operation. Nowadays astronomical telescopes can image increasingly large areas of the sky. They use more and more different instruments and are equipped with ever-larger detectors. The quantity of astronomical data collected is rising dramatically, generating a corresponding increase in potentially interesting research projects. These large collections of valuable data have led to the useful concept of "data mining", whereby large astronomical databases are exploited to support original research. However, it has become obvious that scientists need additional support to cope efficiently with the massive amounts of data available and so to exploit the true potential of the databases. The strengths of ASTROVIRTEL ASTROVIRTEL is the first virtual astronomical telescope dedicated to data mining. It is currently being established at the joint ESO/Space Telescope-European Coordinating Facility Archive in Garching (Germany). Scientists from EC member countries and associated states will be able to apply for support for a scientific project based on access to and analysis of data from the Hubble Space Telescope (HST), Very Large Telescope (VLT), New Technology Telescope (NTT), and Wide Field Imager (WFI) archives, as well as a number of other related archives, including the Infrared Space Observatory (ISO) archive. Scientists will be able to visit the archive site and collaborate with the archive specialists there. Special software tools that incorporate advanced methods for exploring the enormous quantities of information available will be developed. Statements The project co-ordinator, Piero Benvenuti , Head of ST-ECF, elaborates on the advantages of ASTROVIRTEL: "The observations by the ESA/NASA Hubble Space Telescope and, more recently, by the ESO Very Large Telescope, have already been made available on-line to the astronomical community, once the proprietary period of one year has elapsed. ASTROVIRTEL is different, in that astronomers are now invited to regard the archive as an "observatory" in its own right: a facility that, when properly used, may provide an answer to their specific scientific questions. The architecture of the archives as well as their suite of software tools may have to evolve to respond to the new demand. ASTROVIRTEL will try to drive this evolution on the basis of the scientific needs of its users." Peter Quinn , the Head of ESO's Data Management and Operations Division, is of the same opinion: "The ESO/HST Archive Facility at ESO Headquarters in Garching is currently the most rapidly growing astronomical archive resource in the world. This archive is projected to contain more than 100 Terabytes (100,000,000,000,000 bytes) of data within the next four years. The software and hardware technologies for the archive will be jointly developed and operated by ESA and ESO staff and will be common to both HST and ESO data archives. The ASTROVIRTEL project will provide us with real examples of scientific research programs that will push the capabilities of the archive and allow us to identify and develop new software tools for data mining. The growing archive facility will provide the European astronomical community with new digital windows on the Universe." Note [1] This is a joint Press Release by the European Southern Observatory (ESO) and the Space Telescope European Coordinating Facility (ST-ECF). Additional information More information about ASTROVIRTEL can be found at the dedicated website at: http://www.stecf.org/astrovirtel The European Southern Observatory (ESO) is an intergovernmental organisation, supported by eight European countries: Belgium, Denmark, France, Germany, Italy, The Netherlands, Sweden and Switzerland. The European Space Agency is an intergovernmental organisation supported by 15 European countries: Austria, Belgium, Denmark, Finland, France, Germany, Ireland, Italy, Netherlands, Norway, Portugal, Spain, Sweden, Switzerland and the United Kingdom. The Space Telescope European Coordinating Facility (ST-ECF) is a co-operation between the European Space Agency and the European Southern Observatory. The Hubble Space Telescope (HST) is a project of international co-operation between NASA and ESA.

  8. Assessing anthropogenic impact on boreal lakes with historical fish species distribution data and hydrogeochemical modeling.

    PubMed

    Valinia, Salar; Englund, Göran; Moldan, Filip; Futter, Martyn N; Köhler, Stephan J; Bishop, Kevin; Fölster, Jens

    2014-09-01

    Quantifying the effects of human activity on the natural environment is dependent on credible estimates of reference conditions to define the state of the environment before the onset of adverse human impacts. In Europe, emission controls that aimed at restoring ecological status were based on hindcasts from process-based models or paleolimnological reconstructions. For instance, 1860 is used in Europe as the target for restoration from acidification concerning biological and chemical parameters. A more practical problem is that the historical states of ecosystems and their function cannot be observed directly. Therefore, we (i) compare estimates of acidification based on long-term observations of roach (Rutilus rutilus) populations with hindcast pH from the hydrogeochemical model MAGIC; (ii) discuss policy implications and possible scope for use of long-term archival data for assessing human impacts on the natural environment and (iii) present a novel conceptual model for interpreting the importance of physico-chemical and ecological deviations from reference conditions. Of the 85 lakes studied, 78 were coherently classified by both methods. In 1980, 28 lakes were classified as acidified with the MAGIC model, however, roach was present in 14 of these. In 2010, MAGIC predicted chemical recovery in 50% of the lakes, however roach only recolonized in five lakes after 1990, showing a lag between chemical and biological recovery. Our study is the first study of its kind to use long-term archival biological data in concert with hydrogeochemical modeling for regional assessments of anthropogenic acidification. Based on our results, we show how the conceptual model can be used to understand and prioritize management of physico-chemical and ecological effects of anthropogenic stressors on surface water quality. © 2014 The Authors Global Change Biology Published by John Wiley & Sons Ltd.

  9. Assessing anthropogenic impact on boreal lakes with historical fish species distribution data and hydrogeochemical modeling

    PubMed Central

    Valinia, Salar; Englund, Göran; Moldan, Filip; Futter, Martyn N; Köhler, Stephan J; Bishop, Kevin; Fölster, Jens

    2014-01-01

    Quantifying the effects of human activity on the natural environment is dependent on credible estimates of reference conditions to define the state of the environment before the onset of adverse human impacts. In Europe, emission controls that aimed at restoring ecological status were based on hindcasts from process-based models or paleolimnological reconstructions. For instance, 1860 is used in Europe as the target for restoration from acidification concerning biological and chemical parameters. A more practical problem is that the historical states of ecosystems and their function cannot be observed directly. Therefore, we (i) compare estimates of acidification based on long-term observations of roach (Rutilus rutilus) populations with hindcast pH from the hydrogeochemical model MAGIC; (ii) discuss policy implications and possible scope for use of long-term archival data for assessing human impacts on the natural environment and (iii) present a novel conceptual model for interpreting the importance of physico-chemical and ecological deviations from reference conditions. Of the 85 lakes studied, 78 were coherently classified by both methods. In 1980, 28 lakes were classified as acidified with the MAGIC model, however, roach was present in 14 of these. In 2010, MAGIC predicted chemical recovery in 50% of the lakes, however roach only recolonized in five lakes after 1990, showing a lag between chemical and biological recovery. Our study is the first study of its kind to use long-term archival biological data in concert with hydrogeochemical modeling for regional assessments of anthropogenic acidification. Based on our results, we show how the conceptual model can be used to understand and prioritize management of physico-chemical and ecological effects of anthropogenic stressors on surface water quality. PMID:24535943

  10. Outcome of the First wwPDB Hybrid/Integrative Methods Task Force Workshop.

    PubMed

    Sali, Andrej; Berman, Helen M; Schwede, Torsten; Trewhella, Jill; Kleywegt, Gerard; Burley, Stephen K; Markley, John; Nakamura, Haruki; Adams, Paul; Bonvin, Alexandre M J J; Chiu, Wah; Peraro, Matteo Dal; Di Maio, Frank; Ferrin, Thomas E; Grünewald, Kay; Gutmanas, Aleksandras; Henderson, Richard; Hummer, Gerhard; Iwasaki, Kenji; Johnson, Graham; Lawson, Catherine L; Meiler, Jens; Marti-Renom, Marc A; Montelione, Gaetano T; Nilges, Michael; Nussinov, Ruth; Patwardhan, Ardan; Rappsilber, Juri; Read, Randy J; Saibil, Helen; Schröder, Gunnar F; Schwieters, Charles D; Seidel, Claus A M; Svergun, Dmitri; Topf, Maya; Ulrich, Eldon L; Velankar, Sameer; Westbrook, John D

    2015-07-07

    Structures of biomolecular systems are increasingly computed by integrative modeling that relies on varied types of experimental data and theoretical information. We describe here the proceedings and conclusions from the first wwPDB Hybrid/Integrative Methods Task Force Workshop held at the European Bioinformatics Institute in Hinxton, UK, on October 6 and 7, 2014. At the workshop, experts in various experimental fields of structural biology, experts in integrative modeling and visualization, and experts in data archiving addressed a series of questions central to the future of structural biology. How should integrative models be represented? How should the data and integrative models be validated? What data should be archived? How should the data and models be archived? What information should accompany the publication of integrative models? Copyright © 2015 Elsevier Ltd. All rights reserved.

  11. Comparison of land-surface humidity between observations and CMIP5 models

    NASA Astrophysics Data System (ADS)

    Dunn, Robert; Willett, Kate; Ciavarella, Andrew; Stott, Peter; Jones, Gareth

    2017-04-01

    We compare the latest observational land-surface humidity dataset, HadISDH, with the CMIP5 model archive spatially and temporally over the period 1973-2015. None of the CMIP5 models or experiments capture the observed temporal behaviour of the globally averaged relative or specific humidity over the entire study period. When using an atmosphere-only model, driven by observed sea-surface temperatures and radiative forcing changes, the behaviour of regional average temperature and specific humidity are better captured, but there is little improvement in the relative humidity. Comparing the observed and historical model climatologies show that the models are generally cooler everywhere, are drier and less saturated in the tropics and extra tropics, and have comparable moisture levels but are more saturated in the high latitudes. The spatial pattern of linear trends are relatively similar between the models and HadISDH for temperature and specific humidity, but there are large differences for relative humidity, with less moistening shown in the models over the Tropics, and very little at high atitudes. The observed temporal behaviour appears to be a robust climate feature rather than observational error. It has been previously documented and is theoretically consistent with faster warming rates over land compared to oceans. Thus, the poor replication in the models, especially in the atmosphere only model, leads to questions over future projections of impacts related to changes in surface relative humidity.

  12. Heritage Quay: What Will You Discover? Transforming the Archives of the University of Huddersfield, Yorkshire, UK

    ERIC Educational Resources Information Center

    Wickham, M. Sarah

    2015-01-01

    The University of Huddersfield presents a key case study of the transformation of its Archives Service, using the newly-developed Staff/Space/Collections dependency model for archives and the lessons of the UK's Customer Service Excellence (CSE) scheme in order to examine and illustrate service development. Heritage Lottery Fund (HLF) and…

  13. The Semantic Mapping of Archival Metadata to the CIDOC CRM Ontology

    ERIC Educational Resources Information Center

    Bountouri, Lina; Gergatsoulis, Manolis

    2011-01-01

    In this article we analyze the main semantics of archival description, expressed through Encoded Archival Description (EAD). Our main target is to map the semantics of EAD to the CIDOC Conceptual Reference Model (CIDOC CRM) ontology as part of a wider integration architecture of cultural heritage metadata. Through this analysis, it is concluded…

  14. Will Today's Electronic Journals Be Accessible in the 23rd Century: Issues in Long-Term Archiving (SIG STI, IFP)

    ERIC Educational Resources Information Center

    Lippert, Margaret

    2000-01-01

    This abstract of a planned session on access to scientific and technical journals addresses policy and standard issues related to long-term archives; digital archiving models; economic factors; hardware and software issues; multi-publisher electronic journal content integration; format considerations; and future data migration needs. (LRW)

  15. A probabilistic method for constructing wave time-series at inshore locations using model scenarios

    USGS Publications Warehouse

    Long, Joseph W.; Plant, Nathaniel G.; Dalyander, P. Soupy; Thompson, David M.

    2014-01-01

    Continuous time-series of wave characteristics (height, period, and direction) are constructed using a base set of model scenarios and simple probabilistic methods. This approach utilizes an archive of computationally intensive, highly spatially resolved numerical wave model output to develop time-series of historical or future wave conditions without performing additional, continuous numerical simulations. The archive of model output contains wave simulations from a set of model scenarios derived from an offshore wave climatology. Time-series of wave height, period, direction, and associated uncertainties are constructed at locations included in the numerical model domain. The confidence limits are derived using statistical variability of oceanographic parameters contained in the wave model scenarios. The method was applied to a region in the northern Gulf of Mexico and assessed using wave observations at 12 m and 30 m water depths. Prediction skill for significant wave height is 0.58 and 0.67 at the 12 m and 30 m locations, respectively, with similar performance for wave period and direction. The skill of this simplified, probabilistic time-series construction method is comparable to existing large-scale, high-fidelity operational wave models but provides higher spatial resolution output at low computational expense. The constructed time-series can be developed to support a variety of applications including climate studies and other situations where a comprehensive survey of wave impacts on the coastal area is of interest.

  16. A case Study of Applying Object-Relational Persistence in Astronomy Data Archiving

    NASA Astrophysics Data System (ADS)

    Yao, S. S.; Hiriart, R.; Barg, I.; Warner, P.; Gasson, D.

    2005-12-01

    The NOAO Science Archive (NSA) team is developing a comprehensive domain model to capture the science data in the archive. Java and an object model derived from the domain model weil address the application layer of the archive system. However, since RDBMS is the best proven technology for data management, the challenge is the paradigm mismatch between the object and the relational models. Transparent object-relational mapping (ORM) persistence is a successful solution to this challenge. In the data modeling and persistence implementation of NSA, we are using Hibernate, a well-accepted ORM tool, to bridge the object model in the business tier and the relational model in the database tier. Thus, the database is isolated from the Java application. The application queries directly on objects using a DBMS-independent object-oriented query API, which frees the application developers from the low level JDBC and SQL so that they can focus on the domain logic. We present the detailed design of the NSA R3 (Release 3) data model and object-relational persistence, including mapping, retrieving and caching. Persistence layer optimization and performance tuning will be analyzed. The system is being built on J2EE, so the integration of Hibernate into the EJB container and the transaction management are also explored.

  17. The PDS4 Information Model and its Role in Agile Science Data Curation

    NASA Astrophysics Data System (ADS)

    Hughes, J. S.; Crichton, D.

    2017-12-01

    PDS4 is an information model-driven service architecture supporting the capture, management, distribution and integration of massive planetary science data captured in distributed data archives world-wide. The PDS4 Information Model (IM), the core element of the architecture, was developed using lessons learned from 20 years of archiving Planetary Science Data and best practices for information model development. The foundational principles were adopted from the Open Archival Information System (OAIS) Reference Model (ISO 14721), the Metadata Registry Specification (ISO/IEC 11179), and W3C XML (Extensible Markup Language) specifications. These provided respectively an object oriented model for archive information systems, a comprehensive schema for data dictionaries and hierarchical governance, and rules for rules for encoding documents electronically. The PDS4 Information model is unique in that it drives the PDS4 infrastructure by providing the representation of concepts and their relationships, constraints, rules, and operations; a sharable, stable, and organized set of information requirements; and machine parsable definitions that are suitable for configuring and generating code. This presentation will provide an over of the PDS4 Information Model and how it is being leveraged to develop and evolve the PDS4 infrastructure and enable agile curation of over 30 years of science data collected by the international Planetary Science community.

  18. Evaluation of simulated ocean carbon in the CMIP5 earth system models

    NASA Astrophysics Data System (ADS)

    Orr, James; Brockmann, Patrick; Seferian, Roland; Servonnat, Jérôme; Bopp, Laurent

    2013-04-01

    We maintain a centralized model output archive containing output from the previous generation of Earth System Models (ESMs), 7 models used in the IPCC AR4 assessment. Output is in a common format located on a centralized server and is publicly available through a web interface. Through the same interface, LSCE/IPSL has also made available output from the Coupled Model Intercomparison Project (CMIP5), the foundation for the ongoing IPCC AR5 assessment. The latter includes ocean biogeochemical fields from more than 13 ESMs. Modeling partners across 3 EU projects refer to the combined AR4-AR5 archive and comparison as OCMIP5, building on previous phases of OCMIP (Ocean Carbon Cycle Intercomparison Project) and making a clear link to IPCC AR5 (CMIP5). While now focusing on assessing the latest generation of results (AR5, CMIP5), this effort is also able to put them in context (AR4). For model comparison and evaluation, we have also stored computed derived variables (e.g., those needed to assess ocean acidification) and key fields regridded to a common 1°x1° grid, thus complementing the standard CMIP5 archive. The combined AR4-AR5 output (OCMIP5) has been used to compute standard quantitative metrics, both global and regional, and those have been synthesized with summary diagrams. In addition, for key biogeochemical fields we have deconvolved spatiotemporal components of the mean square error in order to constrain which models go wrong where. Here we will detail results from these evaluations which have exploited gridded climatological data. The archive, interface, and centralized evaluation provide a solid technical foundation, upon which collaboration and communication is being broadened in the ocean biogeochemical modeling community. Ultimately we aim to encourage wider use of the OCMIP5 archive.

  19. Implementation of an ASP model offsite backup archive for clinical images utilizing Internet 2

    NASA Astrophysics Data System (ADS)

    Liu, Brent J.; Chao, Sander S.; Documet, Jorge; Lee, Jasper; Lee, Michael; Topic, Ian; Williams, Lanita

    2005-04-01

    With the development of PACS technology and an increasing demand by medical facilities to become filmless, there is a need for a fast and efficient method of providing data backup for disaster recovery and downtime scenarios. At the Image Processing Informatics Lab (IPI), an ASP Backup Archive was developed using a fault-tolerant server with a T1 connection to serve the PACS at the St. John's Health Center (SJHC) Santa Monica, California. The ASP archive server has been in clinical operation for more than 18 months, and its performance was presented at this SPIE Conference last year. This paper extends the ASP Backup Archive to serve the PACS at the USC Healthcare Consultation Center II (HCC2) utilizing an Internet2 connection. HCC2 is a new outpatient facility that recently opened in April 2004. The Internet2 connectivity between USC's HCC2 and IPI has been established for over one year. There are two novelties of the current ASP model: 1) Use of Internet2 for daily clinical operation, and 2) Modifying the existing backup archive to handle two sites in the ASP model. This paper presents the evaluation of the ASP Backup Archive based on the following two criteria: 1) Reliability and performance of the Internet2 connection between HCC2 and IPI using DICOM image transfer in a clinical environment, and 2) Ability of the ASP Fault-Tolerant backup archive to support two separate clinical PACS sites simultaneously. The performances of using T1 and Internet2 at the two different sites are also compared.

  20. News from the ESO Science Archive Facility

    NASA Astrophysics Data System (ADS)

    Dobrzycki, A.; Arnaboldi, M.; Bierwirth, T.; Boelter, M.; Da Rocha, C.; Delmotte, N.; Forchì, V.; Fourniol, N.; klein Gebbinck, M.; Lange, U.; Mascetti, L.; Micol, A.; Moins, C.; Munte, C.; Pluciennik, C.; Retzlaff, J.; Romaniello, M.; Rosse, N.; Sequeiros, I. V.; Vuong, M.-H.; Zampieri, S.

    2015-09-01

    ESO Science Archive Facility (SAF) - one of the world's biggest astronomical archives - combines two roles: operational (ingest, tallying, safekeeping and distribution to observers of raw data taken with ESO telescopes and processed data generated both internally and externally) and scientific (publication and delivery of all flavours of data to external users). This paper presents the “State of the SAF.” SAF, as a living entity, is constantly implementing new services and upgrading the existing ones. We present recent and future developments related to the Archive's Request Handler and metadata handling as well as performance and usage statistics and trends. We also discuss the current and future datasets on offer at SAF.

  1. Data catalog for JPL Physical Oceanography Distributed Active Archive Center (PO.DAAC)

    NASA Technical Reports Server (NTRS)

    Digby, Susan

    1995-01-01

    The Physical Oceanography Distributed Active Archive Center (PO.DAAC) archive at the Jet Propulsion Laboratory contains satellite data sets and ancillary in-situ data for the ocean sciences and global-change research to facilitate multidisciplinary use of satellite ocean data. Geophysical parameters available from the archive include sea-surface height, surface-wind vector, surface-wind speed, surface-wind stress vector, sea-surface temperature, atmospheric liquid water, integrated water vapor, phytoplankton pigment concentration, heat flux, and in-situ data. PO.DAAC is an element of the Earth Observing System Data and Information System and is the United States distribution site for TOPEX/POSEIDON data and metadata.

  2. Asteroseismology and the Virtual Observatory

    NASA Astrophysics Data System (ADS)

    Suárez, J. C.

    2010-12-01

    Virtual Observatory is an international project aiming at solving the problem of interoperability among astronomical archives and the scalability in the classical methods of retrieving and analyzing astronomical data in order to deal with huge amounts of datasets. This is being tackled thanks to the standardization of astronomical archives favoring their access in a efficient manner. This project, which is nowadays a reality, is more and more adopted by many fields of Science. In the present paper I will describe the origin of a new era in Stellar Physics whose main role is played by the relationship between asteroseismology and V.O. I will summarize the main concerns of both fields and the current development of VO tools for the development of what we could name as asteroseismology online, in which not only observed datasets are concerned but also the management of model databases.

  3. THE EFFECT OF CLOUD FRACTION ON THE RADIATIVE ENERGY BUDGET: The Satellite-Based GEWEX-SRB Data vs. the Ground-Based BSRN Measurements

    NASA Astrophysics Data System (ADS)

    Zhang, T.; Stackhouse, P. W.; Gupta, S. K.; Cox, S. J.; Mikovitz, J. C.; Nasa Gewex Srb

    2011-12-01

    The NASA GEWEX-SRB (Global Energy and Water cycle Experiment - Surface Radiation Budget) project produces and archives shortwave and longwave atmospheric radiation data at the top of the atmosphere (TOA) and the Earth's surface. The archive holds uninterrupted records of shortwave/longwave downward/upward radiative fluxes at 1 degree by 1 degree resolution for the entire globe. The latest version in the archive, Release 3.0, is available as 3-hourly, daily and monthly means, spanning 24.5 years from July 1983 to December 2007. Primary inputs to the models used to produce the data include: shortwave and longwave radiances from International Satellite Cloud Climatology Project (ISCCP) pixel-level (DX) data, cloud and surface properties derived therefrom, temperature and moisture profiles from GEOS-4 reanalysis product obtained from the NASA Global Modeling and Assimilation Office (GMAO), and column ozone amounts constituted from Total Ozone Mapping Spectrometer (TOMS), TIROS Operational Vertical Sounder (TOVS) archives, and Stratospheric Monitoring-group's Ozone Blended Analysis (SMOBA), an assimilation product from NOAA's Climate Prediction Center. The data in the archive have been validated systemically against ground-based measurements which include the Baseline Surface Radiation Network (BSRN) data, the World Radiation Data Centre (WRDC) data, and the Global Energy Balance Archive (GEBA) data, and generally good agreement has been achieved. In addition to all-sky radiative fluxes, the output data include clear-sky fluxes, cloud optical depth, cloud fraction and so on. The BSRN archive also includes observations that can be used to derive the cloud fraction, which provides a means for analyzing and explaining the SRB-BSRN flux differences. In this paper, we focus on the effect of cloud fraction on the surface shortwave flux and the level of agreement between the satellite-based SRB data and the ground-based BSRN data. The satellite and BSRN employ different measuring methodologies and thus result in data representing means on dramatically different spatial scales. Therefore, the satellite-based and ground-based measurements are not expected to agree all the time, especially under skies with clouds. The flux comparisons are made under different cloud fractions, and it is found that the SRB-BSRN radiative flux discrepancies can be explained to a certain extent by the SRB-BSRN cloud fraction discrepancies. Apparently, cloud fraction alone cannot completely define the role of clouds in radiation transfer. Further studies need to incorporate the classification of cloud types, altitudes, cloud optical depths and so on.

  4. Filling the gap of existing MWA-VCS archival data

    NASA Astrophysics Data System (ADS)

    Xue, M.; Bhat, R.; Tremblay, S.; Ord, S.; Sobey, C.; Kirsten, F.

    2017-01-01

    Since July 2014, around 110 hours of high time resolution voltage data observed by MWA Voltage Capture System (VCS; Tremblay et al. 2015) have been archived on tapes in the Pawsey Supercomputing Centre. Except some short duration test data and calibration data, the total amount of the observation data which have a duration longer than 400 s is 84 hours. These data cover a significant portion of a lot of the Southern sky and could be used for many science purposes including radio pulsars census and fast radio bursts (FRBs) searching. But there are still some 'holes' in the sky that we do not have VCS archival data for yet. We are proposing a set of MWA-VCS drift scan observations of 3 hours to fill some of these 'holes' and help provide a more complete MWA-VCS data set. We will also census known (cataloged) pulsars in these areas. These observations would be performed between 170-200 MHz. This project will form part of the PhD program of Mengyao Xue.

  5. The Comet Halley archive: Summary volume

    NASA Technical Reports Server (NTRS)

    Sekanina, Zdenek (Editor); Fry, Lori (Editor)

    1991-01-01

    The contents are as follows: The Organizational History of the International Halley Watch; Operations of the International Halley Watch from a Lead Center Perspective; The Steering Group; Astrometry Network; Infrared Studies Network; Large-Scale Phenomena Network; Meteor Studies Network; Near-Nucleus Studies Network; Photometry and Polarimetry Network; Radio Science Network; Spectroscopy and Spectrophotometry Network; Amateur Observation Network; Use of the CD-ROM Archive; The 1986 Passage of Comet Halley; and Recent Observations of Comet Halley.

  6. The shape of dark matter haloes - IV. The structure of stellar discs in edge-on galaxies

    NASA Astrophysics Data System (ADS)

    Peters, S. P. C.; de Geyter, G.; van der Kruit, P. C.; Freeman, K. C.

    2017-01-01

    We present optical and near-infrared archival observations of eight edge-on galaxies. These observations are used to model the stellar content of each galaxy using the FITSKIRT software package. Using FITSKIRT, we can self-consistently model a galaxy in each band simultaneously while treating for dust. This allows us to measure accurately both the scalelength and scaleheight of the stellar disc, plus the shape parameters of the bulge. By combining these data with the previously reported integrated magnitudes of each galaxy, we can infer their true luminosities. We have successfully modelled seven out of the eight galaxies in our sample. We find that stellar discs can be modelled correctly, but we have not been able to model the stellar bulge reliably. Our sample consists for the most part of slowly rotating galaxies and we find that the average dust layer is much thicker than is reported for faster rotating galaxies.

  7. The Water Production Rate of Recent Comets (2013-2014) by SOHO/SWAN: 2P/Encke (2013), C/2013 R1 (Lovejoy), and C/2013 A1 (Siding Spring)

    NASA Astrophysics Data System (ADS)

    Combi, Michael R.; Mäkinen, J. T.; Bertaux, J. L.; Quémerais, Eric; Ferron, Stéphane

    2014-11-01

    The all-sky hydrogen Lyman-alpha camera, SWAN (Solar Wind Anisotropies), on the SOlar and Heliospheric Observatory (SOHO) satellite makes observations of the hydrogen comae of comets. Most water vapor produced by the comet is ultimately photodissociated into two H atoms and one O atom producing a huge atomic hydrogen coma that is routinely observed in the daily full-sky SWAN images in comets of sufficient brightness. Water production rates are calculated using our time-resolved model (Mäkinen & Combi, 2005, Icarus 177, 217), typically yielding about 1 observation every 2 days on the average. Here we describe the progress in analysis of observations of comets observed during 2013-2014 and those selected from the archive for analysis. These include comets 2P/Encke (2013), 45P/Honda Mrkos-Pajdusakova (2011), C/2013 R1 (Lovejoy), as well as C/2013 A1 (Siding Spring), for which results are expected. A status report on the entire SOHO/SWAN archive of water production rates in comets will be given. SOHO is an international cooperative mission between ESA and NASA. Support from grants NNX11AH50G from the NASA Planetary Astronomy Program and NNX13AQ66G from the NASA Planetary Mission Data Analysis Program are gratefully acknowledged.

  8. VO for Education: Archive Prototype

    NASA Astrophysics Data System (ADS)

    Ramella, M.; Iafrate, G.; De Marco, M.; Molinaro, M.; Knapic, C.; Smareglia, R.; Cepparo, F.

    2014-05-01

    The number of remote control telescopes dedicated to education is increasing in many countries, leading to correspondingly larger and larger amount of stored educational data that are usually available only to local observers. Here we present the project for a new infrastructure that will allow teachers using educational telescopes to archive their data and easily publish them within the Virtual Observatory (VO) avoiding the complexity of professional tools. Students and teachers anywhere will be able to access these data with obvious benefits for the realization of grander scale collaborative projects. Educational VO data will also be an important resource for teachers not having direct access to any educational telescopes. We will use the educational telescope at our observatory in Trieste as a prototype for the future VO educational data archive resource. The publishing infrastructure will include: user authentication, content and curation validation, data validation and ingestion, VO compliant resource generation. All of these parts will be performed by means of server side applications accessible through a web graphical user interface (web GUI). Apart from user registration, that will be validated by a natural person responsible for the archive (after having verified the reliability of the user and inspected one or more test files), all the subsequent steps will be automated. This means that at the very first data submission through the webGUI, a complete resource including archive and published VO service will be generated, ready to be registered to the VO. The efforts required to the registered user will consist only in describing herself/himself at registration step and submitting the data she/he selects for publishing after each observation sessions. The infrastructure will be file format independent and the underlying data model will use a minimal set of standard VO keywords, some of which will be specific for outreach and education, possibly including VO field identification (astronomy, planetary science, solar physics). The VO published resource description will be suggested such as to allow selective access to educational data by VO aware tools, differentiating them from professional data while treating them with the same procedures, protocols and tools. The whole system will be very flexible, scalable and with the objective to leave as less work as possible to humans.

  9. The status of the international Halley watch

    NASA Technical Reports Server (NTRS)

    Newburn, Ray L., Jr.; Rahe, Juergen

    1987-01-01

    More than 1000 professional astronomers worldwide actually observed Halley's comet from the ground. Preliminary logs from the observers indicate that 20-40 Gbytes of data were acquired in eight professional disciplines and as much as 5 Gbytes in the amateur network. The latter will be used to fill in gaps in the Archive and to provide a visual light curve. In addition roughly 400 Mbytes of data were taken on Comet Giacobini-Zinner. Data will be accepted for archiving until early 1989. The permanent archive will consist of a set of CD-ROMs and a set of books, publication of both to be completed by mid-1990. Data from the space missions will be included but only on the CDs. From every indication, the ground based effort and the space missions complimented each other beautifully, both directly in the solution of spacecraft navigation problems and indirectly in the solution of scientific problems. The major remaining concern is that scientists submit their data to the Archive before the 1989 deadline.

  10. VizieR Online Data Catalog: Astro-photometric catalog of the core of NGC 5139 (Bellini+, 2017)

    NASA Astrophysics Data System (ADS)

    Bellini, A.; Anderson, J.; Bedin, L. R.; King, I. R.; van der Marel, R. P.; Piotto, G.; Cool, A.

    2018-01-01

    The core of ω Cen (NGC 5139) has been observed through many of the WFC3 filters since 2009 for calibration purposes, and new observations continue to be scheduled. Table 1 summarizes the massive archive of data, organized in a camera/filter fashion. We downloaded from the archive a total of 655 exposures (~205Ks) taken through 26 HST different bands: 18 for WFC3/UVIS (385 exposures) and 8 for WFC3/IR (270 exposures); spanning 2009 July to 2013 March. This data set can be accessed at MAST via this link: http://archive.stsci.edu/doi/resolve/resolve.html?doi=10.17909/T9WG6S (5 data files).

  11. Implementing the HDF-EOS5 software library for data products in the UNAVCO InSAR archive

    NASA Astrophysics Data System (ADS)

    Baker, Scott; Meertens, Charles; Crosby, Christopher

    2017-04-01

    UNAVCO is a non-profit university-governed consortium that operates the U.S. National Science Foundation (NSF) Geodesy Advancing Geosciences and EarthScope (GAGE) facility and provides operational support to the Western North America InSAR Consortium (WInSAR). The seamless synthetic aperture radar archive (SSARA) is a seamless distributed access system for SAR data and higher-level data products. Under the NASA-funded SSARA project, a user-contributed InSAR archive for interferograms, time series, and other derived data products was developed at UNAVCO. The InSAR archive development has led to the adoption of the HDF-EOS5 data model, file format, and library. The HDF-EOS software library was designed to support NASA Earth Observation System (EOS) science data products and provides data structures for radar geometry (Swath) and geocoded (Grid) data based on the HDF5 data model and file format provided by the HDF Group. HDF-EOS5 inherits the benefits of HDF5 (open-source software support, internal compression, portability, support for structural data, self-describing file metadata enhanced performance, and xml support) and provides a way to standardize InSAR data products. Instrument- and datatype-independent services, such as subsetting, can be applied to files across a wide variety of data products through the same library interface. The library allows integration with GIS software packages such as ArcGIS and GDAL, conversion to other data formats like NetCDF and GeoTIFF, and is extensible with new data structures to support future requirements. UNAVCO maintains a GitHub repository that provides example software for creating data products from popular InSAR processing software packages like GMT5SAR and ISCE as well as examples for reading and converting the data products into other formats. Digital object identifiers (DOI) have been incorporated into the InSAR archive allowing users to assign a permanent location for their processed result and easily reference the final data products. A metadata attribute is added to the HDF-EOS5 file when a DOI is minted for a data product. These data products are searchable through the SSARA federated query providing access to processed data for both expert and non-expert InSAR users. The archive facilitates timely distribution of processed data with particular importance for geohazards and event response.

  12. The Archive of the Amateur Observation Network of the International Halley Watch. Volume 2; Comet Halley

    NASA Technical Reports Server (NTRS)

    Edberg, Stephen J. (Editor)

    1996-01-01

    The International Halley Watch (IHW) was organized for the purpose of gathering and archiving the most complete record of the apparition of a comet, Halley's Comet (1982i = 1986 III = 1P/Halley), ever compiled. The redirection of the International Sun-Earth Explorer 3 (ISEE-3) spacecraft, subsequently renamed the International Cometary Explorer (ICE), toward Comet Giacobini- Zinner (1984e = 1985 XIII = 21P/Giacobini-Zinner) prompted the initiation of a formal watch on that comet. All the data collected on P/Giacobini-Zinner and P/Halley have been published on CD-ROM in the Comet Halley Archive. This document contains a printed version of the archive data, collected by amateur astronomers, on these two comets. Volume 1 contains the Comet Giacobini-Zinner data archive and Volume 2 contains the Comet Halley archive. Both volumes include information on how to read the data in both archives, as well as a history of both comet watches (including the organizing of the network of astronomers and lessons learned from that experience).

  13. The Archive of the Amateur Observation Network of the International Halley Watch. Volume 1; Comet Giacobini-Zinner

    NASA Technical Reports Server (NTRS)

    Edberg, Stephen J. (Editor)

    1966-01-01

    The International Halley Watch (IHW) was organized for the purpose of gathering and archiving the most complete record of the apparition of a comet, Halley's Comet (1982i = 1986 III = 1P/Halley), ever compiled. The redirection of the International Sun-Earth Explorer 3 (ISEE-3) spacecraft, subsequently renamed the International Cometary Explorer (ICE), toward Comet Giacobini-Zinner (1984e = 1985 XIII = 21P/Giacobini-Zinner) prompted the initiation of a formal watch on that comet. All the data collected on P/Giacobini-Zinner and P/Halley have been published on CD-ROM in the Comet Halley Archive. This document contains a printed version of the archive data, collected by amateur astronomers, on these two comets. Volume 1 contains the Comet Giacobini-Zinner data archive and Volume 2 contains the Comet Halley archive. Both volumes include information on how to read the data in both archives, as well as a history of both comet watches (including the organizing of the network of astronomers and lessons learned from that experience).

  14. The New world of ';Big Data' Analytics and High Performance Data: A Paradigm shift in the way we interact with very large Earth Observation datasets (Invited)

    NASA Astrophysics Data System (ADS)

    Purss, M. B.; Lewis, A.; Ip, A.; Evans, B.

    2013-12-01

    The next decade promises an exponential increase in volumes of open data from Earth observing satellites. The ESA Sentinels, the Japan Meteorological Agency's Himawari 8/9 geostationary satellites, various NASA missions, and of course the many EO satellites planned from China, will produce petabyte scale datasets of national and global significance. It is vital that we develop new ways of managing, accessing and using this ';big-data' from satellites, to produce value added information within realistic timeframes. A paradigm shift is required away from traditional ';scene based' (and labour intensive) approaches with data storage and delivery for processing at local sites, to emerging High Performance Data (HPD) models where the data are organised and co-located with High Performance Computational (HPC) infrastructures in a way that enables users to bring themselves, their algorithms and the HPC processing power to the data. Automated workflows, that allow the entire archive of data to be rapidly reprocessed from raw data to fully calibrated products, are a crucial requirement for the effective stewardship of these datasets. New concepts such as arranging and viewing data as ';data objects' which underpin the delivery of ';information as a service' are also integral to realising the transition into HPD analytics. As Australia's national remote sensing and geoscience agency, Geoscience Australia faces a pressing need to solve the problems of ';big-data', in particular around the 25-year archive of calibrated Landsat data. The challenge is to ensure standardised information can be extracted from the entire archive and applied to nationally significant problems in hazards, water management, land management, resource development and the environment. Ultimately, these uses justify government investment in these unique systems. A key challenge was how best to organise the archive of calibrated Landsat data (estimated to grow to almost 1 PB by the end of 2014) in a way that supports HPD applications yet with the ability to trace each observation (pixel) back to its original satellite acquisition. The approach taken was to develop a multi-dimensional array (a data cube) underpinned by the partitioning the data into tiles, without any temporal aggregation. This allows for flexible spatio-temporal queries of the archive whilst minimising the need to perform geospatial processing just to locate the pixels of interest. Equally important is the development and implementation of international data interoperability standards (such as OGC web services and ISO metadata standards) that will provide advanced access for users to interact with and query the data cube without needing to download any data or to go through specialised data portals. This new approach will vastly improve access to, and the impact of, Australia's Landsat archive holdings.

  15. Improving NGDC Track-line Data Quality Control

    NASA Astrophysics Data System (ADS)

    Chandler, M. T.; Wessel, P.

    2004-12-01

    Ship-board gravity, magnetic and bathymetry data archived at the National Geophysical Data Center (NGDC) represent decades of seagoing research, containing over 4,500 cruises. Cruise data remain relevent despite the prominence of satellite altimetry-derived global grids because many geologic processes remain resolvable by oceanographic research alone. Due to the tremendous investment put forth by scientists and taxpayers to compile this vast archive and the significant errors found within it, additional quality assessment and corrections are warranted. These can best be accomplished by adding to existing quality control measures at NGDC. We are currently developing open source software to provide additional quality control. Along with NGDC's current sanity checking, new data at NGDC will also be subjected to an along-track ``sniffer'' which will detect and flag suspicious data for later graphical inspection using a visual editor. If new data pass these tests, they will undergo further scrutinization using a crossover error (COE) calculator which will compare new data values to existing values at points of intersection within the archive. Data passing these tests will be deemed ``quality data`` and suitable for permanent addition to the archive, while data that fail will be returned to the source institution for correction. Crossover errors will be stored and an online COE database will be available. The COE database will allow users to apply corrections to the NGDC track-line database to produce corrected data files. At no time will the archived data itself be modified. An attempt will also be made to reduce navigational errors for pre-GPS navigated cruises. Upon completion these programs will be used to explore and model systematic errors within the archive, generate correction tables for all cruises, and to quantify the error budget in marine geophysical observations. Software will be released and these procedures will be implemented in cooperation with NGDC staff.

  16. USGS Releases Landsat Orthorectified State Mosaics

    USGS Publications Warehouse

    ,

    2005-01-01

    The U.S. Geological Survey (USGS) National Remote Sensing Data Archive, located at the USGS Center for Earth Resources Observation and Science (EROS) in Sioux Falls, South Dakota, maintains the Landsat orthorectified data archive. Within the archive are Landsat Enhanced Thematic Mapper Plus (ETM+) data that have been pansharpened and orthorectified by the Earth Satellite Corporation. This imagery has acquisition dates ranging from 1999 to 2001 and was created to provide users with access to quality-screened, high-resolution satellite images with global coverage over the Earth's landmasses.

  17. Archival Legacy Investigations of Circumstellar Environments (ALICE): Statistical assessment of point source detections

    NASA Astrophysics Data System (ADS)

    Choquet, Élodie; Pueyo, Laurent; Soummer, Rémi; Perrin, Marshall D.; Hagan, J. Brendan; Gofas-Salas, Elena; Rajan, Abhijith; Aguilar, Jonathan

    2015-09-01

    The ALICE program, for Archival Legacy Investigation of Circumstellar Environment, is currently conducting a virtual survey of about 400 stars, by re-analyzing the HST-NICMOS coronagraphic archive with advanced post-processing techniques. We present here the strategy that we adopted to identify detections and potential candidates for follow-up observations, and we give a preliminary overview of our detections. We present a statistical analysis conducted to evaluate the confidence level on these detection and the completeness of our candidate search.

  18. National Centers for Environmental Prediction

    Science.gov Websites

    /NDAS Output Fields (contents, format, grid specs, output frequency, archive): The NWP model The horizontal output grid The vertical grid Access to fields Anonymous FTP Access Permanent Tape Archive

  19. A Search for New Galactic Magnetars in Archival Chandra and XMM-Newton Observations

    NASA Astrophysics Data System (ADS)

    Muno, M. P.; Gaensler, B. M.; Nechita, A.; Miller, J. M.; Slane, P. O.

    2008-06-01

    We present constraints on the number of Galactic magnetars, which we have established by searching for sources with periodic variability in 506 archival Chandra observations and 441 archival XMM-Newton observations of the Galactic plane (| b| < 5°). Our search revealed four sources with periodic variability on timescales of 200-5000 s, all of which are probably accreting white dwarfs. We identify 7 of 12 known Galactic magnetars, but find no new examples with periods between 5 and 20 s. We convert this nondetection into limits on the total number of Galactic magnetars by computing the fraction of the young Galactic stellar population that our survey covered. We find that easily detectable magnetars, modeled after persistent anomalous X-ray pulsars (e.g., with LX = 1035 ergs s-1 [0.5-10.0 keV] and Arms = 12% ), could have been identified in ≈5% of the Galactic spiral arms by mass. If we assume that three previously known examples randomly fall within our survey, then there are 59+ 92-32 in the Galaxy. Barely detectable magnetars (LX = 3 × 1033 ergs s-1 and Arms = 15% ) could have been identified throughout ≈0.4% of the spiral arms. The lack of new examples implies that <540 exist in the Galaxy (90% confidence). Similar constraints are found by considering the detectability of transient magnetars in outburst. For assumed lifetimes of 104 yr, the birth rate of magnetars is between 0.003 and 0.06 yr-1. Therefore, the birth rate of magnetars is at least 10% of that for normal radio pulsars, and could exceed that value, unless transient magnetars are active for gtrsim105 yr.

  20. Lessons Learned while Exploring Cloud-Native Architectures for NASA EOSDIS Applications and Systems

    NASA Astrophysics Data System (ADS)

    Pilone, D.

    2016-12-01

    As new, high data rate missions begin collecting data, the NASA's Earth Observing System Data and Information System (EOSDIS) archive is projected to grow roughly 20x to over 300PBs by 2025. To prepare for the dramatic increase in data and enable broad scientific inquiry into larger time series and datasets, NASA has been exploring the impact of applying cloud technologies throughout EOSDIS. In this talk we will provide an overview of NASA's prototyping and lessons learned in applying cloud architectures to: Highly scalable and extensible ingest and archive of EOSDIS data Going "all-in" on cloud based application architectures including "serverless" data processing pipelines and evaluating approaches to vendor-lock in Rethinking data distribution and approaches to analysis in a cloud environment Incorporating and enforcing security controls while minimizing the barrier for research efforts to deploy to NASA compliant, operational environments. NASA's Earth Observing System (EOS) is a coordinated series of satellites for long term global observations. NASA's Earth Observing System Data and Information System (EOSDIS) is a multi-petabyte-scale archive of environmental data that supports global climate change research by providing end-to-end services from EOS instrument data collection to science data processing to full access to EOS and other earth science data. On a daily basis, the EOSDIS ingests, processes, archives and distributes over 3 terabytes of data from NASA's Earth Science missions representing over 6000 data products ranging from various types of science disciplines. EOSDIS has continually evolved to improve the discoverability, accessibility, and usability of high-impact NASA data spanning the multi-petabyte-scale archive of Earth science data products.

  1. Testing Numerical Models of Cool Core Galaxy Cluster Formation with X-Ray Observations

    NASA Astrophysics Data System (ADS)

    Henning, Jason W.; Gantner, Brennan; Burns, Jack O.; Hallman, Eric J.

    2009-12-01

    Using archival Chandra and ROSAT data along with numerical simulations, we compare the properties of cool core and non-cool core galaxy clusters, paying particular attention to the region beyond the cluster cores. With the use of single and double β-models, we demonstrate a statistically significant difference in the slopes of observed cluster surface brightness profiles while the cluster cores remain indistinguishable between the two cluster types. Additionally, through the use of hardness ratio profiles, we find evidence suggesting cool core clusters are cooler beyond their cores than non-cool core clusters of comparable mass and temperature, both in observed and simulated clusters. The similarities between real and simulated clusters supports a model presented in earlier work by the authors describing differing merger histories between cool core and non-cool core clusters. Discrepancies between real and simulated clusters will inform upcoming numerical models and simulations as to new ways to incorporate feedback in these systems.

  2. Cloud archiving and data mining of High-Resolution Rapid Refresh forecast model output

    NASA Astrophysics Data System (ADS)

    Blaylock, Brian K.; Horel, John D.; Liston, Samuel T.

    2017-12-01

    Weather-related research often requires synthesizing vast amounts of data that need archival solutions that are both economical and viable during and past the lifetime of the project. Public cloud computing services (e.g., from Amazon, Microsoft, or Google) or private clouds managed by research institutions are providing object data storage systems potentially appropriate for long-term archives of such large geophysical data sets. We illustrate the use of a private cloud object store developed by the Center for High Performance Computing (CHPC) at the University of Utah. Since early 2015, we have been archiving thousands of two-dimensional gridded fields (each one containing over 1.9 million values over the contiguous United States) from the High-Resolution Rapid Refresh (HRRR) data assimilation and forecast modeling system. The archive is being used for retrospective analyses of meteorological conditions during high-impact weather events, assessing the accuracy of the HRRR forecasts, and providing initial and boundary conditions for research simulations. The archive is accessible interactively and through automated download procedures for researchers at other institutions that can be tailored by the user to extract individual two-dimensional grids from within the highly compressed files. Characteristics of the CHPC object storage system are summarized relative to network file system storage or tape storage solutions. The CHPC storage system is proving to be a scalable, reliable, extensible, affordable, and usable archive solution for our research.

  3. Migration Stories: Upgrading a PDS Archive to PDS4

    NASA Astrophysics Data System (ADS)

    Kazden, D. P.; Walker, R. J.; Mafi, J. N.; King, T. A.; Joy, S. P.; Moon, I. S.

    2015-12-01

    Increasing bandwidth, storage capacity and computational capabilities have greatly increased our ability to access data and use them. A significant challenge, however, is to make data archived under older standards useful in the new data environments. NASA's Planetary Data System (PDS) recently released version 4 of its information model (PDS4). PDS4 is an improvement and has advantages over previous versions. PDS4 adopts the XML standard for metadata and expresses structural requirements with XML Schema and content constraints by using Schematron. This allows for thorough validation by using off the shelf tools. This is a substantial improvement over previous PDS versions. PDS4 was designed to improve discoverability of products (resources) in a PDS archive. These additions allow for more uniform metadata harvesting from the collection level to the product level. New tools and services are being deployed that depend on the data adhering to the PDS4 model. However, the PDS has been an operational archive since 1989 and has large holdings that are compliant with previous versions of the PDS information model. The challenge is the make the older data accessible and useable with the new PDS4 based tools. To provide uniform utility and access to the entire archive the older data must be migrated to the PDS4 model. At the Planetary Plasma Interactions (PPI) Node of the PDS we've been actively planning and preparing to migrate our legacy archive to the new PDS4 standards for several years. With the release of the PDS4 standards we have begun the migration of our archive. In this presentation we will discuss the preparation of the data for the migration and how we are approaching this task. The presentation will consist of a series of stories to describe our experiences and the best practices we have learned.

  4. Solar radiation observation stations with complete listing of data archived by the National Climatic Center, Asheville, North Carolina and initial listing of data not currently archived

    NASA Technical Reports Server (NTRS)

    Carter, E. A.; Wells, R. E.; Williams, B. B.; Christensen, D. L.

    1976-01-01

    A listing is provided of organizations taking solar radiation data, the 166 stations where observations are made, the type of equipment used, the form of the recorded data, and the period of operation of each station. Included is a listing of the data from 150 solar radiation stations collected over the past 25 years and stored by the National Climatic Center.

  5. BAO Plate Archive Project: Digitization, Electronic Database and Research Programmes

    NASA Astrophysics Data System (ADS)

    Mickaelian, A. M.; Abrahamyan, H. V.; Andreasyan, H. R.; Azatyan, N. M.; Farmanyan, S. V.; Gigoyan, K. S.; Gyulzadyan, M. V.; Khachatryan, K. G.; Knyazyan, A. V.; Kostandyan, G. R.; Mikayelyan, G. A.; Nikoghosyan, E. H.; Paronyan, G. M.; Vardanyan, A. V.

    2016-06-01

    The most important part of the astronomical observational heritage are astronomical plate archives created on the basis of numerous observations at many observatories. Byurakan Astrophysical Observatory (BAO) plate archive consists of 37,000 photographic plates and films, obtained at 2.6m telescope, 1m and 0.5m Schmidt type and other smaller telescopes during 1947-1991. In 2002-2005, the famous Markarian Survey (also called First Byurakan Survey, FBS) 1874 plates were digitized and the Digitized FBS (DFBS) was created. New science projects have been conducted based on these low-dispersion spectroscopic material. A large project on the whole BAO Plate Archive digitization, creation of electronic database and its scientific usage was started in 2015. A Science Program Board is created to evaluate the observing material, to investigate new possibilities and to propose new projects based on the combined usage of these observations together with other world databases. The Executing Team consists of 11 astronomers and 2 computer scientists and will use 2 EPSON Perfection V750 Pro scanners for the digitization, as well as Armenian Virtual Observatory (ArVO) database will be used to accommodate all new data. The project will run during 3 years in 2015-2017 and the final result will be an electronic database and online interactive sky map to be used for further research projects, mainly including high proper motion stars, variable objects and Solar System bodies.

  6. VizieR Online Data Catalog: HST/COS survey of z<0.9 AGNs. I. (Danforth+, 2016)

    NASA Astrophysics Data System (ADS)

    Danforth, C. W.; Keeney, B. A.; Tilton, E. M.; Shull, J. M.; Stocke, J. T.; Stevans, M.; Pieri, M. M.; Savage, B. D.; France, K.; Syphers, D.; Smith, B. D.; Green, J. C.; Froning, C.; Penton, S. V.; Osterman, S. N.

    2016-05-01

    COS is the fourth-generation UV spectrograph on board HST and is optimized for medium-resolution (R~18000, Δv~17km/s) spectroscopy of point sources in the 1135-1800Å band. To constitute our survey, we selected 82 AGN sight lines from the archive which met the selection criteria. Most of the AGNs observed in Cycles 18-20 under the Guaranteed Time Observation programs (GTO; PI-Green) are included, along with numerous archival data sets collected under various Guest Observer programs. Observational and programatic details are presented in Table 2; see also section 2.1. (5 data files).

  7. Shape models of asteroids based on lightcurve observations with BlueEye600 robotic observatory

    NASA Astrophysics Data System (ADS)

    Ďurech, Josef; Hanuš, Josef; Brož, Miroslav; Lehký, Martin; Behrend, Raoul; Antonini, Pierre; Charbonnel, Stephane; Crippa, Roberto; Dubreuil, Pierre; Farroni, Gino; Kober, Gilles; Lopez, Alain; Manzini, Federico; Oey, Julian; Poncy, Raymond; Rinner, Claudine; Roy, René

    2018-04-01

    We present physical models, i.e. convex shapes, directions of the rotation axis, and sidereal rotation periods, of 18 asteroids out of which 10 are new models and 8 are refined models based on much larger data sets than in previous work. The models were reconstructed by the lightcurve inversion method from archived publicly available lightcurves and our new observations with BlueEye600 robotic observatory. One of the new results is the shape model of asteroid (1663) van den Bos with the rotation period of 749 h, which makes it the slowest rotator with known shape. We describe our strategy for target selection that aims at fast production of new models using the enormous potential of already available photometry stored in public databases. We also briefly describe the control software and scheduler of the robotic observatory and we discuss the importance of building a database of asteroid models for studying asteroid physical properties in collisional families.

  8. Real and Virtual Heritage - The Plate Archive of Sonneberg Observatory - Digitisation, Preservation and Scientific Programme

    NASA Astrophysics Data System (ADS)

    Kroll, Peter

    The real heritage of Sonneberg Observatory consists of several buildings with seven domes, a number of telescopes for photographic and photoelectric measurements, a plate archive - which is the second-largest in the world - and a scientific library. While the instruments are today mainly used for public observing tours and to a limited degree for continuing sky patrol, the plate archive is systematically scanned in order to make the whole information stored in the emulsion of the plates accessible to the astronomical community and to allow the scientific study of all stars ever recorded. First pilot studies give a taste of what output can be expected from the digitized plate archive.

  9. Learning the Rhythm of the Seasons in the Face of Global Change: Phenological Research in the 21st Century

    NASA Technical Reports Server (NTRS)

    Morisette, Jeffrey T.; Richardson, Andrew D.; Knapp, Alan K.; Fisher, Jeremy I.; Graham, Eric A.; Abatzoglou, John; Wilson, Bruce E.; Breshears, David D.; Hanebry, Geoffrey M.; Hanes, Jonathan M.; hide

    2008-01-01

    Phenology is the study of recurring life-cycle events, of which classic examples include flowering by plants as well as animal migration. Phenological responses are increasingly relevant for addressing applied environmental issues. Yet, challenges remain with respect to spanning scales of observation, integrating observations across taxa, and modeling phenological sequences to enable ecological forecasts in light of future climate change. Recent advances that are helping to address these challenges include refined landscape-scale phenology estimates from satellite data, advanced instrument-based approaches for field measurements, and new cyber-infrastructure for archiving and distribution of products. These advances are aiding in diverse areas including modeling land-surface exchange, evaluating climate-phenology relationships, and aiding land management decisions.

  10. Two decades of numerical modelling to understand long term fluvial archives: Advances and future perspectives

    NASA Astrophysics Data System (ADS)

    Veldkamp, A.; Baartman, J. E. M.; Coulthard, T. J.; Maddy, D.; Schoorl, J. M.; Storms, J. E. A.; Temme, A. J. A. M.; van Balen, R.; van De Wiel, M. J.; van Gorp, W.; Viveen, W.; Westaway, R.; Whittaker, A. C.

    2017-06-01

    The development and application of numerical models to investigate fluvial sedimentary archives has increased during the last decades resulting in a sustained growth in the number of scientific publications with keywords, 'fluvial models', 'fluvial process models' and 'fluvial numerical models'. In this context we compile and review the current contributions of numerical modelling to the understanding of fluvial archives. In particular, recent advances, current limitations, previous unexpected results and future perspectives are all discussed. Numerical modelling efforts have demonstrated that fluvial systems can display non-linear behaviour with often unexpected dynamics causing significant delay, amplification, attenuation or blurring of externally controlled signals in their simulated record. Numerical simulations have also demonstrated that fluvial records can be generated by intrinsic dynamics without any change in external controls. Many other model applications demonstrate that fluvial archives, specifically of large fluvial systems, can be convincingly simulated as a function of the interplay of (palaeo) landscape properties and extrinsic climate, base level and crustal controls. All discussed models can, after some calibration, produce believable matches with real world systems suggesting that equifinality - where a given end state can be reached through many different pathways starting from different initial conditions and physical assumptions - plays an important role in fluvial records and their modelling. The overall future challenge lies in the development of new methodologies for a more independent validation of system dynamics and research strategies that allow the separation of intrinsic and extrinsic record signals using combined fieldwork and modelling.

  11. 2015 Cataloging Hidden Special Collections and Archives Unconference and Symposium: Innovation, Collaboration, and Models. Proceedings of the CLIR Cataloging Hidden Special Collections and Archives Symposium (Philadelphia, Pennsylvania, March 12-13, 2015)

    ERIC Educational Resources Information Center

    Oestreicher, Cheryl, Ed.

    2015-01-01

    The 2015 CLIR Unconference & Symposium was the capstone event to seven years of grant funding through CLIR's Cataloging Hidden Special Collections and Archives program. These proceedings group presentations by theme. Collaborations provides examples of multi-institutional projects, including one international collaboration; Student and Faculty…

  12. Ocean color - Availability of the global data set

    NASA Technical Reports Server (NTRS)

    Feldman, Gene; Kuring, Norman; Ng, Carolyn; Esaias, Wayne; Mcclain, Chuck; Elrod, Jane; Maynard, Nancy; Endres, Dan

    1989-01-01

    The use of satellite observations of ocean color to provide reliable estimates of marine phytoplankton biomass on synoptic scales is examined. An overview is given of the Coastal Zone Color Scanner data processing system. The archiving and distribution of ocean color data are discussed, and NASA-sponsored archive sites are listed.

  13. ASP archiving solution of regional HUSpacs.

    PubMed

    Pohjonen, Hanna; Kauppinen, Tomi; Ahovuo, Juhani

    2004-09-01

    The application service provider (ASP) model is not novel, but widely used in several non-health care-related business areas. In this article, ASP is described as a potential solution for long-term and back-up archiving of the picture archiving and communication system (PACS) of the Hospital District of Helsinki and Uusimaa (HUS). HUSpacs is a regional PACS for 21 HUS hospitals serving altogether 1.4 million citizens. The ultimate goal of this study was to define the specifications for the ASP archiving service and to compare different commercial options for archiving solutions (costs derived by unofficial requests for proposal): in-house PACS components, the regional ASP concept and the hospital-based ASP concept. In conclusion, the large scale of the HUS installation enables a cost-effective regional ASP archiving, resulting in a four to five times more economical solution than hospital-based ASP.

  14. Reference Model for an Open Archival Information System

    NASA Technical Reports Server (NTRS)

    1997-01-01

    This document is a technical report for use in developing a consensus on what is required to operate a permanent, or indefinite long-term, archive of digital information. It may be useful as a starting point for a similar document addressing the indefinite long-term preservation of non-digital information. This report establishes a common framework of terms and concepts which comprise an Open Archival Information System (OAIS). It allows existing and future archives to be more meaningfully compared and contrasted. It provides a basis for further standardization of within an archival context and it should promote greater vendor awareness of, and support of , archival requirements. Through the process of normal evolution, it is expected that expansion, deletion, or modification to this document may occur. This report is therefore subject to CCSDS document management and change control procedures.

  15. A sensor fusion field experiment in forest ecosystem dynamics

    NASA Technical Reports Server (NTRS)

    Smith, James A.; Ranson, K. Jon; Williams, Darrel L.; Levine, Elissa R.; Goltz, Stewart M.

    1990-01-01

    The background of the Forest Ecosystem Dynamics field campaign is presented, a progress report on the analysis of the collected data and related modeling activities is provided, and plans for future experiments at different points in the phenological cycle are outlined. The ecological overview of the study site is presented, and attention is focused on forest stands, needles, and atmospheric measurements. Sensor deployment and thermal and microwave observations are discussed, along with two examples of the optical radiation measurements obtained during the experiment in support of radiative transfer modeling. Future activities pertaining to an archival system, synthetic aperture radar, carbon acquisition modeling, and upcoming field experiments are considered.

  16. The Self-Organized Archive: SPASE, PDS and Archive Cooperatives

    NASA Astrophysics Data System (ADS)

    King, T. A.; Hughes, J. S.; Roberts, D. A.; Walker, R. J.; Joy, S. P.

    2005-05-01

    Information systems with high quality metadata enable uses and services which often go beyond the original purpose. There are two types of metadata: annotations which are items that comment on or describe the content of a resource and identification attributes which describe the external properties of the resource itself. For example, annotations may indicate which columns are present in a table of data, whereas an identification attribute would indicate source of the table, such as the observatory, instrument, organization, and data type. When the identification attributes are collected and used as the basis of a search engine, a user can constrain on an attribute, the archive can then self-organize around the constraint, presenting the user with a particular view of the archive. In an archive cooperative where each participating data system or archive may have its own metadata standards, providing a multi-system search engine requires that individual archive metadata be mapped to a broad based standard. To explore how cooperative archives can form a larger self-organized archive we will show how the Space Physics Archive Search and Extract (SPASE) data model will allow different systems to create a cooperative and will use Planetary Data System (PDS) plus existing space physics activities as a demonstration.

  17. Desired Precision in Multi-Objective Optimization: Epsilon Archiving or Rounding Objectives?

    NASA Astrophysics Data System (ADS)

    Asadzadeh, M.; Sahraei, S.

    2016-12-01

    Multi-objective optimization (MO) aids in supporting the decision making process in water resources engineering and design problems. One of the main goals of solving a MO problem is to archive a set of solutions that is well-distributed across a wide range of all the design objectives. Modern MO algorithms use the epsilon dominance concept to define a mesh with pre-defined grid-cell size (often called epsilon) in the objective space and archive at most one solution at each grid-cell. Epsilon can be set to the desired precision level of each objective function to make sure that the difference between each pair of archived solutions is meaningful. This epsilon archiving process is computationally expensive in problems that have quick-to-evaluate objective functions. This research explores the applicability of a similar but computationally more efficient approach to respect the desired precision level of all objectives in the solution archiving process. In this alternative approach each objective function is rounded to the desired precision level before comparing any new solution to the set of archived solutions that already have rounded objective function values. This alternative solution archiving approach is compared to the epsilon archiving approach in terms of efficiency and quality of archived solutions for solving mathematical test problems and hydrologic model calibration problems.

  18. Analysis of the Sheltered Instruction Observation Protocol Model on Academic Performance of English Language Learners

    NASA Astrophysics Data System (ADS)

    Ingram, Sandra W.

    This quantitative comparative descriptive study involved analyzing archival data from end-of-course (EOC) test scores in biology of English language learners (ELLs) taught or not taught using the sheltered instruction observation protocol (SIOP) model. The study includes descriptions and explanations of the benefits of the SIOP model to ELLs, especially in content area subjects such as biology. Researchers have shown that ELLs in high school lag behind their peers in academic achievement in content area subjects. Much of the research on the SIOP model took place in elementary and middle school, and more research was necessary at the high school level. This study involved analyzing student records from archival data to describe and explain if the SIOP model had an effect on the EOC test scores of ELLs taught or not taught using it. The sample consisted of 527 Hispanic students (283 females and 244 males) from Grades 9-12. An independent sample t-test determined if a significant difference existed in the mean EOC test scores of ELLs taught using the SIOP model as opposed to ELLs not taught using the SIOP model. The results indicated that a significant difference existed between EOC test scores of ELLs taught using the SIOP model and ELLs not taught using the SIOP model (p = .02). A regression analysis indicated a significant difference existed in the academic performance of ELLs taught using the SIOP model in high school science, controlling for free and reduced-price lunch (p = .001) in predicting passing scores on the EOC test in biology at the school level. The data analyzed for free and reduced-price lunch together with SIOP data indicated that both together were not significant (p = .175) for predicting passing scores on the EOC test in high school biology. Future researchers should repeat the study with student-level data as opposed to school-level data, and data should span at least three years.

  19. Amateur Planetary Radio Data Archived for Science and Education: Radio Jove

    NASA Astrophysics Data System (ADS)

    Thieman, J.; Cecconi, B.; Sky, J.; Garcia, L. N.; King, T. A.; Higgins, C. A.; Fung, S. F.

    2015-12-01

    The Radio Jove Project is a hands-on educational activity in which students, teachers, and the general public build simple radio telescopes, usually from a kit, to observe single frequency decameter wavelength radio emissions from Jupiter, the Sun, the galaxy, and the Earth usually with simple dipole antennas. Some of the amateur observers have upgraded their receivers to spectrographs and their antennas have become more sophisticated as well. The data records compare favorably to more sophisticated professional radio telescopes such as the Long Wavelength Array (LWA) and the Nancay Decametric Array. Since these data are often carefully calibrated and recorded around the clock in widely scattered locations they represent a valuable database useful not only to amateur radio astronomers but to the professional science community as well. Some interesting phenomena have been noted in the data that are of interest to the professionals familiar with such records. The continuous monitoring of radio emissions from Jupiter could serve as useful "ground truth" data during the coming Juno mission's radio observations of Jupiter. Radio Jove has long maintained an archive for thousands of Radio Jove observations, but the database was intended for use by the Radio Jove participants only. Now, increased scientific interest in the use of these data has resulted in several proposals to translate the data into a science community data format standard and store the data in professional archives. Progress is being made in translating Radio Jove data to the Common Data Format (CDF) and also in generating new observations in that format as well. Metadata describing the Radio Jove data would follow the Space Physics Archive Search and Extract (SPASE) standard. The proposed archive to be used for long term preservation would be the Planetary Data System (PDS). Data sharing would be achieved through the PDS and the Paris Astronomical Data Centre (PADC) and the Virtual Wave Observatory (VWO). We believe that Radio Jove represents another fertile area for citizen science to contribute to overall scientific investigation.

  20. Gravitational redshift of galaxies in clusters as predicted by general relativity.

    PubMed

    Wojtak, Radosław; Hansen, Steen H; Hjorth, Jens

    2011-09-28

    The theoretical framework of cosmology is mainly defined by gravity, of which general relativity is the current model. Recent tests of general relativity within the Lambda Cold Dark Matter (ΛCDM) model have found a concordance between predictions and the observations of the growth rate and clustering of the cosmic web. General relativity has not hitherto been tested on cosmological scales independently of the assumptions of the ΛCDM model. Here we report an observation of the gravitational redshift of light coming from galaxies in clusters at the 99 per cent confidence level, based on archival data. Our measurement agrees with the predictions of general relativity and its modification created to explain cosmic acceleration without the need for dark energy (the f(R) theory), but is inconsistent with alternative models designed to avoid the presence of dark matter. © 2011 Macmillan Publishers Limited. All rights reserved

  1. Archival Research Capabilities of the WFIRST Data Set

    NASA Astrophysics Data System (ADS)

    Szalay, Alexander

    WFIRST's unique combination of a large (~0.3 deg2) field of view and HST-like angular resolution and sensitivity in the near infrared will produce spectacular new insights into the origins of stars, galaxies, and structure in the cosmos. We propose a WFIRST Archive Science Investigation Team (SIT-F) to define an archival, query, and analysis system that will enable scientific discovery in all relevant areas of astrophysics and maximize the overall scientific yield of the mission. Guest investigators (GIs), guest observers (GOs), the WFIRST SIT's, WFIRST Science Center(s), and astronomers using data from other surveys will all benefit from the extensive, easy, fast and reliable use of the WFIRST archives. We propose to develop the science requirements for the archive and work to understand its interactions with other elements of the WFIRST mission. To accomplish this, we will conduct case studies to derive performance requirements for the WFIRST archives. These will clarify what is needed for GIs to make important scientific discoveries across a broad range of astrophysics. While other SITs will primarily address the science capabilities of the WFIRST instruments, we will look ahead to the science enabling capabilities of the WFIRST archives. We will demonstrate how the archive can be optimized to take advantage of the extraordinary science capabilities of the WFIRST instruments as well as major space and ground observatories to maximize the science return of the mission. We will use the "20 queries" methodology, formulated by Jim Gray, to cover the most important science analysis patterns and use these to establish the performance required of the WFIRST archive. The case studies will be centered on studying galaxy evolution as a function of cosmic time, environment and intrinsic properties. The analyses will require massive angular and spatial cross correlations between key galaxy properties to search for new fundamental scaling relations that may only become apparent when exploring a database of 108 galaxies with multiband photometry and grism spectroscopy. The case studies will require (i) the creation of a unified WFIRST object catalog consisting of data cross-matched to external catalogs, (ii) an easy-to-access, scalable database, utilizing the latest data discovery and querying techniques, (iii) in situ analyses of large and/or complex data, (iv) identification of links to supporting data and enabling queries spanning WFIRST and other databases, (v) combining simulations with modeling software. To accomplish these objectives, we will prototype a system capable of executing complex user-defined scripts including database access to a shared computational facility with tools for joining WFIRST to other surveys, also enabling comparisons to physical models. Our organizational plan divides the work into several general areas where our team members have specific expertise: (a) apply the 20 queries methodology to derive performance and functionality requirements, (b) develop a practical interactive server-side query system, built on our SDSS experience, (c) apply advanced cross-matching techniques, (d) create mock WFIRST imaging and grism data, (e) develop high level cross correlation tools, (e) optimize scripting systems using high-level languages (iPython), (f) perform close integration of cosmological simulations with observational data, (g) apply advanced machine learning techniques. Our efforts will be coordinated with the WFIRST Science Center (WSC), the other SITs, and the broader community in a manner consistent with direction and review of the Project Office. We will publish our results as milestones are reached, and issue progress reports on a regular basis. We will represent SIT-F at all relevant meetings including meetings of the other SITs (SITs A-E), and participate in "Big Data" conferences to interact with others in the field and learn new techniques that might be applicable to WFIRST.

  2. Tridacna Derived ENSO Records From The Philippines During The Last Interglacial Show Similar ENSO Activity To The Present Day

    NASA Astrophysics Data System (ADS)

    Welsh, K.; Morgan, Z.; Suzuki, A.

    2016-12-01

    Although modeled predictions for the relative strength and frequency of ENSO under mean warming conditions suggest an increase in the number and strength of ENSO event, however there are limited seasonally resolved records of ENSO variability during previous warm periods for example the last interglacial to test these models as reliable archives such as corals are not generally well preserved over these time periods. Presented here are two multi decadal Tridacna gigas derived stable isotopic time series from a coral terrace on the island of Cebu in the Philippines that formed during MIS5e based upon geomorphology and open-system corrected U/Th dating of corals. The ENSO activity observed in these time well preserved records indicate a similar level of ENSO activity during the last interglacial period as the present day based upon comparisons with recent coral derived stable isotopic records. Though these are relatively short records they provide further windows into ENSO activity from this important time period and demonstrate this area may be provide more opportunities to gather these archives.

  3. Natural and Cultural Preservation - Complementary Endeavors through Soil Archive Research

    NASA Astrophysics Data System (ADS)

    Ackermann, Oren; Frumin, Suembikya; Kolska Horwitz, Liora; Maeir, Aren M.; Weiss, Ehud; Zhevelev, Helena M.

    2016-04-01

    Soil is an excellent archive for the history of landscape components such as ancient topography, erosion and accumulation processes, and vegetation characterization. In special cases, the soil archive even preserves botanical faunal and mollusc assemblages, allowing for the production of an archive of organic species as well. Long-term human activities in the past have left their imprints on certain landscape systems, leading to the formation of landscapes composed of both cultural and natural assets. The aim of this presentation is to suggest a conceptual model, based on the soil archive, which enables the preservation and sustainability of such environments. The proposed area (eastern Mediterranean) underwent cycles of ancient site establishment and abandonment. When areas were occupied, the natural vegetation around settlements experienced human interventions such as woodcutting, grazing and horticulture. During site abandonment, these interventions ceased, resulting in vegetation regeneration, a reduction in biodiversity, increased fire hazard, etc. This ultimately led to the deterioration of the landscape system as well as the destruction of cultural assets such as ancient buildings and/or remnants. In order to preserve and restore these sites, a conceptual model that combines both modern natural conservation strategies and restoration of traditional land-use techniques is proposed. This model provides a complementary approach to existing natural and cultural preservation efforts.

  4. Spirals, Bridges and Tails: Star Formation and the Disturbed ISM in Colliding Galaxies before Merger.

    NASA Astrophysics Data System (ADS)

    Struck, Curtis; Appleton, Philip; Charmandaris, Vassilis; Reach, William; Smith, Beverly

    2004-09-01

    We propose to use Spitzer's unprecedented sensitivity and wide spatial and spectral evolution to study the distribution of star formation in a sample of colliding galaxies with a wide range of tidal and splash structures. Star forming environments like those in strong tidal spirals, and in extra-disk structures like tails were probably far more common in the early stages of galaxy evolution, and important contributors to the net star formation. Using the Spitzer data and data from other wavebands, we will compare the pattern of SF to maps of gas and dust density and phase distribution. With the help of dynamical modeling, we will relate these in turn to dynamical triggers, to better understand the trigger mechanisms. We expect our observations to complement both the SINGS archive and the archives produced by other GO programs, such as those looking at merger remnants or tidal dwarf formation.

  5. Proper motion of the radio pulsar B1727-47 and its association with the supernova remnant RCW 114

    NASA Astrophysics Data System (ADS)

    Shternin, P. S.; Yu, M.; Kirichenko, A. Yu; Shibanov, Yu A.; Danilenko, A. A.; Voronkov, M. A.; Zyuzin, D. A.

    2017-12-01

    We report preliminary results of the analysis of the proper motion of the bright radio pulsar B1727-47. Using archival Parkes timing data, as well as original and archival ATCA interferometry observations, we, for the first time, constrain the pulsar proper motion at the level of 148±11 mas yr-1. The backward extrapolation of the proper motion vector to the pulsar birth epoch points at the center of the Galactic supernova remnant RCW 114 suggesting the genuine association between the two objects. We discuss the implications of the association and argue that the distance to the system is less than 1 kpc. This value is at least two times lower than the dispersion measure distance estimates. This suggests that the existing Galaxy electron density models are incomplete in the direction to the pulsar.

  6. Living with a Red Dwarf: A Chandra Archival Study of dM Star Activity and Habitability

    NASA Astrophysics Data System (ADS)

    Engle, Scott

    2017-09-01

    We propose to analyze 6 archival Chandra visits, not pointed at, but serendipitously including 3 dM stars of known age. GJ 669 AB are a common proper motion pair, each are resolved and detected in 3 exposures, and LHS 373 is a much older dM star also detected on 3 exposures. Photometry (by us) of GJ 669 AB began 5 years ago, is ongoing, and has precisely determined rotation rates for both stars and evidence of frequent flaring from GJ 669 B. We will analyze the multiple exposures, derive an accurate mean level of X-ray activity from the targets, and also separate out and individually analyze and model any observed X-ray flares. This proposal will provide highly accurate coronal properties for the targets, but also very useful data for stellar evolution and planetary habitability studies.

  7. seNorge2 daily precipitation, an observational gridded dataset over Norway from 1957 to the present day

    NASA Astrophysics Data System (ADS)

    Lussana, Cristian; Saloranta, Tuomo; Skaugen, Thomas; Magnusson, Jan; Tveito, Ole Einar; Andersen, Jess

    2018-02-01

    The conventional climate gridded datasets based on observations only are widely used in atmospheric sciences; our focus in this paper is on climate and hydrology. On the Norwegian mainland, seNorge2 provides high-resolution fields of daily total precipitation for applications requiring long-term datasets at regional or national level, where the challenge is to simulate small-scale processes often taking place in complex terrain. The dataset constitutes a valuable meteorological input for snow and hydrological simulations; it is updated daily and presented on a high-resolution grid (1 km of grid spacing). The climate archive goes back to 1957. The spatial interpolation scheme builds upon classical methods, such as optimal interpolation and successive-correction schemes. An original approach based on (spatial) scale-separation concepts has been implemented which uses geographical coordinates and elevation as complementary information in the interpolation. seNorge2 daily precipitation fields represent local precipitation features at spatial scales of a few kilometers, depending on the station network density. In the surroundings of a station or in dense station areas, the predictions are quite accurate even for intense precipitation. For most of the grid points, the performances are comparable to or better than a state-of-the-art pan-European dataset (E-OBS), because of the higher effective resolution of seNorge2. However, in very data-sparse areas, such as in the mountainous region of southern Norway, seNorge2 underestimates precipitation because it does not make use of enough geographical information to compensate for the lack of observations. The evaluation of seNorge2 as the meteorological forcing for the seNorge snow model and the DDD (Distance Distribution Dynamics) rainfall-runoff model shows that both models have been able to make profitable use of seNorge2, partly because of the automatic calibration procedure they incorporate for precipitation. The seNorge2 dataset 1957-2015 is available at https://doi.org/10.5281/zenodo.845733. Daily updates from 2015 onwards are available at http://thredds.met.no/thredds/catalog/metusers/senorge2/seNorge2/provisional_archive/PREC1d/gridded_dataset/catalog.html.

  8. Around 1500 near-Earth-asteroid orbits improved via EURONEAR

    NASA Astrophysics Data System (ADS)

    Vaduvescu, O.; Hudin, L.; Birlan, M.; Popescu, M.; Tudorica, A.; Toma, R.

    2014-07-01

    Born in 2006 in Paris, the European Near Earth Asteroids Research project (EURONEAR, euronear.imcce.fr) aims ''to study NEAs and PHAs using existing telescopes available to its network and hopefully in the future some automated dedicated 1--2 m facilities''. Although we believe the first aim is fulfilled, the second was not achieved yet, requiring serious commitment from the European NEA researchers and funding agencies. Mainly using free labor by about 30 students and amateur astronomers (from Romania, Chile, UK, France, etc), the PI backed up by his associates M. Birlan (IMCCE Paris) and J. Licandro (IAC Tenerife) and a few other astronomers of the EURONEAR network having access to a few telescopes are approaching around 1,500 observed NEAs whose orbits were improved based on our astrometric contributions. To achive this milestone, we used two main resources and a total of 15 facilities: i) Observing time obtained at 11 professional 1--4 m class telescopes (Chile, La Palma, France, Germany) plus 3 smaller 30--50 cm educational/public outreach telescopes (Romania and Germany) adding about 1,000 observed NEAs; and ii) astrometry obtained from data mining of 4 major image archives (ESO/MPG WFI, INT WFC, CFHTLS Megacam and Subaru SuprimeCam) adding about 500 NEAs recovered in archival images. Among the highlights, about 100 NEAs, PHAs and VIs were observed, recovered or precovered in archives at their second opposition (up to about 15 years away from discovery) or have their orbital arc much extended, and a few VIs and PHAs were eliminated. Incidentally, about 15,000 positions of almost 2,000 known MBAs were reported (mostly in the INT, ESO/MPG and Blanco large fields). About 40 new (one night) NEO candidates and more than 2,000 (one night) unknown MBAs were reported, including about 150 MBAs credited as EURONEAR discoveries. Based on the INT and Blanco data we derived some statistics about the MBA and NEA population observable with 2m and 4m telescopes, proposing a model to rate the NEO candidates observed close to opposition. Based on this work, 10 papers and around 100 MPC circulars were published since 2006.

  9. Using the Saliency-Based Model to Design a Digital Archaeological Game to Motivate Players' Intention to Visit the Digital Archives of Taiwan's Natural Science Museum

    ERIC Educational Resources Information Center

    Hong, Jon-Chao; Hwang, Ming-Yueh; Chen, Yu-Ju; Lin, Pei-Hsin; Huang, Yao-Tien; Cheng, Hao-Yueh; Lee, Chih-Chin

    2013-01-01

    Museums in Taiwan have developed various digital archives, but few people have visited these digital archives. Therefore, this study designed a digital archaeology game for high school students to play. Based on the concept of "learning for playing" (i.e., players who want to win will study more), the digital archaeology game contest…

  10. Radio data archiving system

    NASA Astrophysics Data System (ADS)

    Knapic, C.; Zanichelli, A.; Dovgan, E.; Nanni, M.; Stagni, M.; Righini, S.; Sponza, M.; Bedosti, F.; Orlati, A.; Smareglia, R.

    2016-07-01

    Radio Astronomical Data models are becoming very complex since the huge possible range of instrumental configurations available with the modern Radio Telescopes. What in the past was the last frontiers of data formats in terms of efficiency and flexibility is now evolving with new strategies and methodologies enabling the persistence of a very complex, hierarchical and multi-purpose information. Such an evolution of data models and data formats require new data archiving techniques in order to guarantee data preservation following the directives of Open Archival Information System and the International Virtual Observatory Alliance for data sharing and publication. Currently, various formats (FITS, MBFITS, VLBI's XML description files and ancillary files) of data acquired with the Medicina and Noto Radio Telescopes can be stored and handled by a common Radio Archive, that is planned to be released to the (inter)national community by the end of 2016. This state-of-the-art archiving system for radio astronomical data aims at delegating as much as possible to the software setting how and where the descriptors (metadata) are saved, while the users perform user-friendly queries translated by the web interface into complex interrogations on the database to retrieve data. In such a way, the Archive is ready to be Virtual Observatory compliant and as much as possible user-friendly.

  11. Clinical experiences utilizing wireless remote control and an ASP model backup archive for a disaster recovery event

    NASA Astrophysics Data System (ADS)

    Liu, Brent J.; Documet, Luis; Documet, Jorge; Huang, H. K.; Muldoon, Jean

    2004-04-01

    An Application Service Provider (ASP) archive model for disaster recovery for Saint John"s Health Center (SJHC) clinical PACS data has been implemented using a Fault-Tolerant Archive Server at the Image Processing and Informatics Laboratory, Marina del Rey, CA (IPIL) since mid-2002. The purpose of this paper is to provide clinical experiences with the implementation of an ASP model backup archive in conjunction with handheld wireless technologies for a particular disaster recovery scenario, an earthquake, in which the local PACS archive and the hospital are destroyed and the patients are moved from one hospital to another. The three sites involved are: (1) SJHC, the simulated disaster site; (2) IPIL, the ASP backup archive site; and (3) University of California, Los Angeles Medical Center (UCLA), the relocated patient site. An ASP backup archive has been established at IPIL to receive clinical PACS images daily using a T1 line from SJHC for backup and disaster recovery storage. Procedures were established to test the network connectivity and data integrity on a regular basis. In a given disaster scenario where the local PACS archive has been destroyed and the patients need to be moved to a second hospital, a wireless handheld device such as a Personal Digital Assistant (PDA) can be utilized to route images to the second hospital site with a PACS and reviewed by radiologists. To simulate this disaster scenario, a wireless network was implemented within the clinical environment in all three sites: SJHC, IPIL, and UCLA. Upon executing the disaster scenario, the SJHC PACS archive server simulates a downtime disaster event. Using the PDA, the radiologist at UCLA can query the ASP backup archive server at IPIL for PACS images and route them directly to UCLA. Implementation experiences integrating this solution within the three clinical environments as well as the wireless performance are discussed. A clinical downtime disaster scenario was implemented and successfully tested. Radiologists were able to successfully query PACS images utilizing a wireless handheld device from the ASP backup archive at IPIL and route the PACS images directly to a second clinical site at UCLA where they and the patients are located at that time. In a disaster scenario, using a wireless device, radiologists at the disaster health care center can route PACS data from an ASP backup archive server to be reviewed in a live clinical PACS environment at a secondary site. This solution allows Radiologists to use a wireless handheld device to control the image workflow and to review PACS images during a major disaster event where patients must be moved to a secondary site.

  12. Building A Community Focused Data and Modeling Collaborative platform with Hardware Virtualization Technology

    NASA Astrophysics Data System (ADS)

    Michaelis, A.; Wang, W.; Melton, F. S.; Votava, P.; Milesi, C.; Hashimoto, H.; Nemani, R. R.; Hiatt, S. H.

    2009-12-01

    As the length and diversity of the global earth observation data records grow, modeling and analyses of biospheric conditions increasingly requires multiple terabytes of data from a diversity of models and sensors. With network bandwidth beginning to flatten, transmission of these data from centralized data archives presents an increasing challenge, and costs associated with local storage and management of data and compute resources are often significant for individual research and application development efforts. Sharing community valued intermediary data sets, results and codes from individual efforts with others that are not in direct funded collaboration can also be a challenge with respect to time, cost and expertise. We purpose a modeling, data and knowledge center that houses NASA satellite data, climate data and ancillary data where a focused community may come together to share modeling and analysis codes, scientific results, knowledge and expertise on a centralized platform, named Ecosystem Modeling Center (EMC). With the recent development of new technologies for secure hardware virtualization, an opportunity exists to create specific modeling, analysis and compute environments that are customizable, “archiveable” and transferable. Allowing users to instantiate such environments on large compute infrastructures that are directly connected to large data archives may significantly reduce costs and time associated with scientific efforts by alleviating users from redundantly retrieving and integrating data sets and building modeling analysis codes. The EMC platform also provides the possibility for users receiving indirect assistance from expertise through prefabricated compute environments, potentially reducing study “ramp up” times.

  13. ESA Science Archives, VO tools and remote Scientific Data reduction in Grid Architectures

    NASA Astrophysics Data System (ADS)

    Arviset, C.; Barbarisi, I.; de La Calle, I.; Fajersztejn, N.; Freschi, M.; Gabriel, C.; Gomez, P.; Guainazzi, M.; Ibarra, A.; Laruelo, A.; Leon, I.; Micol, A.; Parrilla, E.; Ortiz, I.; Osuna, P.; Salgado, J.; Stebe, A.; Tapiador, D.

    2008-08-01

    This paper presents the latest functionalities of the ESA Science Archives located at ESAC, Spain, in particular, the following archives : the ISO Data Archive (IDA {http://iso.esac.esa.int/ida}), the XMM-Newton Science Archive (XSA {http://xmm.esac.esa.int/xsa}), the Integral SOC Science Data Archive (ISDA {http://integral.esac.esa.int/isda}) and the Planetary Science Archive (PSA {http://www.rssd.esa.int/psa}), both the classical and the map-based Mars Express interfaces. Furthermore, the ESA VOSpec {http://esavo.esac.esa.int/vospecapp} spectra analysis tool is described, which allows to access and display spectral information from VO resources (both real observational and theoretical spectra), including access to Lines database and recent analysis functionalities. In addition, we detail the first implementation of RISA (Remote Interface for Science Analysis), a web service providing remote users the ability to create fully configurable XMM-Newton data analysis workflows, and to deploy and run them on the ESAC Grid. RISA makes fully use of the inter-operability provided by the SIAP (Simple Image Access Protocol) services as data input, and at the same time its VO-compatible output can directly be used by general VO-tools.

  14. Strongly Misaligned Triple System in SR 24 Revealed by ALMA

    NASA Astrophysics Data System (ADS)

    Fernández-López, M.; Zapata, L. A.; Gabbasov, R.

    2017-08-01

    We report the detection of the 1.3 mm continuum and the molecular emission of the disks of the young triple system SR24 by analyzing ALMA (The Atacama Large Millimeter/Submillimter Array) subarcsecond archival observations. We estimate the mass of the disks (0.025 M ⊙ and 4 × 10-5 M ⊕ for SR24S and SR24N, respectively) and the dynamical mass of the protostars (1.5 M ⊙ and 1.1 M ⊙). A kinematic model of the SR24S disk to fit its C18O (2-1) emission allows us to develop an observational method to determine the tilt of a rotating and accreting disk. We derive the size, inclination, position angle, and sense of rotation of each disk, finding that they are strongly misaligned (108^\\circ ) and possibly rotate in opposite directions as seen from Earth, in projection. We compare the ALMA observations with 12CO SMA archival observations, which are more sensitive to extended structures. We find three extended structures and estimate their masses: a molecular bridge joining the disks of the system, a molecular gas reservoir associated with SR24N, and a gas streamer associated with SR24S. Finally, we discuss the possible origin of the misaligned SR24 system, concluding that a closer inspection of the northern gas reservoir is needed to better understand it.

  15. Confirmation and characterization of young planetary companions hidden in the HST NICMOS archive

    NASA Astrophysics Data System (ADS)

    Pueyo, Laurent

    2013-10-01

    We propose to conduct WFC3 high contrast observations of six faint planetary candidates orbiting young {1 to 100 Myrs} stars identified in archival HST NICMOS coronagraphic data as part of our team's program AR-12652. Such rare objects are of the utmost importance to comparative exo-planetology as their physical properties reflect the initial conditions of still poorly constrained planetary formation mechanisms. Moreover directly imaged systems are precious artifacts in the expanding exo-planetary treasure trove as they are readily available for spectroscopic characterization. Our statistical analysis, which combines population synthesis models and empirical inspections of the entire NICMOS field of view for all sources observed in coronaraphic mode, almost guarantees that one of these six faint candidates is associated with its putative host star. We will conduct our observation in four near infrared filter, F125W, F160W to establish the baseline luminosity of our candidates and in F127M and F139M in order to probe the depth their water absorption features, characteristic of substellar /exo-planetary like atmospheres. Because of the youth of our targets, this program, which only requires a modest 12 HST orbits, will almost certainly identify and image a young or adolescent exo-planet.

  16. JPL Physical Oceanography Distributed Active Archive Center (PO.DAAC) data availability, version 1-94

    NASA Technical Reports Server (NTRS)

    1994-01-01

    The Physical Oceanography Distributed Active Archive Center (PO.DAAC) archive at the Jet Propulsion Laboratory (JPL) includes satellite data sets for the ocean sciences and global-change research to facilitate multidisciplinary use of satellite ocean data. Parameters include sea-surface height, surface-wind vector, sea-surface temperature, atmospheric liquid water, and integrated water vapor. The JPL PO.DAAC is an element of the Earth Observing System Data and Information System (EOSDIS) and is the United States distribution site for Ocean Topography Experiment (TOPEX)/POSEIDON data and metadata.

  17. Digital Archiving: Where the Past Lives Again

    NASA Astrophysics Data System (ADS)

    Paxson, K. B.

    2012-06-01

    The process of digital archiving for variable star data by manual entry with an Excel spreadsheet is described. Excel-based tools including a Step Magnitude Calculator and a Julian Date Calculator for variable star observations where magnitudes and Julian dates have not been reduced are presented. Variable star data in the literature and the AAVSO International Database prior to 1911 are presented and reviewed, with recent archiving work being highlighted. Digitization using optical character recognition software conversion is also demonstrated, with editing and formatting suggestions for the OCR-converted text.

  18. Automated Reduction and Calibration of SCUBA Archive Data Using ORAC-DR

    NASA Astrophysics Data System (ADS)

    Jenness, T.; Stevens, J. A.; Archibald, E. N.; Economou, F.; Jessop, N.; Robson, E. I.; Tilanus, R. P. J.; Holland, W. S.

    The Submillimetre Common User Bolometer Array (SCUBA) instrument has been operating on the James Clerk Maxwell Telescope (JCMT) since 1997. The data archive is now sufficiently large that it can be used for investigating instrumental properties and the variability of astronomical sources. This paper describes the automated calibration and reduction scheme used to process the archive data with particular emphasis on the pointing observations. This is made possible by using the ORAC-DR data reduction pipeline, a flexible and extensible data reduction pipeline that is used on UKIRT and the JCMT.

  19. Constraints on Rational Model Weighting, Blending and Selecting when Constructing Probability Forecasts given Multiple Models

    NASA Astrophysics Data System (ADS)

    Higgins, S. M. W.; Du, H. L.; Smith, L. A.

    2012-04-01

    Ensemble forecasting on a lead time of seconds over several years generates a large forecast-outcome archive, which can be used to evaluate and weight "models". Challenges which arise as the archive becomes smaller are investigated: in weather forecasting one typically has only thousands of forecasts however those launched 6 hours apart are not independent of each other, nor is it justified to mix seasons with different dynamics. Seasonal forecasts, as from ENSEMBLES and DEMETER, typically have less than 64 unique launch dates; decadal forecasts less than eight, and long range climate forecasts arguably none. It is argued that one does not weight "models" so much as entire ensemble prediction systems (EPSs), and that the marginal value of an EPS will depend on the other members in the mix. The impact of using different skill scores is examined in the limits of both very large forecast-outcome archives (thereby evaluating the efficiency of the skill score) and in very small forecast-outcome archives (illustrating fundamental limitations due to sampling fluctuations and memory in the physical system being forecast). It is shown that blending with climatology (J. Bröcker and L.A. Smith, Tellus A, 60(4), 663-678, (2008)) tends to increase the robustness of the results; also a new kernel dressing methodology (simply insuring that the expected probability mass tends to lie outside the range of the ensemble) is illustrated. Fair comparisons using seasonal forecasts from the ENSEMBLES project are used to illustrate the importance of these results with fairly small archives. The robustness of these results across the range of small, moderate and huge archives is demonstrated using imperfect models of perfectly known nonlinear (chaotic) dynamical systems. The implications these results hold for distinguishing the skill of a forecast from its value to a user of the forecast are discussed.

  20. NOMADS-NOAA Operational Model Archive and Distribution System

    Science.gov Websites

    Forecast Maps Climate Climate Prediction Climate Archives Weather Safety Storm Ready NOAA Central Library (16km) 6 hours grib filter http OpenDAP-alt URMA hourly - http - Climate Models Climate Forecast System Flux Products 6 hours grib filter http - Climate Forecast System 3D Pressure Products 6 hours grib

  1. A PDS Archive for Observations of Mercury's Na Exosphere

    NASA Astrophysics Data System (ADS)

    Backes, C.; Cassidy, T.; Merkel, A. W.; Killen, R. M.; Potter, A. E.

    2016-12-01

    We present a data product consisting of ground-based observations of Mercury's sodium exosphere. We have amassed a sizeable dataset of several thousand spectral observations of Mercury's exosphere from the McMath-Pierce solar telescope. Over the last year, a data reduction pipeline has been developed and refined to process and reconstruct these spectral images into low resolution images of sodium D2 emission. This dataset, which extends over two decades, will provide an unprecedented opportunity to analyze the dynamics of Mercury's mid to high-latitude exospheric emissions, which have long been attributed to solar wind ion bombardment. This large archive of observations will be of great use to the Mercury science community in studying the effects of space weather on Mercury's tenuous exosphere. When completely processed, images in this dataset will show the observed spatial distribution of Na D2 in the Mercurian exosphere, have measurements of this sodium emission per pixel in units of kilorayleighs, and be available through NASA's Planetary Data System. The overall goal of the presentation will be to provide the Planetary Science community with a clear picture of what information and data this archival product will make available.

  2. Land processes distributed active archive center product lifecycle plan

    USGS Publications Warehouse

    Daucsavage, John C.; Bennett, Stacie D.

    2014-01-01

    The U.S. Geological Survey (USGS) Earth Resources Observation and Science (EROS) Center and the National Aeronautics and Space Administration (NASA) Earth Science Data System Program worked together to establish, develop, and operate the Land Processes (LP) Distributed Active Archive Center (DAAC) to provide stewardship for NASA’s land processes science data. These data are critical science assets that serve the land processes science community with potential value beyond any immediate research use, and therefore need to be accounted for and properly managed throughout their lifecycle. A fundamental LP DAAC objective is to enable permanent preservation of these data and information products. The LP DAAC accomplishes this by bridging data producers and permanent archival resources while providing intermediate archive services for data and information products.

  3. Chromospheric heating by acoustic shocks - A confrontation of GHRS observations of Alpha Tauri (K5 III) with ab initio calculations

    NASA Technical Reports Server (NTRS)

    Judge, P. G.; Cuntz, M.

    1993-01-01

    We compare ab initio calculations of semiforbidden C II line profiles near 2325 A with recently published observations of the inactive red giant Alpha Tau (K5 III) obtained using the GHRS on board the Hubble Space Telescope. Our one-dimensional, time-dependent calculations assume that the chromosphere is heated by stochastic acoustic shocks generated by photospheric convection. We calculate various models using results from traditional (mixing length) convection zone calculations as input to hydrodynamical models. The semiforbidden C II line profiles and ratios provide sensitive diagnostics of chromospheric velocity fields, electron densities, and temperatures. We identify major differences between observed and computed line profiles which are related to basic gas dynamics and which are probably not due to technical modeling restrictions. If the GHRS observations are representative of chromospheric conditions at all epochs, then one (or more) of our model assumptions must be incorrect. Several possibilities are examined. We predict time variability of semiforbidden C II lines for comparison with observations. Based upon data from the IUE archives, we argue that photospheric motions associated with supergranulation or global pulsation modes are unimportant in heating the chromosphere of Alpha Tau.

  4. SPRUCE Representing Northern Peatland Microtopography and Hydrology within the Community Land Model: Modeling Archive

    DOE Data Explorer

    Shi, X. [Oak Ridge National Laboratory, U.S. Department of Energy, Oak Ridge, Tennessee, U.S.A.; Thornton, P. E. [Oak Ridge National Laboratory, U.S. Department of Energy, Oak Ridge, Tennessee, U.S.A.; Ricciuto, D. M. [Oak Ridge National Laboratory, U.S. Department of Energy, Oak Ridge, Tennessee, U.S.A.; Hanson, P. J. [Oak Ridge National Laboratory, U.S. Department of Energy, Oak Ridge, Tennessee, U.S.A.; Mao, J. [Oak Ridge National Laboratory, U.S. Department of Energy, Oak Ridge, Tennessee, U.S.A.; Sebestyen, S. [Oak Ridge National Laboratory, U.S. Department of Energy, Oak Ridge, Tennessee, U.S.A.; Griffiths, N. A. [Oak Ridge National Laboratory, U.S. Department of Energy, Oak Ridge, Tennessee, U.S.A.; Bisht, G. [Oak Ridge National Laboratory, U.S. Department of Energy, Oak Ridge, Tennessee, U.S.A.

    2016-09-01

    Here we provide model code, inputs, outputs and evaluation datasets for a new configuration of the Community Land Model (CLM) for SPRUCE, which includes a fully prognostic water table calculation for SPRUCE. Our structural and process changes to CLM focus on modifications needed to represent the hydrologic cycle of bogs environment with perched water tables, as well as distinct hydrologic dynamics and vegetation communities of the raised hummock and sunken hollow microtopography characteristic of SPRUCE and other peatland bogs. The modified model was parameterized and independently evaluated against observations from an ombrotrophic raised-dome bog in northern Minnesota (S1-Bog), the site for the Spruce and Peatland Responses Under Climatic and Environmental Change experiment (SPRUCE).

  5. Atmospheric numerical modeling resource enhancement and model convective parameterization/scale interaction studies

    NASA Technical Reports Server (NTRS)

    Cushman, Paula P.

    1993-01-01

    Research will be undertaken in this contract in the area of Modeling Resource and Facilities Enhancement to include computer, technical and educational support to NASA investigators to facilitate model implementation, execution and analysis of output; to provide facilities linking USRA and the NASA/EADS Computer System as well as resident work stations in ESAD; and to provide a centralized location for documentation, archival and dissemination of modeling information pertaining to NASA's program. Additional research will be undertaken in the area of Numerical Model Scale Interaction/Convective Parameterization Studies to include implementation of the comparison of cloud and rain systems and convective-scale processes between the model simulations and what was observed; and to incorporate the findings of these and related research findings in at least two refereed journal articles.

  6. Adaptability in the Development of Data Archiving Services at Johns Hopkins University

    NASA Astrophysics Data System (ADS)

    Petters, J.; DiLauro, T.; Fearon, D.; Pralle, B.

    2015-12-01

    Johns Hopkins University (JHU) Data Management Services provides archiving services for institutional researchers through the JHU Data Archive, thereby increasing the access to and use of their research data. From its inception our unit's archiving service has evolved considerably. While some of these changes have been internally driven so that our unit can archive quality data collections more efficiently, we have also developed archiving policies and procedures on the fly in response to researcher needs. Providing our archiving services for JHU research groups from a variety of research disciplines have surfaced different sets of expectations and needs. We have used each interaction to help us refine our services and quickly satisfy the researchers we serve (following the first agile principle). Here we discuss the development of our newest archiving service model, its implementation over the past several months, and the processes by which we have continued to refine and improve our archiving services since its implementation. Through this discussion we will illustrate the benefits of planning, structure and flexibility in development of archiving services that maximize the potential value of research data. We will describe interactions with research groups, including those from environmental engineering and international health, and how we were able to rapidly modify and develop our archiving services to meet their needs (e.g. in an 'agile' way). For example, our interactions with both of these research groups led first to discussion in regular standing meetings and eventually development of new archiving policies and procedures. These policies and procedures centered on limiting access to archived research data while associated manuscripts progress through peer-review and publication.

  7. Archive of radar observations of meteors in Tomsk in 1965-1966. (Russian Title: Архив радиолокационных наблюдений метеоров в Томске в 1965-1966 гг.)

    NASA Astrophysics Data System (ADS)

    Ryabova, G. O.

    2010-12-01

    The archive of data of radar observations of Geminid, Quadrantid, Daytime Arietid, Perseid, Ursid, Lyrid, Orionid and Leonid meteor showers in Tomsk in 1965-1966 is described. In certain cases registrations of the sporadic background before and after a shower exist. Primary data of echo registrations contain time of a registration, distance, duration and amplitude of an echo, allowing to obtain corresponding distributions essential for calculation of the incident flux density of meteors. Work on the archive digitization has been started.

  8. The Climate Hazards group InfraRed Precipitation (CHIRP) with Stations (CHIRPS): Development and Validation

    NASA Astrophysics Data System (ADS)

    Peterson, P.; Funk, C. C.; Husak, G. J.; Pedreros, D. H.; Landsfeld, M.; Verdin, J. P.; Shukla, S.

    2013-12-01

    CHIRP and CHIRPS are new quasi-global precipitation products with daily to seasonal time scales, a 0.05° resolution, and a 1981 to near real-time period of record. Developed by the Climate Hazards Group at UCSB and scientists at the U.S. Geological Survey Earth Resources Observation and Science Center specifically for drought early warning and environmental monitoring, CHIRPS provides moderate latency precipitation estimates that place observed hydrologic extremes in their historic context. Three main types of information are used in the CHIRPS: (1) global 0.05° precipitation climatologies, (2) time-varying grids of satellite-based precipitation estimates, and (3) in situ precipitation observations. CHIRP: The global grids of long-term (1980-2009) average precipitation were estimated for each month based on station data, averaged satellite observations, and physiographic parameters. 1981-present time-varying grids of satellite precipitation were derived from spatially varying regression models based on pentadal cold cloud duration (CCD) values and TRMM V7 training data. The CCD time-series were derived from the CPC and NOAA B1 datasets. Pentadal CCD-percent anomaly values were multiplied by pentadal climatology fields to produce low bias pentadal precipitation estimates. CHIRPS: The CHG station blending procedure uses the satellite-observed spatial covariance structure to assign relative weights to neighboring stations and the CHIRP values. The CHIRPS blending procedure is based on the expected correlation between precipitation at a given target location and precipitation at the locations of the neighboring observation stations. These correlations are estimated using the CHIRP fields. The CHG has developed an extensive archive of in situ daily, pentadal and monthly precipitation totals. The CHG database has over half a billion daily rainfall observations since 1980 and another half billion before 1980. Most of these observations come from four sets of global climate observations: the monthly Global Historical Climate Network version 2 archive, the daily Global Historical Climate Network archive, the Global Summary of the Day dataset (GSOD), and the daily Global Telecommunication System (GTS) archive provided by NOAA's Climate Prediction Center (CPC). A screening procedure was developed to flag and remove potential false zeros from the daily data, since these potentially spurious data can artificially suppress rainfall totals. Validation: Our validation focused on precipitation products with global coverage, long periods of record and near real-time availability: CHIRP, CHIRPS, CPC-Unified, CFS Reanalysis and ECMWF datasets were compared to GPCC and high quality datasets from Uganda, Colombia and the Sahel. The CHIRP and CHIRPS are shown to have low systematic errors (bias) and low mean absolute errors. Analyses in Uganda, Colombia and the Sahel indicate that the ECMWF, CPC-Unified and CFS-Reanalysis have large inhomogeneities, making them unsuitable for drought monitoring. The CHIRPS performance appears quite similar to research quality products like the GPCC and GPCP, but with higher resolution and lower latency.

  9. Constraining the temperature history of the past millennium using early instrumental observations

    NASA Astrophysics Data System (ADS)

    Brohan, P.

    2012-12-01

    The current assessment that twentieth-century global temperature change is unusual in the context of the last thousand years relies on estimates of temperature changes from natural proxies (tree-rings, ice-cores etc.) and climate model simulations. Confidence in such estimates is limited by difficulties in calibrating the proxies and systematic differences between proxy reconstructions and model simulations - notable differences include large differences in multi-decadal variability between proxy reconstructions, and big uncertainties in the effect of volcanic eruptions. Because the difference between the estimates extends into the relatively recent period of the early nineteenth century it is possible to compare them with a reliable instrumental estimate of the temperature change over that period, provided that enough early thermometer observations, covering a wide enough expanse of the world, can be collected. By constraining key aspects of the reconstructions and simulations, instrumental observations, inevitably from a limited period, can reduce reconstruction uncertainty throughout the millennium. A considerable quantity of early instrumental observations are preserved in the world's archives. One organisation which systematically made observations and collected the results was the English East-India Company (EEIC), and 900 log-books of EEIC ships containing daily instrumental measurements of temperature and pressure have been preserved in the British Library. Similar records from voyages of exploration and scientific investigation are preserved in published literature and the records in National Archives. Some of these records have been extracted and digitised, providing hundreds of thousands of new weather records offering an unprecedentedly detailed view of the weather and climate of the late eighteenth and early nineteenth centuries. The new thermometer observations demonstrate that the large-scale temperature response to the Tambora eruption and the 1809 eruption was modest (perhaps 0.5C). This provides a powerful out-of-sample validation for the proxy reconstructions --- supporting their use for longer-term climate reconstructions. However, some of the climate model simulations in the CMIP5 ensemble show much larger volcanic effects than this --- such simulations are unlikely to be accurate in this respect.

  10. Observations and Analysis of the GK Persei Nova Shell and its "Jet-like" Feature

    NASA Astrophysics Data System (ADS)

    Harvey, E.; Redman, M. P.; Boumis, P.; Akras, S.

    2015-12-01

    GK Persei (1901, the "Firework Nebula") is an old but bright nova remnant that offers a chance to probe the physics and kinematics of nova shells. The kinematics in new and archival longslit optical echelle spectra were analysed using the SHAPE software. New imaging from the Aristarchos telescope continues to track the proper motion, extinction and structural evolution of the knots, which have been observed intermittently over several decades. We present for the first time, kinematical constraints on a large faint "jet" feature, that was previously detected beyond the shell boundary. These observational constraints allow for the generation of models for individual knots, interactions within knot complexes, and the "jet" feature. Put together, and taking into account dwarf-nova accelerated winds emanating from the central source, these data and models give a deeper insight into the GK Per nova remnant as a whole.

  11. National Space Science Data Center Information Model

    NASA Astrophysics Data System (ADS)

    Bell, E. V.; McCaslin, P.; Grayzeck, E.; McLaughlin, S. A.; Kodis, J. M.; Morgan, T. H.; Williams, D. R.; Russell, J. L.

    2013-12-01

    The National Space Science Data Center (NSSDC) was established by NASA in 1964 to provide for the preservation and dissemination of scientific data from NASA missions. It has evolved to support distributed, active archives that were established in the Planetary, Astrophysics, and Heliophysics disciplines through a series of Memoranda of Understanding. The disciplines took over responsibility for working with new projects to acquire and distribute data for community researchers while the NSSDC remained vital as a deep archive. Since 2000, NSSDC has been using the Archive Information Package to preserve data over the long term. As part of its effort to streamline the ingest of data into the deep archive, the NSSDC developed and implemented a data model of desired and required metadata in XML. This process, in use for roughly five years now, has been successfully used to support the identification and ingest of data into the NSSDC archive, most notably those data from the Planetary Data System (PDS) submitted under PDS3. A series of software packages (X-ware) were developed to handle the submission of data from the PDS nodes utilizing a volume structure. An XML submission manifest is generated at the PDS provider site prior to delivery to NSSDC. The manifest ensures the fidelity of PDS data delivered to NSSDC. Preservation metadata is captured in an XML object when NSSDC archives the data. With the recent adoption by the PDS of the XML-based PDS4 data model, there is an opportunity for the NSSDC to provide additional services to the PDS such as the preservation, tracking, and restoration of individual products (e.g., a specific data file or document), which was unfeasible in the previous PDS3 system. The NSSDC is modifying and further streamlining its data ingest process to take advantage of the PDS4 model, an important consideration given the ever-increasing amount of data being generated and archived by orbiting missions at the Moon and Mars, other active projects such as BRRISON, LADEE, MAVEN, INSIGHT, OSIRIS-REX and ground-based observatories. Streamlining the ingest process also benefits the continued processing of PDS3 data. We will report on our progress and status.

  12. Rich Support for Heterogeneous Polar Data in RAMADDA

    NASA Astrophysics Data System (ADS)

    McWhirter, J.; Crosby, C. J.; Griffith, P. C.; Khalsa, S.; Lazzara, M. A.; Weber, W. J.

    2013-12-01

    Difficult to navigate environments, tenuous logistics, strange forms, deeply rooted cultures - these are all experiences shared by Polar scientist in the field as well as the developers of the underlying data management systems back in the office. Among the key data management challenges that Polar investigations present are the heterogeneity and complexity of data that are generated. Polar regions are intensely studied across many science domains through a variety of techniques - satellite and aircraft remote sensing, in-situ observation networks, modeling, sociological investigations, and extensive PI-driven field project data collection. While many data management efforts focus on large homogeneous collections of data targeting specific science domains (e.g., satellite, GPS, modeling), multi-disciplinary efforts that focus on Polar data need to be able to address a wide range of data formats, science domains and user communities. There is growing use of the RAMADDA (Repository for Archiving, Managing and Accessing Diverse Data) system to manage and provide services for Polar data. RAMADDA is a freely available extensible data repository framework that supports a wide range of data types and services to allow the creation, management, discovery and use of data and metadata. The broad range of capabilities provided by RAMADDA and its extensibility makes it well-suited as an archive solution for Polar data. RAMADDA can run in a number of diverse contexts - as a centralized archive, at local institutions, and can even run on an investigator's laptop in the field, providing in-situ metadata and data management services. We are actively developing archives and support for a number of Polar initiatives: - NASA-Arctic Boreal Vulnerability Experiment (ABoVE): ABoVE is a long-term multi-instrument field campaign that will make use of a wide range of data. We have developed an extensive ontology of program, project and site metadata in RAMADDA, in support of the ABoVE Science Definition Team and Project Office. See: http://above.nasa.gov - UNAVCO Terrestrial Laser Scanning (TLS): UNAVCO's Polar program provides support for terrestrial laser scanning field projects. We are using RAMADDA to archive these field projects, with over 40 projects ingested to date. - NASA-IceBridge: As part of the NASA LiDAR Access System (NLAS) project, RAMADDA supports numerous airborne and satellite LiDAR data sets - GLAS, LVIS, ATM, Paris, McORDS, etc. - Antarctic Meteorological Research Center (AMRC): Satellite and surface observation network - Support for numerous other data from AON-ACADIS, Greenland GC-Net, NOAA-GMD, AmeriFlux, etc. In this talk we will discuss some of the challenges that Polar data brings to geoinformatics and describe the approaches we have taken to address these challenges in RAMADDA.

  13. The CGM of Massive Galaxies: Where Cold Gas Goes to Die?

    NASA Astrophysics Data System (ADS)

    Howk, Jay

    2017-08-01

    We propose to survey the cold HI content and metallicity of the circumgalactic medium (CGM) around 50 (45 new, 5 archival) z 0.5 Luminous Red Galaxies (LRGs) to directly test a fundamental prediction of galaxy assembly models: that cold, metal-poor accretion does not survive to the inner halos of very massive galaxies. Accretion and feedback through the CGM play key roles in our models of the star formation dichotomy in galaxies. Low mass galaxies are thought to accrete gas in cold streams, while high mass galaxies host hot, dense halos that heat incoming gas and prevent its cooling, thereby quenching star formation. HST/COS has provided evidence for cold, metal-poor streams in the halos of star-forming galaxies (consistent with cold accretion). Observations have also demonstrated the presence of cool gas in the halos of passive galaxies, a potential challenge to the cold/hot accretion model. Our proposed observations will target the most massive galaxies and address the origin of the cool CGM gas by measuring the metallicity. This experiment is enabled by our novel approach to deriving metallicities, allowing the use of much fainter QSOs. It cannot be done with archival data, as these rare systems are not often probed along random sight lines. The H I column density (and metallicity) measurements require access to the UV. The large size of our survey is crucial to robustly assess whether the CGM in these galaxies is unique from that of star-forming systems, a comparison that provides the most stringent test of cold-mode accretion/quenching models to date. Conversely, widespread detections of metal-poor gas in these halos will seriously challenge the prevailing theory.

  14. Understanding Young Exoplanet Analogs with WISE

    NASA Astrophysics Data System (ADS)

    Rice, Emily

    We propose to tackle outstanding questions about the fundamental properties of young brown dwarfs, which are atmospheric analogs to massive gas giant exoplanets, using public archive data from the Wide-field Infrared Survey Explorer (WISE) combined with our extensive dataset of optical and near-infrared observations, including spectra, proper motions, and parallaxes. Using WISE data we will construct color-color diagrams, color- magnitude diagrams, and spectral energy distributions for our sample of candidate young brown dwarfs. We will fully characterize the spectral properties of the candidates and evaluate their membership in nearby young moving groups in order to obtain independent age estimates. The practical outcomes of this project will allow the research community to use observed colors and spectra to reliably constrain the properties - including effective temperature, gravity, and dust/cloud properties - of both brown dwarfs and gas giant exoplanets. We will also search for new young brown dwarfs in the WISE archive using colors and proper motions. The expanded sample of young brown dwarfs will be used to create a self-contained feedback loop to identify and address the shortcomings of cool atmosphere models and low-mass evolutionary tracks, both of which are already being used to infer the properties of massive exoplanets. Disentangling the effects of physical parameters on the observed properties of young brown dwarfs is directly relevant to studies of exoplanets. Direct observations of exoplanets are currently very limited, and young brown dwarfs are the laboratories in which we can solve existing problems before the onslaught of new observations from instruments capable of directly imaging exoplanets, including the Gemini Planet Imager, Project 1640 at the Palomar Observatory, SPHERE on the VLT, and the James Webb Space Telescope. This project addresses the goal of the NASA Science Mission Directorate to discover how the universe works; in particular, the results of our work will improve our understanding of objects at the intersection of stars and planets and be directly applicable to understanding the atmospheres of directly-imaged exoplanets. The assembled investigators are the absolute best team to accomplish this work. They have extensive and diverse observational experience in astrometry, photometry, and spectroscopy from the optical through the mid-IR, spanning nearly the entire spectral energy distribution of young brown dwarfs and encompassing their most fundamental observational properties. They have considerable experience mining large photometric catalogs and identifying low-gravity very low mass objects. The team maintains collaborations with two groups actively modelling brown dwarf and exoplanet atmospheres and interior evolution. The proposed research organically combines several ongoing projects into a cohesive program that will efficiently incorporate WISE data to disentangle the ambiguous and interdependent physical properties of young brown dwarfs. As a result of the team's previous observational projects, we have assembled a dataset that positions us to best interpret WISE observations brown dwarfs and identify new young brown dwarfs in the WISE archive. A significant parallax program is ongoing, and all of the computing resources and many of the analysis tools are already in place, including several well-tested pipelines for data reduction and analysis and model comparisons. The team will incorporate undergraduate students in the project through an existing NSF-funded REU program.

  15. The Massive Star Content of Circumnuclear Star Clusters in M83

    NASA Astrophysics Data System (ADS)

    Wofford, A.; Chandar, R.; Leitherer, C.

    2011-06-01

    The circumnuclear starburst of M83 (NGC 5236), the nearest such example (4.6 Mpc), constitutes an ideal site for studying the massive star IMF at high metallicity (12+log[O/H]=9.1±0.2, Bresolin & Kennicutt 2002). We analyzed archival HST/STIS FUV imaging and spectroscopy of 13 circumnuclear star clusters in M83. We compared the observed spectra with two types of single stellar population (SSP) models; semi-empirical models, which are based on an empirical library of Galactic O and B stars observed with IUE (Robert et al. 1993), and theoretical models, which are based on a new theoretical UV library of hot massive stars described in Leitherer et al. (2010) and computed with WM-Basic (Pauldrach et al. 2001). The models were generated with Starburst99 (Leitherer & Chen 2009). We derived the reddenings, the ages, and the masses of the clusters from model fits to the FUV spectroscopy, as well as from optical HST/WFC3 photometry.

  16. Mega-precovery and data mining of near-Earth asteroids and other Solar System objects

    NASA Astrophysics Data System (ADS)

    Popescu, M.; Vaduvescu, O.; Char, F.; Curelaru, L.; Euronear Team

    2014-07-01

    The vast collection of CCD images and photographic plate archives available from the world-wide archives and telescopes is still insufficiently exploited. Within the EURONEAR project we designed two data mining software with the purpose to search very large collections of archives for images which serendipitously include known asteroids or comets in their field, with the main aims to extend the arc and improve the orbits. In this sense, ''Precovery'' (published in 2008, aiming to search all known NEAs in few archives via IMCCE's SkyBoT server) and ''Mega-Precovery'' (published in 2010, querying the IMCCE's Miriade server) were made available to the community via the EURONEAR website (euronear.imcce.fr). Briefly, Mega-Precovery aims to search one or a few known asteroids or comets in a mega-collection including millions of images from some of the largest observatory archives: ESO (15 instruments served by ESO Archive including VLT), NVO (8 instruments served by U.S. NVO Archive), CADC (11 instruments, including HST and Gemini), plus other important instrument archives: SDSS, CFHTLS, INT-WFC, Subaru-SuprimeCam and AAT-WFI, adding together 39 instruments and 4.3 million images (Mar 2014), and our Mega-Archive is growing. Here we present some of the most important results obtained with our data-mining software and some new planned search options of Mega-Precovery. Particularly, the following capabilities will be added soon: the ING archive (all imaging cameras) will be included and new search options will be made available (such as query by orbital elements and by observations) to be able to target new Solar System objects such as Virtual Impactors, bolides, planetary satellites, TNOs (besides the comets added recently). In order to better characterize the archives, we introduce the ''AOmegaA'' factor (archival etendue) proportional to the AOmega (etendue) and the number of images in an archive. With the aim to enlarge the Mega-Archive database, we invite the observatories (particularly those storing their images online and also those that own plate archives which could be scanned on request) to contact us in order to add their instrument archives (consisting of an ASCII file with telescope pointings in a simple format) to our Mega-Precovery open project. We intend for the future to synchronise our service with the Virtual Observatory.

  17. GORGONA - the characteristic of the software system.

    NASA Astrophysics Data System (ADS)

    Artim, M.; Zejda, M.

    A description of the new software system is given. The GORGONA system is established to the processing, making and administration of archives of periodic variable stars observations, observers and observed variable stars.

  18. VizieR Online Data Catalog: Swift-UVOT obs. analysis of 29 SNe Ia (Brown+, 2017)

    NASA Astrophysics Data System (ADS)

    Brown, P. J.; Landez, N. J.; Milne, P. A.; Stritzinger, M. D.

    2017-10-01

    Swift/UVOT has observed over 500 SNe of all types in its 12 years of operation (see Brown+ 2015JHEAp...7..111B for a review of the first 10yrs). Most of the observations use six UV and optical filters. All photometry comes from the Swift Optical/Ultraviolet Supernova Archive (SOUSA; Brown+ 2014Ap&SS.354...89B) and is available at the Swift SN website and the Open Supernova Archive (Guillochon+ 2017ApJ...835...64G). (1 data file).

  19. Fault-tolerant back-up archive using an ASP model for disaster recovery

    NASA Astrophysics Data System (ADS)

    Liu, Brent J.; Huang, H. K.; Cao, Fei; Documet, Luis; Sarti, Dennis A.

    2002-05-01

    A single point of failure in PACS during a disaster scenario is the main archive storage and server. When a major disaster occurs, it is possible to lose an entire hospital's PACS data. Few current PACS archives feature disaster recovery, but the design is limited at best. These drawbacks include the frequency with which the back-up is physically removed to an offsite facility, the operational costs associated to maintain the back-up, the ease-of-use to perform the backup consistently and efficiently, and the ease-of-use to perform the PACS image data recovery. This paper describes a novel approach towards a fault-tolerant solution for disaster recovery of short-term PACS image data using an Application Service Provider model for service. The ASP back-up archive provides instantaneous, automatic backup of acquired PACS image data and instantaneous recovery of stored PACS image data all at a low operational cost. A back-up archive server and RAID storage device is implemented offsite from the main PACS archive location. In the example of this particular hospital, it was determined that at least 2 months worth of PACS image exams were needed for back-up. Clinical data from a hospital PACS is sent to this ASP storage server in parallel to the exams being archived in the main server. A disaster scenario was simulated and the PACS exams were sent from the offsite ASP storage server back to the hospital PACS. Initially, connectivity between the main archive and the ASP storage server is established via a T-1 connection. In the future, other more cost-effective means of connectivity will be researched such as the Internet 2. A disaster scenario was initiated and the disaster recovery process using the ASP back-up archive server was success in repopulating the clinical PACS within a short period of time. The ASP back-up archive was able to recover two months of PACS image data for comparison studies with no complex operational procedures. Furthermore, no image data loss was encountered during the recovery.

  20. Data dictionary and formatting standard for dissemination of geotechnical data

    USGS Publications Warehouse

    Benoit, J.; Bobbitt, J.I.; Ponti, D.J.; Shimel, S.A.; ,

    2004-01-01

    A pilot system for archiving and web dissemination of geotechnical data collected and stored by various agencies is currently under development. Part of the scope of this project, sponsored by the Consortium of Organizations for Strong-Motion Observation Systems (COSMOS) and by the Pacific Earthquake Engineering Research Center (PEER) Lifelines Program, is the development of a data dictionary and formatting standard. This paper presents the data model along with the basic structure of the data dictionary tables for this pilot system.

  1. New GOES High-Resolution Magnetic Measurements and their Contribution to Understanding Magnetospheric Particle Dynamics

    NASA Astrophysics Data System (ADS)

    Redmon, R. J.; Loto'aniu, P. T. M.; Boudouridis, A.; Chi, P. J.; Singer, H. J.; Kress, B. T.; Rodriguez, J. V.; Abdelqader, A.; Tilton, M.

    2017-12-01

    The era of NOAA observations of the geomagnetic field started with SMS-1 in May 1974 and continues to this day with GOES-13-16 (on-orbit). We describe the development of a new 20+ year archive of science-quality, high-cadence geostationary measurements of the magnetic field from eight NOAA spacecraft (GOES-8 through GOES-15), the status of GOES-16 and new scientific results using these data. GOES magnetic observations provide an early warning of impending space weather, are the core geostationary data set used for the construction of magnetospheric magnetic models, and can be used to estimate electromagnetic wave power in frequency bands important for plasma processes. Many science grade improvements are being made across the GOES archive to unify the format and content from GOES-8 through the new GOES-R series (with the first of that series launched on November 19, 2016). A majority of the 2-Hz magnetic observations from GOES-8-12 have never before been publicly accessible due to processing constraints. Now, a NOAA Big Earth Data Initiative project is underway to process these measurements starting from original telemetry records. Overall the new archive will include vector measurements in geophysically relevant coordinates (EPN, GSM, and VDH), comprehensive documentation, highest temporal cadence, best calibration parameters, recomputed means, updated quality flagging, full spacecraft ephemeris information, a unified standard format and public access. We are also developing spectral characterization tools for estimating power in standard frequency bands (up to 1 Hz for G8-15), and detecting ULF waves related to field-line resonances. We present the project status and findings, including in-situ statistical and extreme ULF event properties, and case studies where the ULF oscillations along the same field line were observed simultaneously by GOES near the equator in the magnetosphere, the ST-5 satellites at low altitudes, and ground magnetometer stations. For event studies, we find that the wave amplitude of poloidal oscillations is amplified at low altitudes but attenuated on the ground, confirming the theoretical predictions of wave propagation from the magnetosphere to the ground. We include examples of GOES-16 particle flux and magnetic field observations illustrating complex particle dynamics.

  2. Supporting users through integrated retrieval, processing, and distribution systems at the land processes distributed active archive center

    USGS Publications Warehouse

    Kalvelage, T.; Willems, Jennifer

    2003-01-01

    The design of the EOS Data and Information Systems (EOSDIS) to acquire, archive, manage and distribute Earth observation data to the broadest possible user community was discussed. A number of several integrated retrieval, processing and distribution capabilities have been explained. The value of these functions to the users were described and potential future improvements were laid out for the users. The users were interested in acquiring the retrieval, processing and archiving systems integrated so that they can get the data they want in the format and delivery mechanism of their choice.

  3. (abstract) Satellite Physical Oceanography Data Available From an EOSDIS Archive

    NASA Technical Reports Server (NTRS)

    Digby, Susan A.; Collins, Donald J.

    1996-01-01

    The Physical Oceanography Distributed Active Archive Center (PO.DAAC) at the Jet Propulsion Laboratory archives and distributes data as part of the Earth Observing System Data and Information System (EOSDIS). Products available from JPL are largely satellite derived and include sea-surface height, surface-wind speed and vectors, integrated water vapor, atmospheric liquid water, sea-surface temperature, heat flux, and in-situ data as it pertains to satellite data. Much of the data is global and spans fourteen years.There is email access, a WWW site, product catalogs, and FTP capabilities. Data is free of charge.

  4. Metrically preserving the USGS aerial film archive

    USGS Publications Warehouse

    Moe, Donald; Longhenry, Ryan

    2013-01-01

    Since 1972, the U.S. Geological Survey (USGS) Earth Resources Observation and Science (EROS) Center in Sioux Falls, South Dakota, has provided fi lm-based products to the public. EROS is home to an archive of 12 million frames of analog photography ranging from 1937 to the present. The archive contains collections from both aerial and satellite platforms including programs such as the National High Altitude Program (NHAP), National Aerial Photography Program (NAPP), U.S. Antarctic Resource Center (USARC), Declass 1(CORONA, ARGON, and LANYARD), Declass 2 (KH-7 and KH-9), and Landsat (1972 – 1992, Landsat 1–5).

  5. Development of public science archive system of Subaro Telescope. 2

    NASA Astrophysics Data System (ADS)

    Yamamoto, Naotaka; Noda, Sachiyo; Taga, Masatoshi; Ozawa, Tomohiko; Horaguchi, Toshihiro; Okumura, Shin-Ichiro; Furusho, Reiko; Baba, Hajime; Yagi, Masafumi; Yasuda, Naoki; Takata, Tadafumi; Ichikawa, Shin-Ichi

    2003-09-01

    We report various improvements in a public science archive system, SMOKA (Subaru-Mitaka-Okayama-Kiso Archive system). We have developed a new interface to search observational data of minor bodies in the solar system. In addition, the other improvements (1) to search frames by specifying wavelength directly, (2) to find out calibration data set automatically, (3) to browse data on weather, humidity, and temperature, which provide information of image quality, (4) to provide quick-look images of OHS/CISCO and IRCS, and (5) to include the data from OAO HIDES (HIgh Dispersion Echelle Spectrograph), are also summarized.

  6. Citations to Web pages in scientific articles: the permanence of archived references.

    PubMed

    Thorp, Andrea W; Schriger, David L

    2011-02-01

    We validate the use of archiving Internet references by comparing the accessibility of published uniform resource locators (URLs) with corresponding archived URLs over time. We scanned the "Articles in Press" section in Annals of Emergency Medicine from March 2009 through June 2010 for Internet references in research articles. If an Internet reference produced the authors' expected content, the Web page was archived with WebCite (http://www.webcitation.org). Because the archived Web page does not change, we compared it with the original URL to determine whether the original Web page had changed. We attempted to access each original URL and archived Web site URL at 3-month intervals from the time of online publication during an 18-month study period. Once a URL no longer existed or failed to contain the original authors' expected content, it was excluded from further study. The number of original URLs and archived URLs that remained accessible over time was totaled and compared. A total of 121 articles were reviewed and 144 Internet references were found within 55 articles. Of the original URLs, 15% (21/144; 95% confidence interval [CI] 9% to 21%) were inaccessible at publication. During the 18-month observation period, there was no loss of archived URLs (apart from the 4% [5/123; 95% CI 2% to 9%] that could not be archived), whereas 35% (49/139) of the original URLs were lost (46% loss; 95% CI 33% to 61% by the Kaplan-Meier method; difference between curves P<.0001, log rank test). Archiving a referenced Web page at publication can help preserve the authors' expected information. Copyright © 2010 American College of Emergency Physicians. Published by Mosby, Inc. All rights reserved.

  7. Virtual Observatory Interfaces to the Chandra Data Archive

    NASA Astrophysics Data System (ADS)

    Tibbetts, M.; Harbo, P.; Van Stone, D.; Zografou, P.

    2014-05-01

    The Chandra Data Archive (CDA) plays a central role in the operation of the Chandra X-ray Center (CXC) by providing access to Chandra data. Proprietary interfaces have been the backbone of the CDA throughout the Chandra mission. While these interfaces continue to provide the depth and breadth of mission specific access Chandra users expect, the CXC has been adding Virtual Observatory (VO) interfaces to the Chandra proposal catalog and observation catalog. VO interfaces provide standards-based access to Chandra data through simple positional queries or more complex queries using the Astronomical Data Query Language. Recent development at the CDA has generalized our existing VO services to create a suite of services that can be configured to provide VO interfaces to any dataset. This approach uses a thin web service layer for the individual VO interfaces, a middle-tier query component which is shared among the VO interfaces for parsing, scheduling, and executing queries, and existing web services for file and data access. The CXC VO services provide Simple Cone Search (SCS), Simple Image Access (SIA), and Table Access Protocol (TAP) implementations for both the Chandra proposal and observation catalogs within the existing archive architecture. Our work with the Chandra proposal and observation catalogs, as well as additional datasets beyond the CDA, illustrates how we can provide configurable VO services to extend core archive functionality.

  8. BAO Plate Archive digitization, creation of electronic database and its scientific usage

    NASA Astrophysics Data System (ADS)

    Mickaelian, Areg M.

    2015-08-01

    Astronomical plate archives created on the basis of numerous observations at many observatories are important part of the astronomical heritage. Byurakan Astrophysical Observatory (BAO) plate archive consists of 37,500 photographic plates and films, obtained at 2.6m telescope, 1m and 0.5m Schmidt telescopes and other smaller ones during 1947-1991. In 2002-2005, the famous Markarian Survey (First Byurakan Survey, FBS) 2000 plates were digitized and the Digitized FBS (DFBS, http://www.aras.am/Dfbs/dfbs.html) was created. New science projects have been conducted based on these low-dispersion spectroscopic material. In 2015, we have started a project on the whole BAO Plate Archive digitization, creation of electronic database and its scientific usage. A Science Program Board is created to evaluate the observing material, to investigate new possibilities and to propose new projects based on the combined usage of these observations together with other world databases. The Executing Team consists of 9 astronomers and 3 computer scientists and will use 2 EPSON Perfection V750 Pro scanners for the digitization, as well as Armenian Virtual Observatory (ArVO) database to accommodate all new data. The project will run during 3 years in 2015-2017 and the final result will be an electronic database and online interactive sky map to be used for further research projects.

  9. SPASE: The Connection Among Solar and Space Physics Data Centers

    NASA Technical Reports Server (NTRS)

    Thieman, James R.; King, Todd A.; Roberts, D. Aaron

    2011-01-01

    The Space Physics Archive Search and Extract (SPASE) project is an international collaboration among Heliophysics (solar and space physics) groups concerned with data acquisition and archiving. Within this community there are a variety of old and new data centers, resident archives, "virtual observatories", etc. acquiring, holding, and distributing data. A researcher interested in finding data of value for his or her study faces a complex data environment. The SPASE group has simplified the search for data through the development of the SPASE Data Model as a common method to describe data sets in the various archives. The data model is an XML-based schema and is now in operational use. There are both positives and negatives to this approach. The advantage is the common metadata language enabling wide-ranging searches across the archives, but it is difficult to inspire the data holders to spend the time necessary to describe their data using the Model. Software tools have helped, but the main motivational factor is wide-ranging use of the standard by the community. The use is expanding, but there are still other groups who could benefit from adopting SPASE. The SPASE Data Model is also being expanded in the sense of providing the means for more detailed description of data sets with the aim of enabling more automated ingestion and use of the data through detailed format descriptions. We will discuss the present state of SPASE usage and how we foresee development in the future. The evolution is based on a number of lessons learned - some unique to Heliophysics, but many common to the various data disciplines.

  10. First results of MAO NASU SS bodies photographic archive digitizing

    NASA Astrophysics Data System (ADS)

    Pakuliak, L.; Andruk, V.; Shatokhina, S.; Golovnya, V.; Yizhakevych, O.; Kulyk, I.

    2013-05-01

    MAO NASU glass archive encloses about 1800 photographic plates with planets and their satellites (including near 80 images of Uranus, Pluto and Neptune), about 1700 plates with minor planets and about 900 plates with comets. Plates were made during 1949-1999 using 11 telescopes of different focus, mostly the Double Wide-angle Astrograph (F/D=2000/400) and the Double Long-focus Astrograph (F/D=5500/400) of MAO NASU. Observational sites are Kyiv, Lviv (Ukraine), Biurakan (Armenia), Abastumani (Georgia), Mt. Maidanak (Uzbekistan), Quito (Equador). Tables contain data about the most significant numbers of plates sub-divided by years and objects. The database with metadata of plates (DBGPA) is available on the computer cluster of MAO (http://gua.db.ukr-vo.org) via open access. The database accumulates archives of four Ukrainian observatories, involving the UkrVO national project. Together with the archive managing system, the database serves as a test area for JDA - Joint Digital Archive - the core of the UkrVO.

  11. The Planetary Data System Web Catalog Interface--Another Use of the Planetary Data System Data Model

    NASA Technical Reports Server (NTRS)

    Hughes, S.; Bernath, A.

    1995-01-01

    The Planetary Data System Data Model consists of a set of standardized descriptions of entities within the Planetary Science Community. These can be real entities in the space exploration domain such as spacecraft, instruments, and targets; conceptual entities such as data sets, archive volumes, and data dictionaries; or the archive data products such as individual images, spectrum, series, and qubes.

  12. Long-Term Preservation and Advanced Access Services to Archived Data: The Approach of a System Integrator

    NASA Astrophysics Data System (ADS)

    Petitjean, Gilles; de Hauteclocque, Bertrand

    2004-06-01

    EADS Defence and Security Systems (EADS DS SA) have developed an expertise as integrator of archive management systems for both their commercial and defence customers (ESA, CNES, EC, EUMETSAT, French MOD, US DOD, etc.), especially in Earth Observation and in Meteorology fields.The concern of valuable data owners is both their long-term preservation but also the integration of the archive in their information system with in particular an efficient access to archived data for their user community. The system integrator answers to this requirement by a methodology combining understanding of user needs, exhaustive knowledge of the existing solutions both for hardware and software elements and development and integration ability. The system integrator completes the facility development by support activities.The long-term preservation of archived data obviously involves a pertinent selection of storage media and archive library. This selection relies on storage technology survey but the selection criteria depend on the analysis of the user needs. The system integrator will recommend the best compromise for implementing an archive management facility, thanks to its knowledge and its independence of storage market and through the analysis of the user requirements. He will provide a solution, which is able to evolve to take advantage of the storage technology progress.But preserving the data for long-term is not only a question of storage technology. Some functions are required to secure the archive management system against contingency situation: multiple data set copies using operational procedures, active quality control of the archived data, migration policy optimising the cost of ownership.

  13. Long-term Science Data Curation Using a Digital Object Model and Open-Source Frameworks

    NASA Astrophysics Data System (ADS)

    Pan, J.; Lenhardt, W.; Wilson, B. E.; Palanisamy, G.; Cook, R. B.

    2010-12-01

    Scientific digital content, including Earth Science observations and model output, has become more heterogeneous in format and more distributed across the Internet. In addition, data and metadata are becoming necessarily linked internally and externally on the Web. As a result, such content has become more difficult for providers to manage and preserve and for users to locate, understand, and consume. Specifically, it is increasingly harder to deliver relevant metadata and data processing lineage information along with the actual content consistently. Readme files, data quality information, production provenance, and other descriptive metadata are often separated in the storage level as well as in the data search and retrieval interfaces available to a user. Critical archival metadata, such as auditing trails and integrity checks, are often even more difficult for users to access, if they exist at all. We investigate the use of several open-source software frameworks to address these challenges. We use Fedora Commons Framework and its digital object abstraction as the repository, Drupal CMS as the user-interface, and the Islandora module as the connector from Drupal to Fedora Repository. With the digital object model, metadata of data description and data provenance can be associated with data content in a formal manner, so are external references and other arbitrary auxiliary information. Changes are formally audited on an object, and digital contents are versioned and have checksums automatically computed. Further, relationships among objects are formally expressed with RDF triples. Data replication, recovery, metadata export are supported with standard protocols, such as OAI-PMH. We provide a tentative comparative analysis of the chosen software stack with the Open Archival Information System (OAIS) reference model, along with our initial results with the existing terrestrial ecology data collections at NASA’s ORNL Distributed Active Archive Center for Biogeochemical Dynamics (ORNL DAAC).

  14. MRMS Experimental Testbed for Operational Products (METOP)

    NASA Astrophysics Data System (ADS)

    Zhang, J.

    2016-12-01

    Accurate high-resolution quantitative precipitation estimation (QPE) at the continental scale is of critical importance to the nation's weather, water and climate services. To address this need, a Multi-Radar Multi-Sensor (MRMS) system was developed at the National Severe Storms Lab of National Oceanic and Atmospheric Administration that integrates radar, gauge, model and satellite data and provides a suite of QPE products at 1-km and 2-min resolution. MRMS system consists of three components: 1) an operational system; 2) a real-time research system; 3) an archive testbed. The operational system currently provides instantaneous precipitation rate, type and 1- to 72-hr accumulations for conterminous United Stated and southern Canada. The research system has the similar hardware infrastructure and data environment as the operational system, but runs newer and more advanced algorithms. The newer algorithms are tested on the research system for robustness and computational efficiency in a pseudo operational environment before they are transitioned into operations. The archive testbed, also called the MRMS Experimental Testbed for Operational Products (METOP), consists of a large database that encompasses a wide range of hydroclimatological and geographical regimes. METOP is for the testing and refinements of the most advanced radar QPE techniques, which are often developed on specific data from limited times and locations. The archive data includes quality controlled in-situ observations for the validation of the new radar QPE across all seasons and geographic regions. A number of operational QPE products derived from different sensors/models are also included in METOP for the fusion of multiple sources of complementary precipitation information. This paper is an introduction of the METOP system.

  15. Implications of the lack of global dimming and brightening in global climate models

    NASA Astrophysics Data System (ADS)

    Storelvmo, T.

    2017-12-01

    The global temperature trend of the last half-century is widely believed to be the result of two opposing effects; aerosol cooling and greenhouse gas (GHG) warming. While the radiative effect of increasing GHG concentrations is well-constrained, that due to anthropogenic aerosols is not, in part because observational constraints on the latter are lacking. However, long-term surface measurements of downward solar radiation (DSRS), an often-used proxy for aerosol radiative forcing, are available worldwide from the Global Energy Balance Archive (GEBA). We compare DSRS changes from 1,300 GEBA stations to those from the Coupled Model Intercomparison Project, phase 5 (CMIP5) simulations, sampled only when/where observations are available. The observed DSRS shows a strong early (1964-1990) downward trend, followed by a weaker regional trend reversal. Regional emission data for aerosols and aerosol precursors suggest that the culprit for both features was changes to the atmospheric aerosol loading. In contrast, the models show weak or negligible DSRS trends, suggesting a too weak aerosol forcing. We present sensitivity studies with a single model (CESM1.2) that aim to simultaneously reproduce the observed trends in DSRS and surface temperature.

  16. An Update on the CDDIS

    NASA Technical Reports Server (NTRS)

    Noll, Carey; Michael, Patrick; Dube, Maurice P.; Pollack, N.

    2012-01-01

    The Crustal Dynamics Data Inforn1ation System (CoorS) supports data archiving and distribution activities for the space geodesy and geodynamics community. The main objectives of the system are to store space geodesy and geodynamics related data products in a central data bank, to maintain infom1ation about the archival of these data, and to disseminate these data and information in a timely mam1er to a global scientific research community. The archive consists of GNSS, laser ranging, VLBI, and OORIS data sets and products derived from these data. The coors is one of NASA's Earth Observing System Oata and Infom1ation System (EOSorS) distributed data centers; EOSOIS data centers serve a diverse user community and are tasked to provide facilities to search and access science data and products. The coors data system and its archive have become increasingly important to many national and international science communities, in pal1icular several of the operational services within the International Association of Geodesy (lAG) and its project the Global Geodetic Observing System (GGOS), including the International OORIS Service (IDS), the International GNSS Service (IGS), the International Laser Ranging Service (ILRS), the International VLBI Service for Geodesy and Astrometry (IVS), and the International Earth Rotation Service (IERS). The coors has recently expanded its archive to supp011 the IGS Multi-GNSS Experiment (MGEX). The archive now contains daily and hourly 3D-second and subhourly I-second data from an additional 35+ stations in RINEX V3 fOm1at. The coors will soon install an Ntrip broadcast relay to support the activities of the IGS Real-Time Pilot Project (RTPP) and the future Real-Time IGS Service. The coors has also developed a new web-based application to aid users in data discovery, both within the current community and beyond. To enable this data discovery application, the CDDIS is currently implementing modifications to the metadata extracted from incoming data and product files pushed to its archive. This poster will include background information about the system and its user communities, archive contents and updates, enhancements for data discovery, new system architecture, and future plans.

  17. The Moving Image in Education Research: Reassembling the Body in Classroom Video Data

    ERIC Educational Resources Information Center

    de Freitas, Elizabeth

    2016-01-01

    While audio recordings and observation might have dominated past decades of classroom research, video data is now the dominant form of data in the field. Ubiquitous videography is standard practice today in archiving the body of both the teacher and the student, and vast amounts of classroom and experiment clips are stored in online archives. Yet…

  18. Shoichi Sakata: His Life and Physics ---Collections of Materials in Sakata Memorial Archival Library---

    NASA Astrophysics Data System (ADS)

    Tanabashi, M.

    Shoichi Sakata and his Nagoya School made a lot of important achievements at the predawn of the particle physics revolution. The ``two-meson'' theory (introduction of the second generation leptons), the ``C-meson theory'' (a theory which inspired Tomonaga's renormalization theory), the ``Sakata model'' (a precursor to the quark model), and the ``Maki-Nakagawa-Sakata'' theory on the neutrino mixings are among them. These outputs are now regarded as essential ingredients in modern particle physics. Sakata also took his leadership in setting up democratic administration system in his theoretical particle physics group (E-ken). It was this democratic atmosphere in which many excellent physicists were brought up as Sakata's diciples. In this talk, I introduce Sakata and his achievements in physics, showing various materials archived in the Sakata Memorial Archival Library (SMAL), an archival repository of primary material showing Sakata's activities. These SMAL documents vividly show Sakata's way of thinking in his approach to the new physics.

  19. The Regional Climate Model Evaluation System: A Systematic Evaluation Of CORDEX Simulations Using Obs4MIPs

    NASA Astrophysics Data System (ADS)

    Goodman, A.; Lee, H.; Waliser, D. E.; Guttowski, W.

    2017-12-01

    Observation-based evaluations of global climate models (GCMs) have been a key element for identifying systematic model biases that can be targeted for model improvements and for establishing uncertainty associated with projections of global climate change. However, GCMs are limited in their ability to represent physical phenomena which occur on smaller, regional scales, including many types of extreme weather events. In order to help facilitate projections in changes of such phenomena, simulations from regional climate models (RCMs) for 14 different domains around the world are being provided by the Coordinated Regional Climate Downscaling Experiment (CORDEX; www.cordex.org). However, although CORDEX specifies standard simulation and archiving protocols, these simulations are conducted independently by individual research and modeling groups representing each of these domains often with different output requirements and data archiving and exchange capabilities. Thus, with respect to similar efforts using GCMs (e.g., the Coupled Model Intercomparison Project, CMIP), it is more difficult to achieve a standardized, systematic evaluation of the RCMs for each domain and across all the CORDEX domains. Using the Regional Climate Model Evaluation System (RCMES; rcmes.jpl.nasa.gov) developed at JPL, we are developing easy to use templates for performing systematic evaluations of CORDEX simulations. Results from the application of a number of evaluation metrics (e.g., biases, centered RMS, and pattern correlations) will be shown for a variety of physical quantities and CORDEX domains. These evaluations are performed using products from obs4MIPs, an activity initiated by DOE and NASA, and now shepherded by the World Climate Research Program's Data Advisory Council.

  20. Geodetic Seamless Archive Centers Modernization - Information Technology for Exploiting the Data Explosion

    NASA Astrophysics Data System (ADS)

    Boler, F. M.; Blewitt, G.; Kreemer, C. W.; Bock, Y.; Noll, C. E.; McWhirter, J.; Jamason, P.; Squibb, M. B.

    2010-12-01

    Space geodetic science and other disciplines using geodetic products have benefited immensely from open sharing of data and metadata from global and regional archives Ten years ago Scripps Orbit and Permanent Array Center (SOPAC), the NASA Crustal Dynamics Data Information System (CDDIS), UNAVCO and other archives collaborated to create the GPS Seamless Archive Centers (GSAC) in an effort to further enable research with the expanding collections of GPS data then becoming available. The GSAC partners share metadata to facilitate data discovery and mining across participating archives and distribution of data to users. This effort was pioneering, but was built on technology that has now been rendered obsolete. As the number of geodetic observing technologies has expanded, the variety of data and data products has grown dramatically, exposing limitations in data product sharing. Through a NASA ROSES project, the three archives (CDDIS, SOPAC and UNAVCO) have been funded to expand the original GSAC capability for multiple geodetic observation types and to simultaneously modernize the underlying technology by implementing web services. The University of Nevada, Reno (UNR) will test the web services implementation by incorporating them into their daily GNSS data processing scheme. The effort will include new methods for quality control of current and legacy data that will be a product of the analysis/testing phase performed by UNR. The quality analysis by UNR will include a report of the stability of the stations coordinates over time that will enable data users to select sites suitable for their application, for example identifying stations with large seasonal effects. This effort will contribute to enhanced ability for very large networks to obtain complete data sets for processing.

  1. Linking multiple biodiversity informatics platforms with Darwin Core Archives

    PubMed Central

    2014-01-01

    Abstract We describe an implementation of the Darwin Core Archive (DwC-A) standard that allows for the exchange of biodiversity information contained within the Scratchpads virtual research environment with external collaborators. Using this single archive file Scratchpad users can expose taxonomies, specimen records, species descriptions and a range of other data to a variety of third-party aggregators and tools (currently Encyclopedia of Life, eMonocot Portal, CartoDB, and the Common Data Model) for secondary use. This paper describes our technical approach to dynamically building and validating Darwin Core Archives for the 600+ Scratchpad user communities, which can be used to serve the diverse data needs of all of our content partners. PMID:24723785

  2. Measurements of CO2 Mole Fractionand δ13C in Archived Air Samples from Cape Meares, Oregon (USA) 1977 - 1998

    NASA Astrophysics Data System (ADS)

    Clark, O.; Rice, A. L.

    2017-12-01

    Carbon dioxide (CO2) is the most abundant, anthropogenically forced greenhouse gas (GHG) in the global atmosphere. Emissions of CO2 account for approximately 75% of the world's total GHG emissions. Atmospheric concentrations of CO2 are higher now than they've been at any other time in the past 800,000 years. Currently, the global mean concentration exceeds 400 ppm. Today, global networks regularly monitor CO2 concentrations and isotopic composition (δ13C and δ18O). However, past data is sparse. Over 200 ambient air samples from Cape Meares, Oregon (45.5°N, 124.0°W), a coastal site in Western United States, were obtained by researchers at Oregon Institute of Science and Technology (OGI, now Oregon Health & Science University), between the years of 1977 and 1998 as part of a global monitoring program of six different sites in the polar, middle, and tropical latitudes of the Northern and Southern Hemispheres. Air liquefaction was used to compress approximately 1000L of air (STP) to 30bar, into 33L electropolished (SUMMA) stainless steel canisters. Select archived air samples from the original network are maintained at Portland State University (PSU) Department of Physics. These archived samples are a valuable look at changing atmospheric concentrations of CO2 and δ13C, which can contribute to a better understanding of changes in sources during this time. CO2 concentrations and δ13C of CO2 were measured at PSU, with a Picarro Cavity Ringdown Spectrometer, model G1101-i analytical system. This study presents the analytical methods used, calibration techniques, precision, and reproducibility. Measurements of select samples from the archive show rising CO2 concentrations and falling δ13C over the 1977 to 1998 period, compatible with previous observations and rising anthropogenic sources of CO2. The resulting data set was statistically analyzed in MATLAB. Results of preliminary seasonal and secular trends from the archive samples are presented.

  3. WFIRST Science Operations at STScI

    NASA Astrophysics Data System (ADS)

    Gilbert, Karoline; STScI WFIRST Team

    2018-06-01

    With sensitivity and resolution comparable the Hubble Space Telescope, and a field of view 100 times larger, the Wide Field Instrument (WFI) on WFIRST will be a powerful survey instrument. STScI will be the Science Operations Center (SOC) for the WFIRST Mission, with additional science support provided by the Infrared Processing and Analysis Center (IPAC) and foreign partners. STScI will schedule and archive all WFIRST observations, calibrate and produce pipeline-reduced data products for imaging with the Wide Field Instrument, support the High Latitude Imaging and Supernova Survey Teams, and support the astronomical community in planning WFI imaging observations and analyzing the data. STScI has developed detailed concepts for WFIRST operations, including a data management system integrating data processing and the archive which will include a novel, cloud-based framework for high-level data processing, providing a common environment accessible to all users (STScI operations, Survey Teams, General Observers, and archival investigators). To aid the astronomical community in examining the capabilities of WFIRST, STScI has built several simulation tools. We describe the functionality of each tool and give examples of its use.

  4. The OrbitOutlook Data Archive

    NASA Astrophysics Data System (ADS)

    Czajkowski, M.; Shilliday, A.; LoFaso, N.; Dipon, A.; Van Brackle, D.

    2016-09-01

    In this paper, we describe and depict the Defense Advanced Research Projects Agency (DARPA)'s OrbitOutlook Data Archive (OODA) architecture. OODA is the infrastructure that DARPA's OrbitOutlook program has developed to integrate diverse data from various academic, commercial, government, and amateur space situational awareness (SSA) telescopes. At the heart of the OODA system is its world model - a distributed data store built to quickly query big data quantities of information spread out across multiple processing nodes and data centers. The world model applies a multi-index approach where each index is a distinct view on the data. This allows for analysts and analytics (algorithms) to access information through queries with a variety of terms that may be of interest to them. Our indices include: a structured global-graph view of knowledge, a keyword search of data content, an object-characteristic range search, and a geospatial-temporal orientation of spatially located data. In addition, the world model applies a federated approach by connecting to existing databases and integrating them into one single interface as a "one-stop shopping place" to access SSA information. In addition to the world model, OODA provides a processing platform for various analysts to explore and analytics to execute upon this data. Analytic algorithms can use OODA to take raw data and build information from it. They can store these products back into the world model, allowing analysts to gain situational awareness with this information. Analysts in turn would help decision makers use this knowledge to address a wide range of SSA problems. OODA is designed to make it easy for software developers who build graphical user interfaces (GUIs) and algorithms to quickly get started with working with this data. This is done through a multi-language software development kit that includes multiple application program interfaces (APIs) and a data model with SSA concepts and terms such as: space observation, observable, measurable, metadata, track, space object, catalog, expectation, and maneuver.

  5. Weak lensing study of 16 DAFT/FADA clusters: Substructures and filaments

    NASA Astrophysics Data System (ADS)

    Martinet, Nicolas; Clowe, Douglas; Durret, Florence; Adami, Christophe; Acebrón, Ana; Hernandez-García, Lorena; Márquez, Isabel; Guennou, Loic; Sarron, Florian; Ulmer, Mel

    2016-05-01

    While our current cosmological model places galaxy clusters at the nodes of a filament network (the cosmic web), we still struggle to detect these filaments at high redshifts. We perform a weak lensing study for a sample of 16 massive, medium-high redshift (0.4

  6. EOSDIS: Archive and Distribution Systems in the Year 2000

    NASA Technical Reports Server (NTRS)

    Behnke, Jeanne; Lake, Alla

    2000-01-01

    Earth Science Enterprise (ESE) is a long-term NASA research mission to study the processes leading to global climate change. The Earth Observing System (EOS) is a NASA campaign of satellite observatories that are a major component of ESE. The EOS Data and Information System (EOSDIS) is another component of ESE that will provide the Earth science community with easy, affordable, and reliable access to Earth science data. EOSDIS is a distributed system, with major facilities at seven Distributed Active Archive Centers (DAACs) located throughout the United States. The EOSDIS software architecture is being designed to receive, process, and archive several terabytes of science data on a daily basis. Thousands of science users and perhaps several hundred thousands of non-science users are expected to access the system. The first major set of data to be archived in the EOSDIS is from Landsat-7. Another EOS satellite, Terra, was launched on December 18, 1999. With the Terra launch, the EOSDIS will be required to support approximately one terabyte of data into and out of the archives per day. Since EOS is a multi-mission program, including the launch of more satellites and many other missions, the role of the archive systems becomes larger and more critical. In 1995, at the fourth convening of NASA Mass Storage Systems and Technologies Conference, the development plans for the EOSDIS information system and archive were described. Five years later, many changes have occurred in the effort to field an operational system. It is interesting to reflect on some of the changes driving the archive technology and system development for EOSDIS. This paper principally describes the Data Server subsystem including how the other subsystems access the archive, the nature of the data repository, and the mass-storage I/O management. The paper reviews the system architecture (both hardware and software) of the basic components of the archive. It discusses the operations concept, code development, and testing phase of the system. Finally, it describes the future plans for the archive.

  7. Web-based data delivery services in support of disaster-relief applications

    USGS Publications Warehouse

    Jones, Brenda K.; Risty, Ron R.; Buswell, M.

    2003-01-01

    The U.S. Geological Survey Earth Resources Observation Systems Data Center responds to emergencies in support of various government agencies for human-induced and natural disasters. This response consists of satellite tasking and acquisitions, satellite image registrations, disaster-extent maps analysis and creation, base image provision and support, Web-based mapping services for product delivery, and predisaster and postdisaster data archiving. The emergency response staff are on call 24 hours a day, 7 days a week, and have access to many commercial and government satellite and aerial photography tasking authorities. They have access to value-added data processing and photographic laboratory services for off-hour emergency requests. They work with various Federal agencies for preparedness planning, which includes providing base imagery. These data may include digital elevation models, hydrographic models, base satellite images, vector data layers such as roads, aerial photographs, and other predisaster data. These layers are incorporated into a Web-based browser and data delivery service that is accessible either to the general public or to select customers. As usage declines, the data are moved to a postdisaster nearline archive that is still accessible, but not in real time.

  8. Using the Stereotype Content Model to examine group depictions in Fascism: An Archival Approach.

    PubMed

    Durante, Federica; Volpato, Chiara; Fiske, Susan T

    2010-04-01

    The Stereotype Content Model (SCM) suggests potentially universal intergroup depictions. If universal, they should apply across history in archival data. Bridging this gap, we examined social groups descriptions during Italy's Fascist era. In Study 1, articles published in a Fascist magazine- La Difesa della Razza -were content analyzed, and results submitted to correspondence analysis. Admiration prejudice depicted ingroups; envious and contemptuous prejudices depicted specific outgroups, generally in line with SCM predictions. No paternalistic prejudice appeared; historical reasons might explain this finding. Results also fit the recently developed BIAS Map of behavioral consequences. In Study 2, ninety-six undergraduates rated the content-analysis traits on warmth and competence, without knowing their origin. They corroborated SCM's interpretations of the archival data.

  9. Information Requirements for Integrating Spatially Discrete, Feature-Based Earth Observations

    NASA Astrophysics Data System (ADS)

    Horsburgh, J. S.; Aufdenkampe, A. K.; Lehnert, K. A.; Mayorga, E.; Hsu, L.; Song, L.; Zaslavsky, I.; Valentine, D. L.

    2014-12-01

    Several cyberinfrastructures have emerged for sharing observational data collected at densely sampled and/or highly instrumented field sites. These include the CUAHSI Hydrologic Information System (HIS), the Critical Zone Observatory Integrated Data Management System (CZOData), the Integrated Earth Data Applications (IEDA) and EarthChem system, and the Integrated Ocean Observing System (IOOS). These systems rely on standard data encodings and, in some cases, standard semantics for classes of geoscience data. Their focus is on sharing data on the Internet via web services in domain specific encodings or markup languages. While they have made progress in making data available, it still takes investigators significant effort to discover and access datasets from multiple repositories because of inconsistencies in the way domain systems describe, encode, and share data. Yet, there are many scenarios that require efficient integration of these data types across different domains. For example, understanding a soil profile's geochemical response to extreme weather events requires integration of hydrologic and atmospheric time series with geochemical data from soil samples collected over various depth intervals from soil cores or pits at different positions on a landscape. Integrated access to and analysis of data for such studies are hindered because common characteristics of data, including time, location, provenance, methods, and units are described differently within different systems. Integration requires syntactic and semantic translations that can be manual, error-prone, and lossy. We report information requirements identified as part of our work to define an information model for a broad class of earth science data - i.e., spatially-discrete, feature-based earth observations resulting from in-situ sensors and environmental samples. We sought to answer the question: "What information must accompany observational data for them to be archivable and discoverable within a publication system as well as interpretable once retrieved from such a system for analysis and (re)use?" We also describe development of multiple functional schemas (i.e., physical implementations for data storage, transfer, and archival) for the information model that capture the requirements reported here.

  10. Searching for X-ray emission from AGB stars

    NASA Astrophysics Data System (ADS)

    Ramstedt, S.; Montez, R.; Kastner, J.; Vlemmings, W. H. T.

    2012-07-01

    Context. Magnetic fields have been measured around asymptotic giant branch (AGB) stars of all chemical types using maser polarization observations. If present, a large-scale magnetic field would lead to X-ray emission, which should be observable using current X-ray observatories. Aims: The aim is to search the archival data for AGB stars that are intrinsic X-ray emitters. Methods: We have searched the ROSAT, CXO, and XMM-Newton archives for serendipitous X-ray observations of a sample of ~500 AGB stars. We specifically searched for the AGB stars detected with GALEX. The data is calibrated, analyzed and the X-ray luminosities and temperatures are estimated as functions of the circumstellar absorption. Results: We identify 13 AGB stars as having either serendipitous or targeted observations in the X-ray data archives, however for a majority of the sources the detailed analysis show that the detections are questionable. Two new sources are detected by ROSAT: T Dra and R UMa. The spectral analysis suggests that the emission associated with these sources could be due to coronal activity or interaction across a binary system. Conclusions: Further observations of the detected sources are necessary to clearly determine the origin of the X-ray emission. Moreover, additional objects should be subject to targeted X-ray observations in order to achieve better constraints for the magnetic fields around AGB stars. Appendices are available in electronic form at http://www.aanda.org

  11. Dwarfs in the Deepest Fields at Noon: Studying Size and Shape of Low-mass Galaxies out to z 3 in Five HST Legacy Fields

    NASA Astrophysics Data System (ADS)

    Guo, Yicheng

    2017-08-01

    Galaxies with stellar mass 100x-1000x times smaller than our Milky Way (hereafter dwarf galaxies or DGs) are important for understanding galaxy formation and evolution by being the most sensitive probes of both the macro-physics of dark matter halos and the micro-physics of the different physical mechanisms that regulate star formation and shape galaxies. Currently, however, observations of distant DGs have been hampered by small samples and poor quality due to their faintness. We propose an archival study of the size, morphology, and structures of DGs out to z 3.0 by combining the archived data from five of the deepest regions that HST has ever observed: eXtreme Deep Field (XDF, updated from HUDF) and the Hubble Legacy Fields (HLFs). Our program would be the first to advance the morphology studies of DGs to the Cosmic Noon (z 2), and hence place unprecedented constraints on models of galaxy structure formation. Equally important is the data product of our program: multi-wavelength photometry and morphology catalogs for all detected galaxies in these fields. These catalogs would be a timely treasure for the public to prepare for the coming JWST era by providing detailed information of small, faint, but important objects in some deepest HST fields for JWST observations.

  12. Generation of High Resolution Global DSM from ALOS PRISM

    NASA Astrophysics Data System (ADS)

    Takaku, J.; Tadono, T.; Tsutsui, K.

    2014-04-01

    Panchromatic Remote-sensing Instrument for Stereo Mapping (PRISM), one of onboard sensors carried on the Advanced Land Observing Satellite (ALOS), was designed to generate worldwide topographic data with its optical stereoscopic observation. The sensor consists of three independent panchromatic radiometers for viewing forward, nadir, and backward in 2.5 m ground resolution producing a triplet stereoscopic image along its track. The sensor had observed huge amount of stereo images all over the world during the mission life of the satellite from 2006 through 2011. We have semi-automatically processed Digital Surface Model (DSM) data with the image archives in some limited areas. The height accuracy of the dataset was estimated at less than 5 m (rms) from the evaluation with ground control points (GCPs) or reference DSMs derived from the Light Detection and Ranging (LiDAR). Then, we decided to process the global DSM datasets from all available archives of PRISM stereo images by the end of March 2016. This paper briefly reports on the latest processing algorithms for the global DSM datasets as well as their preliminary results on some test sites. The accuracies and error characteristics of datasets are analyzed and discussed on various fields by the comparison with existing global datasets such as Ice, Cloud, and land Elevation Satellite (ICESat) data and Shuttle Radar Topography Mission (SRTM) data, as well as the GCPs and the reference airborne LiDAR/DSM.

  13. A basis for a visual language for describing, archiving and analyzing functional models of complex biological systems

    PubMed Central

    Cook, Daniel L; Farley, Joel F; Tapscott, Stephen J

    2001-01-01

    Background: We propose that a computerized, internet-based graphical description language for systems biology will be essential for describing, archiving and analyzing complex problems of biological function in health and disease. Results: We outline here a conceptual basis for designing such a language and describe BioD, a prototype language that we have used to explore the utility and feasibility of this approach to functional biology. Using example models, we demonstrate that a rather limited lexicon of icons and arrows suffices to describe complex cell-biological systems as discrete models that can be posted and linked on the internet. Conclusions: Given available computer and internet technology, BioD may be implemented as an extensible, multidisciplinary language that can be used to archive functional systems knowledge and be extended to support both qualitative and quantitative functional analysis. PMID:11305940

  14. A Subgrid Approach for Modeling Microtopography Effects on Overland Flow: Application to Polygonal Tundra: Modeling Archive

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ahmad Jan; Ethan Coon; Scott Painter

    This Modeling Archive is in support of an NGEE Arctic manuscript under review. A new subgrid model was implemented in the Advanced Terrestrial Simulator (ATS) to capture micro-topography effects on surface flow. A comparison of the fine-scale simulations on seven individual ice-wedge polygons and a cluster of polygons was made between the results of the subgrid model and no-subgrid model. Our finding confirms that the effects of small-scale spatial heterogeneities can be captured in the coarsened models. The dataset contains meshes, inputfiles, subgrid parameters used in the simulations. Python scripts for post-processing and files for geometric analyses are also included.

  15. Model methodology for estimating pesticide concentration extremes based on sparse monitoring data

    USGS Publications Warehouse

    Vecchia, Aldo V.

    2018-03-22

    This report describes a new methodology for using sparse (weekly or less frequent observations) and potentially highly censored pesticide monitoring data to simulate daily pesticide concentrations and associated quantities used for acute and chronic exposure assessments, such as the annual maximum daily concentration. The new methodology is based on a statistical model that expresses log-transformed daily pesticide concentration in terms of a seasonal wave, flow-related variability, long-term trend, and serially correlated errors. Methods are described for estimating the model parameters, generating conditional simulations of daily pesticide concentration given sparse (weekly or less frequent) and potentially highly censored observations, and estimating concentration extremes based on the conditional simulations. The model can be applied to datasets with as few as 3 years of record, as few as 30 total observations, and as few as 10 uncensored observations. The model was applied to atrazine, carbaryl, chlorpyrifos, and fipronil data for U.S. Geological Survey pesticide sampling sites with sufficient data for applying the model. A total of 112 sites were analyzed for atrazine, 38 for carbaryl, 34 for chlorpyrifos, and 33 for fipronil. The results are summarized in this report; and, R functions, described in this report and provided in an accompanying model archive, can be used to fit the model parameters and generate conditional simulations of daily concentrations for use in investigations involving pesticide exposure risk and uncertainty.

  16. Report of the wwPDB Small-Angle Scattering Task Force: data requirements for biomolecular modeling and the PDB.

    PubMed

    Trewhella, Jill; Hendrickson, Wayne A; Kleywegt, Gerard J; Sali, Andrej; Sato, Mamoru; Schwede, Torsten; Svergun, Dmitri I; Tainer, John A; Westbrook, John; Berman, Helen M

    2013-06-04

    This report presents the conclusions of the July 12-13, 2012 meeting of the Small-Angle Scattering Task Force of the worldwide Protein Data Bank (wwPDB; Berman et al., 2003) at Rutgers University in New Brunswick, New Jersey. The task force includes experts in small-angle scattering (SAS), crystallography, data archiving, and molecular modeling who met to consider questions regarding the contributions of SAS to modern structural biology. Recognizing there is a rapidly growing community of structural biology researchers acquiring and interpreting SAS data in terms of increasingly sophisticated molecular models, the task force recommends that (1) a global repository is needed that holds standard format X-ray and neutron SAS data that is searchable and freely accessible for download; (2) a standard dictionary is required for definitions of terms for data collection and for managing the SAS data repository; (3) options should be provided for including in the repository SAS-derived shape and atomistic models based on rigid-body refinement against SAS data along with specific information regarding the uniqueness and uncertainty of the model, and the protocol used to obtain it; (4) criteria need to be agreed upon for assessment of the quality of deposited SAS data and the accuracy of SAS-derived models, and the extent to which a given model fits the SAS data; (5) with the increasing diversity of structural biology data and models being generated, archiving options for models derived from diverse data will be required; and (6) thought leaders from the various structural biology disciplines should jointly define what to archive in the PDB and what complementary archives might be needed, taking into account both scientific needs and funding. Copyright © 2013 Elsevier Ltd. All rights reserved.

  17. Modeling the spatio-temporal variability in subsurface thermal regimes across a low-relief polygonal tundra landscape: Modeling Archive

    DOE Data Explorer

    Kumar, Jitendra; Collier, Nathan; Bisht, Gautam; Mills, Richard T.; Thornton, Peter E.; Iversen, Colleen M.; Romanovsky, Vladimir

    2016-01-27

    This Modeling Archive is in support of an NGEE Arctic discussion paper under review and available at http://www.the-cryosphere-discuss.net/tc-2016-29/. Vast carbon stocks stored in permafrost soils of Arctic tundra are under risk of release to atmosphere under warming climate. Ice--wedge polygons in the low-gradient polygonal tundra create a complex mosaic of microtopographic features. The microtopography plays a critical role in regulating the fine scale variability in thermal and hydrological regimes in the polygonal tundra landscape underlain by continuous permafrost. Modeling of thermal regimes of this sensitive ecosystem is essential for understanding the landscape behaviour under current as well as changing climate. We present here an end-to-end effort for high resolution numerical modeling of thermal hydrology at real-world field sites, utilizing the best available data to characterize and parameterize the models. We develop approaches to model the thermal hydrology of polygonal tundra and apply them at four study sites at Barrow, Alaska spanning across low to transitional to high-centered polygon and representative of broad polygonal tundra landscape. A multi--phase subsurface thermal hydrology model (PFLOTRAN) was developed and applied to study the thermal regimes at four sites. Using high resolution LiDAR DEM, microtopographic features of the landscape were characterized and represented in the high resolution model mesh. Best available soil data from field observations and literature was utilized to represent the complex hetogeneous subsurface in the numerical model. This data collection provides the complete set of input files, forcing data sets and computational meshes for simulations using PFLOTRAN for four sites at Barrow Environmental Observatory. It also document the complete computational workflow for this modeling study to allow verification, reproducibility and follow up studies.

  18. The physical origin of the X-ray emission from SN 1987A

    NASA Astrophysics Data System (ADS)

    Miceli, M.; Orlando, S.; Petruk, O.

    2017-10-01

    We revisit the spectral analysis of the set of archive XMM-Newton observations of SN 1987A through our 3-D hydrodynamic model describing the whole evolution from the onset of the supernova to the full remnant development. For the first time the spectral analysis accounts for the single observations and for the evolution of the system self-consistently. We adopt a forward modeling approach which allows us to directly synthesize, from the model, X-ray spectra and images in different energy bands. We fold the synthetic observables through the XMM-Newton instrumental response and directly compare models and actual data. We find that our simulation provides an excellent fit to the data, by reproducing simultaneously X-ray fluxes, spectral features, and morphology of SN 1987A at all evolutionary stages. Our analysis enables us to obtain a deep insight on the physical origin of the observed multi-thermal emission, by revealing the contribution of shocked surrounding medium, dense clumps of the circumstellar ring, and ejecta to the total emission. We finally provide predictions for future observations (to be performed with XMM-Newton in the next future and with the forthcoming Athena X-ray telescope in approximately 10 years), showing the growing contribution of the ejecta X-ray emission.

  19. Intelligent Systems Technologies and Utilization of Earth Observation Data

    NASA Technical Reports Server (NTRS)

    Ramapriyan, H. K.; McConaughy, G. R.; Morse, H. S.

    2004-01-01

    The addition of raw data and derived geophysical parameters from several Earth observing satellites over the last decade to the data held by NASA data centers has created a data rich environment for the Earth science research and applications communities. The data products are being distributed to a large and diverse community of users. Due to advances in computational hardware, networks and communications, information management and software technologies, significant progress has been made in the last decade in archiving and providing data to users. However, to realize the full potential of the growing data archives, further progress is necessary in the transformation of data into information, and information into knowledge that can be used in particular applications. Sponsored by NASA s Intelligent Systems Project within the Computing, Information and Communication Technology (CICT) Program, a conceptual architecture study has been conducted to examine ideas to improve data utilization through the addition of intelligence into the archives in the context of an overall knowledge building system (KBS). Potential Intelligent Archive concepts include: 1) Mining archived data holdings to improve metadata to facilitate data access and usability; 2) Building intelligence about transformations on data, information, knowledge, and accompanying services; 3) Recognizing the value of results, indexing and formatting them for easy access; 4) Interacting as a cooperative node in a web of distributed systems to perform knowledge building; and 5) Being aware of other nodes in the KBS, participating in open systems interfaces and protocols for virtualization, and achieving collaborative interoperability.

  20. The spectral archive of cosmic X-ray sources observed by the Einstein Observatory Focal Plane Crystal Spectrometer

    NASA Technical Reports Server (NTRS)

    Lum, Kenneth S. K.; Canizares, Claude R.; Clark, George W.; Coyne, Joan M.; Markert, Thomas H.; Saez, Pablo J.; Schattenburg, Mark L.; Winkler, P. F.

    1992-01-01

    The Einstein Observatory Focal Plane Crystal Spectrometer (FPCS) used the technique of Bragg spectroscopy to study cosmic X-ray sources in the 0.2-3 keV energy range. The high spectral resolving power (E/Delta-E is approximately equal to 100-1000) of this instrument allowed it to resolve closely spaced lines and study the structure of individual features in the spectra of 41 cosmic X-ray sources. An archival summary of the results is presented as a concise record the FPCS observations and a source of information for future analysis by the general astrophysics community. For each observation, the instrument configuration, background rate, X-ray flux or upper limit within the energy band observed, and spectral histograms are given. Examples of the contributions the FPCS observations have made to the understanding of the objects observed are discussed.

  1. The living publication

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Terwilliger, Thomas C.

    2012-06-04

    Within the ICSTI Insights Series we offer three articles on the 'living publication' that is already available to practitioners in the important field of crystal structure determination and analysis. While the specific examples are drawn from this particular field, we invite readers to draw parallels in their own fields of interest. The first article describes the present state of the crystallographic living publication, already recognized by an ALPSP (Association of Learned and Professional Society Publishers) Award for Publishing Innovation in 2006. The second article describes the potential impact on the record of science as greater post-publication analysis becomes more commonmore » within currently accepted data deposition practices, using processed diffraction data as the starting point. The third article outlines a vision for the further improvement of crystallographic structure reports within potentially achievable enhanced data deposition practices, based upon raw (unprocessed) diffraction data. The IUCr in its Commissions and Journals has for many years emphasized the importance of publications being accompanied by data and the interpretation of the data in terms of atomic models. This has been followed as policy by numerous other journals in the field and its cognate disciplines. This practice has been well served by databases and archiving institutions such as the Protein Data Bank (PDB), the Cambridge Crystallographic Data Centre (CCDC), and the Inorganic Crystal Structure Database (ICSD). Normally the models that are archived are interpretations of the data, consisting of atomic coordinates with their displacement parameters, along with processed diffraction data from X-ray, neutron or electron diffraction studies. In our current online age, a reader can not only consult the printed word, but can display and explore the results with molecular graphics software of exceptional quality. Furthermore, the routine availability of processed diffraction data allows readers to perform direct calculations of the electron density (using X-rays and electrons as probes) or the nuclear density (using neutrons as probe) on which the molecular models are directly based. This current community practice is described in our first article. There are various ways that these data and tools can be used to further analyze the molecules that have been crystallized. Notably, once a set of results is announced via the publication, the research community can start to interact directly with the data and models. This gives the community the opportunity not only to read about the structure, but to examine it in detail, and even generate subsequent improved models. These improved models could, in principle, be archived along with the original interpretation of the data and can represent a continuously improving set of interpretations of a set of diffraction data. The models could improve both by correction of errors in the original interpretation and by the use of new representations of molecules in crystal structures that more accurately represent the contents of a crystal. These possible developments are described in our second article. A current, significant, thrust for the IUCr is whether it would be advantageous for the crystallographic community to require, rather than only encourage, the archiving of the raw (unprocessed) diffraction data images measured from a crystal, a fibre or a solution. This issue is being evaluated in detail by an IUCr Working Group (see http://forums.iucr.org). Such archived raw data would be linked to and from any associated publications. The archiving of raw diffraction data could allow as yet undeveloped processing methods to have access to the originally measured data. The debate within the community about this much larger proposed archiving effort revolves around the issue of 'cost versus benefit'. Costs can be minimized by preserving the raw data in local repositories, either at centralized synchrotron and neutron research institutes, or at research universities. Archiving raw data is also perceived as being more effective than just archiving processed data in countering scientific fraud, which exists in our field, albeit at a tiny level of occurrences. In parallel developments, sensitivities to avoiding research malpractice are encouraging Universities to establish their own data repositories for research and academic staff. These various 'raw data archives', would complement the existing processed data archives. These archives could however have gaps in their coverage arising from a lack of resources. Nevertheless we believe that a sufficiently large raw data archive, with reasonable global coverage, could be encouraged and have major benefits. These possible developments, costs and benefits, are described in our third and final article on 'The living publication'.« less

  2. Multivariate Analyses of Quality Metrics for Crystal Structures in the PDB Archive.

    PubMed

    Shao, Chenghua; Yang, Huanwang; Westbrook, John D; Young, Jasmine Y; Zardecki, Christine; Burley, Stephen K

    2017-03-07

    Following deployment of an augmented validation system by the Worldwide Protein Data Bank (wwPDB) partnership, the quality of crystal structures entering the PDB has improved. Of significance are improvements in quality measures now prominently displayed in the wwPDB validation report. Comparisons of PDB depositions made before and after introduction of the new reporting system show improvements in quality measures relating to pairwise atom-atom clashes, side-chain torsion angle rotamers, and local agreement between the atomic coordinate structure model and experimental electron density data. These improvements are largely independent of resolution limit and sample molecular weight. No significant improvement in the quality of associated ligands was observed. Principal component analysis revealed that structure quality could be summarized with three measures (Rfree, real-space R factor Z score, and a combined molecular geometry quality metric), which can in turn be reduced to a single overall quality metric readily interpretable by all PDB archive users. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. MagIC: Geomagnetic Applications from Earth History to Archeology

    NASA Astrophysics Data System (ADS)

    Constable, C.; Tauxe, L.; Koppers, A.; Minnett, R.; Jarboe, N.

    2016-12-01

    Major scientific challenges increasingly require an interdisciplinary approach, and highlight the need for open archives, incorporating visualization and analysis tools that are flexible enough to address novel research problems. Increasingly modern standards for publication are (or should be) demanding direct links to data, data citations, and adequate documentation that allow other researchers direct access to the fundamental measurements and analyses producing the results. Carefully documented metadata are essential and data models may need considerable complexity to accommodate re-use of observations originally collected with a different purpose in mind. The Magnetics Information Consortium (MagIC) provides an online home for all kinds of paleo-, archeo-magnetic, rock, and environmental magnetic data, from documentation of fieldwork, through lab protocols, to interpretations in terms of geomagnetic history. Examples of their application to understanding geomagnetic field behavior, archeological dating, and voyages of exploration to discover America will be used to highlight best practices and illustrate unexpected benefits of data archived using best practices with the goal of maintaining high standards for reproducibility.

  4. Observations and Modeling of the Transient General Circulation of the North Pacific Basin

    NASA Technical Reports Server (NTRS)

    McWilliams, James C.

    2000-01-01

    Because of recent progress in satellite altimetry and numerical modeling and the accumulation and archiving of long records of hydrographic and meteorological variables, it is becoming feasible to describe and understand the transient general circulation of the ocean (i.e., variations with spatial scales larger than a few hundred kilometers and time scales of seasonal and longer-beyond the mesoscale). We have carried out various studies in investigation of the transient general circulation of the Pacific Ocean from a coordinated analysis of satellite altimeter data, historical hydrographic gauge data, scatterometer wind observations, reanalyzed operational wind fields, and a variety of ocean circulation models. Broadly stated, our goal was to achieve a phenomenological catalogue of different possible types of large-scale, low-frequency variability, as a context for understanding the observational record. The approach is to identify the simplest possible model from which particular observed phenomena can be isolated and understood dynamically and then to determine how well these dynamical processes are represented in more complex Oceanic General Circulation Models (OGCMs). Research results have been obtained on Rossby wave propagation and transformation, oceanic intrinsic low-frequency variability, effects of surface gravity waves, pacific data analyses, OGCM formulation and developments, and OGCM simulations of forced variability.

  5. Sensor Management for Applied Research Technologies (SMART)-On Demand Modeling (ODM) Project

    NASA Technical Reports Server (NTRS)

    Goodman, M.; Blakeslee, R.; Hood, R.; Jedlovec, G.; Botts, M.; Li, X.

    2006-01-01

    NASA requires timely on-demand data and analysis capabilities to enable practical benefits of Earth science observations. However, a significant challenge exists in accessing and integrating data from multiple sensors or platforms to address Earth science problems because of the large data volumes, varying sensor scan characteristics, unique orbital coverage, and the steep learning curve associated with each sensor and data type. The development of sensor web capabilities to autonomously process these data streams (whether real-time or archived) provides an opportunity to overcome these obstacles and facilitate the integration and synthesis of Earth science data and weather model output. A three year project, entitled Sensor Management for Applied Research Technologies (SMART) - On Demand Modeling (ODM), will develop and demonstrate the readiness of Open Geospatial Consortium (OGC) Sensor Web Enablement (SWE) capabilities that integrate both Earth observations and forecast model output into new data acquisition and assimilation strategies. The advancement of SWE-enabled systems (i.e., use of SensorML, sensor planning services - SPS, sensor observation services - SOS, sensor alert services - SAS and common observation model protocols) will have practical and efficient uses in the Earth science community for enhanced data set generation, real-time data assimilation with operational applications, and for autonomous sensor tasking for unique data collection.

  6. Identifying human influences on atmospheric temperature

    PubMed Central

    Santer, Benjamin D.; Painter, Jeffrey F.; Mears, Carl A.; Doutriaux, Charles; Caldwell, Peter; Arblaster, Julie M.; Cameron-Smith, Philip J.; Gillett, Nathan P.; Gleckler, Peter J.; Lanzante, John; Perlwitz, Judith; Solomon, Susan; Stott, Peter A.; Taylor, Karl E.; Terray, Laurent; Thorne, Peter W.; Wehner, Michael F.; Wentz, Frank J.; Wigley, Tom M. L.; Wilcox, Laura J.; Zou, Cheng-Zhi

    2013-01-01

    We perform a multimodel detection and attribution study with climate model simulation output and satellite-based measurements of tropospheric and stratospheric temperature change. We use simulation output from 20 climate models participating in phase 5 of the Coupled Model Intercomparison Project. This multimodel archive provides estimates of the signal pattern in response to combined anthropogenic and natural external forcing (the fingerprint) and the noise of internally generated variability. Using these estimates, we calculate signal-to-noise (S/N) ratios to quantify the strength of the fingerprint in the observations relative to fingerprint strength in natural climate noise. For changes in lower stratospheric temperature between 1979 and 2011, S/N ratios vary from 26 to 36, depending on the choice of observational dataset. In the lower troposphere, the fingerprint strength in observations is smaller, but S/N ratios are still significant at the 1% level or better, and range from three to eight. We find no evidence that these ratios are spuriously inflated by model variability errors. After removing all global mean signals, model fingerprints remain identifiable in 70% of the tests involving tropospheric temperature changes. Despite such agreement in the large-scale features of model and observed geographical patterns of atmospheric temperature change, most models do not replicate the size of the observed changes. On average, the models analyzed underestimate the observed cooling of the lower stratosphere and overestimate the warming of the troposphere. Although the precise causes of such differences are unclear, model biases in lower stratospheric temperature trends are likely to be reduced by more realistic treatment of stratospheric ozone depletion and volcanic aerosol forcing. PMID:23197824

  7. Model-Observation "Data Cubes" for the DOE Atmospheric Radiation Measurement Program's LES ARM Symbiotic Simulation and Observation (LASSO) Workflow

    NASA Astrophysics Data System (ADS)

    Vogelmann, A. M.; Gustafson, W. I., Jr.; Toto, T.; Endo, S.; Cheng, X.; Li, Z.; Xiao, H.

    2015-12-01

    The Department of Energy's Atmospheric Radiation Measurement (ARM) Climate Research Facilities' Large-Eddy Simulation (LES) ARM Symbiotic Simulation and Observation (LASSO) Workflow is currently being designed to provide output from routine LES to complement its extensive observations. The modeling portion of the LASSO workflow is presented by Gustafson et al., which will initially focus on shallow convection over the ARM megasite in Oklahoma, USA. This presentation describes how the LES output will be combined with observations to construct multi-dimensional and dynamically consistent "data cubes", aimed at providing the best description of the atmospheric state for use in analyses by the community. The megasite observations are used to constrain large-eddy simulations that provide a complete spatial and temporal coverage of observables and, further, the simulations also provide information on processes that cannot be observed. Statistical comparisons of model output with their observables are used to assess the quality of a given simulated realization and its associated uncertainties. A data cube is a model-observation package that provides: (1) metrics of model-observation statistical summaries to assess the simulations and the ensemble spread; (2) statistical summaries of additional model property output that cannot be or are very difficult to observe; and (3) snapshots of the 4-D simulated fields from the integration period. Searchable metrics are provided that characterize the general atmospheric state to assist users in finding cases of interest, such as categorization of daily weather conditions and their specific attributes. The data cubes will be accompanied by tools designed for easy access to cube contents from within the ARM archive and externally, the ability to compare multiple data streams within an event as well as across events, and the ability to use common grids and time sampling, where appropriate.

  8. Advantages to Geoscience and Disaster Response from QuakeSim Implementation of Interferometric Radar Maps in a GIS Database System

    NASA Astrophysics Data System (ADS)

    Parker, Jay; Donnellan, Andrea; Glasscoe, Margaret; Fox, Geoffrey; Wang, Jun; Pierce, Marlon; Ma, Yu

    2015-08-01

    High-resolution maps of earth surface deformation are available in public archives for scientific interpretation, but are primarily available as bulky downloads on the internet. The NASA uninhabited aerial vehicle synthetic aperture radar (UAVSAR) archive of airborne radar interferograms delivers very high resolution images (approximately seven meter pixels) making remote handling of the files that much more pressing. Data exploration requiring data selection and exploratory analysis has been tedious. QuakeSim has implemented an archive of UAVSAR data in a web service and browser system based on GeoServer (http://geoserver.org). This supports a variety of services that supply consistent maps, raster image data and geographic information systems (GIS) objects including standard earthquake faults. Browsing the database is supported by initially displaying GIS-referenced thumbnail images of the radar displacement maps. Access is also provided to image metadata and links for full file downloads. One of the most widely used features is the QuakeSim line-of-sight profile tool, which calculates the radar-observed displacement (from an unwrapped interferogram product) along a line specified through a web browser. Displacement values along a profile are updated to a plot on the screen as the user interactively redefines the endpoints of the line and the sampling density. The profile and also a plot of the ground height are available as CSV (text) files for further examination, without any need to download the full radar file. Additional tools allow the user to select a polygon overlapping the radar displacement image, specify a downsampling rate and extract a modest sized grid of observations for display or for inversion, for example, the QuakeSim simplex inversion tool which estimates a consistent fault geometry and slip model.

  9. Documenting of Geologic Field Activities in Real-Time in Four Dimensions: Apollo 17 as a Case Study for Terrestrial Analogues and Future Exploration

    NASA Technical Reports Server (NTRS)

    Feist, B.; Bleacher, J. E.; Petro, N. E.; Niles, P. B.

    2018-01-01

    During the Apollo exploration of the lunar surface, thousands of still images, 16 mm videos, TV footage, samples, and surface experiments were captured and collected. In addition, observations and descriptions of what was observed was radioed to Mission Control as part of standard communications and subsequently transcribed. The archive of this material represents perhaps the best recorded set of geologic field campaigns and will serve as the example of how to conduct field work on other planetary bodies for decades to come. However, that archive of material exists in disparate locations and formats with varying levels of completeness, making it not easily cross-referenceable. While video and audio exist for the missions, it is not time synchronized, and images taken during the missions are not time or location tagged. Sample data, while robust, is not easily available in a context of where the samples were collected, their descriptions by the astronauts are not connected to them, or the video footage of their collection (if available). A more than five year undertaking to reconstruct and reconcile the Apollo 17 mission archive, from launch through splashdown, has generated an integrated record of the entire mission, resulting in searchable, synchronized image, voice, and video data, with geologic context provided at the time each sample was collected. Through www.apollo17.org the documentation of the field investigation conducted by the Apollo 17 crew is presented in chronologic sequence, with additional context provided by high-resolution Lunar Reconnaissance Orbiter Camera (LROC) Narrow Angle Camera (NAC) images and a corresponding digital terrain model (DTM) of the Taurus-Littrow Valley.

  10. Teacher Evaluation: Archiving Teaching Effectiveness

    ERIC Educational Resources Information Center

    Nielsen, Lance D.

    2014-01-01

    Teacher evaluation is a current hot topic within music education. This article offers strategies for K-12 music educators on how to promote their effectiveness as teachers through archival documentation in a teacher portfolio. Using the Danielson evaluation model (based on four domains of effective teaching practices), examples of music teaching…

  11. Direct probe of the inner accretion flow around the supermassive black hole in NGC 2617

    NASA Astrophysics Data System (ADS)

    Giustini, M.; Costantini, E.; De Marco, B.; Svoboda, J.; Motta, S. E.; Proga, D.; Saxton, R.; Ferrigno, C.; Longinotti, A. L.; Miniutti, G.; Grupe, D.; Mathur, S.; Shappee, B. J.; Prieto, J. L.; Stanek, K.

    2017-01-01

    Aims: NGC 2617 is a nearby (z 0.01) active galaxy that recently switched from being a Seyfert 1.8 to be a Seyfert 1.0. At the same time, it underwent a strong increase of X-ray flux by one order of magnitude with respect to archival measurements. We characterise the X-ray spectral and timing properties of NGC 2617 with the aim of studying the physics of a changing-look active galactic nucleus (AGN). Methods: We performed a comprehensive timing and spectral analysis of two XMM-Newton pointed observations spaced by one month, complemented by archival quasi-simultaneous INTEGRAL observations. Results: We found that, to the first order, NGC 2617 looks like a type 1 AGN in the X-ray band and, with the addition of a modest reflection component, its continuum can be modelled well either with a power law plus a phenomenological blackbody, a partially covered power law, or a double Comptonisation model. Independent of the continuum adopted, in all three cases a column density of a few 1023 cm-2 of neutral gas covering 20-40% of the continuum source is required by the data. Most interestingly, absorption structures due to highly ionised iron have been detected in both observations with a redshift of about 0.1c with respect to the systemic redshift of the host galaxy. Conclusions: The redshifted absorber can be ascribed to a failed wind/aborted jets component, to gravitational redshift effects, and/or to matter directly falling towards the central supermassive black hole. In either case, we are probing the innermost accretion flow around the central supermassive black hole of NGC 2617 and might be even watching matter in a direct inflow towards the black hole itself.

  12. Challenges of coordinating global climate observations - Role of satellites in climate monitoring

    NASA Astrophysics Data System (ADS)

    Richter, C.

    2017-12-01

    Global observation of the Earth's atmosphere, ocean and land is essential for identifying climate variability and change, and for understanding their causes. Observation also provides data that are fundamental for evaluating, refining and initializing the models that predict how the climate system will vary over the months and seasons ahead, and that project how climate will change in the longer term under different assumptions concerning greenhouse gas emissions and other human influences. Long-term observational records have enabled the Intergovernmental Panel on Climate Change to deliver the message that warming of the global climate system is unequivocal. As the Earth's climate enters a new era, in which it is forced by human activities, as well as natural processes, it is critically important to sustain an observing system capable of detecting and documenting global climate variability and change over long periods of time. High-quality climate observations are required to assess the present state of the ocean, cryosphere, atmosphere and land and place them in context with the past. The global observing system for climate is not a single, centrally managed observing system. Rather, it is a composite "system of systems" comprising a set of climate-relevant observing, data-management, product-generation and data-distribution systems. Data from satellites underpin many of the Essential Climate Variables(ECVs), and their historic and contemporary archives are a key part of the global climate observing system. In general, the ECVs will be provided in the form of climate data records that are created by processing and archiving time series of satellite and in situ measurements. Early satellite data records are very valuable because they provide unique observations in many regions which were not otherwise observed during the 1970s and which can be assimilated in atmospheric reanalyses and so extend the satellite climate data records back in time.

  13. BTA Magnet Field Map Archive and MAD Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Glenn,J.W.

    2008-04-01

    This note publishes some and information that has resided in private files. The attached tables were provided by Joseph Skelly from his archives. They show magnetic field measurements versus excitation current for the Booster to AGS transfer line quadrupoles and dipoles based on field measurements [we believe] were done by the Magnet Division. Also given are Ed Blesser's fifth order fits of field versus current. The results are given in 'Tesla' or T-M/M. These tables are attached to provide an archive of this data. The MAD model of the BTA line does have the same values as shown in themore » attached fits so the transfer was correct. MAD uses as its 'gradient' for quads Tesla per meter normalized to rigidity [B-rho]. The model of the BTA line in use uses the T-M/M given in the tables divided by the length to give T M which is then normalized by Brho. Thus, the input to the model appears to be correct. The original model is also attached as part of a memo by Skelly describing it.« less

  14. A New Search Paradigm for Correlated Neutrino Emission from Discrete GRBs using Antarctic Cherenkov Telescopes in the Swift Era

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stamatikos, Michael; Band, David L.; JCA/UMBC, Baltimore, MD 21250

    2006-05-19

    We describe the theoretical modeling and analysis techniques associated with a preliminary search for correlated neutrino emission from GRB980703a, which triggered the Burst and Transient Source Experiment (BATSE GRB trigger 6891), using archived data from the Antarctic Muon and Neutrino Detector Array (AMANDA-B10). Under the assumption of associated hadronic acceleration, the expected observed neutrino energy flux is directly derived, based upon confronting the fireball phenomenology with the discrete set of observed electromagnetic parameters of GRB980703a, gleaned from ground-based and satellite observations, for four models, corrected for oscillations. Models 1 and 2, based upon spectral analysis featuring a prompt photon energymore » fit to the Band function, utilize an observed spectroscopic redshift, for isotropic and anisotropic emission geometry, respectively. Model 3 is based upon averaged burst parameters, assuming isotropic emission. Model 4 based upon a Band fit, features an estimated redshift from the lag-luminosity relation, with isotropic emission. Consistent with our AMANDA-II analysis of GRB030329, which resulted in a flux upper limit of {approx} 0.150GeV /cm2/s for model 1, we find differences in excess of an order of magnitude in the response of AMANDA-B10, among the various models for GRB980703a. Implications for future searches in the era of Swift and IceCube are discussed.« less

  15. Collaborative Metadata Curation in Support of NASA Earth Science Data Stewardship

    NASA Technical Reports Server (NTRS)

    Sisco, Adam W.; Bugbee, Kaylin; le Roux, Jeanne; Staton, Patrick; Freitag, Brian; Dixon, Valerie

    2018-01-01

    Growing collection of NASA Earth science data is archived and distributed by EOSDIS’s 12 Distributed Active Archive Centers (DAACs). Each collection and granule is described by a metadata record housed in the Common Metadata Repository (CMR). Multiple metadata standards are in use, and core elements of each are mapped to and from a common model – the Unified Metadata Model (UMM). Work done by the Analysis and Review of CMR (ARC) Team.

  16. NADIR: A Flexible Archiving System Current Development

    NASA Astrophysics Data System (ADS)

    Knapic, C.; De Marco, M.; Smareglia, R.; Molinaro, M.

    2014-05-01

    The New Archiving Distributed InfrastructuRe (NADIR) is under development at the Italian center for Astronomical Archives (IA2) to increase the performances of the current archival software tools at the data center. Traditional softwares usually offer simple and robust solutions to perform data archive and distribution but are awkward to adapt and reuse in projects that have different purposes. Data evolution in terms of data model, format, publication policy, version, and meta-data content are the main threats to re-usage. NADIR, using stable and mature framework features, answers those very challenging issues. Its main characteristics are a configuration database, a multi threading and multi language environment (C++, Java, Python), special features to guarantee high scalability, modularity, robustness, error tracking, and tools to monitor with confidence the status of each project at each archiving site. In this contribution, the development of the core components is presented, commenting also on some performance and innovative features (multi-cast and publisher-subscriber paradigms). NADIR is planned to be developed as simply as possible with default configurations for every project, first of all for LBT and other IA2 projects.

  17. Databases and archiving for cryoEM

    PubMed Central

    Patwardhan, Ardan; Lawson, Catherine L.

    2017-01-01

    Cryo-EM in structural biology is currently served by three public archives – EMDB for 3DEM reconstructions, PDB for models built from 3DEM reconstructions and EMPIAR for the raw 2D image data used to obtain the 3DEM reconstructions. These archives play a vital role for both the structural community and the wider biological community in making the data accessible so that results may be reused, reassessed and integrated with other structural and bioinformatics resources. The important role of the archives is underpinned by the fact that many journals mandate the deposition of data to PDB and EMDB on publication. The field is currently undergoing transformative changes where on the one hand high-resolution structures are becoming a routine occurrence while on the other hand electron tomography is enabling the study of macromolecules in the cellular context. Concomitantly the archives are evolving to best serve their stakeholder communities. In this chapter we describe the current state of the archives, resources available for depositing, accessing, searching, visualising and validating data, on-going community-wide initiatives and opportunities and challenges for the future. PMID:27572735

  18. The Cancer Imaging Archive (TCIA): maintaining and operating a public information repository.

    PubMed

    Clark, Kenneth; Vendt, Bruce; Smith, Kirk; Freymann, John; Kirby, Justin; Koppel, Paul; Moore, Stephen; Phillips, Stanley; Maffitt, David; Pringle, Michael; Tarbox, Lawrence; Prior, Fred

    2013-12-01

    The National Institutes of Health have placed significant emphasis on sharing of research data to support secondary research. Investigators have been encouraged to publish their clinical and imaging data as part of fulfilling their grant obligations. Realizing it was not sufficient to merely ask investigators to publish their collection of imaging and clinical data, the National Cancer Institute (NCI) created the open source National Biomedical Image Archive software package as a mechanism for centralized hosting of cancer related imaging. NCI has contracted with Washington University in Saint Louis to create The Cancer Imaging Archive (TCIA)-an open-source, open-access information resource to support research, development, and educational initiatives utilizing advanced medical imaging of cancer. In its first year of operation, TCIA accumulated 23 collections (3.3 million images). Operating and maintaining a high-availability image archive is a complex challenge involving varied archive-specific resources and driven by the needs of both image submitters and image consumers. Quality archives of any type (traditional library, PubMed, refereed journals) require management and customer service. This paper describes the management tasks and user support model for TCIA.

  19. The ``One Archive'' for JWST

    NASA Astrophysics Data System (ADS)

    Greene, G.; Kyprianou, M.; Levay, K.; Sienkewicz, M.; Donaldson, T.; Dower, T.; Swam, M.; Bushouse, H.; Greenfield, P.; Kidwell, R.; Wolfe, D.; Gardner, L.; Nieto-Santisteban, M.; Swade, D.; McLean, B.; Abney, F.; Alexov, A.; Binegar, S.; Aloisi, A.; Slowinski, S.; Gousoulin, J.

    2015-09-01

    The next generation for the Space Telescope Science Institute data management system is gearing up to provide a suite of archive system services supporting the operation of the James Webb Space Telescope. We are now completing the initial stage of integration and testing for the preliminary ground system builds of the JWST Science Operations Center which includes multiple components of the Data Management Subsystem (DMS). The vision for astronomical science and research with the JWST archive introduces both solutions to formal mission requirements and innovation derived from our existing mission systems along with the collective shared experience of our global user community. We are building upon the success of the Hubble Space Telescope archive systems, standards developed by the International Virtual Observatory Alliance, and collaborations with our archive data center partners. In proceeding forward, the “one archive” architectural model presented here is designed to balance the objectives for this new and exciting mission. The STScI JWST archive will deliver high quality calibrated science data products, support multi-mission data discovery and analysis, and provide an infrastructure which supports bridges to highly valued community tools and services.

  20. HEASARC - The High Energy Astrophysics Science Archive Research Center

    NASA Technical Reports Server (NTRS)

    Smale, Alan P.

    2011-01-01

    The High Energy Astrophysics Science Archive Research Center (HEASARC) is NASA's archive for high-energy astrophysics and cosmic microwave background (CMB) data, supporting the broad science goals of NASA's Physics of the Cosmos theme. It provides vital scientific infrastructure to the community by standardizing science data formats and analysis programs, providing open access to NASA resources, and implementing powerful archive interfaces. Over the next five years the HEASARC will ingest observations from up to 12 operating missions, while serving data from these and over 30 archival missions to the community. The HEASARC archive presently contains over 37 TB of data, and will contain over 60 TB by the end of 2014. The HEASARC continues to secure major cost savings for NASA missions, providing a reusable mission-independent framework for reducing, analyzing, and archiving data. This approach was recognized in the NRC Portals to the Universe report (2007) as one of the HEASARC's great strengths. This poster describes the past and current activities of the HEASARC and our anticipated developments in coming years. These include preparations to support upcoming high energy missions (NuSTAR, Astro-H, GEMS) and ground-based and sub-orbital CMB experiments, as well as continued support of missions currently operating (Chandra, Fermi, RXTE, Suzaku, Swift, XMM-Newton and INTEGRAL). In 2012 the HEASARC (which now includes LAMBDA) will support the final nine-year WMAP data release. The HEASARC is also upgrading its archive querying and retrieval software with the new Xamin system in early release - and building on opportunities afforded by the growth of the Virtual Observatory and recent developments in virtual environments and cloud computing.

  1. Water level ingest, archive and processing system - an integral part of NOAA's tsunami database

    NASA Astrophysics Data System (ADS)

    McLean, S. J.; Mungov, G.; Dunbar, P. K.; Price, D. J.; Mccullough, H.

    2013-12-01

    The National Oceanic and Atmospheric Administration (NOAA), National Geophysical Data Center (NGDC) and collocated World Data Service for Geophysics (WDS) provides long-term archive, data management, and access to national and global tsunami data. Archive responsibilities include the NOAA Global Historical Tsunami event and runup database, damage photos, as well as other related hazards data. Beginning in 2008, NGDC was given the responsibility of archiving, processing and distributing all tsunami and hazards-related water level data collected from NOAA observational networks in a coordinated and consistent manner. These data include the Deep-ocean Assessment and Reporting of Tsunami (DART) data provided by the National Data Buoy Center (NDBC), coastal-tide-gauge data from the National Ocean Service (NOS) network and tide-gauge data from the two National Weather Service (NWS) Tsunami Warning Centers (TWCs) regional networks. Taken together, this integrated archive supports tsunami forecast, warning, research, mitigation and education efforts of NOAA and the Nation. Due to the variety of the water level data, the automatic ingest system was redesigned, along with upgrading the inventory, archive and delivery capabilities based on modern digital data archiving practices. The data processing system was also upgraded and redesigned focusing on data quality assessment in an operational manner. This poster focuses on data availability highlighting the automation of all steps of data ingest, archive, processing and distribution. Examples are given from recent events such as the October 2012 hurricane Sandy, the Feb 06, 2013 Solomon Islands tsunami, and the June 13, 2013 meteotsunami along the U.S. East Coast.

  2. Using the Stereotype Content Model to examine group depictions in Fascism: An Archival Approach

    PubMed Central

    Durante, Federica; Volpato, Chiara; Fiske, Susan T.

    2013-01-01

    The Stereotype Content Model (SCM) suggests potentially universal intergroup depictions. If universal, they should apply across history in archival data. Bridging this gap, we examined social groups descriptions during Italy’s Fascist era. In Study 1, articles published in a Fascist magazine— La Difesa della Razza —were content analyzed, and results submitted to correspondence analysis. Admiration prejudice depicted ingroups; envious and contemptuous prejudices depicted specific outgroups, generally in line with SCM predictions. No paternalistic prejudice appeared; historical reasons might explain this finding. Results also fit the recently developed BIAS Map of behavioral consequences. In Study 2, ninety-six undergraduates rated the content-analysis traits on warmth and competence, without knowing their origin. They corroborated SCM’s interpretations of the archival data. PMID:24403646

  3. Stewardship of NASA's Earth Science Data and Ensuring Long-Term Active Archives

    NASA Astrophysics Data System (ADS)

    Ramapriyan, H.; Behnke, J.

    2016-12-01

    NASA's Earth Observing System Data and Information System (EOSDIS) has been in operation since 1994. EOSDIS manages data from pre-EOS missions dating back to 1960s, EOS missions that started in 1997, and missions from the post-EOS era. Its data holdings come from many different sources - satellite and airborne instruments, in situ measures, field experiments, science investigations, etc. Since the beginning of the EOS Program, NASA has followed an open data policy, with non-discriminatory access to data with no period of exclusive access. NASA has well-established processes for assigning and/or accepting datasets into one of 12 Distributed Active Archive Centers (DAACs) that are parts of EOSDIS. EOSDIS has been evolving through several information technology cycles, adapting to hardware and software changes in the commercial sector. NASA is responsible for maintaining Earth science data as long as users are interested in using them for research and applications, which is well beyond the life of the data gathering missions. For science data to remain useful over long periods of time, steps must be taken to preserve: 1. Data bits with no corruption, 2. Discoverability and access, 3. Readability, 4. Understandability, 5. Usability and 6. Reproducibility of results. NASA's Earth Science data and Information System (ESDIS) Project, along with the 12 EOSDIS Distributed Active Archive Centers (DAACs), has made significant progress in each of these areas over the last decade, and continues to evolve its active archive capabilities. Particular attention is being paid in recent years to ensure that the datasets are "published" in an easily accessible and citable manner through a unified metadata model, a common metadata repository (CMR), a coherent view through the earthdata.gov website, and assignment of Digital Object Identifiers (DOI) with well-designed landing/product information pages.

  4. The EOSDIS Version 0 Distributed Active Archive Center for physical oceanography and air-sea interaction

    NASA Technical Reports Server (NTRS)

    Hilland, Jeffrey E.; Collins, Donald J.; Nichols, David A.

    1991-01-01

    The Distributed Active Archive Center (DAAC) at the Jet Propulsion Laboratory will support scientists specializing in physical oceanography and air-sea interaction. As part of the NASA Earth Observing System Data and Information System Version 0 the DAAC will build on existing capabilities to provide services for data product generation, archiving, distribution and management of information about data. To meet scientist's immediate needs for data, existing data sets from missions such as Seasat, Geosat, the NOAA series of satellites and the Global Positioning Satellite system will be distributed to investigators upon request. In 1992, ocean topography, wave and surface roughness data from the Topex/Poseidon radar altimeter mission will be archived and distributed. New data products will be derived from Topex/Poseidon and other sensor systems based on recommendations of the science community. In 1995, ocean wind field measurements from the NASA Scatterometer will be supported by the DAAC.

  5. Data Management in the Euclid Science Archive System

    NASA Astrophysics Data System (ADS)

    de Teodoro, P.; Nieto, S.; Altieri, B.

    2017-06-01

    Euclid is the ESA M2 mission and a milestone in the understanding of the geometry of the Universe. In total Euclid will produce up to 26 PB per year of observations. The Science Archive Systems (SAS) belongs to the Euclid Archive System (EAS) that sits in the core of the Euclid Science Ground Segment (SGS). The SAS is being built at the ESAC Science Data Centre (ESDC), which is responsible for the development and operations of the scientific archives for the Astronomy, Planetary and Heliophysics missions of ESA. The SAS is focused on the needs of the scientific community and is intended to provide access to the most valuable scientific metadata from the Euclid mission. In this paper we describe the architectural design of the system, implementation progress and the main challenges from the data management point of view in the building of the SAS.

  6. The Land Processes Distributed Active Archive Center (LP DAAC)

    USGS Publications Warehouse

    Golon, Danielle K.

    2016-10-03

    The Land Processes Distributed Active Archive Center (LP DAAC) operates as a partnership with the U.S. Geological Survey and is 1 of 12 DAACs within the National Aeronautics and Space Administration (NASA) Earth Observing System Data and Information System (EOSDIS). The LP DAAC ingests, archives, processes, and distributes NASA Earth science remote sensing data. These data are provided to the public at no charge. Data distributed by the LP DAAC provide information about Earth’s surface from daily to yearly intervals and at 15 to 5,600 meter spatial resolution. Data provided by the LP DAAC can be used to study changes in agriculture, vegetation, ecosystems, elevation, and much more. The LP DAAC provides several ways to access, process, and interact with these data. In addition, the LP DAAC is actively archiving new datasets to provide users with a variety of data to study the Earth.

  7. Opening the Landsat Archive

    USGS Publications Warehouse

    ,

    2008-01-01

    The USGS Landsat archive holds an unequaled 36-year record of the Earth's surface that is invaluable to climate change studies, forest and resource management activities, and emergency response operations. An aggressive effort is taking place to provide all Landsat imagery [scenes currently held in the USGS Earth Resources Observation and Science (EROS) Center archive, as well as newly acquired scenes daily] free of charge to users with electronic access via the Web by the end of December 2008. The entire Landsat 7 Enhanced Thematic Mapper Plus (ETM+) archive acquired since 1999 and any newly acquired Landsat 7 ETM+ images that have less than 40 percent cloud cover are currently available for download. When this endeavor is complete all Landsat 1-5 data will also be available for download. This includes Landsat 1-5 Multispectral Scanner (MSS) scenes, as well as Landsat 4 and 5 Thematic Mapper (TM) scenes.

  8. U.S. Geological Survey, remote sensing, and geoscience data: Using standards to serve us all

    USGS Publications Warehouse

    Benson, Michael G.; Faundeen, John L.

    2000-01-01

    The U.S. Geological Survey (USGS) advocates the use of standards with geosciences and remotely sensed data and metadata for its own purposes and those of its customers. In activities that range from archiving data to making a product, the incorporation of standards makes these functions repeatable and understandable. More important, when accepted standards are followed, data discovery and sharing can be more efficient and the overall value to society can be expanded. The USGS archives many terabytes of digital geoscience and remotely sensed data. Several million photographs are also available to the research community. To manage these vast holdings and ensure that strict preservation and high usability criteria are observed, the USGS uses standards within the archival, data management, public access and ordering, and data distribution areas. The USGS uses Federal and international standards in performing its role as the U.S. National Satellite Land Remote Sensing Data Archive and in its mission as the long-term archive and production center for aerial photographs and cartographic data covering the United States.

  9. Data management and digital delivery of analog data

    USGS Publications Warehouse

    Miller, W.A.; Longhenry, Ryan; Smith, T.

    2008-01-01

    The U.S. Geological Survey's (USGS) data archive at the Earth Resources Observation and Science (EROS) Center is a comprehensive and impartial record of the Earth's changing land surface. USGS/EROS has been archiving and preserving land remote sensing data for over 35 years. This remote sensing archive continues to grow as aircraft and satellites acquire more imagery. As a world leader in preserving data, USGS/EROS has a reputation as a technological innovator in solving challenges and ensuring that access to these collections is available. Other agencies also call on the USGS to consider their collections for long-term archive support. To improve access to the USGS film archive, each frame on every roll of film is being digitized by automated high performance digital camera systems. The system robotically captures a digital image from each film frame for the creation of browse and medium resolution image files. Single frame metadata records are also created to improve access that otherwise involves interpreting flight indexes. USGS/EROS is responsible for over 8.6 million frames of aerial photographs and 27.7 million satellite images.

  10. Straddling Interdisciplinary Seams: Working Safely in the Field, Living Dangerously With a Model

    NASA Astrophysics Data System (ADS)

    Light, B.; Roberts, A.

    2016-12-01

    Many excellent proposals for observational work have included language detailing how the proposers will appropriately archive their data and publish their results in peer-reviewed literature so that they may be readily available to the modeling community for parameterization development. While such division of labor may be both practical and inevitable, the assimilation of observational results and the development of observationally-based parameterizations of physical processes require care and feeding. Key questions include: (1) Is an existing parameterization accurate, consistent, and general? If not, it may be ripe for additional physics. (2) Do there exist functional working relationships between human modeler and human observationalist? If not, one or more may need to be initiated and cultivated. (3) If empirical observation and model development are a chicken/egg problem, how, given our lack of prescience and foreknowledge, can we better design observational science plans to meet the eventual demands of model parameterization? (4) Will the addition of new physics "break" the model? If so, then the addition may be imperative. In the context of these questions, we will make retrospective and forward-looking assessments of a now-decade-old numerical parameterization to treat the partitioning of solar energy at the Earth's surface where sea ice is present. While this so called "Delta-Eddington Albedo Parameterization" is currently employed in the widely-used Los Alamos Sea Ice Model (CICE) and appears to be standing the tests of accuracy, consistency, and generality, we will highlight some ideas for its ongoing development and improvement.

  11. Polarimetry of 600 pulsars from observations at 1.4 GHz with the Parkes radio telescope

    NASA Astrophysics Data System (ADS)

    Johnston, Simon; Kerr, Matthew

    2018-03-01

    Over the past 13 yr, the Parkes radio telescope has observed a large number of pulsars using digital filter bank backends with high time and frequency resolution and the capability for Stokes recording. Here, we use archival data to present polarimetry data at an observing frequency of 1.4 GHz for 600 pulsars with spin-periods ranging from 0.036 to 8.5 s. We comment briefly on some of the statistical implications from the data and highlight the differences between pulsars with high and low spin-down energy. The data set, images and table of properties for all 600 pulsars are made available in a public data archive maintained by the CSIRO.

  12. A far-ultraviolet atlas of symbiotic stars observed with IUE. 1. The SWP range

    NASA Technical Reports Server (NTRS)

    Meier, S. R.; Kafatos, M.; Fahey, R. P.; Michalitsianos, A. G.

    1994-01-01

    This atlas contains sample spectra from the far-ultraviolet observations of 32 symbiotic stars obtained with the International Ultraviolet Explorer (IUE) satellite. In all, 394 low-resolution spectra from the short-wavelength primary (SWP) camera covering the range 1200-2000 A have been extracted from the IUE archive, calibrated, and measured. Absolute line fluxes and wavelengths for the prominent emission lines have been tabulated. Tables of both the general properties of these symbiotics and of features specific to the spectrum of each are included. The spectra shown are representative of the different classes of symbiotic stars that are currently in the IUE archive. These include known eclipsing systems and those that have been observed in outburst (as well as quiescence).

  13. Satellite observed thermodynamics during FGGE

    NASA Technical Reports Server (NTRS)

    Smith, W. L.

    1985-01-01

    During the First Global Atmospheric Research Program (GARP) Global Experiment (FGGE), determinations of temperature and moisture were made from TIROS-N and NOAA-6 satellite infrared and microwave sounding radiance measurements. The data were processed by two methods differing principally in their horizontal resolution. At the National Earth Satellite Service (NESS) in Washington, D.C., the data were produced operationally with a horizontal resolution of 250 km for inclusion in the FGGE Level IIb data sets for application to large-scale numerical analysis and prediction models. High horizontal resolution (75 km) sounding data sets were produced using man-machine interactive methods for the special observing periods of FGGE at the NASA/Goddard Space Flight Center and archived as supplementary Level IIb. The procedures used for sounding retrieval and the characteristics and quality of these thermodynamic observations are given.

  14. Air Quality Forecasts Using the NASA GEOS Model

    NASA Technical Reports Server (NTRS)

    Keller, Christoph A.; Knowland, K. Emma; Nielsen, Jon E.; Orbe, Clara; Ott, Lesley; Pawson, Steven; Saunders, Emily; Duncan, Bryan; Follette-Cook, Melanie; Liu, Junhua; hide

    2018-01-01

    We provide an introduction to a new high-resolution (0.25 degree) global composition forecast produced by NASA's Global Modeling and Assimilation office. The NASA Goddard Earth Observing System version 5 (GEOS-5) model has been expanded to provide global near-real-time forecasts of atmospheric composition at a horizontal resolution of 0.25 degrees (25 km). Previously, this combination of detailed chemistry and resolution was only provided by regional models. This system combines the operational GEOS-5 weather forecasting model with the state-of-the-science GEOS-Chem chemistry module (version 11) to provide detailed chemical analysis of a wide range of air pollutants such as ozone, carbon monoxide, nitrogen oxides, and fine particulate matter (PM2.5). The resolution of the forecasts is the highest resolution compared to current, publically-available global composition forecasts. Evaluation and validation of modeled trace gases and aerosols compared to surface and satellite observations will be presented for constituents relative to health air quality standards. Comparisons of modeled trace gases and aerosols against satellite observations show that the model produces realistic concentrations of atmospheric constituents in the free troposphere. Model comparisons against surface observations highlight the model's capability to capture the diurnal variability of air pollutants under a variety of meteorological conditions. The GEOS-5 composition forecasting system offers a new tool for scientists and the public health community, and is being developed jointly with several government and non-profit partners. Potential applications include air quality warnings, flight campaign planning and exposure studies using the archived analysis fields.

  15. Development of working hypotheses linking management of the Missouri River to population dynamics of Scaphirhynchus albus (pallid sturgeon)

    USGS Publications Warehouse

    Jacobson, Robert B.; Parsley, Michael J.; Annis, Mandy L.; Colvin, Michael E.; Welker, Timothy L.; James, Daniel A.

    2016-01-20

    The initial set of candidate hypotheses provides a useful starting point for quantitative modeling and adaptive management of the river and species. We anticipate that hypotheses will change from the set of working management hypotheses as adaptive management progresses. More importantly, hypotheses that have been filtered out of our multistep process are still being considered. These filtered hypotheses are archived and if existing hypotheses are determined to be inadequate to explain observed population dynamics, new hypotheses can be created or filtered hypotheses can be reinstated.

  16. Exploiting Satellite Archives to Estimate Global Glacier Volume Changes

    NASA Astrophysics Data System (ADS)

    McNabb, R. W.; Nuth, C.; Kääb, A.; Girod, L.

    2017-12-01

    In the past decade, the availability of, and ability to process, remote sensing data over glaciers has expanded tremendously. Newly opened satellite image archives, combined with new processing techniques as well as increased computing power and storage capacity, have given the glaciological community the ability to observe and investigate glaciological processes and changes on a truly global scale. In particular, the opening of the ASTER archives provides further opportunities to both estimate and monitor glacier elevation and volume changes globally, including potentially on sub-annual timescales. With this explosion of data availability, however, comes the challenge of seeing the forest instead of the trees. The high volume of data available means that automated detection and proper handling of errors and biases in the data becomes critical, in order to properly study the processes that we wish to see. This includes holes and blunders in digital elevation models (DEMs) derived from optical data or penetration of radar signals leading to biases in DEMs derived from radar data, among other sources. Here, we highlight new advances in the ability to sift through high-volume datasets, and apply these techniques to estimate recent glacier volume changes in the Caucasus Mountains, Scandinavia, Africa, and South America. By properly estimating and correcting for these biases, we additionally provide a detailed accounting of the uncertainties in these estimates of volume changes, leading to more reliable results that have applicability beyond the glaciological community.

  17. AstroDAbis: Annotations and Cross-Matches for Remote Catalogues

    NASA Astrophysics Data System (ADS)

    Gray, N.; Mann, R. G.; Morris, D.; Holliman, M.; Noddle, K.

    2012-09-01

    Astronomers are good at sharing data, but poorer at sharing knowledge. Almost all astronomical data ends up in open archives, and access to these is being simplified by the development of the global Virtual Observatory (VO). This is a great advance, but the fundamental problem remains that these archives contain only basic observational data, whereas all the astrophysical interpretation of that data — which source is a quasar, which a low-mass star, and which an image artefact — is contained in journal papers, with very little linkage back from the literature to the original data archives. It is therefore currently impossible for an astronomer to pose a query like “give me all sources in this data archive that have been identified as quasars” and this limits the effective exploitation of these archives, as the user of an archive has no direct means of taking advantage of the knowledge derived by its previous users. The AstroDAbis service aims to address this, in a prototype service enabling astronomers to record annotations and cross-identifications in the AstroDAbis service, annotating objects in other catalogues. We have deployed two interfaces to the annotations, namely one astronomy-specific one using the TAP protocol (Dowler et al. 2011), and a second exploiting generic Linked Open Data (LOD) and RDF techniques.

  18. A Model Curriculum for the Education and Training of Archivists in Automation: A RAMP Study.

    ERIC Educational Resources Information Center

    Fishbein, M. H.

    This RAMP (Records and Archives Management Programme) study is intended for people involved in planning and conducting archival and records management training; for individual archivists and records managers interested in professional development through continuing education programs; and for all information professionals interested in learning of…

  19. Exploiting Data Intensive Applications on High Performance Computers to Unlock Australia's Landsat Archive

    NASA Astrophysics Data System (ADS)

    Purss, Matthew; Lewis, Adam; Edberg, Roger; Ip, Alex; Sixsmith, Joshua; Frankish, Glenn; Chan, Tai; Evans, Ben; Hurst, Lachlan

    2013-04-01

    Australia's Earth Observation Program has downlinked and archived satellite data acquired under the NASA Landsat mission for the Australian Government since the establishment of the Australian Landsat Station in 1979. Geoscience Australia maintains this archive and produces image products to aid the delivery of government policy objectives. Due to the labor intensive nature of processing of this data there have been few national-scale datasets created to date. To compile any Earth Observation product the historical approach has been to select the required subset of data and process "scene by scene" on an as-needed basis. As data volumes have increased over time, and the demand for the processed data has also grown, it has become increasingly difficult to rapidly produce these products and achieve satisfactory policy outcomes using these historic processing methods. The result is that we have been "drowning in a sea of uncalibrated data" and scientists, policy makers and the public have not been able to realize the full potential of the Australian Landsat Archive and its value is therefore significantly diminished. To overcome this critical issue, the Australian Space Research Program has funded the "Unlocking the Landsat Archive" (ULA) Project from April 2011 to June 2013 to improve the access and utilization of Australia's archive of Landsat data. The ULA Project is a public-private consortium led by Lockheed Martin Australia (LMA) and involving Geoscience Australia (GA), the Victorian Partnership for Advanced Computing (VPAC), the National Computational Infrastructure (NCI) at the Australian National University (ANU) and the Cooperative Research Centre for Spatial Information (CRC-SI). The outputs from the ULA project will become a fundamental component of Australia's eResearch infrastructure, with the Australian Landsat Archive hosted on the NCI and made openly available under a creative commons license. NCI provides access to researchers through significant HPC supercomputers, cloud infrastructure and data resources along with a large catalogue of software tools that make it possible to fully explore the potential of this data. Under the ULA Project, Geoscience Australia has developed a data-intensive processing workflow on the NCI. This system has allowed us to successfully process 11 years of the Australian Landsat Archive (from 2000 to 2010 inclusive) to standardized well-calibrated and sensor independent data products at a rate that allows for both bulk data processing of the archive and near-realtime processing of newly acquired satellite data. These products are available as Optical Surface Reflectance 25m (OSR25) and other derived products, such as Fractional Cover.

  20. CHEERS: Chemical enrichment of clusters of galaxies measured using a large XMM-Newton sample

    NASA Astrophysics Data System (ADS)

    de Plaa, J.; Mernier, F.; Kaastra, J.; Pinto, C.

    2017-10-01

    The Chemical Enrichment RGS Sample (CHEERS) is aimed to be a sample of the most optimal clusters of galaxies for observation with the Reflection Grating Spectrometer (RGS) aboard XMM-Newton. It consists of 5 Ms of deep cluster observations of 44 objects obtained through a very large program and archival observations. The main goal is to measure chemical abundances in the hot Intra-Cluster Medium (ICM) of clusters to provide constraints on chemical evolution models. Especially the origin and evolution of type Ia supernovae is still poorly known and X-ray observations could contribute to constrain models regarding the SNIa explosion mechanism. Due to the high quality of the data, the uncertainties on the abundances are dominated by systematic effects. By carefully treating each systematic effect, we increase the accuracy or estimate the remaining uncertainty on the measurement. The resulting abundances are then compared to supernova models. In addition, also radial abundance profiles are derived. In the talk, we present an overview of the results that the CHEERS collaboration obtained based on the CHEERS data. We focus on the abundance measurements. The other topics range from turbulence measurements through line broadening to cool gas in groups.

  1. BIOME: A scientific data archive search-and-order system using browser-aware, dynamic pages.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jennings, S.V.; Yow, T.G.; Ng, V.W.

    1997-08-01

    The Oak Ridge National Laboratory`s (ORNL) Distributed Active Archive Center (DAAC) is a data archive and distribution center for the National Air and Space Administration`s (NASA) Earth Observing System Data and Information System (EOSDIS). Both the Earth Observing System (EOS) and EOSDIS are components of NASA`s contribution to the US Global Change Research Program through its Mission to Planet Earth Program. The ORNL DAAC provides access to data used in ecological and environmental research such as global change, global warming, and terrestrial ecology. Because of its large and diverse data holdings, the challenge for the ORNL DAAC is to helpmore » users find data of interest from the hundreds of thousands of files available at the DAAC without overwhelming them. Therefore, the ORNL DAAC has developed the Biogeochemical Information Ordering Management Environment (BIOME), a customized search and order system for the World Wide Web (WWW). BIOME is a public system located at http://www-eosdis.ornl.gov/BIOME/biome.html.« less

  2. BIOME: A scientific data archive search-and-order system using browser-aware, dynamic pages

    NASA Technical Reports Server (NTRS)

    Jennings, S. V.; Yow, T. G.; Ng, V. W.

    1997-01-01

    The Oak Ridge National Laboratory's (ORNL) Distributed Active Archive Center (DAAC) is a data archive and distribution center for the National Air and Space Administration's (NASA) Earth Observing System Data and Information System (EOSDIS). Both the Earth Observing System (EOS) and EOSDIS are components of NASA's contribution to the US Global Change Research Program through its Mission to Planet Earth Program. The ORNL DAAC provides access to data used in ecological and environmental research such as global change, global warming, and terrestrial ecology. Because of its large and diverse data holdings, the challenge for the ORNL DAAC is to help users find data of interest from the hundreds of thousands of files available at the DAAC without overwhelming them. Therefore, the ORNL DAAC has developed the Biogeochemical Information Ordering Management Environment (BIOME), a customized search and order system for the World Wide Web (WWW). BIOME is a public system located at http://www-eosdis. ornl.gov/BIOME/biome.html.

  3. Content Platforms Meet Data Storage, Retrieval Needs

    NASA Technical Reports Server (NTRS)

    2012-01-01

    Earth is under a constant barrage of information from space. Whether from satellites orbiting our planet, spacecraft circling Mars, or probes streaking toward the far reaches of the Solar System, NASA collects massive amounts of data from its spacefaring missions each day. NASA s Earth Observing System (EOS) satellites, for example, provide daily imagery and measurements of Earth s atmosphere, oceans, vegetation, and more. The Earth Observing System Data and Information System (EOSDIS) collects all of that science data and processes, archives, and distributes it to researchers around the globe; EOSDIS recently reached a total archive volume of 4.5 petabytes. Try to store that amount of information in your standard, four-drawer file cabinet, and you would need 90 million to get the job done. To manage the flood of information, NASA has explored technologies to efficiently collect, archive, and provide access to EOS data for scientists today and for years to come. One such technology is now providing similar capabilities to businesses and organizations worldwide.

  4. NASA Earth Observing System Data and Information System (EOSDIS): A U.S. Network of Data Centers Serving Earth Science Data: A Network Member of ICSU WDS

    NASA Technical Reports Server (NTRS)

    Behnke, Jeanne; Ramapriyan, H. K. " Rama"

    2016-01-01

    NASA's Earth Observing System Data and Information System (EOSDIS) has been in operation since August 1994, and serving a diverse user community around the world with Earth science data from satellites, aircraft, field campaigns and research investigations. The ESDIS Project, responsible for EOSDIS is a Network Member of the International Council for Sciences (ICSU) World Data System (WDS). Nine of the 12 Distributed Active Archive Centers (DAACs), which are part of EOSDIS, are Regular Members of the ICSUWDS. This poster presents the EOSDIS mission objectives, key characteristics of the DAACs that make them world class Earth science data centers, successes, challenges and best practices of EOSDIS focusing on the years 2014-2016, and illustrates some highlights of accomplishments of EOSDIS. The highlights include: high customer satisfaction, growing archive and distribution volumes, exponential growth in number of products distributed to users around the world, unified metadata model and common metadata repository, flexibility provided to uses by supporting data transformations to suit their applications, near-real-time capabilities to support various operational and research applications, and full resolution image browse capabilities to help users select data of interest. The poster also illustrates how the ESDIS Project is actively involved in several US and international data system organizations.

  5. Development of landsat-5 thematic mapper internal calibrator gain and offset table

    USGS Publications Warehouse

    Barsi, J.A.; Chander, G.; Micijevic, E.; Markham, B.L.; Haque, Md. O.

    2008-01-01

    The National Landsat Archive Production System (NLAPS) has been the primary processing system for Landsat data since U.S. Geological Survey (USGS) Earth Resources Observation and Science Center (EROS) started archiving Landsat data. NLAPS converts raw satellite data into radiometrically and geometrically calibrated products. NLAPS has historically used the Internal Calibrator (IC) to calibrate the reflective bands of the Landsat-5 Thematic Mapper (TM), even though the lamps in the IC were less stable than the TM detectors, as evidenced by vicarious calibration results. In 2003, a major effort was made to model the actual TM gain change and to update NLAPS to use this model rather than the unstable IC data for radiometric calibration. The model coefficients were revised in 2007 to reflect greater understanding of the changes in the TM responsivity. While the calibration updates are important to users with recently processed data, the processing system no longer calculates the original IC gain or offset. For specific applications, it is useful to have a record of the gain and offset actually applied to the older data. Thus, the NLAPS calibration database was used to generate estimated daily values for the radiometric gain and offset that might have been applied to TM data. This paper discusses the need for and generation of the NLAPSIC gain and offset tables. A companion paper covers the application of and errors associated with using these tables.

  6. National Satellite Land Remote Sensing Data Archive

    USGS Publications Warehouse

    Faundeen, John L.; Longhenry, Ryan

    2018-06-13

    The National Satellite Land Remote Sensing Data Archive is managed on behalf of the Secretary of the Interior by the U.S. Geological Survey’s Earth Resources Observation and Science Center. The Land Remote Sensing Policy Act of 1992 (51 U.S.C. §601) directed the U.S. Department of the Interior to establish a permanent global archive consisting of imagery over land areas obtained from satellites orbiting the Earth. The law also directed the U.S. Department of the Interior, delegated to the U.S. Geological Survey, to ensure proper storage and preservation of imagery, and timely access for all parties. Since 2008, these images have been available at no cost to the user.

  7. Extracting DNA from 'jaws': high yield and quality from archived tiger shark (Galeocerdo cuvier) skeletal material.

    PubMed

    Nielsen, E E; Morgan, J A T; Maher, S L; Edson, J; Gauthier, M; Pepperell, J; Holmes, B J; Bennett, M B; Ovenden, J R

    2017-05-01

    Archived specimens are highly valuable sources of DNA for retrospective genetic/genomic analysis. However, often limited effort has been made to evaluate and optimize extraction methods, which may be crucial for downstream applications. Here, we assessed and optimized the usefulness of abundant archived skeletal material from sharks as a source of DNA for temporal genomic studies. Six different methods for DNA extraction, encompassing two different commercial kits and three different protocols, were applied to material, so-called bio-swarf, from contemporary and archived jaws and vertebrae of tiger sharks (Galeocerdo cuvier). Protocols were compared for DNA yield and quality using a qPCR approach. For jaw swarf, all methods provided relatively high DNA yield and quality, while large differences in yield between protocols were observed for vertebrae. Similar results were obtained from samples of white shark (Carcharodon carcharias). Application of the optimized methods to 38 museum and private angler trophy specimens dating back to 1912 yielded sufficient DNA for downstream genomic analysis for 68% of the samples. No clear relationships between age of samples, DNA quality and quantity were observed, likely reflecting different preparation and storage methods for the trophies. Trial sequencing of DNA capture genomic libraries using 20 000 baits revealed that a significant proportion of captured sequences were derived from tiger sharks. This study demonstrates that archived shark jaws and vertebrae are potential high-yield sources of DNA for genomic-scale analysis. It also highlights that even for similar tissue types, a careful evaluation of extraction protocols can vastly improve DNA yield. © 2016 John Wiley & Sons Ltd.

  8. Internet Services for Professional Astronomy

    NASA Astrophysics Data System (ADS)

    Andernach, H.

    A (subjective) overview of Internet resources relevant to professional astronomers is given. Special emphasis is put on databases of astronomical objects and servers providing general information, e.g. on astronomical catalogues, finding charts from sky surveys, bibliographies, directories, browsers through multi-wavelength observational archives, etc. Archives of specific observational data will be discussed in more detail in other chapters of this book, dealing with the corresponding part of the electromagnetic spectrum. About 200 different links are mentioned, and every attempt was made to make this report as up-to-date as possible. As the field is rapidly growing with improved network technology, it will be just a snapshot of the situation in mid-1998.

  9. Testing for Lorentz violation: constraints on standard-model-extension parameters via lunar laser ranging.

    PubMed

    Battat, James B R; Chandler, John F; Stubbs, Christopher W

    2007-12-14

    We present constraints on violations of Lorentz invariance based on archival lunar laser-ranging (LLR) data. LLR measures the Earth-Moon separation by timing the round-trip travel of light between the two bodies and is currently accurate to the equivalent of a few centimeters (parts in 10(11) of the total distance). By analyzing this LLR data under the standard-model extension (SME) framework, we derived six observational constraints on dimensionless SME parameters that describe potential Lorentz violation. We found no evidence for Lorentz violation at the 10(-6) to 10(-11) level in these parameters. This work constitutes the first LLR constraints on SME parameters.

  10. GLAS Long-Term Archive: Preservation and Stewardship for a Vital Earth Observing Mission

    NASA Astrophysics Data System (ADS)

    Fowler, D. K.; Moses, J. F.; Zwally, J.; Schutz, B. E.; Hancock, D.; McAllister, M.; Webster, D.; Bond, C.

    2012-12-01

    Data Stewardship, preservation, and reproducibility are fast becoming principal parts of a data manager's work. In an era of distributed data and information systems, it is of vital importance that organizations make a commitment to both current and long-term goals of data management and the preservation of scientific data. Satellite missions and instruments go through a lifecycle that involves pre-launch calibration, on-orbit data acquisition and product generation, and final reprocessing. Data products and descriptions flow to the archives for distribution on a regular basis during the active part of the mission. However there is additional information from the product generation and science teams needed to ensure the observations will be useful for long term climate studies. Examples include ancillary input datasets, product generation software, and production history as developed by the team during the course of product generation. These data and information will need to be archived after product data processing is completed. NASA has developed a set of Earth science data and information content requirements for long term preservation that is being used for all the EOS missions as they come to completion. Since the ICESat/GLAS mission was one of the first to end, NASA and NSIDC, in collaboration with the science team, are collecting data, software, and documentation, preparing for long-term support of the ICESat mission. For a long-term archive, it is imperative to preserve sufficient information about how products were prepared in order to ensure future researchers that the scientific results are accurate, understandable, and useable. Our experience suggests data centers know what to preserve in most cases. That is, the processing algorithms along with the Level 0 or Level 1a input and ancillary products used to create the higher-level products will be archived and made available to users. In other cases, such as pre-launch, calibration/validation, and test data, the data centers must seek guidance from the science team. All these data are essential for product provenance, contributing to and helping establish the integrity of the scientific observations for long term climate studies. In this presentation we will describe application of information gathering with guidance from the ICESat/GLAS Science Team, and the flow of additional information from the ICESat Science team and Science Investigator-Led Processing System to the NSIDC Distributed Active Archive Center. This presentation will also cover how we envision user support through the years of the Long-Term Archive.

  11. Can we use Earth Observations to improve monthly water level forecasts?

    NASA Astrophysics Data System (ADS)

    Slater, L. J.; Villarini, G.

    2017-12-01

    Dynamical-statistical hydrologic forecasting approaches benefit from different strengths in comparison with traditional hydrologic forecasting systems: they are computationally efficient, can integrate and `learn' from a broad selection of input data (e.g., General Circulation Model (GCM) forecasts, Earth Observation time series, teleconnection patterns), and can take advantage of recent progress in machine learning (e.g. multi-model blending, post-processing and ensembling techniques). Recent efforts to develop a dynamical-statistical ensemble approach for forecasting seasonal streamflow using both GCM forecasts and changing land cover have shown promising results over the U.S. Midwest. Here, we use climate forecasts from several GCMs of the North American Multi Model Ensemble (NMME) alongside 15-minute stage time series from the National River Flow Archive (NRFA) and land cover classes extracted from the European Space Agency's Climate Change Initiative 300 m annual Global Land Cover time series. With these data, we conduct systematic long-range probabilistic forecasting of monthly water levels in UK catchments over timescales ranging from one to twelve months ahead. We evaluate the improvement in model fit and model forecasting skill that comes from using land cover classes as predictors in the models. This work opens up new possibilities for combining Earth Observation time series with GCM forecasts to predict a variety of hazards from space using data science techniques.

  12. Geoinformation web-system for processing and visualization of large archives of geo-referenced data

    NASA Astrophysics Data System (ADS)

    Gordov, E. P.; Okladnikov, I. G.; Titov, A. G.; Shulgina, T. M.

    2010-12-01

    Developed working model of information-computational system aimed at scientific research in area of climate change is presented. The system will allow processing and analysis of large archives of geophysical data obtained both from observations and modeling. Accumulated experience of developing information-computational web-systems providing computational processing and visualization of large archives of geo-referenced data was used during the implementation (Gordov et al, 2007; Okladnikov et al, 2008; Titov et al, 2009). Functional capabilities of the system comprise a set of procedures for mathematical and statistical analysis, processing and visualization of data. At present five archives of data are available for processing: 1st and 2nd editions of NCEP/NCAR Reanalysis, ECMWF ERA-40 Reanalysis, JMA/CRIEPI JRA-25 Reanalysis, and NOAA-CIRES XX Century Global Reanalysis Version I. To provide data processing functionality a computational modular kernel and class library providing data access for computational modules were developed. Currently a set of computational modules for climate change indices approved by WMO is available. Also a special module providing visualization of results and writing to Encapsulated Postscript, GeoTIFF and ESRI shape files was developed. As a technological basis for representation of cartographical information in Internet the GeoServer software conforming to OpenGIS standards is used. Integration of GIS-functionality with web-portal software to provide a basis for web-portal’s development as a part of geoinformation web-system is performed. Such geoinformation web-system is a next step in development of applied information-telecommunication systems offering to specialists from various scientific fields unique opportunities of performing reliable analysis of heterogeneous geophysical data using approved computational algorithms. It will allow a wide range of researchers to work with geophysical data without specific programming knowledge and to concentrate on solving their specific tasks. The system would be of special importance for education in climate change domain. This work is partially supported by RFBR grant #10-07-00547, SB RAS Basic Program Projects 4.31.1.5 and 4.31.2.7, SB RAS Integration Projects 4 and 9.

  13. Temporal variability of total cloud cover at a Mediterranean megacity in the 20th century: Evidence from visual observations and climate models

    NASA Astrophysics Data System (ADS)

    Founda, Dimitra; Giannakopoulos, Christos; Pierros, Fragiskos

    2013-04-01

    Cloud cover is one of the major factors that determine the radiation budget and the climate system of the Earth. Moreover, the response of clouds has always been an important source of uncertainty in global climate models. Visual surface observations of clouds have been conducted at the National Observatory of Athens (NOA) since the mid 19th century. The historical archive of cloud reports at NOA since 1860 has been digitized and updated, spanning now a period of one and a half century. Mean monthly values of total cloud cover were derived by averaging subdaily observations of cloud cover (3 observations/day). Changes in observational practice (e.g. from 1/10 to 1/8 units) were considered, however, subjective measures of cloud cover from trained observers introduces some kind of uncertainty in the time series. Data before 1884 were considered unreliable, so the analysis was restricted to the series from 1884 to 2012. The time series of total cloud cover at NOA is validated and correlated with historical time series of other (physically related) variables such as the total sunshine duration as well as DTR (Diurnal Temperature Range) which are independently measured. Trend analysis was performed on the mean annual and seasonal series of total cloud cover from 1884-2012. The mean annual values show a marked temporal variability with sub periods of decreasing and increasing tendencies, however, the overall linear trend is positive and statistically significant (p <0.001) amounting to +2% per decade and implying a total increase of almost 25% for the whole analysed period. These results are in agreement qualitatively with the trends reported in other studies worldwide, especially concerning the period before the mid 20th century. On a seasonal basis, spring and summer series present outstanding positive long term trends, while in winter and autumn total cloud cover reveals also positive but less pronounced long term trends Additionally, an evaluation of cloud cover and/or sunshine duration/diurnal temperature range as depicted by regional climate models over Athens will be performed. Regional climate models are valuable tools for projections of future climate change but their performance is typically assessed only in terms of temperature and precipitation. The representation of non-standard parameters such as cloud cover and/or sunshine duration/diurnal temperature range has so far seen little or no evaluation in the models and can therefore be prone to large uncertainties. Regional climate models developed in the framework of recent EU projects, such as the ENSEMBLES (www.ensembles-eu.org) and the CIRCE (www.circeproject.eu) projects, will be used and an initial validation of these parameters against the historical archive of NOA will be performed.

  14. Albedos of Small Hilda Asteroids

    NASA Astrophysics Data System (ADS)

    Ryan, Erin L.; Woodward, C. E.

    2010-10-01

    We present albedo results for 70 small Hilda dynamical family members detected by the Spitzer Space Telescope in multiple archival programs. This Spitzer data samples Hildas with diameters between 2 and 11 kilometers. Our preliminary analysis reveals that the mean geometric albedo for this sample is pv = 0.05, matching the mean albedo derived for large (20 to 160 km) Hilda asteroids observed by IRAS (Ryan and Woodward 2010). This mean albedo is significantly darker than the mean albedo of asteroids in the outer main belt (2.8 AU < a < 3.5 AU), possibly suggesting that these asteroids did not originate from the outer main belt . This is in direct conflict with some dynamical models which suggest that the HIldas are field asteroids trapped from an inward migration of Jupiter (Franklin et al. 2004), and may provide additional observation support for delivery of dark Kuiper Belt contaminants to the inner solar system as per the Nice Model (Levison et al. 2009).

  15. Stationarity of the Tropical Pacific Teleconnection to North America in CMIP5 PMIP3 Model Simulations

    NASA Technical Reports Server (NTRS)

    Coats, Sloan; Smerdon, Jason E.; Cook, Benjamin I.; Seager, Richard

    2013-01-01

    The temporal stationarity of the teleconnection between the tropical Pacific Ocean and North America (NA) is analyzed in atmosphere-only, and coupled last-millennium, historical, and control runs from the Coupled Model Intercomparison Project Phase 5 data archive. The teleconnection, defined as the correlation between December-January-February (DJF) tropical Pacific sea surface temperatures (SSTs) and DJF 200 mb geopotential height, is found to be nonstationary on multidecadal timescales. There are significant changes in the spatial features of the teleconnection over NA in continuous 56-year segments of the last millennium and control simulations. Analysis of atmosphere-only simulations forced with observed SSTs indicates that atmospheric noise cannot account for the temporal variability of the teleconnection, which instead is likely explained by the strength of, and multidecadal changes in, tropical Pacific Ocean variability. These results have implications for teleconnection-based analyses of model fidelity in simulating precipitation, as well as any reconstruction and forecasting efforts that assume stationarity of the observed teleconnection.

  16. Observed temperature trends in the Indian Ocean over 1960-1999 and associated mechanisms

    NASA Astrophysics Data System (ADS)

    Alory, Gaël; Wijffels, Susan; Meyers, Gary

    2007-01-01

    The linear trends in oceanic temperature from 1960 to 1999 are estimated using the new Indian Ocean Thermal Archive (IOTA), a compilation of historical temperature profiles. Widespread surface warming is found, as in other data sets, and reproduced in IPCC climate model simulations for the 20th century. This warming is particularly large in the subtropics, and extends down to 800 m around 40-50°S. Models suggest the deep-reaching subtropical warming is related to a 0.5° southward shift of the subtropical gyre driven by a strengthening of the westerly winds, and associated with an upward trend in the Southern Annular Mode index. In the tropics, IOTA shows a subsurface cooling corresponding to a shoaling of the thermocline and increasing vertical stratification. Most models suggest this trend in the tropical Indian thermocline is likely associated with the observed weakening of the Pacific trade winds and transmitted to the Indian Ocean by the Indonesian throughflow.

  17. New Archiving Distributed InfrastructuRe (NADIR): Status and Evolution

    NASA Astrophysics Data System (ADS)

    De Marco, M.; Knapic, C.; Smareglia, R.

    2015-09-01

    The New Archiving Distributed InfrastructuRe (NADIR) has been developed at INAF-OATs IA2 (Italian National Institute for Astrophysics - Astronomical Observatory of Trieste, Italian center of Astronomical Archives), as an evolution of the previous archiving and distribution system, used on several telescopes (LBT, TNG, Asiago, etc.) to improve performance, efficiency and reliability. At the present, NADIR system is running on LBT telescope and Vespa (Italian telescopes network for outreach) Ramella et al. (2014), and will be used on TNG, Asiago and IRA (Istituto Radio Astronomia) archives of Medicina, Noto and SRT radio telescopes Zanichelli et al. (2014) as the data models for radio data will be ready. This paper will discuss the progress status, the architectural choices and the solutions adopted, during the development and the commissioning phase of the project. A special attention will be given to the LBT case, due to some critical aspect of data flow and policies and standards compliance, adopted by the LBT organization.

  18. Interactive Webmap-Based Science Planning for BepiColombo

    NASA Astrophysics Data System (ADS)

    McAuliffe, J.; Martinez, S.; Ortiz de Landaluce, I.; de la Fuente, S.

    2015-10-01

    For BepiColombo, ESA's Mission to Mercury, we will build a web-based, map-based interface to the Science Planning System. This interface will allow the mission's science teams to visually define targets for observations and interactively specify what operations will make up the given observation. This will be a radical departure from previous ESA mission planning methods. Such an interface will rely heavily on GIS technologies. This interface will provide footprint coverage of all existing archived data for Mercury, including a set of built-in basemaps. This will allow the science teams to analyse their planned observations and operational constraints with relevant contextual information from their own instrument, other BepiColombo instruments or from previous missions. The interface will allow users to import and export data in commonly used GIS formats, such that it can be visualised together with the latest planning information (e.g. import custom basemaps) or analysed in other GIS software. The interface will work with an object-oriented concept of an observation that will be a key characteristic of the overall BepiColombo scienceplanning concept. Observation templates or classes will be tracked right through the planning-executionprocessing- archiving cycle to the final archived science products. By using an interface that synthesises all relevant available information, the science teams will have a better understanding of the operational environment; it will enhance their ability to plan efficiently minimising or removing manual planning. Interactive 3D visualisation of the planned, scheduled and executed observations, simulation of the viewing conditions and interactive modification of the observation parameters are also being considered.

  19. Impacts of the Detection of Cassiopeia A Point Source.

    PubMed

    Umeda; Nomoto; Tsuruta; Mineshige

    2000-05-10

    Very recently the Chandra first light observation discovered a point-like source in the Cassiopeia A supernova remnant. This detection was subsequently confirmed by the analyses of the archival data from both ROSAT and Einstein observations. Here we compare the results from these observations with the scenarios involving both black holes (BHs) and neutron stars (NSs). If this point source is a BH, we offer as a promising model a disk-corona type model with a low accretion rate in which a soft photon source at approximately 0.1 keV is Comptonized by higher energy electrons in the corona. If it is an NS, the dominant radiation observed by Chandra most likely originates from smaller, hotter regions of the stellar surface, but we argue that it is still worthwhile to compare the cooler component from the rest of the surface with cooling theories. We emphasize that the detection of this point source itself should potentially provide enormous impacts on the theories of supernova explosion, progenitor scenario, compact remnant formation, accretion to compact objects, and NS thermal evolution.

  20. GES DISC Greenhouse Gas Data Sets and Associated Services

    NASA Technical Reports Server (NTRS)

    Sherman, Elliot; Wei, Jennifer; Vollmer, Bruce; Meyer, David

    2017-01-01

    NASA Goddard Earth Sciences (GES) Data and Information Services Center (DISC) archives and distributes rich collections of data on atmospheric greenhouse gases from multiple missions. Hosted data include those from the Atmospheric Infrared Sounder (AIRS) mission (which has observed CO2, CH4, ozone, and water vapor since 2002); legacy water vapor and ozone retrievals from TIROS Operational Vertical Sounder (TOVS); and Upper Atmosphere Research Satellite (UARS) going back to the early 1980s. GES DISC also archives and supports data from seven projects of the Making Earth System Data Records for Use in Research Environments (MEaSUREs) program that have ozone and water vapor records. Greenhouse gases data from the A-Train satellite constellation is also available: (1) Aura-Ozone Monitoring Instrument (OMI) and Microwave Limb Sounder (MLS) ozone, nitrous oxide, and water vapor since 2004; (2) Greenhouse Gases Observing Satellite (GOSAT) CO2 observations since 2009 from the Atmospheric CO2 Observations from Space (ACOS) task; and (3) Orbiting Carbon Observatory-2 (OCO-2) CO2 data since 2014. The most recent related data set that the GES DISC archives is methane flux for North America, as part of NASAs Carbon Monitoring System (CMS) project. This dataset contains estimates of methane emission in North America based on an inversion of the GEOS-Chem chemical transport model constrained by GOSAT observations (Turner et al., 2015). Along with data stewardship, an important focus area of the GES DISC is to enhance the usability of its data and broaden its user base. Users have unrestricted access to a new user-friendly search interface, which includes many services such as variable subsetting, format conversion, quality screening, and quick browse. The majority of the GES DISC data sets are also accessible through Open-source Project for a Network Data Access Protocol (OPeNDAP) and Web Coverage Service (WCS). The latter two services provide more options for specialized subsetting, format conversion, and image viewing. Additional data exploration, data preview, and preliminary analysis capabilities are available via NASA Giovanni, which obviates the need forusers to download the data (Acker and Leptoukh, 2007). Giovanni provides a bridge between the data and science and has been very successful in extending GES DISC data to educational users and to users with limited resources.

  1. Clinical experience with a high-performance ATM-connected DICOM archive for cardiology

    NASA Astrophysics Data System (ADS)

    Solomon, Harry P.

    1997-05-01

    A system to archive large image sets, such as cardiac cine runs, with near realtime response must address several functional and performance issues, including efficient use of a high performance network connection with standard protocols, an architecture which effectively integrates both short- and long-term mass storage devices, and a flexible data management policy which allows optimization of image distribution and retrieval strategies based on modality and site-specific operational use. Clinical experience with such as archive has allowed evaluation of these systems issues and refinement of a traffic model for cardiac angiography.

  2. Cooperative observation data center for planets: starting with the Mars 2009-2010 observation

    NASA Astrophysics Data System (ADS)

    Nakakushi, T.; Okyudo, M.; Tomita, A.

    2009-12-01

    We propose in this paper a plan to construct a planetary image data center on the internet, which links professional researchers and amateur observers all over the world. Such data archive projects have worked, at least for Mars. Since 2003, one of the authors (T. N.) have started a project to summarize Mars observations using such cooperative network observation data archives and to publish the summary as professional research papers (Nakakushi et al., 2004, 2005, and 2008). Planetary atmosphere varies in various timescales, which requires temporarily continuous observations. Cooperative observation which amateur observers join can keep the observation continuous and sustainable, so that it can be a strong weapon to reveal planetary climate and meteorology. For outer planets, in particular, we don't know synoptic "seasonal" variations because of their long periods of revolution. We need steady and persistent effort to accumulate observations. That is why we need amateur observers' high-level observation techniques. To do so, we also needs systems to provide (and reproduce) data for users in an appropriate manner. We start from Mars with our own new date archive website, because we have much experience in terms of Mars. Next, we will expand the system for all the planets. Roughly said, there will be 3 steps to expand the project to all the planets: (1) to construct our own Mars cooperative observation data center, (2) to link it with professional studies, (3) to construct cooperative observation data center for all planets. And 4 problems to tackle: (1) to develop web interfaces for users to submit data, (2) to develop interfaces for managers, (3) to secure finances, (4) to secure professional researchers. 2009 and 2010 are a good apparition for Mars observation. We manage the Mars image data website, find problems and solutions in detail, and search for ways to expand it to all the planet and to enable sustainable management.

  3. Human and Machine Entanglement in the Digital Archive: Academic Libraries and Socio-Technical Change

    ERIC Educational Resources Information Center

    Manoff, Marlene

    2015-01-01

    This essay urges a broadening of the discourse of library and information science (LIS) to address the convergence of forces shaping the information environment. It proposes adopting a model from the field of science studies that acknowledges the interdependence and coevolution of social, cultural, and material phenomena. Digital archives and…

  4. Connecting to Collections in Florida: Current Conditions and Critical Needs in Libraries, Archives, and Museums

    ERIC Educational Resources Information Center

    Jorgensen, Corinne; Marty, Paul F.; Braun, Kathy

    2012-01-01

    This article presents results from an IMLS-funded project to evaluate the current state of collections in Florida's libraries, archives, and museums, current practices to preserve and conserve these collections, and perceived needs to maintain and improve these collections for future generations. The survey, modeled after the Heritage Health Index…

  5. Global and regional emissions of HFC-125 (CHF2CF3) from in situ and air archive atmospheric observations at AGAGE and SOGE observatories

    NASA Astrophysics Data System (ADS)

    O'Doherty, S.; Cunnold, D. M.; Miller, B. R.; Mühle, J.; McCulloch, A.; Simmonds, P. G.; Manning, A. J.; Reimann, S.; Vollmer, M. K.; Greally, B. R.; Prinn, R. G.; Fraser, P. J.; Steele, L. P.; Krummel, P. B.; Dunse, B. L.; Porter, L. W.; Lunder, C. R.; Schmidbauer, N.; Hermansen, O.; Salameh, P. K.; Harth, C. M.; Wang, R. H. J.; Weiss, R. F.

    2009-12-01

    High-frequency, in situ observations from the Advanced Global Atmospheric Gases Experiment (AGAGE) and System for Observation of halogenated Greenhouse gases in Europe (SOGE) networks for the period 1998 to 2008, combined with archive flask measurements dating back to 1978, have been used to capture the rapid growth of HFC-125 (CHF2CF3) in the atmosphere. HFC-125 is the fifth most abundant HFC, and it currently makes the third largest contribution of the HFCs to atmospheric radiative forcing. At the beginning of 2008 the global average was 5.6 ppt in the lower troposphere and the growth rate was 16% yr-1. The extensive observations have been combined with a range of modeling techniques to derive global emission estimates in a top-down approach. It is estimated that 21 kt were emitted globally in 2007, and the emissions are estimated to have increased 15% yr-1 since 2000. These estimates agree within approximately 20% with values reported to the United Nations Framework Convention on Climate Change (UNFCCC) provided that estimated emissions from East Asia are included. Observations of regionally polluted air masses at individual AGAGE sites have been used to produce emission estimates for Europe (the EU-15 countries), the United States, and Australia. Comparisons between these top-down estimates and bottom-up estimates based on reports by individual countries to the UNFCCC show a range of approximately four in the differences. This process of independent verification of emissions, and an understanding of the differences, is vital for assessing the effectiveness of international treaties, such as the Kyoto Protocol.

  6. Back to the Future: Long-Term Seismic Archives Revisited

    NASA Astrophysics Data System (ADS)

    Waldhauser, F.; Schaff, D. P.

    2007-12-01

    Archives of digital seismic data recorded by seismometer networks around the world have grown tremendously over the last several decades helped by the deployment of seismic stations and their continued operation within the framework of monitoring seismic activity. These archives typically consist of waveforms of seismic events and associated parametric data such as phase arrival time picks and the location of hypocenters. Catalogs of earthquake locations are fundamental data in seismology, and even in the Earth sciences in general. Yet, these locations have notoriously low spatial resolution because of errors in both the picks and the models commonly used to locate events one at a time. This limits their potential to address fundamental questions concerning the physics of earthquakes, the structure and composition of the Earth's interior, and the seismic hazards associated with active faults. We report on the comprehensive use of modern waveform cross-correlation based methodologies for high- resolution earthquake location - as applied to regional and global long-term seismic databases. By simultaneous re-analysis of two decades of the digital seismic archive of Northern California, reducing pick errors via cross-correlation and model errors via double-differencing, we achieve up to three orders of magnitude resolution improvement over existing hypocenter locations. The relocated events image networks of discrete faults at seismogenic depths across various tectonic settings that until now have been hidden in location uncertainties. Similar location improvements are obtained for earthquakes recorded at global networks by re- processing 40 years of parametric data from the ISC and corresponding waveforms archived at IRIS. Since our methods are scaleable and run on inexpensive Beowulf clusters, periodic re-analysis of entire archives may thus become a routine procedure to continuously improve resolution in existing catalogs. We demonstrate the role of seismic archives in obtaining the precise location of new events in real-time. Such information has considerable social and economic impact in the evaluation and mitigation of seismic hazards, for example, and highlights the need for consistent long-term seismic monitoring and archiving of records.

  7. Searches for 3.5 keV Absorption Features in Cluster AGN Spectra

    NASA Astrophysics Data System (ADS)

    Conlon, Joseph P.

    2018-06-01

    We investigate possible evidence for a spectral dip around 3.5 keV in central cluster AGNs, motivated by previous results for archival Chandra observations of the Perseus cluster and the general interest in novel spectral features around 3.5 keV that may arise from dark matter physics. We use two deep Chandra observations of the Perseus and Virgo clusters that have recently been made public. In both cases, mild improvements in the fit (Δχ2 = 4.2 and Δχ2 = 2.5) are found by including such a dip at 3.5 keV into the spectrum. A comparable result (Δχ2 = 6.5) is found re-analysing archival on-axis Chandra ACIS-S observations of the centre of the Perseus cluster.

  8. OGLE-2002-BLG-360: from a gravitational microlensing candidate to an overlooked red transient

    NASA Astrophysics Data System (ADS)

    Tylenda, R.; Kamiński, T.; Udalski, A.; Soszyński, I.; Poleski, R.; Szymański, M. K.; Kubiak, M.; Pietrzyński, G.; Kozłowski, S.; Pietrukowicz, P.; Ulaczyk, K.; Wyrzykowski, Ł.

    2013-07-01

    Context. OGLE-2002-BLG-360 was discovered as a microlensing candidate by the OGLE-III project. The subsequent light curve, however, clearly showed that the brightening of the object could not have resulted from the gravitational microlensing phenomenon. Aims: We aim to explain the nature of OGLE-2002-BLG-360 and its eruption observed in 2002-2006. Methods: The observational data primarily come from the archives of the OGLE project, which monitored the object in 2001-2009. The archives of the MACHO and MOA projects also provided us with additional data obtained in 1995-99 and 2000-2005, respectively. These data allowed us to analyse the light curve of the object during its eruption, as well as the potential variability of its progenitor. In the archives of several infrared surveys, namely 2MASS, MSX, Spitzer, AKARI, WISE, and VVV, we found measurements of the object, which allowed us to study the spectral energy distribution (SED) of the object. We constructed a simple model of a star surrounded by a dusty envelope, which was used to interpret the observed SED. Results: Our analysis of the data clearly shows that OGLE-2002-BLG-360 was most probably a red transient, i.e. an object similar in nature to V838 Mon, whose eruption was observed in 2002. The SED in all phases, i.e. progenitor, eruption, and remnant, was dominated by infrared emission, which we interpret as evidence of dust formation in an intense mass outflow. Since 2009 the object has been completely embedded in dust. Conclusions: We suggest that the progenitor of OGLE-2002-BLG-360 was a binary, which had entered the common-envelope phase a long time (at least decades) before the observed eruption, and that the eruption resulted from the final merger of the binary components. We point out similarities between OGLE-2002-BLG-360 and CK Vul, whose eruption was observed in 1670-72, and this strengthens the hypothesis that CK Vul was also a red transient. Based on observations obtained with the 1.3-m Warsaw telescope at the Las Campanas Observatory of the Carnegie Institution for Science.We dedicate this paper to the memory of the late Professor Bohdan Paczyński who, from its discovery, traced the strange behaviour of OGLE-2002-BLG-360 with great interest. The analysis of the light curve of this unusual object maintained his excitement for science, despite serious illness.

  9. Improving the photometric precision of IRAC Channel 1

    NASA Astrophysics Data System (ADS)

    Mighell, Kenneth J.; Glaccum, William; Hoffmann, William

    2008-07-01

    Planning is underway for a possible post-cryogenic mission with the Spitzer Space Telescope. Only Channels 1 and 2 (3.6 and 4.5 μm) of the Infrared Array Camera (IRAC) will be operational; they will have unmatched sensitivity from 3 to 5 microns until the James Webb Space Telescope is launched. At SPIE Orlando, Mighell described his NASA-funded MATPHOT algorithm for precision stellar photometry and astrometry and presented MATPHOT-based simulations that suggested Channel 1 stellar photometry may be significantly improved by modeling the nonuniform RQE within each pixel, which, when not taken into account in aperture photometry, causes the derived flux to vary according to where the centroid falls within a single pixel (the pixel-phase effect). We analyze archival observations of calibration stars and compare the precision of stellar aperture photometry, with the recommended 1-dimensional and a new 2-dimensional pixel-phase aperture-flux correction, and MATPHOT-based PSF-fitting photometry which accounts for the observed loss of stellar flux due to the nonuniform intrapixel quantum efficiency. We show how the precision of aperture photometry of bright isolated stars corrected with the new 2-dimensional aperture-flux correction function can yield photometry that is almost as precise as that produced by PSF-fitting procedures. This timely research effort is intended to enhance the science return not only of observations already in Spitzer data archive but also those that would be made during the Spitzer Warm Mission.

  10. The IRIS Data Management Center: Enabling Access to Observational Time Series Spanning Decades

    NASA Astrophysics Data System (ADS)

    Ahern, T.; Benson, R.; Trabant, C.

    2009-04-01

    The Incorporated Research Institutions for Seismology (IRIS) is funded by the National Science Foundation (NSF) to operate the facilities to generate, archive, and distribute seismological data to research communities in the United States and internationally. The IRIS Data Management System (DMS) is responsible for the ingestion, archiving, curation and distribution of these data. The IRIS Data Management Center (DMC) manages data from more than 100 permanent seismic networks, hundreds of temporary seismic deployments as well as data from other geophysical observing networks such as magnetotelluric sensors, ocean bottom sensors, superconducting gravimeters, strainmeters, surface meteorological measurements, and in-situ atmospheric pressure measurements. The IRIS DMC has data from more than 20 different types of sensors. The IRIS DMC manages approximately 100 terabytes of primary observational data. These data are archived in multiple distributed storage systems that insure data availability independent of any single catastrophic failure. Storage systems include both RAID systems of greater than 100 terabytes as well as robotic tape robots of petabyte capacity. IRIS performs routine transcription of the data to new media and storage systems to insure the long-term viability of the scientific data. IRIS adheres to the OAIS Data Preservation Model in most cases. The IRIS data model requires the availability of metadata describing the characteristics and geographic location of sensors before data can be fully archived. IRIS works with the International Federation of Digital Seismographic Networks (FDSN) in the definition and evolution of the metadata. The metadata insures that the data remain useful to both current and future generations of earth scientists. Curation of the metadata and time series is one of the most important activities at the IRIS DMC. Data analysts and an automated quality assurance system monitor the quality of the incoming data. This insures data are of acceptably high quality. The formats and data structures used by the seismological community are esoteric. IRIS and its FDSN partners are developing web services that can transform the data holdings to structures that are more easily used by broader scientific communities. For instance, atmospheric scientists are interested in using global observations of microbarograph data but that community does not understand the methods of applying instrument corrections to the observations. Web processing services under development at IRIS will transform these data in a manner that allows direct use within such analysis tools as MATLAB® already in use by that community. By continuing to develop web-service based methods of data discovery and access, IRIS is enabling broader access to its data holdings. We currently support data discovery using many of the Open Geospatial Consortium (OGC) web mapping services. We are involved in portal technologies to support data discovery and distribution for all data from the EarthScope project. We are working with computer scientists at several universities including the University of Washington as part of a DataNet proposal and we intend to enhance metadata, further develop ontologies, develop a Registry Service to aid in the discovery of data sets and services, and in general improve the semantic interoperability of the data managed at the IRIS DMC. Finally IRIS has been identified as one of four scientific organizations that the External Research Division of Microsoft wants to work with in the development of web services and specifically with the development of a scientific workflow engine. More specific details of current and future developments at the IRIS DMC will be included in this presentation.

  11. Radioactive Pollution Estimate for Fukushima Nuclear Power Plant by a Particle Model

    NASA Astrophysics Data System (ADS)

    Saito, Keisuke; Ogawa, Susumu

    2016-06-01

    On Mar 12, 2011, very wide radioactive pollution occurred by a hydrogen explosion in Fukushima Nuclear Power Plant. A large amount of radioisotopes started with four times of explosions. With traditional atmospheric diffusion models could not reconstruct radioactive pollution in Fukushima. Then, with a particle model, this accident was reconstructed from meteorological archive and Radar- AMeDAS. Calculations with the particle model were carried out for Mar 12, 15, 18 and 20 when east southeast winds blew for five hours continuously. Meteorological archive is expressed by wind speeds and directions in five-km grid every hour with eight classes of height till 3000 m. Radar- AMeDAS is precipitation data in one-km grid every thirty minutes. Particles are ten scales of 0.01 to 0.1 mm in diameter with specific weight of 2.65 and vertical speeds given by Stokes equation. But, on Mar 15, it rained from 16:30 and then the particles fell down at a moment as wet deposit in calculation. On the other hand, the altitudes on the ground were given by DEM with 1 km-grid. The spatial dose by emitted radioisotopes was referred to the observation data at monitoring posts of Tokyo Electric Power Company. The falling points of radioisotopes were expressed on the map using the particle model. As a result, the same distributions were obtained as the surface spatial dose of radioisotopes in aero-monitoring by Ministry of Education, Culture, Sports, Science and Technology. Especially, on Mar 15, the simulated pollution fitted to the observation, which extended to the northwest of Fukushima Daiichi Nuclear Power Plant and caused mainly sever pollution. By the particle model, the falling positions on the ground were estimated each particle size. Particles with more than 0.05 mm of size were affected by the topography and blocked by the mountains with the altitudes of more than 700 m. The particle model does not include the atmospheric stability, the source height, and deposit speeds. The present assignment is how to express the difference of deposition each nucleus.

  12. Purple spot damage dynamics investigated by an integrated approach on a 1244 A.D. parchment roll from the Secret Vatican Archive.

    PubMed

    Migliore, Luciana; Thaller, Maria Cristina; Vendittozzi, Giulia; Mejia, Astrid Yazmine; Mercuri, Fulvio; Orlanducci, Silvia; Rubechini, Alessandro

    2017-09-07

    Ancient parchments are commonly attacked by microbes, producing purple spots and detachment of the superficial layer. Neither standard cultivation nor molecular methods (DGGE) solved the issue: causative agents and colonization model are still unknown. To identify the putative causal agents, we describe the 16 S rRNA gene analysis (454-pyrosequencing) of the microbial communities colonizing a damaged parchment roll dated 1244 A.D. (A.A. Arm. I-XVIII 3328, Vatican Secret Archives). The taxa in damaged or undamaged areas of the same document were different. In the purple spots, marine halotolerant Gammaproteobacteria, mainly Vibrio, were found; these microorganisms are rare or absent in the undamaged areas. Ubiquitous and environmental microorganisms were observed in samples from both damaged and undamaged areas. Pseudonocardiales were the most common, representing the main colonizers of undamaged areas. We hypothesize a successional model of biodeterioration, based on metagenomic data and spectroscopic analysis of pigments, which help to relate the damage to a microbial agent. Furthermore, a new method (Light Transmitted Analysis) was utilized to evaluate the kind and entity of the damage to native collagen. These data give a significant advance to the knowledge in the field and open new perspectives to remediation activity on a huge amount of ancient document.

  13. A Comparison Between Spectral Properties of ULXs and Luminous X-ray Binaries

    NASA Astrophysics Data System (ADS)

    Berghea, C. T.; Colbert, E. J. M.; Roberts, T. P.

    2004-05-01

    What is special about the 1039 erg s-1 limit that is used to define the ULX class? We investigate this question by analyzing Chandra X-ray spectra of 71 X-ray bright point sources from nearby galaxies. Fifty-one of these sources are ULXs (LX(0.3-8.0 keV) ≥ 1039 erg s-1), and 20 sources (our comparison sample) are less-luminous X-ray binaries with LX(0.3-8.0 keV) = 1038-39 erg s-1. Our sample objects were selected from the Chandra archive to have ≥1000 counts and thus represent the highest quality spectra in the Chandra archives for extragalactic X-ray binaries and ULXs. We fit the spectra with one-component models (e.g., cold absorption with power-law, or cold absorption with multi-colored disk blackbody) and two-component models (e.g. absorption with both a power-law and a multi colored disk blackbody). A crude measure of the spectral states of the sources are determined observationally by calibrating the strength of the disk (blackbody) and coronal (power-law) components. These results are then use to determine if spectral properties of the ULXs are statistically distinct from those of the comparison objects, which are assumed to be ``normal'' black-hole X-ray binaries.

  14. On the Structure of Earth Science Data Collections

    NASA Astrophysics Data System (ADS)

    Barkstrom, B. R.

    2009-12-01

    While there has been substantial work in the IT community regarding metadata and file identifier schemas, there appears to be relatively little work on the organization of the file collections that constitute the preponderance of Earth science data. One symptom of this difficulty appears in nomenclature describing collections: the terms `Data Product,' `Data Set,' and `Version' are overlaid with multiple meanings between communities. A particularly important aspect of this lack of standardization appears when the community attempts to developa schema for data file identifiers. There are four candidate families of identifiers: ● Randomly assigned identifiers, such as GUIDs or UUIDs, ● Segmented numerical identifiers, such as OIDs or the prefixes for DOIs, ● Extensible URL-based identifiers, such as URNs, PURL, ARK, and similar schemas, ● Text-based identifiers based on citations for papers and books, such as those suggested for the International Polar Year (IPY) citations. Unfortunately, these schema families appear to be devoid of content based on the actual structures of Earth science data collections. In this paper, we consider an organization based on an industrial production paradigm that appears to provide the preponderance of Earth science data from satellites and in situ observations. This paradigm produces a hierarchical collection structure, similar to one discussed in Barkstrom [2003: Lecture Notes in Computer Science, 2649, pp. 118-133]. In this organization, three key collection types are ● a Data Product, which is a collection of files that have similar key parameters and included data time interval, ● a Data Set, which is a collection of files within a Data Product that comes from a specified set of Data Sources, ● a Data Set Version, which is a collection of files within a Data Set for which the data producer has attempted to ensure error homogeneity. Within a Data Set Version, files appear as a time series of instances that may be identified by the starting time of the data in the file. For data intended for climate uses, it seems appropriate to state this time in terms of Astronomical Julian Date, which is a long-standing international standard that provides continuity between current observations and paleo-climatic observations. Because this collection structure is hierarchical, it could be used by either of the two hierarchical identifier schema families, although it is probably easier to use with the OID/DOI family. This hierarchical collection structure fits into the hierarchical structure of Archival Information Packages (AIPs) identified in the Open Archival Information Systems (OAIS) Reference Model. In that model, AIPs are subdivided into Archival Information Units (AIUs), which describe individual files, or Archival Information Collections (AICs). The latter can be hierarchically nested, leading to an OAIS RM-consistent collection structure that does not appear clearly in other metadata standards. This paper will also discuss the connection between these collection categories and other metadata, as well as the possible need for other organizational schemas to capture the full range of Earth science data collection structures.

  15. SURA-IOOS Coastal Inundation Testbed Inter-Model Evaluation of Tides, Waves, and Hurricane Surge in the Gulf of Mexico

    NASA Astrophysics Data System (ADS)

    Kerr, P. C.; Donahue, A.; Westerink, J. J.; Luettich, R.; Zheng, L.; Weisberg, R. H.; Wang, H. V.; Slinn, D. N.; Davis, J. R.; Huang, Y.; Teng, Y.; Forrest, D.; Haase, A.; Kramer, A.; Rhome, J.; Feyen, J. C.; Signell, R. P.; Hanson, J. L.; Taylor, A.; Hope, M.; Kennedy, A. B.; Smith, J. M.; Powell, M. D.; Cardone, V. J.; Cox, A. T.

    2012-12-01

    The Southeastern Universities Research Association (SURA), in collaboration with the NOAA Integrated Ocean Observing System program and other federal partners, developed a testbed to help accelerate progress in both research and the transition to operational use of models for both coastal and estuarine prediction. This testbed facilitates cyber-based sharing of data and tools, archival of observation data, and the development of cross-platform tools to efficiently access, visualize, skill assess, and evaluate model results. In addition, this testbed enables the modeling community to quantitatively assess the behavior (e.g., skill, robustness, execution speed) and implementation requirements (e.g. resolution, parameterization, computer capacity) that characterize the suitability and performance of selected models from both operational and fundamental science perspectives. This presentation focuses on the tropical coastal inundation component of the testbed and compares a variety of model platforms as well as grids in simulating tides, and the wave and surge environments for two extremely well documented historical hurricanes, Hurricanes Rita (2005) and Ike (2008). Model platforms included are ADCIRC, FVCOM, SELFE, SLOSH, SWAN, and WWMII. Model validation assessments were performed on simulation results using numerous station observation data in the form of decomposed harmonic constituents, water level high water marks and hydrographs of water level and wave data. In addition, execution speed, inundation extents defined by differences in wetting/drying schemes, resolution and parameterization sensitivities are also explored.

  16. Models Archive and ModelWeb at NSSDC

    NASA Astrophysics Data System (ADS)

    Bilitza, D.; Papitashvili, N.; King, J. H.

    2002-05-01

    In addition to its large data holdings, NASA's National Space Science Data Center (NSSDC) also maintains an archive of space physics models for public use (ftp://nssdcftp.gsfc.nasa.gov/models/). The more than 60 model entries cover a wide range of parameters from the atmosphere, to the ionosphere, to the magnetosphere, to the heliosphere. The models are primarily empirical models developed by the respective model authors based on long data records from ground and space experiments. An online model catalog (http://nssdc.gsfc.nasa.gov/space/model/) provides information about these and other models and links to the model software if available. We will briefly review the existing model holdings and highlight some of its usages and users. In response to a growing need by the user community, NSSDC began to develop web-interfaces for the most frequently requested models. These interfaces enable users to compute and plot model parameters online for the specific conditions that they are interested in. Currently included in the Modelweb system (http://nssdc.gsfc.nasa.gov/space/model/) are the following models: the International Reference Ionosphere (IRI) model, the Mass Spectrometer Incoherent Scatter (MSIS) E90 model, the International Geomagnetic Reference Field (IGRF) and the AP/AE-8 models for the radiation belt electrons and protons. User accesses to both systems have been steadily increasing over the last years with occasional spikes prior to large scientific meetings. The current monthly rate is between 5,000 to 10,000 accesses for either system; in February 2002 13,872 accesses were recorded to the Modelsweb and 7092 accesses to the models archive.

  17. Human and natural influences on the changing thermal structure of the atmosphere

    PubMed Central

    Santer, Benjamin D.; Painter, Jeffrey F.; Bonfils, Céline; Mears, Carl A.; Solomon, Susan; Wigley, Tom M. L.; Gleckler, Peter J.; Schmidt, Gavin A.; Doutriaux, Charles; Gillett, Nathan P.; Taylor, Karl E.; Thorne, Peter W.; Wentz, Frank J.

    2013-01-01

    Since the late 1970s, satellite-based instruments have monitored global changes in atmospheric temperature. These measurements reveal multidecadal tropospheric warming and stratospheric cooling, punctuated by short-term volcanic signals of reverse sign. Similar long- and short-term temperature signals occur in model simulations driven by human-caused changes in atmospheric composition and natural variations in volcanic aerosols. Most previous comparisons of modeled and observed atmospheric temperature changes have used results from individual models and individual observational records. In contrast, we rely on a large multimodel archive and multiple observational datasets. We show that a human-caused latitude/altitude pattern of atmospheric temperature change can be identified with high statistical confidence in satellite data. Results are robust to current uncertainties in models and observations. Virtually all previous research in this area has attempted to discriminate an anthropogenic signal from internal variability. Here, we present evidence that a human-caused signal can also be identified relative to the larger “total” natural variability arising from sources internal to the climate system, solar irradiance changes, and volcanic forcing. Consistent signal identification occurs because both internal and total natural variability (as simulated by state-of-the-art models) cannot produce sustained global-scale tropospheric warming and stratospheric cooling. Our results provide clear evidence for a discernible human influence on the thermal structure of the atmosphere. PMID:24043789

  18. VizieR Online Data Catalog: Proper motions and photometry of stars in NGC 3201 (Sariya+, 2017)

    NASA Astrophysics Data System (ADS)

    Sariya, D. P.; Jiang, I.-G.; Yadav, R. K. S.

    2017-07-01

    To determine the PMs of the stars in this work, we used archive images (http://archive.eso.org/eso/esoarchivemain.html) from observations made with the 2.2m ESO/MPI telescope at La Silla, Chile. This telescope contains a mosaic camera called the Wide-Field Imager (WFI), consisting of 4*2 (i.e., 8 CCD chips). Since each CCD has an array of 2048*4096 pixels, WFI ultimately produces images with a 34*33arcmin2 field of view. The observational run of the first epoch contains two images in B,V and I bands, each with 240s exposure time observed on 1999 December 05. In the second epoch, we have 35 images with 40s exposure time each in V filter observed during the period of 2014 April 02-05. Thus the epoch gap between the data is ~14.3 years. (2 data files).

  19. Studies of Global Solar Magnetic Field Patterns Using a Newly Digitized Archive

    NASA Astrophysics Data System (ADS)

    Hewins, I.; Webb, D. F.; Gibson, S. E.; McFadden, R.; Emery, B. A.; Malanushenko, A. V.

    2017-12-01

    The McIntosh Archive consists of a set of hand-drawn solar Carrington maps created by Patrick McIntosh from 1964 to 2009. McIntosh used mainly Ha, He 10830Å and photospheric magnetic measurements from both ground-based and NASA satellite observations. With these he traced polarity inversion lines (PILs), filaments, sunspots and plage and, later, coronal holes, yielding a unique 45-year record of features associated with the large-scale organization of the solar magnetic field. We discuss our efforts to preserve and digitize this archive; the original hand-drawn maps have been scanned, a method for processing these scans into digital, searchable format has been developed, and a website and an archival repository at NOAA's National Centers for Environmental Information (NCEI) has been created. The archive is complete for SC 23 and partially complete for SCs 21 and 22. In this paper we show examples of how the data base can be utilized for scientific applications. We compare the evolution of the areas and boundaries of CHs with other recent results, and we use the maps to track the global, SC-evolution of filaments, large-scale positive and negative polarity regions, PILs and sunspots.

  20. Scalable Data Mining and Archiving for the Square Kilometre Array

    NASA Astrophysics Data System (ADS)

    Jones, D. L.; Mattmann, C. A.; Hart, A. F.; Lazio, J.; Bennett, T.; Wagstaff, K. L.; Thompson, D. R.; Preston, R.

    2011-12-01

    As the technologies for remote observation improve, the rapid increase in the frequency and fidelity of those observations translates into an avalanche of data that is already beginning to eclipse the resources, both human and technical, of the institutions and facilities charged with managing the information. Common data management tasks like cataloging both data itself and contextual meta-data, creating and maintaining scalable permanent archive, and making data available on-demand for research present significant software engineering challenges when considered at the scales of modern multi-national scientific enterprises such as the upcoming Square Kilometre Array project. The NASA Jet Propulsion Laboratory (JPL), leveraging internal research and technology development funding, has begun to explore ways to address the data archiving and distribution challenges with a number of parallel activities involving collaborations with the EVLA and ALMA teams at the National Radio Astronomy Observatory (NRAO), and members of the Square Kilometre Array South Africa team. To date, we have leveraged the Apache OODT Process Control System framework and its catalog and archive service components that provide file management, workflow management, resource management as core web services. A client crawler framework ingests upstream data (e.g., EVLA raw directory output), identifies its MIME type and automatically extracts relevant metadata including temporal bounds, and job-relevant/processing information. A remote content acquisition (pushpull) service is responsible for staging remote content and handing it off to the crawler framework. A science algorithm wrapper (called CAS-PGE) wraps underlying code including CASApy programs for the EVLA, such as Continuum Imaging and Spectral Line Cube generation, executes the algorithm, and ingests its output (along with relevant extracted metadata). In addition to processing, the Process Control System has been leveraged to provide data curation and automatic ingestion for the MeerKAT/KAT-7 precursor instrument in South Africa, helping to catalog and archive correlator and sensor output from KAT-7, and to make the information available for downstream science analysis. These efforts, supported by the increasing availability of high-quality open source software, represent a concerted effort to seek a cost-conscious methodology for maintaining the integrity of observational data from the upstream instrument to the archive, and at the same time ensuring that the data, with its richly annotated catalog of meta-data, remains a viable resource for research into the future.

  1. Exploring the blazar zone in high-energy flares of FSRQs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pacciani, L.; Donnarumma, I.; Tavecchio, F.

    2014-07-20

    The gamma-ray emission offers a powerful diagnostic tool to probe jets and their surroundings in flat-spectrum radio quasars (FSRQs). In particular, sources emitting at high energies (>10 GeV) give us the strongest constraints. This motivates us to start a systematic study of flares with bright emission above 10 GeV, examining archival data of the Fermi-LAT gamma-ray telescope. At the same time, we began to trigger Target of Opportunity observations to the Swift observatory at the occurrence of high-energy flares, obtaining a wide coverage of the spectral energy distributions (SEDs) for several FSRQs during flares. Among others, we investigate the SEDmore » of a peculiar flare of 3C 454.3, showing a remarkably hard gamma-ray spectrum, quite different from the brightest flares of this source, and a bright flare of CTA 102. We modeled the SED in the framework of the one-zone leptonic model, using also archival optical spectroscopic data to derive the luminosity of the broad lines and thus estimate the disk luminosity, from which the structural parameters of the FSRQ nucleus can be inferred. The model allowed us to evaluate the magnetic field intensity in the blazar zone and to locate the emitting region of gamma-rays in the particular case in which gamma-ray spectra show neither absorption from the broad-line region (BLR) nor the Klein-Nishina curvature expected in leptonic models assuming the BLR as the source of seed photons for the External Compton scenario. For FSRQs bright above 10 GeV, we were able to identify short periods lasting less than one day characterized by a high rate of high-energy gamma-rays and hard gamma-ray spectra. We discussed the observed spectra and variability timescales in terms of injection and cooling of energetic particles, arguing that these flares could be triggered by magnetic reconnection events or turbulence in the flow.« less

  2. First X-ray Statistical Tests for Clumpy-Torus Models: Constraints from RXTEmonitoring of Seyfert AGN

    NASA Astrophysics Data System (ADS)

    Markowitz, Alex; Krumpe, Mirko; Nikutta, R.

    2016-06-01

    In two papers (Markowitz, Krumpe, & Nikutta 2014, and Nikutta et al., in prep.), we derive the first X-ray statistical constraints for clumpy-torus models in Seyfert AGN by quantifying multi-timescale variability in line of-sight X-ray absorbing gas as a function of optical classification.We systematically search for discrete absorption events in the vast archive of RXTE monitoring of 55 nearby type Is and Compton-thin type IIs. We are sensitive to discrete absorption events due to clouds of full-covering, neutral/mildly ionized gas transiting the line of sight. Our results apply to both dusty and non-dusty clumpy media, and probe model parameter space complementary to that for eclipses observed with XMM-Newton, Suzaku, and Chandra.We detect twelve eclipse events in eight Seyferts, roughly tripling the number previously published from this archive. Event durations span hours to years. Most of our detected clouds are Compton-thin, and most clouds' distances from the black hole are inferred to be commensurate with the outer portions of the BLR or the inner regions of infrared-emitting dusty tori.We present the density profiles of the highest-quality eclipse events; the column density profile for an eclipsing cloud in NGC 3783 is doubly spiked, possibly indicating a cloud that is being tidallysheared. We discuss implications for cloud distributions in the context of clumpy-torus models. We calculate eclipse probabilities for orientation-dependent Type I/II unification schemes.We present constraints on cloud sizes, stability, and radial distribution. We infer that clouds' small angular sizes as seen from the SMBH imply 107 clouds required across the BLR + torus. Cloud size is roughly proportional to distance from the black hole, hinting at the formation processes (e.g., disk fragmentation). All observed clouds are sub-critical with respect to tidal disruption; self-gravity alone cannot contain them. External forces, such as magnetic fields or ambient pressure, are needed to contain them; otherwise, clouds must be short-lived.

  3. Chandra Grating Spectroscopy of Three Hot White Dwarfs

    NASA Technical Reports Server (NTRS)

    Adamczak, J.; Werner, K.; Rauch, T.; Schuh, S.; Drake, J. J.; Kruk, J. W.

    2013-01-01

    High-resolution soft X-ray spectroscopic observations of single hot white dwarfs are scarce. With the Chandra Low-Energy Transmission Grating, we have observed two white dwarfs, one is of spectral type DA (LB1919) and the other is a non-DA of spectral type PG1159 (PG1520+525). The spectra of both stars are analyzed, together with an archival Chandra spectrum of another DA white dwarf (GD246). Aims. The soft X-ray spectra of the two DA white dwarfs are investigated in order to study the effect of gravitational settling and radiative levitation of metals in their photospheres. LB1919 is of interest because it has a significantly lower metallicity than DAs with otherwise similar atmospheric parameters. GD246 is the only white dwarf known that shows identifiable individual iron lines in the soft X-ray range. For the PG1159 star, a precise effective temperature determination is performed in order to confine the position of the blue edge of the GW Vir instability region in the HRD. Methods. The Chandra spectra are analyzed with chemically homogeneous as well as stratified NLTE model atmospheres that assume equilibrium between gravitational settling and radiative acceleration of chemical elements. Archival EUV and UV spectra obtained with EUVE, FUSE, and HST are utilized to support the analysis. Results. No metals could be identified in LB1919. All observations are compatible with a pure hydrogen atmosphere. This is in stark contrast to the vast majority of hot DA white dwarfs that exhibit light and heavy metals and to the stratified models that predict significant metal abundances in the atmosphere. For GD246 we find that neither stratified nor homogeneous models can fit the Chandra spectrum. The Chandra spectrum of PG1520+525 constrains the effective temperature to T(sub eff) = 150 000 +/- 10 000 K. Therefore, this nonpulsating star together with the pulsating prototype of the GWVir class (PG1159-035) defines the location of the blue edge of the GWVir instability region. The result is in accordance with predictions from nonadiabatic stellar pulsation models. Such models are therefore reliable tools to investigate the interior structure of GW Vir variables. Conclusions. Our soft X-ray study reveals that the understanding of metal abundances in hot DA white dwarf atmospheres is still incomplete. On the other hand, model atmospheres of hydrogen-deficient PG1159-type stars are reliable and reproduce well the observed spectra from soft X-ray to optical wavelengths.

  4. 10Be in ice at high resolution: Solar activity and climate signals observed and GCM-modeled in Law Dome ice cores

    NASA Astrophysics Data System (ADS)

    Pedro, Joel; Heikkilä, Ulla; van Ommen, T. D.; Smith, A. M.

    2010-05-01

    Changes in solar activity modulate the galactic cosmic ray flux, and in turn, the production rate of 10Be in the earth's atmosphere. The best archives of past changes in 10Be production rate are the polar ice cores. Key challenges in interpreting these archives as proxies for past solar activity lie in separating the useful solar activity (or production) signal from the interfering meteorological (or climate) signal, and furthermore, in determining the atmospheric source regions of 10Be deposited to the ice core site. In this study we use a new monthly resolution composite 10Be record, which spans the past decade, and a general circulation model (ECHAM5-HAM), to constrain both the production and climate signals in 10Be concentrations at the Law Dome ice core site, East Antarctica. This study differs from most previous work on 10Be in Antarctica due to the very high sample resolution achieved. This high resolution, through a time period where accurate instrumental measurements of solar activity and climate are available, allows us to examine the response of 10Be concentrations in ice to short-term (monthly to annual) variations in solar activity, and to short-term variations in climate, including seasonality. We find a significant correlation (r2 = 0.56, P < 0.005, n = 92) between observed 10Be concentrations and solar activity (represented by the neutron counting rate). The most pervasive climate influence is a seasonal cycle, which shows maximum concentrations in mid-to-late-summer and minimum concentrations in winter. Model results show reasonable agreement with observations; both a solar activity signal and seasonal cycle in 10Be are captured. However, the modeled snow accumulation rate is too high by approximately 60%. According to the model, the main atmospheric source region of 10Be deposited to Law Dome is the 30-90°S stratosphere (~50%), followed by the 30-90°S troposphere (~30%). An enhancement in the fraction of 10Be arriving to Law Dome from the stratosphere is found by the model during the mid-to-late summer, we suggest this pattern is implicated in the seasonality of observed 10Be concentrations in ice. Our results have implications for interpretation of longer term records of 10Be from ice cores. Firstly, the strong production signal supports the use of 10Be as a solar proxy. Secondly, the short term climate processes operating here, may provide clues to how longer term shifts in climate impact on ice core 10Be.

  5. Modifying the Heliophysics Data Policy to Better Enable Heliophysics Research

    NASA Technical Reports Server (NTRS)

    Hayes, Jeffrey; Roberts, D. Aaron; Bredekamp, Joseph

    2008-01-01

    The Heliophysics (HP) Science Data Management Policy, adopted by HP in June 2007, has helped to provide a structure for the HP data lifecycle. It provides guidelines for Project Data Management Plans and related documents, initiates Resident Archives to maintain data services after a mission ends, and outlines a route to the unification of data finding, access, and distribution through Virtual observatories. Recently we have filled in missing pieces that assure more coherence and a home for the VxOs (through the 'Heliophsyics Data and Model Consortium'), and provide greater clarity with respect to long term archiving. In particular, the new policy which has been vetted with many community members, details the 'Final Archives' that are to provide long-term data access. These are distinguished from RAs in that they provide little additional service beyond servicing data, but critical to their success is that the final archival materials include calibrated data in useful formats such as one finds in CDAWeb and various ASCII or FITS archives. Having a clear goal for legacy products, to be detailed as part of the Mission Archives Plans presented at Senior Reviews, will help to avoid the situation so common in the past of having archival products that preserve bits well but not readily usable information. We hope to avoid the need for the large numbers of 'data upgrade' projects that have been necessary in recent years.

  6. Anatomy of the AGN in NGC 5548. VI. Long-term variability of the warm absorber

    NASA Astrophysics Data System (ADS)

    Ebrero, J.; Kaastra, J. S.; Kriss, G. A.; Di Gesu, L.; Costantini, E.; Mehdipour, M.; Bianchi, S.; Cappi, M.; Boissay, R.; Branduardi-Raymont, G.; Petrucci, P.-O.; Ponti, G.; Pozo Núñez, F.; Seta, H.; Steenbrugge, K. C.; Whewell, M.

    2016-03-01

    Context. We observed the archetypal Seyfert 1 galaxy NGC 5548 in 2013-2014 in the context of an extensive multiwavelength campaign involving several satellites, which revealed the source to be in an extraordinary state of persistent heavy obscuration. Aims: We re-analyzed the archival grating spectra obtained by XMM-Newton and Chandra between 1999 and 2007 in order to characterize the classic warm absorber (WA) using consistent models and up-to-date photoionization codes and atomic physics databases and to construct a baseline model that can be used as a template for the physical state of the WA in the 2013 observations. Methods: We used the latest version of the photoionization code CLOUDY and the SPEX fitting package to model the X-ray grating spectra of the different archival observations of NGC 5548. Results: We find that the WA in NGC 5548 is composed of six distinct ionization phases outflowing in four kinematic regimes. The components seem to be in the form of a stratified wind with several layers intersected by our line of sight. Assuming that the changes in the WA are solely due to ionization or recombination processes in response to variations in the ionizing flux among the different observations, we are able to estimate lower limits on the density of the absorbing gas, finding that the farthest components are less dense and have a lower ionization. These limits are used to put stringent upper limits on the distance of the WA components from the central ionizing source, with the lowest ionization phases at several pc distances (<50, <20, and <5 pc, respectively), while the intermediately ionized components lie at pc-scale distances from the center (<3.6 and <2.2 pc, respectively). The highest ionization component is located at ~0.6 pc or closer to the AGN central engine. The mass outflow rate summed over all WA components is ~0.3 M⊙ yr-1, about six times the nominal accretion rate of the source. The total kinetic luminosity injected into the surrounding medium is a small fraction (~0.03%) of the bolometric luminosity of the source. After adding the contribution of the UV absorbers, this value augments to ~0.2% of the bolometric luminosity, well below the minimum amount of energy required by current feedback models to regulate galaxy evolution.

  7. The HARPS-N archive through a Cassandra, NoSQL database suite?

    NASA Astrophysics Data System (ADS)

    Molinari, Emilio; Guerra, Jose; Harutyunyan, Avet; Lodi, Marcello; Martin, Adrian

    2016-07-01

    The TNG-INAF is developing the science archive for the WEAVE instrument. The underlying architecture of the archive is based on a non relational database, more precisely, on Apache Cassandra cluster, which uses a NoSQL technology. In order to test and validate the use of this architecture, we created a local archive which we populated with all the HARPSN spectra collected at the TNG since the instrument's start of operations in mid-2012, as well as developed tools for the analysis of this data set. The HARPS-N data set is two orders of magnitude smaller than WEAVE, but we want to demonstrate the ability to walk through a complete data set and produce scientific output, as valuable as that produced by an ordinary pipeline, though without accessing directly the FITS files. The analytics is done by Apache Solr and Spark and on a relational PostgreSQL database. As an example, we produce observables like metallicity indexes for the targets in the archive and compare the results with the ones coming from the HARPS-N regular data reduction software. The aim of this experiment is to explore the viability of a high availability cluster and distributed NoSQL database as a platform for complex scientific analytics on a large data set, which will then be ported to the WEAVE Archive System (WAS) which we are developing for the WEAVE multi object, fiber spectrograph.

  8. Protein Data Bank (PDB): The Single Global Macromolecular Structure Archive

    PubMed Central

    Burley, Stephen K.; Berman, Helen M.; Kleywegt, Gerard J.; Markley, John L.; Nakamura, Haruki; Velankar, Sameer

    2018-01-01

    The Protein Data Bank (PDB)—the single global repository of experimentally determined 3D structures of biological macromolecules and their complexes—was established in 1971, becoming the first open-access digital resource in the biological sciences. The PDB archive currently houses ~130,000 entries (May 2017). It is managed by the Worldwide Protein Data Bank organization (wwPDB; wwpdb.org), which includes the RCSB Protein Data Bank (RCSB PDB; rcsb.org), the Protein Data Bank Japan (PDBj; pdbj.org), the Protein Data Bank in Europe (PDBe; pdbe.org), and BioMagResBank (BMRB; www.bmrb.wisc.edu). The four wwPDB partners operate a unified global software system that enforces community-agreed data standards and supports data Deposition, Biocuration, and Validation of ~11,000 new PDB entries annually (deposit.wwpdb.org). The RCSB PDB currently acts as the archive keeper, ensuring disaster recovery of PDB data and coordinating weekly updates. wwPDB partners disseminate the same archival data from multiple FTP sites, while operating complementary websites that provide their own views of PDB data with selected value-added information and links to related data resources. At present, the PDB archives experimental data, associated metadata, and 3D-atomic level structural models derived from three well-established methods: crystallography, nuclear magnetic resonance spectroscopy (NMR), and electron microscopy (3DEM). wwPDB partners are working closely with experts in related experimental areas (small-angle scattering, chemical cross-linking/mass spectrometry, Forster energy resonance transfer or FRET, etc.) to establish a federation of data resources that will support sustainable archiving and validation of 3D structural models and experimental data derived from integrative or hybrid methods. PMID:28573592

  9. Protein Data Bank (PDB): The Single Global Macromolecular Structure Archive.

    PubMed

    Burley, Stephen K; Berman, Helen M; Kleywegt, Gerard J; Markley, John L; Nakamura, Haruki; Velankar, Sameer

    2017-01-01

    The Protein Data Bank (PDB)--the single global repository of experimentally determined 3D structures of biological macromolecules and their complexes--was established in 1971, becoming the first open-access digital resource in the biological sciences. The PDB archive currently houses ~130,000 entries (May 2017). It is managed by the Worldwide Protein Data Bank organization (wwPDB; wwpdb.org), which includes the RCSB Protein Data Bank (RCSB PDB; rcsb.org), the Protein Data Bank Japan (PDBj; pdbj.org), the Protein Data Bank in Europe (PDBe; pdbe.org), and BioMagResBank (BMRB; www.bmrb.wisc.edu). The four wwPDB partners operate a unified global software system that enforces community-agreed data standards and supports data Deposition, Biocuration, and Validation of ~11,000 new PDB entries annually (deposit.wwpdb.org). The RCSB PDB currently acts as the archive keeper, ensuring disaster recovery of PDB data and coordinating weekly updates. wwPDB partners disseminate the same archival data from multiple FTP sites, while operating complementary websites that provide their own views of PDB data with selected value-added information and links to related data resources. At present, the PDB archives experimental data, associated metadata, and 3D-atomic level structural models derived from three well-established methods: crystallography, nuclear magnetic resonance spectroscopy (NMR), and electron microscopy (3DEM). wwPDB partners are working closely with experts in related experimental areas (small-angle scattering, chemical cross-linking/mass spectrometry, Forster energy resonance transfer or FRET, etc.) to establish a federation of data resources that will support sustainable archiving and validation of 3D structural models and experimental data derived from integrative or hybrid methods.

  10. Archive of digital CHIRP seismic reflection data collected during USGS cruise 06FSH01 offshore of Siesta Key, Florida, May 2006

    USGS Publications Warehouse

    Harrison, Arnell S.; Dadisman, Shawn V.; Flocks, James G.; Wiese, Dana S.; Robbins, Lisa L.

    2007-01-01

    In May of 2006, the U.S. Geological Survey conducted geophysical surveys offshore of Siesta Key, Florida. This report serves as an archive of unprocessed digital chirp seismic reflection data, trackline maps, navigation files, GIS information, Field Activity Collection System (FACS) logs, observer's logbook, and formal FGDC metadata. Gained digital images of the seismic profiles are also provided. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG-Y format (Barry and others, 1975) and may be downloaded and processed with commercial or public domain software such as Seismic Unix (SU). Example SU processing scripts and USGS software for viewing the SEG-Y files (Zihlman, 1992) are also provided.

  11. Archive of digital CHIRP seismic reflection data collected during USGS cruise 06SCC01 offshore of Isles Dernieres, Louisiana, June 2006

    USGS Publications Warehouse

    Harrison, Arnell S.; Dadisman, Shawn V.; Ferina, Nick F.; Wiese, Dana S.; Flocks, James G.

    2007-01-01

    In June of 2006, the U.S. Geological Survey conducted a geophysical survey offshore of Isles Dernieres, Louisiana. This report serves as an archive of unprocessed digital CHIRP seismic reflection data, trackline maps, navigation files, GIS information, Field Activity Collection System (FACS) logs, observer's logbook, and formal FGDC metadata. Gained digital images of the seismic profiles are also provided. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG-Y format (Barry and others, 1975) and may be downloaded and processed with commercial or public domain software such as Seismic UNIX (SU). Example SU processing scripts and USGS software for viewing the SEG-Y files (Zihlman, 1992) are also provided.

  12. VizieR Online Data Catalog: Jame Clerk Maxwell Telescope Science Archive (CADC, 2003)

    NASA Astrophysics Data System (ADS)

    Canadian Astronomy Data, Centre

    2018-01-01

    The JCMT Science Archive (JSA), a collaboration between the CADC and EOA, is the official distribution site for observational data obtained with the James Clerk Maxwell Telescope (JCMT) on Mauna Kea, Hawaii. The JSA search interface is provided by the CADC Search tool, which provides generic access to the complete set of telescopic data archived at the CADC. Help on the use of this tool is provided via tooltips. For additional information on instrument capabilities and data reduction, please consult the SCUBA-2 and ACSIS instrument pages provided on the JAC maintained JCMT pages. JCMT-specific help related to the use of the CADC AdvancedSearch tool is available from the JAC. (1 data file).

  13. Characterizing the X-ray Emission From Stellar Bow Shocks and Their Driving Stars with the Chandra Archive

    NASA Astrophysics Data System (ADS)

    Binder, Breanna

    2017-09-01

    We propose an archival study of 2.8 Msec of ACIS images to search for X-ray emission from stellar-wind bow shocks and to characterize the X-ray properties of their driving stars. Bow shocks, particularly those produced by runaway OB stars, are theorized to up-scatter IR photons via inverse Compton scattering, and may produce a significant fraction of high-energy photons in our Galaxy. However, their low X-ray luminosity makes direct detection difficult. By stacking 106 archival observations containing >100 bow shocks, we will create the deepest X-ray exposure of bow shocks to date. We will perform the first detailed comparison of bow shock driving stars to the general massive star population.

  14. CARDS: A blueprint and environment for domain-specific software reuse

    NASA Technical Reports Server (NTRS)

    Wallnau, Kurt C.; Solderitsch, Anne Costa; Smotherman, Catherine

    1992-01-01

    CARDS (Central Archive for Reusable Defense Software) exploits advances in domain analysis and domain modeling to identify, specify, develop, archive, retrieve, understand, and reuse domain-specific software components. An important element of CARDS is to provide visibility into the domain model artifacts produced by, and services provided by, commercial computer-aided software engineering (CASE) technology. The use of commercial CASE technology is important to provide rich, robust support for the varied roles involved in a reuse process. We refer to this kind of use of knowledge representation systems as supporting 'knowledge-based integration.'

  15. The preservation of LANDSAT data by the National Land Remote Sensing Archive

    NASA Technical Reports Server (NTRS)

    Boyd, John E.

    1992-01-01

    Digital data, acquired by the National Landsat Remote Sensing Program, document nearly two decades of global agricultural, environmental, and sociological change. The data were widely applied and continue to be essential to a variety of geologic, hydrologic, agronomic, and strategic programs and studies by governmental, academic, and commercial researchers. Landsat data were acquired by five observatories that use primarily two digital sensor systems. The Multispectral Scanner (MSS) was onboard all five Landsats, which have orbited over 19 years; the higher resolution Thematic Mapper (TM) sensor acquired data for the last 9 years on Landsats 4 and 5 only. The National Land Remote Sensing Archive preserves the 800,000 scenes, which total more than 60 terabytes of data, on master tapes that are steadily deteriorating. Data are stored at two terabytes of data, on master tapes that are steadily deteriorating. Data are stored at two locations (Sioux Falls, South Dakota and Landover, Maryland), in three archive formats. The U.S. Geological Survey's EROS Data Center has initiated a project to consolidate and convert, over the next 4 years, two of the archive formats from antiquated instrumentation tape to rotary-recorded cassette magnetic tape. The third archive format, consisting of 300,000 scenes of MSS data acquired from 1972 through 1978, will not be converted because of budgetary constraints. This data preservation project augments EDC's experience in data archiving and information management, expertise that is critical to EDC's role as a Distributed Active Archive Center for the Earth Observing System, a new and much larger national earth science program.

  16. ALICE Data Release: A Revaluation of HST-NICMOS Coronagraphic Images

    NASA Astrophysics Data System (ADS)

    Hagan, J. Brendan; Choquet, Élodie; Soummer, Rémi; Vigan, Arthur

    2018-04-01

    The Hubble Space Telescope NICMOS instrument was used from 1997 to 2008 to perform coronagraphic observations of about 400 targets. Most of them were part of surveys looking for substellar companions or resolved circumstellar disks to young nearby stars, making the NICMOS coronagraphic archive a valuable database for exoplanets and disks studies. As part of the Archival Legacy Investigations of Circumstellar Environments program, we have consistently reprocessed a large fraction of the NICMOS coronagrahic archive using advanced starlight subtraction methods. We present here the high-level science products of these re-analyzed data, which we delivered back to the community through the Mikulski Archive for Space Telescopes: doi:10.17909/T9W89V. We also present the second version of the HCI-FITS format (for High-Contrast Imaging FITS format), which we developed as a standard format for data exchange of imaging reduced science products. These re-analyzed products are openly available for population statistics studies, characterization of specific targets, or detected point-source identification.

  17. Detection of the YORP effect in asteroid (161989) Cacus

    NASA Astrophysics Data System (ADS)

    Durech, Josef; Vokrouhlicky, David; Pravec, Petr; Hanus, Josef; Kusnirak, Peter; Hornoch, Kamil; Galad, Adrian; Masi, Gianluca

    2016-10-01

    The rotation state of small asteroids is affected by the thermal Yarkovsky-O'Keefe-Radzievski-Paddack (YORP) torque. The directly observable consequence of YORP is the secular change of the asteroid's rotational period in time. We carried out new photometric observations of asteroid (161989) Cacus during its apparitions in 2014-2016. Using the new lightcurves together with archived data going back to 1978, we were able to detect a tiny deviation from the constant-period rotation. This deviation caused an observable shift between the observed lightcurves and those predicted by the best constant-period model. We used the lightcurve inversion method to derive a shape/spin solution that fitted the data at best. We assumed that the rotation rate evolved linearly in time and derived the acceleration of the rotation rate dω/dt = (1.9 +/- 0.3) × 10-8 rad/day2. The accelerating model provides a significantly better fit than the constant-period model. By applying a thermophysical model on WISE thermal infrared data, we estimated the thermal inertia of the surface to Γ = 250-2000 J m-2 s-0.5 K-1 and the volume-equivalent diameter to 0.8-1.2 km (1σ intervals). The value of dω/dt derived from observations is in agreement with the theoretical value computed numerically from the lightcurve inversion shape model and its spin axis orientation. Cacus has become the sixth asteroid with YORP detection. Surprisingly, for all six cases the rotation rate accelerates.

  18. Air Quality Forecasts Using the NASA GEOS Model: A Unified Tool from Local to Global Scales

    NASA Technical Reports Server (NTRS)

    Knowland, E. Emma; Keller, Christoph; Nielsen, J. Eric; Orbe, Clara; Ott, Lesley; Pawson, Steven; Saunders, Emily; Duncan, Bryan; Cook, Melanie; Liu, Junhua; hide

    2017-01-01

    We provide an introduction to a new high-resolution (0.25 degree) global composition forecast produced by NASA's Global Modeling and Assimilation office. The NASA Goddard Earth Observing System version 5 (GEOS-5) model has been expanded to provide global near-real-time forecasts of atmospheric composition at a horizontal resolution of 0.25 degrees (approximately 25 km). Previously, this combination of detailed chemistry and resolution was only provided by regional models. This system combines the operational GEOS-5 weather forecasting model with the state-of-the-science GEOS-Chem chemistry module (version 11) to provide detailed chemical analysis of a wide range of air pollutants such as ozone, carbon monoxide, nitrogen oxides, and fine particulate matter (PM2.5). The resolution of the forecasts is the highest resolution compared to current, publically-available global composition forecasts. Evaluation and validation of modeled trace gases and aerosols compared to surface and satellite observations will be presented for constituents relative to health air quality standards. Comparisons of modeled trace gases and aerosols against satellite observations show that the model produces realistic concentrations of atmospheric constituents in the free troposphere. Model comparisons against surface observations highlight the model's capability to capture the diurnal variability of air pollutants under a variety of meteorological conditions. The GEOS-5 composition forecasting system offers a new tool for scientists and the public health community, and is being developed jointly with several government and non-profit partners. Potential applications include air quality warnings, flight campaign planning and exposure studies using the archived analysis fields.

  19. Applying the Technology Acceptance Model in a Study of the Factors Affecting Usage of the Taiwan Digital Archives System

    ERIC Educational Resources Information Center

    Hong, Jon-Chao; Hwang, Ming-Yueh; Hsu, Hsuan-Fang; Wong, Wan-Tzu; Chen, Mei-Yung

    2011-01-01

    The rapid development of information and communication technology and the popularization of the Internet have given a boost to digitization technologies. Since 2001, The National Science Council (NSC) of Taiwan has invested a large amount of funding in the National Digital Archives Program (NDAP) to develop digital content. Some studies have…

  20. Land Processes Distributed Active Archive Center (LP DAAC) 25th Anniversary Recognition "A Model for Government Partnerships". LP DAAC "History and a Look Forward"

    NASA Technical Reports Server (NTRS)

    Behnke, Jeanne; Doescher, Chris

    2015-01-01

    This presentation discusses 25 years of interactions between NASA and the USGS to manage a Land Processes Distributed Active Archive Center (LPDAAC) for the purpose of providing users access to NASA's rich collection of Earth Science data. The presentation addresses challenges, efforts and metrics on the performance.

  1. Technical note: The US Dobson station network data record prior to 2015, re-evaluation of NDACC and WOUDC archived records with WinDobson processing software

    NASA Astrophysics Data System (ADS)

    Evans, Robert D.; Petropavlovskikh, Irina; McClure-Begley, Audra; McConville, Glen; Quincy, Dorothy; Miyagawa, Koji

    2017-10-01

    The United States government has operated Dobson ozone spectrophotometers at various sites, starting during the International Geophysical Year (1 July 1957 to 31 December 1958). A network of stations for long-term monitoring of the total column content (thickness of the ozone layer) of the atmosphere was established in the early 1960s and eventually grew to 16 stations, 14 of which are still operational and submit data to the United States of America's National Oceanic and Atmospheric Administration (NOAA). Seven of these sites are also part of the Network for the Detection of Atmospheric Composition Change (NDACC), an organization that maintains its own data archive. Due to recent changes in data processing software the entire dataset was re-evaluated for possible changes. To evaluate and minimize potential changes caused by the new processing software, the reprocessed data record was compared to the original data record archived in the World Ozone and UV Data Center (WOUDC) in Toronto, Canada. The history of the observations at the individual stations, the instruments used for the NOAA network monitoring at the station, the method for reducing zenith-sky observations to total ozone, and calibration procedures were re-evaluated using data quality control tools built into the new software. At the completion of the evaluation, the new datasets are to be published as an update to the WOUDC and NDACC archives, and the entire dataset is to be made available to the scientific community. The procedure for reprocessing Dobson data and the results of the reanalysis on the archived record are presented in this paper. A summary of historical changes to 14 station records is also provided.

  2. Energy-dependent intensity variation of the persistent X-ray emission of magnetars observed with Suzaku

    NASA Astrophysics Data System (ADS)

    Nakagawa, Yujin; Ebisawa, Ken; Enoto, Teruaki

    2018-03-01

    The emission mechanism of magnetars is still controversial even though various observational and theoretical studies have been made. In order to investigate mechanisms of both the persistent X-ray emission and the burst emission of the magnetars, we propose a model in which the persistent X-ray emission consists of numerous micro-bursts of various sizes. If this model is correct, root mean square (rms) intensity variations of the persistent emission would exceed the values expected from the Poisson distribution. Using Suzaku archive data of 11 magnetars (22 observations), the rms intensity variations were calculated from 0.2 keV to 70 keV. As a result, we found significant excess rms intensity variations from all 11 magnetars. We suppose that numerous micro-bursts constituting the persistent X-ray emission cause the observed variations, suggesting that the persistent X-ray emission and the burst emission have identical emission mechanisms. In addition, we found that the rms intensity variations clearly increase toward higher energy bands for four magnetars (six observations). The energy-dependent rms intensity variations imply that the soft thermal component and the hard X-ray component are emitted from different regions far apart from each other.

  3. The Ultra-Soft X-ray Background: A Probe of the Hot Interstellar Medium and the Local Bubble - ADP-99

    NASA Technical Reports Server (NTRS)

    Smith, Randall

    2003-01-01

    The Ultra-Soft X-ray Telescope (UXT) was a sounding rocket mission flown three times in 1984 - 1986. At the beginning of the project, the data existed solely in form of raw telemetry data stored on 9 track tapes. The primary goal of this proposal has been to extract event files from the raw telemetry stream and to create instrument response models and calibrated spectra from it. We have completed this project, and the data will soon be available to all via the HEASARC archive of high-energy data at Goddard Space Flight Center. We are in the process of combining the results with the ALEXIS and DXS observations of the Local Bubble in modelling the 72 eV iron line (recently observed by the X-ray Quantum Calorimeter) and the carbon emission lines that are uniquely visible in this dataset. Our results agree with the XQC observation which predicts a maximum emission in the 72 eV iron lines that is below the limit observable with UXT. However, this leaves an open question as to what lines were responsible for the observed Be-band emission. The answer to this question will likely require more observations of soft X-rays with the Chandra LETGS and new atomic data models of potentially emitting ions.

  4. The 1996 Leonid shower as studied with a potassium lidar: Observations and inferred meteoroid sizes

    NASA Astrophysics Data System (ADS)

    Höffner, Josef; von Zahn, Ulf; McNeil, William J.; Murad, Edmond

    1999-02-01

    We report on the observation and analysis of meteor trails that are detected by ground-based lidar tuned to the D1 fine structure line of K. The lidar is located at Kühlungsborn, Germany. The echo profiles are analyzed with a temporal resolution of about 1 s and altitude resolution of 200 m. Identification of meteor trails in the large archive of raw data is performed with help of an automated computer search code. During the peak of the Lenoid meteor shower on the morning of November 17, 1996, we observed seven meteor trails between 0245 and 0445 UT. Their mean altitude was 89.0 km. The duration of observation of individual trails ranges from 3 s to ~30 min. We model the probability of observing a meteor trail by ground-based lidar as a function of both altitude distribution and duration of the trails. These distributions depend on the mass distribution, entry velocity, and entry angle of the meteoroids, on the altitude-dependent chemical and dynamical lifetimes of the released K atom, and on the absolute detection sensitivity of our lidar experiment. From the modeling, we derive the statistical likelihood of detection of trails from meteoroids of a particular size. These bracket quite well the observed trails. The model also gives estimates of the probable size of the meteoroids based on characteristics of individual trails.

  5. Archaeological Feature Detection from Archive Aerial Photography with a Sfm-Mvs and Image Enhancement Pipeline

    NASA Astrophysics Data System (ADS)

    Peppa, M. V.; Mills, J. P.; Fieber, K. D.; Haynes, I.; Turner, S.; Turner, A.; Douglas, M.; Bryan, P. G.

    2018-05-01

    Understanding and protecting cultural heritage involves the detection and long-term documentation of archaeological remains alongside the spatio-temporal analysis of their landscape evolution. Archive aerial photography can illuminate traces of ancient features which typically appear with different brightness values from their surrounding environment, but are not always well defined. This research investigates the implementation of the Structure-from-Motion - Multi-View Stereo image matching approach with an image enhancement algorithm to derive three epochs of orthomosaics and digital surface models from visible and near infrared historic aerial photography. The enhancement algorithm uses decorrelation stretching to improve the contrast of the orthomosaics so as archaeological features are better detected. Results include 2D / 3D locations of detected archaeological traces stored into a geodatabase for further archaeological interpretation and correlation with benchmark observations. The study also discusses the merits and difficulties of the process involved. This research is based on a European-wide project, entitled "Cultural Heritage Through Time", and the case study research was carried out as a component of the project in the UK.

  6. Analysis of Scattering from Archival Pulsar Data using a CLEAN-based Method

    NASA Astrophysics Data System (ADS)

    Tsai, -Wei, Jr.; Simonetti, John H.; Kavic, Michael

    2017-02-01

    In this work, we adopted a CLEAN-based method to determine the scatter time, τ, from archived pulsar profiles under both the thin screen and uniform medium scattering models and to calculate the scatter time frequency scale index α, where τ \\propto {ν }α . The value of α is -4.4, if a Kolmogorov spectrum of the interstellar medium turbulence is assumed. We deconvolved 1342 profiles from 347 pulsars over a broad range of frequencies and dispersion measures. In our survey, in the majority of cases the scattering effect was not significant compared to pulse profile widths. For a subset of 21 pulsars scattering at the lowest frequencies was large enough to be measured. Because reliable scatter time measurements were determined only for the lowest frequency, we were limited to using upper limits on scatter times at higher frequencies for the purpose of our scatter time frequency slope estimation. We scaled the deconvolved scatter time to 1 GHz assuming α =-4.4 and considered our results in the context of other observations which yielded a broad relation between scatter time and dispersion measure.

  7. Swift UVOT Observations of SN2018apl/ASASSN-18gq

    NASA Astrophysics Data System (ADS)

    Brown, Peter J.

    2018-04-01

    SN2018apl/ASASSN-18gq (ATEL #11500) was observed by the Neil Gehrels Swift Observatory beginning 2018-04-03 15:28:06. We measured the magnitudes below from summed images from the first orbit of observations using the Swift Optical Ultraviolet Supernova Archive (SOUSA; Brown et al. 2014).

  8. Rapid, High-Resolution Detection of Environmental Change over Continental Scales from Satellite Data - the Earth Observation Data Cube

    NASA Technical Reports Server (NTRS)

    Lewis, Adam; Lymburner, Leo; Purss, Matthew B. J.; Brooke, Brendan; Evans, Ben; Ip, Alex; Dekker, Arnold G.; Irons, James R.; Minchin, Stuart; Mueller, Norman

    2015-01-01

    The effort and cost required to convert satellite Earth Observation (EO) data into meaningful geophysical variables has prevented the systematic analysis of all available observations. To overcome these problems, we utilise an integrated High Performance Computing and Data environment to rapidly process, restructure and analyse the Australian Landsat data archive. In this approach, the EO data are assigned to a common grid framework that spans the full geospatial and temporal extent of the observations - the EO Data Cube. This approach is pixel-based and incorporates geometric and spectral calibration and quality assurance of each Earth surface reflectance measurement. We demonstrate the utility of the approach with rapid time-series mapping of surface water across the entire Australian continent using 27 years of continuous, 25 m resolution observations. Our preliminary analysis of the Landsat archive shows how the EO Data Cube can effectively liberate high-resolution EO data from their complex sensor-specific data structures and revolutionise our ability to measure environmental change.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kashgarian, M; Guilderson, T P

    We utilize monthly {sup 14}C data derived from coral archives in conjunction with ocean circulation models to address two questions: (1) how does the shallow circulation of the tropical Pacific vary on seasonal to decadal time scales and (2) which dynamic processes determine the mean vertical structure of the equatorial Pacific thermocline. Our results directly impact the understanding of global climate events such as the El Nino-Southern Oscillation (ENSO). To study changes in ocean circulation and water mass distribution involved in the genesis and evolution of ENSO and decadal climate variability, it is necessary to have records of climate variablesmore » several decades in length. Continuous instrumental records are limited because technology for continuous monitoring of ocean currents (e.g. satellites and moored arrays) has only recently been available, and ships of opportunity archives such as COADS contain large spatial and temporal biases. In addition, temperature and salinity in surface waters are not conservative and thus can not be independently relied upon to trace water masses, reducing the utility of historical observations. Radiocarbon in sea water is a quasi-conservative water mass tracer and is incorporated into coral skeletal material, thus coral {sup 14}C records can be used to reconstruct changes in shallow circulation that would be difficult to characterize using instrumental data. High resolution {Delta}{sup 14}C timeseries such as ours, provide a powerful constraint on the rate of surface ocean mixing and hold great promise to augment one time oceanographic surveys. {Delta}{sup 14}C timeseries such as these, not only provide fundamental information about the shallow circulation of the Pacific, but can also be directly used as a benchmark for the next generation of high resolution ocean models used in prognosticating climate. The measurement of {Delta}{sup 14}C in biological archives such as tree rings and coral growth bands is a direct record of the invasion of fossil fuel CO{sub 2} and bomb {sup 14}C into the atmosphere and surface oceans. Therefore the {Delta}{sup 14}C data that are produced in this study can be used to validate the ocean uptake of fossil fuel CO2 in coupled ocean-atmosphere models. This study takes advantage of the quasi-conservative nature of {sup 14}C as a water mass tracer by using {Delta}{sup 14}C time series in corals to identify changes in the shallow circulation of the Pacific. Although the data itself provides fundamental information on surface water mass movement the true strength is a combined approach which is greater than the individual parts; the data helps uncover deficiencies in ocean circulation models and the model results place long {Delta}{sup 14}C time series in a dynamic framework which helps to identify those locations where additional observations are most needed.« less

  10. Low Luminosity States of the Black Hole Candidate GX 339-4. 1; ASCA and Simultaneous Radio/RXTE Observations

    NASA Technical Reports Server (NTRS)

    Wilms, Joern; Nowak, Michael A.; Dove, James B.; Fender, Robert P.; DiMatteo, Tiziana

    1998-01-01

    We discuss a series of observations of the black hole candidate GX 339-4 in low luminosity, spectrally hard states. We present spectral analysis of three separate archival Advanced Satellite for Cosmology and Astrophysics (ASCA) data sets and eight separate Rossi X-ray Timing Explorer (RXTE) data sets. Three of the RXTE observations were strictly simultaneous with 843 Mega Hertz and 8.3-9.1 Giga Hertz radio observations. All of these observations have (3-9 keV) flux approximately less than 10(exp-9) ergs s(exp-1) CM(exp -2). The ASCA data show evidence for an approximately 6.4 keV Fe line with equivalent width approximately 40 eV, as well as evidence for a soft excess that is well-modeled by a power law plus a multicolor blackbody spectrum with peak temperature approximately equals 150-200 eV. The RXTE data sets also show evidence of an Fe line with equivalent widths approximately equal to 20-1OO eV. Reflection models show a hardening of the RXTE spectra with decreasing X-ray flux; however, these models do not exhibit evidence of a correlation between the photon index of the incident power law flux and the solid angle subtended by the reflector. 'Sphere+disk' Comptonization models and Advection Dominated Accretion Flow (ADAF) models also provide reasonable descriptions of the RXTE data. The former models yield coronal temperatures in the range 20-50 keV and optical depths of r approximately equal to 3. The model fits to the X-ray data, however, do not simultaneously explain the observed radio properties. The most likely source of the radio flux is synchrotron emission from an extended outflow of extent greater than O(10 (exp7) GM/c2).

  11. The Heliophysics Data Environment, Virtual Observatories, NSSDC, and SPASE

    NASA Technical Reports Server (NTRS)

    Thieman, James; Grayzeck, Edwin; Roberts, Aaron; King, Todd

    2010-01-01

    Heliophysics (the study of the Sun and its effects on the Solar System, especially the Earth) has an interesting data environment in that the data are often to be found in relatively small data sets widely scattered in archives around the world. Within the last decade there have been more concentrated efforts to organize the data access methods and create a Heliophysics Data and Model Consortium (HDMC). To provide data search and access capability a number of Virtual Observatories (VO's) have been established both via funding from the U.S. National Aeronautics and Space Administration (NASA) and through other funding agencies in the U.S. and worldwide. At least 15 systems can be labeled as Heliophysics Virtual Observatories, 9 of them funded by NASA. Other parts of this data environment include Resident Archives, and the final, or "deep" archive at the National Space Science Data Center (NSSDC). The problem is that different data search and access approaches are used by all of these elements of the HDMC and a search for data relevant to a particular research question can involve consulting with multiple VO's - needing to learn a different approach for finding and acquiring data for each. The Space Physics Archive Search and Extract (SPASE) project is intended to provide a common data model for Heliophysics data and therefore a common set of metadata for searches of the VO's and other data environment elements. The SPASE Data Model has been developed through the common efforts of the HDMC representatives over a number of years. We currently have released Version 2.1. of the Data Model. The advantages and disadvantages of the Data Model will be discussed along with the plans for the future. Recent changes requested by new members of the SPASE community indicate some of the directions for further development.

  12. Pervaya uchebnaya astronomicheskaya observatoriya Moskovskogo universiteta %t The first eductational astronomical observatory of the Moscow University

    NASA Astrophysics Data System (ADS)

    Ponomareva, G. A.; Shcheglov, P. V.

    Using archive materials found in the Central Historical Archive of Moscow and early publications in Russian and German press, we follow the history of the struggle for the foundation of the University's astronomical observatory by M. N. Muravyov, the University Warden in 1803-1807. Though F. Goldbach, the astronomy professor in 1804-1811, prepared the observatory's plan and budget, it was not possible to begin construction works. Nevertheless, a wooden dome was built in 1804 on the roof of the University's main building, referred to as "the astronomical bellevedere" by Muravyov. This fist educational astronomical observatory was used for practical studies and for the students' observations. F. Goldbach himself observed from the window of a room in his apartment, so his colleagues called that room "Goldbach's observatory". Later this fact was a source of confusion for the University's historiographers. The educational observatory was destroyed, with the whole University, by the fire in September 1812. The existing archive documents claim that the Moscow University's Presnya observatory was built as a replacemnet of the one destroyed by the fire in 1812.

  13. ESA's Planetary Science Archive: Preserve and present reliable scientific data sets

    NASA Astrophysics Data System (ADS)

    Besse, S.; Vallat, C.; Barthelemy, M.; Coia, D.; Costa, M.; De Marchi, G.; Fraga, D.; Grotheer, E.; Heather, D.; Lim, T.; Martinez, S.; Arviset, C.; Barbarisi, I.; Docasal, R.; Macfarlane, A.; Rios, C.; Saiz, J.; Vallejo, F.

    2018-01-01

    The European Space Agency (ESA) Planetary Science Archive (PSA) is undergoing a significant refactoring of all its components to improve the services provided to the scientific community and the public. The PSA supports ESA's missions exploring the Solar System by archiving scientific peer-reviewed observations as well as engineering data sets. This includes the Giotto, SMART-1, Huygens, Venus Express, Mars Express, Rosetta, Exomars 2016, Exomars RSP, BepiColombo, and JUICE missions. The PSA is offering a newly designed graphical user interface which is simultaneously meant to maximize the interaction with scientific observations and also minimise the efforts needed to download these scientific observations. The PSA still offers the same services as before (i.e., FTP, documentation, helpdesk, etc.). In addition, it will support the two formats of the Planetary Data System (i.e., PDS3 and PDS4), as well as providing new ways for searching the data products with specific metadata and geometrical parameters. As well as enhanced services, the PSA will also provide new services to improve the visualisation of data products and scientific content (e.g., spectra, etc.). Together with improved access to the spacecraft engineering data sets, the PSA will provide easier access to scientific data products that will help to maximize the science return of ESA's space missions.

  14. Intelligent Systems Technologies to Assist in Utilization of Earth Observation Data

    NASA Technical Reports Server (NTRS)

    Ramapriyan, Hampapuram K.; McConaughy, Gail; Lynnes, Christopher; McDonald, Kenneth; Kempler, Steven

    2003-01-01

    With the launch of several Earth observing satellites over the last decade, we are now in a data rich environment. From NASA's Earth Observing System (EOS) satellites alone, we are accumulating more than 3 TB per day of raw data and derived geophysical parameters. The data products are being distributed to a large user community comprising scientific researchers, educators and operational government agencies. Notable progress has been made in the last decade in facilitating access to data. However, to realize the full potential of the growing archives of valuable scientific data, further progress is necessary in the transformation of data into information, and information into knowledge that can be used in particular applications. Sponsored by NASA s Intelligent Systems Project within the Computing, Information and Communication Technology (CICT) Program, a conceptual architecture study has been conducted to examine ideas to improve data utilization through the addition of intelligence into the archives in the context of an overall knowledge building system. Potential Intelligent Archive concepts include: 1) Mining archived data holdings using Intelligent Data Understanding algorithms to improve metadata to facilitate data access and usability; 2) Building intelligence about transformations on data, information, knowledge, and accompanying services involved in a scientific enterprise; 3) Recognizing the value of results, indexing and formatting them for easy access, and delivering them to concerned individuals; 4) Interacting as a cooperative node in a web of distributed systems to perform knowledge building (i.e., the transformations from data to information to knowledge) instead of just data pipelining; and 5) Being aware of other nodes in the knowledge building system, participating in open systems interfaces and protocols for virtualization, and collaborative interoperability. This paper presents some of these concepts and identifies issues to be addressed by research in future intelligent systems technology.

  15. Preservation and Enhancement of the Spacewatch Data Archives

    NASA Technical Reports Server (NTRS)

    Larsen, Jeffrey A.

    2003-01-01

    In March of 1998, the asteroid 1997 XF11 was announced to be potentially hazardous after being tracked over 90 days. A potential two year wait for confirming observations was shortened to under 24 hours because of the existence of archived photographic prediscovery images. Spacewatch was a pioneer in using CCD scanning and possesses a valuable digital archive of its scans. Unfortunately these data are aging on magnetic tape and will soon be lost. Since 1990, the Spacewatch project gathered some 1.5 Terabytes of scan data covering roughly 75,000 degrees of sky to a limiting magnitude of V = 21.5. The data have not yet been mined for all of their asteroids for scientific studies and orbit determination. Spacewatch's real-time motion detection program MODP was constrained by the computers of the era to use simplified image processing algorithms at a reduced efficiency. Jedicke and Herron estimated MODP's efficiency at finding asteroids to be approximately 60 percent to V=18 and improving somewhat thereafter. This lead to a substantial bias correction in their analyses. Larsen has developed a MODP replacement capable in excess of 90 percent efficiency in the same range and able to push a magnitude fainter in completeness. We propose a program of post-processing and re-archiving Spacewatch data. Our scans would be transferred from tape to CD-ROMs and converted to FITS images -- establishing a consistent data format and media for both past and future Spacewatch observations. Larsen's MODP replacement would mine these data for previously undetected motions, which would be made available to the Minor Planet Center and our ongoing asteroid population studies. A searchable observation record would be made generally available for prediscovery work. We estimate the net asteroid yield of this proposal is equivalent to three full years of Spacewatch operations.

  16. Using modern imaging techniques to old HST data: a summary of the ALICE program.

    NASA Astrophysics Data System (ADS)

    Choquet, Elodie; Soummer, Remi; Perrin, Marshall; Pueyo, Laurent; Hagan, James Brendan; Zimmerman, Neil; Debes, John Henry; Schneider, Glenn; Ren, Bin; Milli, Julien; Wolff, Schuyler; Stark, Chris; Mawet, Dimitri; Golimowski, David A.; Hines, Dean C.; Roberge, Aki; Serabyn, Eugene

    2018-01-01

    Direct imaging of extrasolar systems is a powerful technique to study the physical properties of exoplanetary systems and understand their formation and evolution mechanisms. The detection and characterization of these objects are challenged by their high contrast with their host star. Several observing strategies and post-processing algorithms have been developed for ground-based high-contrast imaging instruments, enabling the discovery of directly-imaged and spectrally-characterized exoplanets. The Hubble Space Telescope (HST), pioneer in directly imaging extrasolar systems, has yet been often limited to the detection of bright debris disks systems, with sensitivity limited by the difficulty to implement an optimal PSF subtraction stategy, which is readily offered on ground-based telescopes in pupil tracking mode.The Archival Legacy Investigations of Circumstellar Environments (ALICE) program is a consistent re-analysis of the 10 year old coronagraphic archive of HST's NICMOS infrared imager. Using post-processing methods developed for ground-based observations, we used the whole archive to calibrate PSF temporal variations and improve NICMOS's detection limits. We have now delivered ALICE-reprocessed science products for the whole NICMOS archival data back to the community. These science products, as well as the ALICE pipeline, were used to prototype the JWST coronagraphic data and reduction pipeline. The ALICE program has enabled the detection of 10 faint debris disk systems never imaged before in the near-infrared and several substellar companion candidates, which we are all in the process of characterizing through follow-up observations with both ground-based facilities and HST-STIS coronagraphy. In this publication, we provide a summary of the results of the ALICE program, advertise its science products and discuss the prospects of the program.

  17. Moving beyond the age-depth model paradigm in deep-sea palaeoclimate archives: dual radiocarbon and stable isotope analysis on single foraminifera

    NASA Astrophysics Data System (ADS)

    Lougheed, Bryan C.; Metcalfe, Brett; Ninnemann, Ulysses S.; Wacker, Lukas

    2018-04-01

    Late-glacial palaeoclimate reconstructions from deep-sea sediment archives provide valuable insight into past rapid changes in ocean chemistry. Unfortunately, only a small proportion of the ocean floor with sufficiently high sediment accumulation rate (SAR) is suitable for such reconstructions using the long-standing age-depth model approach. We employ ultra-small radiocarbon (14C) dating on single microscopic foraminifera to demonstrate that the long-standing age-depth model method conceals large age uncertainties caused by post-depositional sediment mixing, meaning that existing studies may underestimate total geochronological error. We find that the age-depth distribution of our 14C-dated single foraminifera is in good agreement with existing bioturbation models only after one takes the possibility of Zoophycos burrowing into account. To overcome the problems associated with the age-depth paradigm, we use the first ever dual 14C and stable isotope (δ18O and δ13C) analysis on single microscopic foraminifera to produce a palaeoclimate time series independent of the age-depth paradigm. This new state of the art essentially decouples single foraminifera from the age-depth paradigm to provide multiple floating, temporal snapshots of ocean chemistry, thus allowing for the successful extraction of temporally accurate palaeoclimate data from low-SAR deep-sea archives. This new method can address large geographical gaps in late-glacial benthic palaeoceanographic reconstructions by opening up vast areas of previously disregarded, low-SAR deep-sea archives to research, which will lead to an improved understanding of the global interaction between oceans and climate.

  18. Data rescue of NASA First ISLSCP (International Satellite Land Surface Climatology Project) Field Experiment (FIFE) aerial observations

    NASA Astrophysics Data System (ADS)

    Santhana Vannan, S. K.; Boyer, A.; Deb, D.; Beaty, T.; Wei, Y.; Wei, Z.

    2017-12-01

    The Oak Ridge National Laboratory Distributed Active Archive Center (ORNL DAAC) for biogeochemical dynamics is one of the NASA Earth Observing System Data and Information System (EOSDIS) data centers. ORNL DAAC (https://daac.ornl.gov) is responsible for data archival, product development and distribution, and user support for biogeochemical and ecological data and models. In particular, ORNL DAAC has been providing data management support for NASA's terrestrial ecology field campaign programs for the last several decades. Field campaigns combine ground, aircraft, and satellite-based measurements in specific ecosystems over multi-year time periods. The data collected during NASA field campaigns are archived at the ORNL DAAC (https://daac.ornl.gov/get_data/). This paper describes the effort of the ORNL DAAC team for data rescue of a First ISLSCP Field Experiment (FIFE) dataset containing airborne and satellite data observations from the 1980s. The data collected during the FIFE campaign contain high resolution aerial imageries collected over Kansas. The data rescue workflow was prepared to test for successful recovery of the data from a CD-ROM and to ensure that the data are usable and preserved for the future. The imageries contain spectral reflectance data that can be used as a historical benchmark to examine climatological and ecological changes in the Kansas region since the 1980s. Below are the key steps taken to convert the files to modern standards. Decompress the imageries using custom compression software provided with the data. The compression algorithm created for MS-DOS in 1980s had to be set up to run on modern computer systems. Decompressed files were geo-referenced by using metadata information stored in separate compressed header files. Standardized file names were applied (File names and details were described in separate readme documents). Image files were converted to GeoTIFF format with embedded georeferencing information. Leverage Open Geospatial Consortium (OGC) Web services to provide dynamic data transformation and visualization. We will describe the steps in detail and share lessons learned during the AGU session.

  19. Archive & Data Management Activities for ISRO Science Archives

    NASA Astrophysics Data System (ADS)

    Thakkar, Navita; Moorthi, Manthira; Gopala Krishna, Barla; Prashar, Ajay; Srinivasan, T. P.

    2012-07-01

    ISRO has kept a step ahead by extending remote sensing missions to planetary and astronomical exploration. It has started with Chandrayaan-1 and successfully completed the moon imaging during its life time in the orbit. Now, in future ISRO is planning to launch Chandrayaan-2 (next moon mission), Mars Mission and Astronomical mission ASTROSAT. All these missions are characterized by the need to receive process, archive and disseminate the acquired science data to the user community for analysis and scientific use. All these science missions will last for a few months to a few years but the data received are required to be archived, interoperable and requires a seamless access to the user community for the future. ISRO has laid out definite plans to archive these data sets in specified standards and develop relevant access tools to be able to serve the user community. To achieve this goal, a Data Center is set up at Bangalore called Indian Space Science Data Center (ISSDC). This is the custodian of all the data sets of the current and future science missions of ISRO . Chandrayaan-1 is the first among the planetary missions launched/to be launched by ISRO and we had taken the challenge and developed a system for data archival and dissemination of the payload data received. For Chandrayaan-1 the data collected from all the instruments are processed and is archived in the archive layer in the Planetary Data System (PDS 3.0) standards, through the automated pipeline. But the dataset once stored is of no use unless it is made public, which requires a Web-based dissemination system that can be accessible to all the planetary scientists/data users working in this field. Towards this, a Web- based Browse and Dissemination system has been developed, wherein users can register and search for their area of Interest and view the data archived for TMC & HYSI with relevant Browse chips and Metadata of the data. Users can also order the data and get it on their desktop in the PDS. For other AO payloads users can view the metadata and the data is available through FTP site. This same archival and dissemination strategy will be extended for the next moon mission Chandrayaan-2. ASTROSAT is going to be the first multi-wavelength astronomical mission for which the data is archived at ISSDC. It consists of five astronomical payloads that would allow simultaneous multi-wavelengths observations from X-ray to Ultra-Violet (UV) of astronomical objects. It is planned to archive the data sets in FITS. The archive of the ASTROSAT will be done in the Archive Layer at ISSDC. The Browse of the Archive will be available through the ISDA (Indian Science Data Archive) web site. The Browse will be IVOA compliant with a search mechanism using VOTable. The data will be available to the users only on request basis via a FTP site after the lock in period is over. It is planned that the Level2 pipeline software and various modules for processing the data sets will be also available on the web site. This paper, describes the archival procedure of Chandrayaan-1 and archive plan for the ASTROSAT, Chandrayaan-2 and other future mission of ISRO including the discussion on data management activities.

  20. The North American Carbon Program Multi-scale Synthesis and Terrestrial Model Intercomparison Project – Part 2: Environmental driver data

    DOE PAGES

    Wei, Yaxing; Liu, Shishi; Huntzinger, Deborah N.; ...

    2014-12-05

    Ecosystems are important and dynamic components of the global carbon cycle, and terrestrial biospheric models (TBMs) are crucial tools in further understanding of how terrestrial carbon is stored and exchanged with the atmosphere across a variety of spatial and temporal scales. Improving TBM skills, and quantifying and reducing their estimation uncertainties, pose significant challenges. The Multi-scale Synthesis and Terrestrial Model Intercomparison Project (MsTMIP) is a formal multi-scale and multi-model intercomparison effort set up to tackle these challenges. The MsTMIP protocol prescribes standardized environmental driver data that are shared among model teams to facilitate model model and model observation comparisons. Inmore » this article, we describe the global and North American environmental driver data sets prepared for the MsTMIP activity to both support their use in MsTMIP and make these data, along with the processes used in selecting/processing these data, accessible to a broader audience. Based on project needs and lessons learned from past model intercomparison activities, we compiled climate, atmospheric CO 2 concentrations, nitrogen deposition, land use and land cover change (LULCC), C3 / C4 grasses fractions, major crops, phenology and soil data into a standard format for global (0.5⁰ x 0.5⁰ resolution) and regional (North American: 0.25⁰ x 0.25⁰ resolution) simulations. In order to meet the needs of MsTMIP, improvements were made to several of the original environmental data sets, by improving the quality, and/or changing their spatial and temporal coverage, and resolution. The resulting standardized model driver data sets are being used by over 20 different models participating in MsTMIP. Lastly, the data are archived at the Oak Ridge National Laboratory Distributed Active Archive Center (ORNL DAAC, http://daac.ornl.gov) to provide long-term data management and distribution.« less

  1. Modelling prehistoric terrain Models using LiDAR-data: a geomorphological approach

    NASA Astrophysics Data System (ADS)

    Höfler, Veit; Wessollek, Christine; Karrasch, Pierre

    2015-10-01

    Terrain surfaces conserve human activities in terms of textures and structures. With reference to archaeological questions, the geological archive is investigated by means of models regarding anthropogenic traces. In doing so, the high-resolution digital terrain model is of inestimable value for the decoding of the archive. The evaluation of these terrain models and the reconstruction of historical surfaces is still a challenging issue. Due to the data collection by means of LiDAR systems (light detection and ranging) and despite their subsequent pre-processing and filtering, recently anthropogenic artefacts are still present in the digital terrain model. Analysis have shown that elements, such as contour lines and channels, can well be extracted from a high-resolution digital terrain model. This way, channels in settlement areas show a clear anthropogenic character. This fact can also be observed for contour lines. Some contour lines representing a possibly natural ground surface and avoid anthropogenic artefacts. Comparable to channels, noticeable patterns of contour lines become visible in areas with anthropogenic artefacts. The presented workflow uses functionalities of ArcGIS and the programming language R.1 The method starts with the extraction of contour lines from the digital terrain model. Through macroscopic analyses based on geomorphological expert knowledge, contour lines are selected representing the natural geomorphological character of the surface. In a first step, points are determined along each contour line in regular intervals. This points and the corresponding height information which is taken from an original digital terrain model is saved as a point cloud. Using the programme library gstat, a variographic analysis and the use of a Kriging-procedure based on this follow.2-4 The result is a digital terrain model filtered considering geomorphological expert knowledge showing no human degradation in terms of artefacts, preserving the landscape-genetic character and can be called a prehistoric terrain model.

  2. D Modelling of the Lusatian Borough in Biskupin Using Archival Data

    NASA Astrophysics Data System (ADS)

    Zawieska, D.; Markiewicz, J. S.; Kopiasz, J.; Tazbir, J.; Tobiasz, A.

    2017-02-01

    The paper presents the results of 3D modelling in the Lusatian Borough, Biskupin, using archival data. Pre-war photographs acquired from different heights, e.g., from a captive balloon (maximum height up to 150 m), from a blimp (at a height of 50-110 m) and from an aeroplane (at a height of 200 m, 300 m and up to 3 km). In order to generate 3D models, AgiSoft tools were applied, as they allow for restoring shapes using triangular meshes. Individual photographs were processed using Google SketchUp software and the "shape from shadow" method. The usefulness of these particular models in archaeological research work was also analysed.

  3. Addressing numerical challenges in introducing a reactive transport code into a land surface model: A biogeochemical modeling proof-of-concept with CLM-PFLOTRAN 1.0: Modeling Archive

    DOE Data Explorer

    Tang, G.; Andre, B.; Hoffman, F. M.; Painter, S. L.; Thornton, P. E.; Yuan, F.; Bisht, G.; Hammond, G. E.; Lichtner, P. C.; Kumar, J.; Mills, R. T.; Xu, X.

    2016-04-19

    This Modeling Archive is in support of an NGEE Arctic discussion paper under review and available at doi:10.5194/gmd-9-927-2016. The purpose is to document the simulations to allow verification, reproducibility, and follow-up studies. This dataset contains shell scripts to create the CLM-PFLOTRAN cases, specific input files for PFLOTRAN and CLM, outputs, and python scripts to make the figures using the outputs in the publication. Through these results, we demonstrate that CLM-PFLOTRAN can approximately reproduce CLM results in selected cases for the Arctic, temperate and tropic sites. In addition, the new framework facilitates mechanistic representations of soil biogeochemistry processes in the land surface model.

  4. Development of public science archive system of Subaru Telescope

    NASA Astrophysics Data System (ADS)

    Baba, Hajime; Yasuda, Naoki; Ichikawa, Shin-Ichi; Yagi, Masafumi; Iwamoto, Nobuyuki; Takata, Tadafumi; Horaguchi, Toshihiro; Taga, Masatochi; Watanabe, Masaru; Okumura, Shin-Ichiro; Ozawa, Tomohiko; Yamamoto, Naotaka; Hamabe, Masaru

    2002-09-01

    We have developed a public science archive system, Subaru-Mitaka-Okayama-Kiso Archive system (SMOKA), as a successor of Mitaka-Okayama-Kiso Archive (MOKA) system. SMOKA provides an access to the public data of Subaru Telescope, the 188 cm telescope at Okayama Astrophysical Observatory, and the 105 cm Schmidt telescope at Kiso Observatory of the University of Tokyo. Since 1997, we have tried to compile the dictionary of FITS header keywords. The accomplishment of the dictionary enabled us to construct an unified public archive of the data obtained with various instruments at the telescopes. SMOKA has two kinds of user interfaces; Simple Search and Advanced Search. Novices can search data by simply selecting the name of the target with the Simple Search interface. Experts would prefer to set detailed constraints on the query, using the Advanced Search interface. In order to improve the efficiency of searching, several new features are implemented, such as archive status plots, calibration data search, an annotation system, and an improved Quick Look Image browsing system. We can efficiently develop and operate SMOKA by adopting a three-tier model for the system. Java servlets and Java Server Pages (JSP) are useful to separate the front-end presentation from the middle and back-end tiers.

  5. Reconstructing Forty Years of Landsat Observations

    NASA Astrophysics Data System (ADS)

    Meyer, D. J.; Dwyer, J. L.; Steinwand, D.

    2013-12-01

    In July 1972, NASA launched the Earth Resource Technology Satellite (ERTS), the first of what was to be the series of Earth-observing satellites we now know as the Landsat system. This system, originally conceived in the 1960's within the US Department of the Interior and US Geological Survey (USGS), has continued with little interruption for over 40 years, creating the longest record of satellite-based global land observations. The current USGS archive of Landsat images exceeds 4 million scenes, and the recently launched Landsat 8 platform will extend that archive to nearly 50 years of observations. Clearly, these observations are critical to the study of Earth system processes, and the interaction between these processes and human activities. However, the seven successful Landsat missions represent more of an ad hoc program than a long-term record of consistent observations, due largely to changing Federal policies and challenges finding an operational home for the program. Technologically, these systems evolved from the original Multispectral Scanning System (MSS) through the Thematic Mapper and Enhanced Thematic Mapper Plus (ETM+) systems, to the current Observational Land Imager (OLI) and Thermal Infrared Sensor (TIRS) systems. Landsat data were collected globally by a network of international cooperators having diverse data management policies. Much of the oldest data were stored on archaic media that could not be retrieved using modern media readers. Collecting these data from various sensors and sources, and reconstructing them into coherent Earth observation records, posed numerous challenges. We present here a brief overview of work done to overcome these challenges and create a consistent, long-term Landsat observation record. Much of the current archive was 'repatriated' from international cooperators and often required the reconstruction of (sometimes absent) metadata for geo-location and radiometric calibration. The older MSS data, some of which had been successfully retrieved from outdated wide band video media, required similar metadata reconstruction. TM data from Landsats 4 and 5 relied on questionable on-board lamp data for calibration, thus the calibration history for these missions was reconstructed to account for sensor degradation over time. To improve continuity between platforms, Landsat 7 and 8 missions employed 'under-flight' maneuvers to reduce inter-calibration error. Data from the various sensors, platforms and sources were integrated into a common metadata standard, with quality assurance information, to ensure understandability of the data for long-term preservation. Because of these efforts, the current Landsat archive can now support the creation of the long-term climate data records and essential climate variables required to monitor changes on the Earth's surface quantitatively over decades of observations.

  6. Preservation of Digital Information. Proceedings of the Membership Meeting of the Association of Research Libraries (131st, Washington, DC, October 15-17, 1997).

    ERIC Educational Resources Information Center

    Barrett, Jaia, Ed.; Wetzel, Karen A., Ed.

    The 131st meeting of the Association of Research Libraries (ARL) focused on preservation of digital information. The ARL Preservation Committee convened three panels of experts to highlight major issues raised by the archiving of digital resources, and to encourage discussion about options for operating models and criteria for digital archives.…

  7. Detail view of the sculpted pediment on the south facade ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Detail view of the sculpted pediment on the south facade entitled Recorder of the Archives; the artist was James Earle Fraser. The great danes in the corner were based on sketches by Fraser's assistant Bruce Moore and the dogs behind the great danes are modeled after Fraser's own dogs. - National Archives, Constitution Avenue, between Seventh & Ninth Streets Northwest, Washington, District of Columbia, DC

  8. A modelling approach to assessing the timescale uncertainties in proxy series with chronological errors

    NASA Astrophysics Data System (ADS)

    Divine, D. V.; Godtliebsen, F.; Rue, H.

    2012-01-01

    The paper proposes an approach to assessment of timescale errors in proxy-based series with chronological uncertainties. The method relies on approximation of the physical process(es) forming a proxy archive by a random Gamma process. Parameters of the process are partly data-driven and partly determined from prior assumptions. For a particular case of a linear accumulation model and absolutely dated tie points an analytical solution is found suggesting the Beta-distributed probability density on age estimates along the length of a proxy archive. In a general situation of uncertainties in the ages of the tie points the proposed method employs MCMC simulations of age-depth profiles yielding empirical confidence intervals on the constructed piecewise linear best guess timescale. It is suggested that the approach can be further extended to a more general case of a time-varying expected accumulation between the tie points. The approach is illustrated by using two ice and two lake/marine sediment cores representing the typical examples of paleoproxy archives with age models based on tie points of mixed origin.

  9. Preserving Data for Renewable Energy

    NASA Astrophysics Data System (ADS)

    Macduff, M.; Sivaraman, C.

    2017-12-01

    The EERE Atmosphere to Electrons (A2e) program established the Data Archive and Portal (DAP) to ensure the long-term preservation and access to A2e research data. The DAP has been operated by PNNL for 2 years with data from more than a dozen projects and 1PB of data and hundreds of datasets expected to be stored this year. The data are a diverse mix of model runs, observational data, and dervived products. While most of the data is public, the DAP has securely stored many proprietary data sets provided by energy producers that are critical to the research goals of the A2e program. The DAP uses Amazon Web Services (AWS) and PNNL resources to provide long-term archival and access to the data with appropriate access controls. As a key element of the DAP, metadata are collected for each dataset to assist with data discovery and usefulness of the data. Further, the DAP has begun a process of standardizing observation data into NetCDF, which allows users to focus on the data instead of parsing the many formats. Creating a central repository that is in tune with the unique needs of the A2e research community is helping active tasks today as well as making many future research efforts possible. In this presentation, we provide an overview the DAP capabilities and benefits to the renewable energy community.

  10. HST-COS Observations of AGNs. III. Spectral Constraints in the Lyman Continuum from Composite COS/G140L Data

    NASA Astrophysics Data System (ADS)

    Tilton, Evan M.; Stevans, Matthew L.; Shull, J. Michael; Danforth, Charles W.

    2016-01-01

    The rest-frame ultraviolet (UV) spectra of active galactic nuclei (AGNs) are important diagnostics of both accretion disk physics and their contribution to the metagalactic ionizing UV background. Though the mean AGN spectrum is well characterized with composite spectra at wavelengths greater than 912 Å, the shorter-wavelength extreme-UV (EUV) remains poorly studied. In this third paper in a series on the spectra of AGNs, we combine 11 new spectra taken with the Cosmic Origins Spectrograph on the Hubble Space Telescope with archival spectra to characterize the typical EUV spectral slope of AGNs from λrest ˜ 850 Å down to λrest ˜ 425 Å. Parameterizing this slope as a power law, we obtain Fν ∝ ν-0.72±0.26, but we also discuss the limitations and systematic uncertainties of this model. We identify broad emission features in this spectral region, including emission due to ions of O, Ne, Mg, and other species, and we limit the intrinsic He I 504 Å photoelectric absorption edge opacity to τHe I < 0.047. Based on observations made with the NASA/ESA Hubble Space Telescope, obtained from the data archive at the Space Telescope Science Institute. STScI is operated by the Association of Universities for Research in Astronomy, Inc., under NASA contract NAS5-26555.

  11. Designing Extensible Data Management for Ocean Observatories, Platforms, and Devices

    NASA Astrophysics Data System (ADS)

    Graybeal, J.; Gomes, K.; McCann, M.; Schlining, B.; Schramm, R.; Wilkin, D.

    2002-12-01

    The Monterey Bay Aquarium Research Institute (MBARI) has been collecting science data for 15 years from all kinds of oceanographic instruments and systems, and is building a next-generation observing system, the MBARI Ocean Observing System (MOOS). To meet the data management requirements of the MOOS, the Institute began developing a flexible, extensible data management solution, the Shore Side Data System (SSDS). This data management system must address a wide variety of oceanographic instruments and data sources, including instruments and platforms of the future. Our data management solution will address all elements of the data management challenge, from ingest (including suitable pre-definition of metadata) through to access and visualization. Key to its success will be ease of use, and automatic incorporation of new data streams and data sets. The data will be of many different forms, and come from many different types of instruments. Instruments will be designed for fixed locations (as with moorings), changing locations (drifters and AUVs), and cruise-based sampling. Data from airplanes, satellites, models, and external archives must also be considered. Providing an architecture which allows data from these varied sources to be automatically archived and processed, yet readily accessed, is only possible with the best practices in metadata definition, software design, and re-use of third-party components. The current status of SSDS development will be presented, including lessons learned from our science users and from previous data management designs.

  12. The Fermi Science Support Center Data Servers and Archive

    NASA Astrophysics Data System (ADS)

    Reustle, Alexander; Fermi Science Support Center

    2018-01-01

    The Fermi Science Support Center (FSSC) provides the scientific community with access to Fermi data and other products. The Gamma-Ray Burst Monitor (GBM) data is stored at NASA's High Energy Astrophysics Science Archive Research Center (HEASARC) and is accessible through their searchable Browse web interface. The Large Area Telescope (LAT) data is distributed through a custom FSSC interface where users can request all photons detected from a region on the sky over a specified time and energy range. Through its website the FSSC also provides planning and scheduling products, such as long and short term observing timelines, spacecraft position and attitude histories, and exposure maps. We present an overview of the different data products provided by the FSSC, how they can be accessed, and statistics on the archive usage since launch.

  13. Archive of digital boomer and CHIRP seismic reflection data collected during USGS cruise 06FSH03 offshore of Fort Lauderdale, Florida, September 2006

    USGS Publications Warehouse

    Harrison, Arnell S.; Dadisman, Shawn V.; Reich, Christopher D.; Wiese, Dana S.; Greenwood, Jason W.; Swarzenski, Peter W.

    2007-01-01

    In September of 2006, the U.S. Geological Survey conducted geophysical surveys offshore of Fort Lauderdale, FL. This report serves as an archive of unprocessed digital boomer and CHIRP seismic reflection data, trackline maps, navigation files, GIS information, Field Activity Collection System (FACS) logs, observer's logbook, and formal FGDC metadata. Filtered and gained digital images of the seismic profiles are also provided. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG-Y format (Barry and others, 1975) and may be downloaded and processed with commercial or public domain software such as Seismic Unix (SU). Example SU processing scripts and USGS software for viewing the SEG-Y files (Zihlman, 1992) are also provided.

  14. The NASA Distributed Active Archive Center Experience in Providing Trustworthy Digital Repositories

    NASA Astrophysics Data System (ADS)

    de Sherbinin, A. M.; Downs, R. R.; Chen, R. S.

    2017-12-01

    Since the early 1990s, NASA Earth Observation System Data and Information System (EOSDIS) has supported between 10 to 12 discipline-specific Distributed Active Archive Centers (DAACs) that have provided long-term preservation of Earth Science data records, particularly from satellite and airborne remote sensing. The focus of this presentation is on two of the DAACs - the Socioeconomic Data and Applications Center (SEDAC) and Oak Ridge National Laboratory (ORNL) DAAC - that provide archiving and dissemination of third party data sets. The presentation describes the community of interest for these two DAACs, their data management practices, and the benefits of certification to the DAACs and their user communities. It also describes the organizational, technical, financial, and legal challenges to providing trustworthy long-term data stewardship.

  15. USGS Releases New Digital Aerial Products

    USGS Publications Warehouse

    ,

    2005-01-01

    The U.S. Geological Survey (USGS) Center for Earth Resources Observation and Science (EROS) has initiated distribution of digital aerial photographic products produced by scanning or digitizing film from its historical aerial photography film archive. This archive, located in Sioux Falls, South Dakota, contains thousands of rolls of film that contain more than 8 million frames of historic aerial photographs. The largest portion of this archive consists of original film acquired by Federal agencies from the 1930s through the 1970s to produce 1:24,000-scale USGS topographic quadrangle maps. Most of this photography is reasonably large scale (USGS photography ranges from 1:8,000 to 1:80,000) to support the production of the maps. Two digital products are currently available for ordering: high-resolution scanned products and medium-resolution digitized products.

  16. Landsat International Cooperators and Global Archive Consolidation

    USGS Publications Warehouse

    ,

    2016-04-07

    Landsat missions have always been an important component of U.S. foreign policy, as well as science and technology policy. The program’s longstanding network of International Cooperators (ICs), which operate numerous International Ground Stations (IGS) around the world, embodies the United States’ policy of peaceful use of outer space and the worldwide dissemination of civil space technology for public benefit. Thus, the ICs provide an essential dimension to the Landsat mission.In 2010, the Landsat Global Archive Consolidation (LGAC) effort began, with a goal to consolidate the Landsat data archives of all international ground stations, make the data more accessible to the global Landsat community, and significantly increase the frequency of observations over a given area of interest to improve scientific uses such as change detection and analysis.

  17. Simulation of Extreme Surface Winds by Regional Climate Models in the NARCCAP Archive

    NASA Astrophysics Data System (ADS)

    Hatteberg, R.; Takle, E. S.

    2011-12-01

    Surface winds play a significant role in many natural processes as well as providing a very important ecological service for many human activities. Surface winds ventilate pollutants and heat from our cities, contribute to pollination for our crops, and regulate the fluxes of heat, moisture, and carbon dioxide from the earth's surface. Many environmental models such as biogeochemical models, crop models, lake models, pollutant transport models, etc., use surface winds as a key variable. Studies of the impacts of climate change and climate variability on a wide range of natural systems and coupled human-natural systems frequently need information on how surface wind speeds will change as greenhouse gas concentrations in the earth's atmosphere change. We have studied the characteristics of extreme winds - both high winds and low winds - created by regional climate models (RCMs) in the NARCCAP archives. We evaluated the capabilities of five RCMs forced by NCEP reanalysis data as well as global climate model (GCM) data for contemporary and future scenario climates to capture the observed statistical distribution of surface winds, both high-wind events and low-wind conditions. Our domain is limited to the Midwest (37°N to 49°N, -82°W to -101°W) with the Great Lakes masked out, which eliminates orographic effects that may contribute to regional circulations. The majority of this study focuses on the warm seasonal in order to examine derechos on the extreme high end and air pollution and plant processes on the low wind speed end. To examine extreme high winds we focus on derechos, which are long-lasting convectively driven extreme wind events that frequently leave a swath of damage extending across multiple states. These events are unusual in that, despite their relatively small spatial scale, they can persist for hours or even days, drawing energy from well-organized larger mesoscale or synoptic scale processes. We examine the ability of NARCCAP RCMs to reproduce these isolated extreme events by assessing their existence, location, magnitude, synoptic linkage, initiation time and duration as compared to the record of observations of derechos in the Midwest and Northeast US. We find that RCMs do reproduce features with close resemblance to derechos although their magnitudes are considerably below those observed (which may be expected given the 50-km grid spacing of the RCM models). Extreme low wind speeds in summer are frequently associated with stagnation conditions leading to high air pollution events in major cities. Low winds also lead to reduced evapotranspiration by crops, which can impact phenological processes (e.g. pollination and seed fertilization, carbon uptake by plants). We evaluate whether RCMs can simulate climatic distributions of low-wind conditions in the northern US. Results show differences among models in their ability to reproduce observed characteristics of low summer-time winds. Only one model reproduces observed high frequency of calm night-time surface winds in summer, which suggests a need to improve model capabilities for simulating extreme stagnation events.

  18. Chemically Dissected Rotation Curves of the Galactic Bulge from Main-sequence Proper Motions

    NASA Astrophysics Data System (ADS)

    Clarkson, William I.; Calamida, Annalisa; Sahu, Kailash C.; Brown, Thomas M.; Gennaro, Mario; Avila, Roberto J.; Valenti, Jeff; Debattista, Victor P.; Rich, R. Michael; Minniti, Dante; Zoccali, Manuela; Aufdemberge, Emily R.

    2018-05-01

    We report results from an exploratory study implementing a new probe of Galactic evolution using archival Hubble Space Telescope imaging observations. Precise proper motions are combined with photometric relative metallicity and temperature indices, to produce the proper-motion rotation curves of the Galactic bulge separately for metal-poor and metal-rich main-sequence samples. This provides a “pencil-beam” complement to large-scale wide-field surveys, which to date have focused on the more traditional bright giant branch tracers. We find strong evidence that the Galactic bulge rotation curves drawn from “metal-rich” and “metal-poor” samples are indeed discrepant. The “metal-rich” sample shows greater rotation amplitude and a steeper gradient against line-of-sight distance, as well as possibly a stronger central concentration along the line of sight. This may represent a new detection of differing orbital anisotropy between metal-rich and metal-poor bulge objects. We also investigate selection effects that would be implied for the longitudinal proper-motion cut often used to isolate a “pure-bulge” sample. Extensive investigation of synthetic stellar populations suggests that instrumental and observational artifacts are unlikely to account for the observed rotation curve differences. Thus, proper-motion-based rotation curves can be used to probe chemodynamical correlations for main-sequence tracer stars, which are orders of magnitude more numerous in the Galactic bulge than the bright giant branch tracers. We discuss briefly the prospect of using this new tool to constrain detailed models of Galactic formation and evolution. Based on observations made with the NASA/ESA Hubble Space Telescope and obtained from the data archive at the Space Telescope Science Institute. STScI is operated by the Association of Universities for Research in Astronomy, Inc., under NASA contract NAS 5-26555.

  19. Detection of the YORP Effect in Asteroid (3103) Eger

    NASA Astrophysics Data System (ADS)

    Durech, Josef; Vokrouhlicky, D.; Polishook, D.; Krugly, Y. N.; Gaftonyuk, N. M.; Stephens, R. D.; Warner, B. D.; Kaasalainen, M.; Gross, J.; Cooney, W.; Terrel, D.

    2009-09-01

    The rotation state of small bodies of the Solar System is affected by the thermal Yarkovsky-O'Keefe-Radzievski-Paddack (YORP) torque. The directly observable consequence of YORP is the secular change of the asteroid's rotational period in time. We carried out new photometric measurements of asteroid (3103) Eger during its suitable apparitions in 2001-2009. We also used archived data going back to 1987. Using all available photometry covering more than twenty years, we were able to detect a tiny deviation from the constant-period rotation. This deviation caused an observable shift between the observed lightcurves and those predicted by the best constant-period model. We used the lightcurve inversion method to derive a shape/spin solution that fitted the data at best. We assumed that the rotation rate evolved linearly in time and derived the acceleration of Eger's rotation rate dω/dt = (9 +/- 6) x 10-9 rad/d2 (maximum estimated uncertainty). The accelerating model provides a significantly better fit than the constant-period model. The value of dω/dt derived from observations is in agreement with the theoretical value computed numerically from the lightcurve inversion shape model and its spin axis orientation. After the three asteroids for which the YORP effect has already been detected (1862 Apollo, 54509 YORP, and 1620 Geographos), Eger is the fourth one.

  20. Harmonize Pipeline and Archiving Aystem: PESSTO@IA2 Use Case

    NASA Astrophysics Data System (ADS)

    Smareglia, R.; Knapic, C.; Molinaro, M.; Young, D.; Valenti, S.

    2013-10-01

    Italian Astronomical Archives Center (IA2) is a research infrastructure project that aims at coordinating different national and international initiatives to improve the quality of astrophysical data services. IA2 is now also involved in the PESSTO (Public ESO Spectroscopic Survey of Transient Objects) collaboration, developing a complete archiving system to store calibrated post processed data (including sensitive intermediate products), a user interface to access private data and Virtual Observatory (VO) compliant web services to access public fast reduction data via VO tools. The archive system shall rely on the PESSTO Marshall to provide file data and its associated metadata output by the PESSTO data-reduction pipeline. To harmonize the object repository, data handling and archiving system, new tools are under development. These systems must have a strong cross-interaction without increasing the complexities of any single task, in order to improve the performances of the whole system and must have a sturdy logic in order to perform all operations in coordination with the other PESSTO tools. MySQL Replication technology and triggers are used for the synchronization of new data in an efficient, fault tolerant manner. A general purpose library is under development to manage data starting from raw observations to final calibrated ones, open to the overriding of different sources, formats, management fields, storage and publication policies. Configurations for all the systems are stored in a dedicated schema (no configuration files), but can be easily updated by a planned Archiving System Configuration Interface (ASCI).

  1. VizieR Online Data Catalog: Astrometric monitoring of ultracool dwarf binaries (Dupuy+, 2017)

    NASA Astrophysics Data System (ADS)

    Dupuy, T. J.; Liu, M. C.

    2017-09-01

    In Table 1 we list all 33 binaries in our Keck+CFHT astrometric monitoring sample, along with three other binaries that have published orbit and parallax measurements. We began obtaining resolved Keck AO astrometry in 2007-2008, and we combined our new astrometry with available data in the literature or public archives (e.g., HST and Gemini) to refine our orbital period estimates and thereby our prioritization for Keck observations. We present here new Keck/NIRC2 AO imaging and non-redundant aperture-masking observations, in addition to a re-analysis of our own previously published data and publicly available archival data for our sample binaries. Table 2 gives our measured astrometry and flux ratios for all Keck AO data used in our orbital analysis spanning 2003 Apr 15 to 2016 May 13. In total there are 339 distinct measurements (unique bandpass and epoch for a given target), where 302 of these are direct imaging and 37 are non-redundant aperture masking. Eight of the imaging measurements are from six unpublished archival data sets. See section 3.1.1 for further details. In addition to our Keck AO monitoring, we also obtained data for three T dwarf binaries over a three-year HST program using the Advanced Camera for Surveys (ACS) Wide Field Camera (WFC) in the F814W bandpass. See section 3.1.2 for further details. Many of our sample binaries have HST imaging data in the public archive. We have re-analyzed the available archival data coming from the WFPC2 Planetary Camera (WFPC2-PC1), ACS High Resolution Channel (ACS-HRC), and NICMOS Camera 1 (NICMOS-NIC1). See section 3.1.3 for further details. We present here an updated analysis of our data from the Hawaii Infrared Parallax Program that uses the CFHT facility infrared camera WIRCam. Our observing strategy and custom astrometry pipeline are described in detail in Dupuy & Liu (2012, J/ApJS/201/19). See section 3.2 for further explanations. (10 data files).

  2. Natural climate variability and teleconnections to precipitation over the Pacific-North American region in CMIP3 and CMIP5 models

    NASA Astrophysics Data System (ADS)

    Polade, Suraj D.; Gershunov, Alexander; Cayan, Daniel R.; Dettinger, Michael D.; Pierce, David W.

    2013-05-01

    climate variability will continue to be an important aspect of future regional climate even in the midst of long-term secular changes. Consequently, the ability of climate models to simulate major natural modes of variability and their teleconnections provides important context for the interpretation and use of climate change projections. Comparisons reported here indicate that the CMIP5 generation of global climate models shows significant improvements in simulations of key Pacific climate mode and their teleconnections to North America compared to earlier CMIP3 simulations. The performance of 14 models with simulations in both the CMIP3 and CMIP5 archives are assessed using singular value decomposition analysis of simulated and observed winter Pacific sea surface temperatures (SSTs) and concurrent precipitation over the contiguous United States and northwestern Mexico. Most of the models reproduce basic features of the key natural mode and their teleconnections, albeit with notable regional deviations from observations in both SST and precipitation. Increasing horizontal resolution in the CMIP5 simulations is an important, but not a necessary, factor in the improvement from CMIP3 to CMIP5.

  3. Natural climate variability and teleconnections to precipitation over the Pacific-North American region in CMIP3 and CMIP5 models

    USGS Publications Warehouse

    Polade, Suraj D.; Gershunov, Alexander; Cayan, Daniel R.; Dettinger, Michael D.; Pierce, David W.

    2013-01-01

    Natural climate variability will continue to be an important aspect of future regional climate even in the midst of long-term secular changes. Consequently, the ability of climate models to simulate major natural modes of variability and their teleconnections provides important context for the interpretation and use of climate change projections. Comparisons reported here indicate that the CMIP5 generation of global climate models shows significant improvements in simulations of key Pacific climate mode and their teleconnections to North America compared to earlier CMIP3 simulations. The performance of 14 models with simulations in both the CMIP3 and CMIP5 archives are assessed using singular value decomposition analysis of simulated and observed winter Pacific sea surface temperatures (SSTs) and concurrent precipitation over the contiguous United States and northwestern Mexico. Most of the models reproduce basic features of the key natural mode and their teleconnections, albeit with notable regional deviations from observations in both SST and precipitation. Increasing horizontal resolution in the CMIP5 simulations is an important, but not a necessary, factor in the improvement from CMIP3 to CMIP5.

  4. A Diffusive-Particle Theory of Free Recall

    PubMed Central

    Fumarola, Francesco

    2017-01-01

    Diffusive models of free recall have been recently introduced in the memory literature, but their potential remains largely unexplored. In this paper, a diffusive model of short-term verbal memory is considered, in which the psychological state of the subject is encoded as the instantaneous position of a particle diffusing over a semantic graph. The model is particularly suitable for studying the dependence of free-recall observables on the semantic properties of the words to be recalled. Besides predicting some well-known experimental features (forward asymmetry, semantic clustering, word-length effect), a novel prediction is obtained on the relationship between the contiguity effect and the syllabic length of words; shorter words, by way of their wider semantic range, are predicted to be characterized by stronger forward contiguity. A fresh analysis of archival free-recall data allows to confirm this prediction. PMID:29085521

  5. The National Deep-Sea Coral and Sponge Database: A Comprehensive Resource for United States Deep-Sea Coral and Sponge Records

    NASA Astrophysics Data System (ADS)

    Dornback, M.; Hourigan, T.; Etnoyer, P.; McGuinn, R.; Cross, S. L.

    2014-12-01

    Research on deep-sea corals has expanded rapidly over the last two decades, as scientists began to realize their value as long-lived structural components of high biodiversity habitats and archives of environmental information. The NOAA Deep Sea Coral Research and Technology Program's National Database for Deep-Sea Corals and Sponges is a comprehensive resource for georeferenced data on these organisms in U.S. waters. The National Database currently includes more than 220,000 deep-sea coral records representing approximately 880 unique species. Database records from museum archives, commercial and scientific bycatch, and from journal publications provide baseline information with relatively coarse spatial resolution dating back as far as 1842. These data are complemented by modern, in-situ submersible observations with high spatial resolution, from surveys conducted by NOAA and NOAA partners. Management of high volumes of modern high-resolution observational data can be challenging. NOAA is working with our data partners to incorporate this occurrence data into the National Database, along with images and associated information related to geoposition, time, biology, taxonomy, environment, provenance, and accuracy. NOAA is also working to link associated datasets collected by our program's research, to properly archive them to the NOAA National Data Centers, to build a robust metadata record, and to establish a standard protocol to simplify the process. Access to the National Database is provided through an online mapping portal. The map displays point based records from the database. Records can be refined by taxon, region, time, and depth. The queries and extent used to view the map can also be used to download subsets of the database. The database, map, and website is already in use by NOAA, regional fishery management councils, and regional ocean planning bodies, but we envision it as a model that can expand to accommodate data on a global scale.

  6. Enabling Data-Driven Methodologies Across the Data Lifecycle and Ecosystem

    NASA Astrophysics Data System (ADS)

    Doyle, R. J.; Crichton, D.

    2017-12-01

    NASA has unlocked unprecedented scientific knowledge through exploration of the Earth, our solar system, and the larger universe. NASA is generating enormous amounts of data that are challenging traditional approaches to capturing, managing, analyzing and ultimately gaining scientific understanding from science data. New architectures, capabilities and methodologies are needed to span the entire observing system, from spacecraft to archive, while integrating data-driven discovery and analytic capabilities. NASA data have a definable lifecycle, from remote collection point to validated accessibility in multiple archives. Data challenges must be addressed across this lifecycle, to capture opportunities and avoid decisions that may limit or compromise what is achievable once data arrives at the archive. Data triage may be necessary when the collection capacity of the sensor or instrument overwhelms data transport or storage capacity. By migrating computational and analytic capability to the point of data collection, informed decisions can be made about which data to keep; in some cases, to close observational decision loops onboard, to enable attending to unexpected or transient phenomena. Along a different dimension than the data lifecycle, scientists and other end-users must work across an increasingly complex data ecosystem, where the range of relevant data is rarely owned by a single institution. To operate effectively, scalable data architectures and community-owned information models become essential. NASA's Planetary Data System is having success with this approach. Finally, there is the difficult challenge of reproducibility and trust. While data provenance techniques will be part of the solution, future interactive analytics environments must support an ability to provide a basis for a result: relevant data source and algorithms, uncertainty tracking, etc., to assure scientific integrity and to enable confident decision making. Advances in data science offer opportunities to gain new insights from space missions and their vast data collections. We are working to innovate new architectures, exploit emerging technologies, develop new data-driven methodologies, and transfer them across disciplines, while working across the dual dimensions of the data lifecycle and the data ecosystem.

  7. A Method to Evaluate Genome-Wide Methylation in Archival Formalin-Fixed, Paraffin-Embedded Ovarian Epithelial Cells

    PubMed Central

    Li, Qiling; Li, Min; Ma, Li; Li, Wenzhi; Wu, Xuehong; Richards, Jendai; Fu, Guoxing; Xu, Wei; Bythwood, Tameka; Li, Xu; Wang, Jianxin; Song, Qing

    2014-01-01

    Background The use of DNA from archival formalin and paraffin embedded (FFPE) tissue for genetic and epigenetic analyses may be problematic, since the DNA is often degraded and only limited amounts may be available. Thus, it is currently not known whether genome-wide methylation can be reliably assessed in DNA from archival FFPE tissue. Methodology/Principal Findings Ovarian tissues, which were obtained and formalin-fixed and paraffin-embedded in either 1999 or 2011, were sectioned and stained with hematoxylin-eosin (H&E).Epithelial cells were captured by laser micro dissection, and their DNA subjected to whole genomic bisulfite conversion, whole genomic polymerase chain reaction (PCR) amplification, and purification. Sequencing and software analyses were performed to identify the extent of genomic methylation. We observed that 31.7% of sequence reads from the DNA in the 1999 archival FFPE tissue, and 70.6% of the reads from the 2011 sample, could be matched with the genome. Methylation rates of CpG on the Watson and Crick strands were 32.2% and 45.5%, respectively, in the 1999 sample, and 65.1% and 42.7% in the 2011 sample. Conclusions/Significance We have developed an efficient method that allows DNA methylation to be assessed in archival FFPE tissue samples. PMID:25133528

  8. Simple, Script-Based Science Processing Archive

    NASA Technical Reports Server (NTRS)

    Lynnes, Christopher; Hegde, Mahabaleshwara; Barth, C. Wrandle

    2007-01-01

    The Simple, Scalable, Script-based Science Processing (S4P) Archive (S4PA) is a disk-based archival system for remote sensing data. It is based on the data-driven framework of S4P and is used for data transfer, data preprocessing, metadata generation, data archive, and data distribution. New data are automatically detected by the system. S4P provides services such as data access control, data subscription, metadata publication, data replication, and data recovery. It comprises scripts that control the data flow. The system detects the availability of data on an FTP (file transfer protocol) server, initiates data transfer, preprocesses data if necessary, and archives it on readily available disk drives with FTP and HTTP (Hypertext Transfer Protocol) access, allowing instantaneous data access. There are options for plug-ins for data preprocessing before storage. Publication of metadata to external applications such as the Earth Observing System Clearinghouse (ECHO) is also supported. S4PA includes a graphical user interface for monitoring the system operation and a tool for deploying the system. To ensure reliability, S4P continuously checks stored data for integrity, Further reliability is provided by tape backups of disks made once a disk partition is full and closed. The system is designed for low maintenance, requiring minimal operator oversight.

  9. SST Patterns, Atmospheric Variability, and Inferred Sensitivities in the CMIP5 Model Archive

    NASA Astrophysics Data System (ADS)

    Marvel, K.; Pincus, R.; Schmidt, G. A.

    2017-12-01

    An emerging consensus suggests that global mean feedbacks to increasing temperature are not constant in time. If feedbacks become more positive in the future, the equilibrium climate sensitivity (ECS) inferred from recent observed global energy budget constraints is likely to be biased low. Time-varying feedbacks are largely tied to evolving sea-surface temperature patterns. In particular, recent anomalously cool conditions in the tropical Pacific may have triggered feedbacks that are not reproduced in equilibrium simulations where the tropical Pacific and Southern Ocean have had time to warm. Here, we use AMIP and CMIP5 historical simulations to explore the ECS that may be inferred over the recent historical period. We find that in all but one CMIP5 model, the feedbacks triggered by observed SST patterns are significantly less positive than those arising from historical simulations in which SST patterns are allowed to evolve unconstrained. However, there are substantial variations in feedbacks even when the SST pattern is held fixed, suggesting that atmospheric and land variability contribute to uncertainty in the estimates of ECS obtained from recent observations of the global energy budget.

  10. The X-ray spectra of blazars: Analysis of the complete EXOSAT archive

    NASA Technical Reports Server (NTRS)

    Sambruna, Rita M.; Barr, Paul; Giommi, Paolo; Maraschi, Laura; Tagliaferri, Gianpiero; Treves, Aldo

    1994-01-01

    We analyzed the 0.1-10 keV spectra of 26 blazars (21 BL Lac objects and 5 highly polarized quasars), on the basis of 93 exposures taken from the EXOSAT archives. Fits were performed first with a single power-law model. Indications are found that better fits can be obtained with models where the slope steepens at higher energies. We therefore considered a broken power law and found that in a large fraction of the spectra the fit is significantly improved. Fits with a power law + oxygen edge at 0.6 keV are also explored.

  11. Errors and improvements in the use of archived meteorological data for chemical transport modeling: an analysis using GEOS-Chem v11-01 driven by GEOS-5 meteorology

    NASA Astrophysics Data System (ADS)

    Yu, Karen; Keller, Christoph A.; Jacob, Daniel J.; Molod, Andrea M.; Eastham, Sebastian D.; Long, Michael S.

    2018-01-01

    Global simulations of atmospheric chemistry are commonly conducted with off-line chemical transport models (CTMs) driven by archived meteorological data from general circulation models (GCMs). The off-line approach has the advantages of simplicity and expediency, but it incurs errors due to temporal averaging in the meteorological archive and the inability to reproduce the GCM transport algorithms exactly. The CTM simulation is also often conducted at coarser grid resolution than the parent GCM. Here we investigate this cascade of CTM errors by using 222Rn-210Pb-7Be chemical tracer simulations off-line in the GEOS-Chem CTM at rectilinear 0.25° × 0.3125° (≈ 25 km) and 2° × 2.5° (≈ 200 km) resolutions and online in the parent GEOS-5 GCM at cubed-sphere c360 (≈ 25 km) and c48 (≈ 200 km) horizontal resolutions. The c360 GEOS-5 GCM meteorological archive, updated every 3 h and remapped to 0.25° × 0.3125°, is the standard operational product generated by the NASA Global Modeling and Assimilation Office (GMAO) and used as input by GEOS-Chem. We find that the GEOS-Chem 222Rn simulation at native 0.25° × 0.3125° resolution is affected by vertical transport errors of up to 20 % relative to the GEOS-5 c360 online simulation, in part due to loss of transient organized vertical motions in the GCM (resolved convection) that are temporally averaged out in the 3 h meteorological archive. There is also significant error caused by operational remapping of the meteorological archive from a cubed-sphere to a rectilinear grid. Decreasing the GEOS-Chem resolution from 0.25° × 0.3125° to 2° × 2.5° induces further weakening of vertical transport as transient vertical motions are averaged out spatially and temporally. The resulting 222Rn concentrations simulated by the coarse-resolution GEOS-Chem are overestimated by up to 40 % in surface air relative to the online c360 simulations and underestimated by up to 40 % in the upper troposphere, while the tropospheric lifetimes of 210Pb and 7Be against aerosol deposition are affected by 5-10 %. The lost vertical transport in the coarse-resolution GEOS-Chem simulation can be partly restored by recomputing the convective mass fluxes at the appropriate resolution to replace the archived convective mass fluxes and by correcting for bias in the spatial averaging of boundary layer mixing depths.

  12. Development of the SOFIA Image Processing Tool

    NASA Technical Reports Server (NTRS)

    Adams, Alexander N.

    2011-01-01

    The Stratospheric Observatory for Infrared Astronomy (SOFIA) is a Boeing 747SP carrying a 2.5 meter infrared telescope capable of operating between at altitudes of between twelve and fourteen kilometers, which is above more than 99 percent of the water vapor in the atmosphere. The ability to make observations above most water vapor coupled with the ability to make observations from anywhere, anytime, make SOFIA one of the world s premiere infrared observatories. SOFIA uses three visible light CCD imagers to assist in pointing the telescope. The data from these imagers is stored in archive files as is housekeeping data, which contains information such as boresight and area of interest locations. A tool that could both extract and process data from the archive files was developed.

  13. A modelling approach to assessing the timescale uncertainties in proxy series with chronological errors

    NASA Astrophysics Data System (ADS)

    Divine, D.; Godtliebsen, F.; Rue, H.

    2012-04-01

    Detailed knowledge of past climate variations is of high importance for gaining a better insight into the possible future climate scenarios. The relative shortness of available high quality instrumental climate data conditions the use of various climate proxy archives in making inference about past climate evolution. It, however, requires an accurate assessment of timescale errors in proxy-based paleoclimatic reconstructions. We here propose an approach to assessment of timescale errors in proxy-based series with chronological uncertainties. The method relies on approximation of the physical process(es) forming a proxy archive by a random Gamma process. Parameters of the process are partly data-driven and partly determined from prior assumptions. For a particular case of a linear accumulation model and absolutely dated tie points an analytical solution is found suggesting the Beta-distributed probability density on age estimates along the length of a proxy archive. In a general situation of uncertainties in the ages of the tie points the proposed method employs MCMC simulations of age-depth profiles yielding empirical confidence intervals on the constructed piecewise linear best guess timescale. It is suggested that the approach can be further extended to a more general case of a time-varying expected accumulation between the tie points. The approach is illustrated by using two ice and two lake/marine sediment cores representing the typical examples of paleoproxy archives with age models constructed using tie points of mixed origin.

  14. ENSO-driven energy budget perturbations in observations and CMIP models

    DOE PAGES

    Mayer, Michael; Fasullo, John T.; Trenberth, Kevin E.; ...

    2016-03-19

    Various observation-based datasets are employed to robustly quantify changes in ocean heat content (OHC), anomalous ocean–atmosphere energy exchanges and atmospheric energy transports during El Niño-Southern Oscillation (ENSO). These results are used as a benchmark to evaluate the energy pathways during ENSO as simulated by coupled climate model runs from the CMIP3 and CMIP5 archives. The models are able to qualitatively reproduce observed patterns of ENSO-related energy budget variability to some degree, but key aspects are seriously biased. Area-averaged tropical Pacific OHC variability associated with ENSO is greatly underestimated by all models because of strongly biased responses of net radiation atmore » top-of-the-atmosphere to ENSO. The latter are related to biases of mean convective activity in the models and project on surface energy fluxes in the eastern Pacific Intertropical Convergence Zone region. Moreover, models underestimate horizontal and vertical OHC redistribution in association with the generally too weak Bjerknes feedback, leading to a modeled ENSO affecting a too shallow layer of the Pacific. Vertical links between SST and OHC variability are too weak even in models driven with observed winds, indicating shortcomings of the ocean models. Furthermore, modeled teleconnections as measured by tropical Atlantic OHC variability are too weak and the tropical zonal mean ENSO signal is strongly underestimated or even completely missing in most of the considered models. In conclusion, results suggest that attempts to infer insight about climate sensitivity from ENSO-related variability are likely to be hampered by biases in ENSO in CMIP simulations that do not bear a clear link to future changes.« less

  15. The impact of Surface Wind Velocity Data Assimilation on the Predictability of Plume Advection in the Lower Troposphere

    NASA Astrophysics Data System (ADS)

    Sekiyama, Thomas; Kajino, Mizuo; Kunii, Masaru

    2017-04-01

    The authors investigated the impact of surface wind velocity data assimilation on the predictability of plume advection in the lower troposphere exploiting the radioactive cesium emitted by the Fukushima nuclear accident in March 2011 as an atmospheric tracer. It was because the radioactive cesium plume was dispersed from the sole point source exactly placed at the Fukushima Daiichi Nuclear Power Plant and its surface concentration was measured at many locations with a high frequency and high accuracy. We used a non-hydrostatic regional weather prediction model with a horizontal resolution of 3 km, which was coupled with an ensemble Kalman filter data assimilation system in this study, to simulate the wind velocity and plume advection. The main module of this weather prediction model has been developed and used operationally by the Japan Meteorological Agency (JMA) since before March 2011. The weather observation data assimilated into the model simulation were provided from two data resources; [#1] the JMA observation archives collected for numerical weather predictions (NWPs) and [#2] the land-surface wind velocity data archived by the JMA surface weather observation network. The former dataset [#1] does not contain land-surface wind velocity observations because their spatial representativeness is relatively small and therefore the land-surface wind velocity data assimilation normally deteriorates the more than one day NWP performance. The latter dataset [#2] is usually used for real-time weather monitoring and never used for the data assimilation of more than one day NWPs. We conducted two experiments (STD and TEST) to reproduce the radioactive cesium plume behavior for 48 hours from 12UTC 14 March to 12UTC 16 March 2011 over the land area of western Japan. The STD experiment was performed to replicate the operational NWP using only the #1 dataset, not assimilating land-surface wind observations. In contrast, the TEST experiment was performed assimilating both the #1 dataset and the #2 dataset including land-surface wind observations measured at more than 200 stations in the model domain. The meteorological boundary conditions for both the experiments were imported from the JMA operational global NWP model results. The modeled radioactive cesium concentrations were examined for plume arrival timing at each observatory comparing with the hourly-measured "suspended particulate matter" filter tape's cesium concentrations retrieved by Tsuruta et al. at more than 40 observatories. The averaged difference of the plume arrival times at 40 observatories between the observational reality and the STD experiment was 82.0 minutes; at this time, the forecast period was 13 hours on average. Meanwhile, The averaged difference of the TEST experiment was 72.8 minutes, which was smaller than that of the STD experiment with a statistical significance of 99.2 %. In summary, the land-surface wind velocity data assimilation improves the predictability of plume advection in the lower troposphere at least in the case of wintertime air pollution over complex terrain. We need more investigation into the data assimilation impact of land-surface weather observations on the predictability of pollutant dispersion especially in the planetary boundary layer.

  16. The VLBA Extragalactic Proper Motion Catalog and a Measurement of the Secular Aberration Drift

    NASA Astrophysics Data System (ADS)

    Truebenbach, Alexandra E.; Darling, Jeremy

    2017-11-01

    We present a catalog of extragalactic proper motions created using archival VLBI data and our own VLBA astrometry. The catalog contains 713 proper motions, with average uncertainties of ˜24 μas yr-1, including 40 new or improved proper motion measurements using relative astrometry with the VLBA. The observations were conducted in the X-band and yielded positions with uncertainties of ˜70 μas. We add 10 new redshifts using spectroscopic observations taken at Apache Point Observatory and Gemini North. With the VLBA Extragalactic Proper Motion Catalog, we detect the secular aberration drift—the apparent motion of extragalactic objects caused by the solar system’s acceleration around the Galactic center—at a 6.3σ significance. We model the aberration drift as a spheroidal dipole, with the square root of the power equal to 4.89 ± 0.77 μas yr-1, an amplitude of 1.69 ± 0.27 μas yr-1, and an apex at (275\\buildrel{\\circ}\\over{.} 2+/- 10\\buildrel{\\circ}\\over{.} 0, -29\\buildrel{\\circ}\\over{.} 4+/- 8\\buildrel{\\circ}\\over{.} 8). Our dipole model detects the aberration drift at a higher significance than some previous studies, but at a lower amplitude than expected or previously measured. The full aberration drift may be partially removed by the no-net-rotation constraint used when measuring archival extragalactic radio source positions. Like the cosmic microwave background dipole, which is induced by the observer’s motion, the aberration drift signal should be subtracted from extragalactic proper motions in order to detect cosmological proper motions, including the Hubble expansion, long-period stochastic gravitational waves, and the collapse of large-scale structure.

  17. A New Test of Copper and Zinc Abundances in Late-Type Stars Using Cu II and Zn II lines in the Near-Ultraviolet

    NASA Astrophysics Data System (ADS)

    Roederer, Ian

    2017-08-01

    The copper (Cu, Z = 29) and zinc (Zn, Z = 30) abundances found in late-type stars provide critical constraints on models that predict the yields of massive star supernovae, hypernovae, Type Ia supernovae, and AGB stars, which are essential ingredients in Galactic chemical evolution models. Furthermore, Zn is commonly used to compare the abundance of iron-group elements in the gas phase in high-redshift DLA systems with metallicities in Local Group stars. It is thus important that the observational Cu and Zn abundances in stars are correct. My proposed archive study will address this issue by using archive STIS spectra of 14 stars to provide the first systematic observational tests of non-LTE calculations of Cu and Zn line formation in late-type stars. The non-LTE calculations predict that all LTE [Cu/Fe] abundance ratios presently found in the literature are systematically lower than the true ratios found in stars. The non-LTE calculations for Zn predict that the LTE values in the literature may be systematically overestimated in low-metallicity stars. The LTE abundances of Cu and Zn are derived from Cu I and Zn I lines. The key advance enabled by the use of NUV spectra is the detection of several lines of Cu II and Zn II, which cannot be detected in the optical or infrared. Cu II and Zn II are largely immune to non-LTE effects in the atmospheres of late-type stars. The metallicities of the 14 stars with NUV spectra span -2.6 < [Fe/H] < -0.1, which covers the range of most Cu and Zn abundances reported in the literature. The proposed study will allow me to test the non-LTE calculations and calibrate the stellar abundances.

  18. Kernel-Phase Interferometry for Super-Resolution Detection of Faint Companions

    NASA Astrophysics Data System (ADS)

    Factor, Samuel M.; Kraus, Adam L.

    2017-06-01

    Direct detection of close in companions (exoplanets or binary systems) is notoriously difficult. While coronagraphs and point spread function (PSF) subtraction can be used to reduce contrast and dig out signals of companions under the PSF, there are still significant limitations in separation and contrast near λ/D. Non-redundant aperture masking (NRM) interferometry can be used to detect companions well inside the PSF of a diffraction limited image, though the mask discards ˜ 95% of the light gathered by the telescope and thus the technique is severely flux limited. Kernel-phase analysis applies interferometric techniques similar to NRM to a diffraction limited image utilizing the full aperture. Instead of non-redundant closure-phases, kernel-phases are constructed from a grid of points on the full aperture, simulating a redundant interferometer. I have developed a new, easy to use, faint companion detection pipeline which analyzes kernel-phases utilizing Bayesian model comparison. I demonstrate this pipeline on archival images from HST/NICMOS, searching for new companions in order to constrain binary formation models at separations inaccessible to previous techniques. Using this method, it is possible to detect a companion well within the classical λ/D Rayleigh diffraction limit using a fraction of the telescope time as NRM. Since the James Webb Space Telescope (JWST) will be able to perform NRM observations, further development and characterization of kernel-phase analysis will allow efficient use of highly competitive JWST telescope time. As no mask is needed, this technique can easily be applied to archival data and even target acquisition images (e.g. from JWST), making the detection of close in companions cheap and simple as no additional observations are needed.

  19. Building the European Seismological Research Infrastructure: results from 4 years NERIES EC project

    NASA Astrophysics Data System (ADS)

    van Eck, T.; Giardini, D.

    2010-12-01

    The EC Research Infrastructure (RI) project, Network of Research Infrastructures for European Seismology (NERIES), implemented a comprehensive European integrated RI for earthquake seismological data that is scalable and sustainable. NERIES opened a significant amount of additional seismological data, integrated different distributed data archives, implemented and produced advanced analysis tools and advanced software packages and tools. A single seismic data portal provides a single access point and overview for European seismological data available for the earth science research community. Additional data access tools and sites have been implemented to meet user and robustness requirements, notably those at the EMSC and ORFEUS. The datasets compiled in NERIES and available through the portal include among others: - The expanded Virtual European Broadband Seismic Network (VEBSN) with real-time access to more then 500 stations from > 53 observatories. This data is continuously monitored, quality controlled and archived in the European Integrated Distributed waveform Archive (EIDA). - A unique integration of acceleration datasets from seven networks in seven European or associated countries centrally accessible in a homogeneous format, thus forming the core comprehensive European acceleration database. Standardized parameter analysis and actual software are included in the database. - A Distributed Archive of Historical Earthquake Data (AHEAD) for research purposes, containing among others a comprehensive European Macroseismic Database and Earthquake Catalogue (1000 - 1963, M ≥5.8), including analysis tools. - Data from 3 one year OBS deployments at three sites, Atlantic, Ionian and Ligurian Sea within the general SEED format, thus creating the core integrated data base for ocean, sea and land based seismological observatories. Tools to facilitate analysis and data mining of the RI datasets are: - A comprehensive set of European seismological velocity reference model including a standardized model description with several visualisation tools currently adapted on a global scale. - An integrated approach to seismic hazard modelling and forecasting, a community accepted forecasting testing and model validation approach and the core hazard portal developed along the same technologies as the NERIES data portal. - Implemented homogeneous shakemap estimation tools at several large European observatories and a complementary new loss estimation software tool. - A comprehensive set of new techniques for geotechnical site characterization with relevant software packages documented and maintained (www.geopsy.org). - A set of software packages for data mining, data reduction, data exchange and information management in seismology as research and observatory analysis tools NERIES has a long-term impact and is coordinated with related US initiatives IRIS and EarthScope. The follow-up EC project of NERIES, NERA (2010 - 2014), is funded and will integrate the seismological and the earthquake engineering infrastructures. NERIES further provided the proof of concept for the ESFRI2008 initiative: the European Plate Observing System (EPOS). Its preparatory phase (2010 - 2014) is also funded by the EC.

  20. Challenges for Data Archival Centers in Evolving Environmental Sciences

    NASA Astrophysics Data System (ADS)

    Wei, Y.; Cook, R. B.; Gu, L.; Santhana Vannan, S. K.; Beaty, T.

    2015-12-01

    Environmental science has entered into a big data era as enormous data about the Earth environment are continuously collected through field and airborne missions, remote sensing observations, model simulations, sensor networks, etc. An open-access and open-management data infrastructure for data-intensive science is a major grand challenge in global environmental research (BERAC, 2010). Such an infrastructure, as exemplified in EOSDIS, GEOSS, and NSF EarthCube, will provide a complete lifecycle of environmental data and ensures that data will smoothly flow among different phases of collection, preservation, integration, and analysis. Data archival centers, as the data integration units closest to data providers, serve as the source power to compile and integrate heterogeneous environmental data into this global infrastructure. This presentation discusses the interoperability challenges and practices of geosciences from the aspect of data archival centers, based on the operational experiences of the NASA-sponsored Oak Ridge National Laboratory Distributed Active Archive Center (ORNL DAAC) and related environmental data management activities. Specifically, we will discuss the challenges to 1) encourage and help scientists to more actively share data with the broader scientific community, so that valuable environmental data, especially those dark data collected by individual scientists in small independent projects, can be shared and integrated into the infrastructure to tackle big science questions; 2) curate heterogeneous multi-disciplinary data, focusing on the key aspects of identification, format, metadata, data quality, and semantics to make them ready to be plugged into a global data infrastructure. We will highlight data curation practices at the ORNL DAAC for global campaigns such as BOREAS, LBA, SAFARI 2000; and 3) enhance the capabilities to more effectively and efficiently expose and deliver "big" environmental data to broad range of users and systems. Experiences and challenges with integrating large data sets via the ORNL DAAC's data discovery and delivery Web services will be discussed.

  1. Ray Modeling Methods for Range Dependent Ocean Environments

    DTIC Science & Technology

    1983-12-01

    the eikonal equation, gives rise to equations for ray paths which are perpendicular to the wave fronts. Equation II.4, the transport equation, leads... databases for use by MEDUSA. The author has assisted in the installation of MEDUSA at computer facilities which possess databases containing archives of...sound velocity profiles, bathymetry, and bottom loss data. At each computer site, programs convert the archival data retrieved by the database system

  2. The Road to Independently Understandable Information

    NASA Astrophysics Data System (ADS)

    Habermann, T.; Robinson, E.

    2017-12-01

    The turn of the 21st century was a pivotal time in the Earth and Space Science information ecosystem. The Content Standard for Digital Geospatial Metadata (CSDGM) had existed for nearly a decade and ambitious new standards were just emerging. The U.S. Federal Geospatial Data Committee (FGDC) had extended many of the concepts from CSDGM into the International community with ISO 19115:2003 and the Consultative Committee for Space Data Systems (CCSDS) had migrated their Open Archival Information System (OAIS) Reference Model into an international standard (ISO 14721:2003). The OAIS model outlined the roles and responsibilities of archives with the principle role being preserving information and making it available to users, a "designated community", as a service to the data producer. It was mandatory for the archive to ensure that information is "independently understandable" to the designated community and to maintain that understanding through on-going partnerships between archives and designated communities. Standards can play a role in supporting these partnerships as designated communities expand across disciplinary and geographic boundaries. The ISO metadata standards include many capabilities that might make critical contributions to this goal. These include connections to resources outside of the metadata record (i.e. documentation) and mechanisms for ongoing incorporation of user feedback into the metadata stream. We will demonstrate these capabilities with examples of how they can increase understanding.

  3. KIC 8462852: Maria Mitchell Observatory Photographic Photometry 1922 to 1991

    NASA Astrophysics Data System (ADS)

    Castelaz, M.; Barker, T.

    2018-06-01

    A new study of the long-term photometric behavior of the the unusual star KIC 8462852 (Boyajian's Star) has been carried out using archival photographic plates from 1922ñ1991 taken at the Maria Mitchell Observatory (MMO). We find five episodes of sudden, several day, decreases in magnitude occurring in 1935, 1966, 1978, and two in 1980. Episodes of sudden increase in magnitude appear to occur in 1967 and 1977. Inspection of archival light curves of KIC 8462852 from two previous studies based on the Harvard and the Sonneberg plate collections finds apparent corresponding events to these observed episodes in the MMO light curve. Also, a general trend of 0.12 ± 0.02 magnitude per century decrease is observed in the MMO light curve, significant, but less than the trend of 0.164 ± 0.013 observed in the Harvard light curve.

  4. GAUDI: A Preparatory Archive for the COROT Mission

    NASA Astrophysics Data System (ADS)

    Solano, E.; Catala, C.; Garrido, R.; Poretti, E.; Janot-Pacheco, E.; Gutiérrez, R.; González, R.; Mantegazza, L.; Neiner, C.; Fremat, Y.; Charpinet, S.; Weiss, W.; Amado, P. J.; Rainer, M.; Tsymbal, V.; Lyashko, D.; Ballereau, D.; Bouret, J. C.; Hua, T.; Katz, D.; Lignières, F.; Lüftinger, T.; Mittermayer, P.; Nesvacil, N.; Soubiran, C.; van't Veer-Menneret, C.; Goupil, M. J.; Costa, V.; Rolland, A.; Antonello, E.; Bossi, M.; Buzzoni, A.; Rodrigo, C.; Aerts, C.; Butler, C. J.; Guenther, E.; Hatzes, A.

    2005-01-01

    The GAUDI database (Ground-based Asteroseismology Uniform Database Interface) is a preparatory archive for the COROT (Convection, Rotation, and Planetary Transits) mission developed at the Laboratorio de Astrofísica Espacial y Física Fundamental (Laboratory for Space Astrophysics and Theoretical Physics, Spain). Its intention is to make the ground-based observations obtained in preparation of the asteroseismology program available in a simple and efficient way. It contains spectroscopic and photometric data together with inferred physical parameters for more than 1500 objects gathered since 1998 January 1998 in 6 years of observational campaigns. In this paper, the main functions and characteristics of the system are described. Based on observations collected at La Silla (ESO proposals 67.D-0169, 69.D-0166, and 70.D-0110), Telescopio Nazionale Galileo (proposal 6-20-068), Observatoire de Haute-Provence, the South African Astronomical Observatory, Tautenburg Observatory, and Sierra Nevada Observatory.

  5. Satellite Observations of Stratospheric Gravity Waves Associated With the Intensification of Tropical Cyclones

    NASA Astrophysics Data System (ADS)

    Hoffmann, Lars; Wu, Xue; Alexander, M. Joan

    2018-02-01

    Forecasting the intensity of tropical cyclones is a challenging problem. Rapid intensification is often preceded by the formation of "hot towers" near the eyewall. Driven by strong release of latent heat, hot towers are high-reaching tropical cumulonimbus clouds that penetrate the tropopause. Hot towers are a potentially important source of stratospheric gravity waves. Using 13.5 years (2002-2016) of Atmospheric Infrared Sounder observations of stratospheric gravity waves and tropical cyclone data from the International Best Track Archive for Climate Stewardship, we found empirical evidence that stratospheric gravity wave activity is associated with the intensification of tropical cyclones. The Atmospheric Infrared Sounder and International Best Track Archive for Climate Stewardship data showed that strong gravity wave events occurred about twice as often for tropical cyclone intensification compared to storm weakening. Observations of stratospheric gravity waves, which are not affected by obscuring tropospheric clouds, may become an important future indicator of storm intensification.

  6. Expanding understanding of optical variability in Lake Superior with a 4-year dataset

    NASA Astrophysics Data System (ADS)

    Mouw, Colleen B.; Ciochetto, Audrey B.; Grunert, Brice; Yu, Angela

    2017-07-01

    Lake Superior is one of the largest freshwater lakes on our planet, but few optical observations have been made to allow for the development and validation of visible spectral satellite remote sensing products. The dataset described here focuses on coincidently observing inherent and apparent optical properties along with biogeochemical parameters. Specifically, we observe remote sensing reflectance, absorption, scattering, backscattering, attenuation, chlorophyll concentration, and suspended particulate matter over the ice-free months of 2013-2016. The dataset substantially increases the optical knowledge of the lake. In addition to visible spectral satellite algorithm development, the dataset is valuable for characterizing the variable light field, particle, phytoplankton, and colored dissolved organic matter distributions, and helpful in food web and carbon cycle investigations. The compiled data can be freely accessed at https://seabass.gsfc.nasa.gov/archive/URI/Mouw/LakeSuperior/.

  7. Model Atmospheres and Spectral Irradiance Library of the Exoplanet Host Stars Observed in the MUSCLES Survey

    NASA Astrophysics Data System (ADS)

    Linsky, Jeffrey

    2017-08-01

    We propose to compute state-of-the-art model atmospheres (photospheres, chromospheres, transition regions and coronae) of the 4 K and 7 M exoplanet host stars observed by HST in the MUSCLES Treasury Survey, the nearest host star Proxima Centauri, and TRAPPIST-1. Our semi-empirical models will fit theunique high-resolution panchromatic (X-ray to infrared) spectra of these stars in the MAST High-Level Science Products archive consisting of COS and STIS UV spectra and near-simultaneous Chandra, XMM-Newton, and ground-based observations. We will compute models with the fully tested SSRPM computer software incorporating 52 atoms and ions in full non-LTE (435,986 spectral lines) and the 20 most-abundant diatomic molecules (about 2 million lines). This code has successfully fit the panchromatic spectrum of the M1.5 V exoplanet host star GJ 832 (Fontenla et al. 2016), the first M star with such a detailed model, and solar spectra. Our models will (1) predict the unobservable extreme-UV spectra, (2) determine radiative energy losses and balancing heating rates throughout these atmospheres, (3) compute a stellar irradiance library needed to describe the radiation environment of potentially habitable exoplanets to be studied by TESS and JWST, and (4) in the long post-HST era when UV observations will not be possible, the stellar irradiance library will be a powerful tool for predicting the panchromatic spectra of host stars that have only limited spectral coverage, in particular no UV spectra. The stellar models and spectral irradiance library will be placed quickly in MAST.

  8. Roosevelt Hot Springs, Utah FORGE Observation Well Data

    DOE Data Explorer

    Nash, Greg

    2018-02-22

    This archive contains temperature data for Roosevelt Hot Springs observation wells OH-1, OH-4, OH-5 and OH-7. There are also mud logs for OH-4. These are old datasets obtained from Rocky Mountain Power for use in the Utah FORGE project.

  9. Herschel spectroscopic observations of PPNe and PNe

    NASA Astrophysics Data System (ADS)

    García-Lario, Pedro; Ramos-Medina, J.; Sánchez-Contreras, C.

    2017-10-01

    We are building a catalogue of interactively reprocessed observations of evolved stars observed with Herschel. The catalogue will offer not only the PACS and SPIRE spectroscopic data for each observation, but also complementary information from other infrared space observatories. As a first step, we are concentrating our efforts on two main activities: 1) the interactive data-reduction of more than 500 individual spectra obtained with PACS in the 55-210 μm range, available in the Herschel Science Archive; 2) the creation of a catalogue, accesible via a web-based interface and through the Virtual Observatory. Our ultimate goal is to carry out a comprehensive and systematic study of the far infrared properties of low-and intermediate-mass evolved stars using these data and enable science based on Herschel archival data. The objects cover the whole range of possible evolutionary stages in this short-lived phase of stellar evolution, from the AGB to the PN stage, displaying a wide variety of chemical and physical properties.

  10. A regressive storm model for extreme space weather

    NASA Astrophysics Data System (ADS)

    Terkildsen, Michael; Steward, Graham; Neudegg, Dave; Marshall, Richard

    2012-07-01

    Extreme space weather events, while rare, pose significant risk to society in the form of impacts on critical infrastructure such as power grids, and the disruption of high end technological systems such as satellites and precision navigation and timing systems. There has been an increased focus on modelling the effects of extreme space weather, as well as improving the ability of space weather forecast centres to identify, with sufficient lead time, solar activity with the potential to produce extreme events. This paper describes the development of a data-based model for predicting the occurrence of extreme space weather events from solar observation. The motivation for this work was to develop a tool to assist space weather forecasters in early identification of solar activity conditions with the potential to produce extreme space weather, and with sufficient lead time to notify relevant customer groups. Data-based modelling techniques were used to construct the model, and an extensive archive of solar observation data used to train, optimise and test the model. The optimisation of the base model aimed to eliminate false negatives (missed events) at the expense of a tolerable increase in false positives, under the assumption of an iterative improvement in forecast accuracy during progression of the solar disturbance, as subsequent data becomes available.

  11. Observations of A0535 + 26 with the SMM satellite

    NASA Technical Reports Server (NTRS)

    Sembay, S.; Schwartz, R. A.; Orwig, L. E.; Dennis, B. R.; Davies, S. R.

    1990-01-01

    An examination of archival data from the hard X-ray instruments on the Solar Maximum Mission (SMM) satellite has revealed a previously undetected outburst from the recurrent X-ray transient, A0535 + 26. The outburst occurred in June 1983 and reached a peak intensity of about 2 crab units in the energy range 32-91 keV. The outburst was detected over a span of 18 days, and the pulse period was observed to spin-up with an average rate of about -6 x 10 to the -8th s/s. A recently proposed model for A0535 + 26 has a pulsar powered by a short-lived accretion disk. A thin accretion disk model is fitted to the present data, assuming an orbital period of 111 days. Two solutions to the magnetic moment of the neutron star are derived. The slow rotator solution is more consistent with the model than the fast rotator, on the grounds that the conditions for the formation of an accretion disk are more favorable for a lower magnetic field strength.

  12. Effect of soil property uncertainties on permafrost thaw projections: a calibration-constrained analysis: Modeling Archive

    DOE Data Explorer

    J.C. Rowland; D.R. Harp; C.J. Wilson; A.L. Atchley; V.E. Romanovsky; E.T. Coon; S.L. Painter

    2016-02-02

    This Modeling Archive is in support of an NGEE Arctic publication available at doi:10.5194/tc-10-341-2016. This dataset contains an ensemble of thermal-hydro soil parameters including porosity, thermal conductivity, thermal conductivity shape parameters, and residual saturation of peat and mineral soil. The ensemble was generated using a Null-Space Monte Carlo analysis of parameter uncertainty based on a calibration to soil temperatures collected at the Barrow Environmental Observatory site by the NGEE team. The micro-topography of ice wedge polygons present at the site is included in the analysis using three 1D column models to represent polygon center, rim and trough features. The Arctic Terrestrial Simulator (ATS) was used in the calibration to model multiphase thermal and hydrological processes in the subsurface.

  13. The McIntosh Archive: A solar feature database spanning four solar cycles

    NASA Astrophysics Data System (ADS)

    Gibson, S. E.; Malanushenko, A. V.; Hewins, I.; McFadden, R.; Emery, B.; Webb, D. F.; Denig, W. F.

    2016-12-01

    The McIntosh Archive consists of a set of hand-drawn solar Carrington maps created by Patrick McIntosh from 1964 to 2009. McIntosh used mainly H-alpha, He-1 10830 and photospheric magnetic measurements from both ground-based and NASA satellite observations. With these he traced coronal holes, polarity inversion lines, filaments, sunspots and plage, yielding a unique 45-year record of the features associated with the large-scale solar magnetic field. We will present the results of recent efforts to preserve and digitize this archive. Most of the original hand-drawn maps have been scanned, a method for processing these scans into digital, searchable format has been developed and streamlined, and an archival repository at NOAA's National Centers for Environmental Information (NCEI) has been created. We will demonstrate how Solar Cycle 23 data may now be accessed and how it may be utilized for scientific applications. In addition, we will discuss how this database of human-recognized features, which overlaps with the onset of high-resolution, continuous modern solar data, may act as a training set for computer feature recognition algorithms.

  14. The Challenges Facing Science Data Archiving on Current Mass Storage Systems

    NASA Technical Reports Server (NTRS)

    Peavey, Bernard; Behnke, Jeanne (Editor)

    1996-01-01

    This paper discusses the desired characteristics of a tape-based petabyte science data archive and retrieval system required to store and distribute several terabytes (TB) of data per day over an extended period of time, probably more than 115 years, in support of programs such as the Earth Observing System Data and Information System (EOSDIS). These characteristics take into consideration not only cost effective and affordable storage capacity, but also rapid access to selected files, and reading rates that are needed to satisfy thousands of retrieval transactions per day. It seems that where rapid random access to files is not crucial, the tape medium, magnetic or optical, continues to offer cost effective data storage and retrieval solutions, and is likely to do so for many years to come. However, in environments like EOS these tape based archive solutions provide less than full user satisfaction. Therefore, the objective of this paper is to describe the performance and operational enhancements that need to be made to the current tape based archival systems in order to achieve greater acceptance by the EOS and similar user communities.

  15. Clinical aspects of the Mayo/IBM PACS project

    NASA Astrophysics Data System (ADS)

    Forbes, Glenn S.; Morin, Richard L.; Pavlicek, William

    1991-07-01

    A joint project between Mayo Clinic and IBM to develop a picture archival and communications system has been under development for three years. This project began as a potential solution to a pressing archival problem in magnetic resonance imaging. The project has grown to encompass a much larger sphere of activity including workstations, image retrieval, and report archival. This report focuses on the clinical aspects involved in the design, development, and implementation of such a system. In particular, emphasis is placed on the clinical impact of the system both inside and outside of the radiology department. The primary concerns have centered on fidelity of archival data, ease of use, and diagnostic efficacy. The project to date has been limited to neuroradiology practice. This group consists of nine staff radiologists and fellows. Administrative policy decisions regarding the accessibility and available of digital data in the clinical environment have been much more difficult and complex than originally conceived. Based on the observations thus far, the authors believe the system will become a useful and valuable adjunct to clinical practice of radiology.

  16. Electronic archival tags provide first glimpse of bathythermal habitat use by free-ranging adult lake sturgeon Acipenser fulvescens

    USGS Publications Warehouse

    Briggs, Andrew S.; Hondorp, Darryl W.; Quinlan, Henry R.; Boase, James C.; Mohr, Lloyd C.

    2016-01-01

    Information on lake sturgeon (Acipenser fulvescens) depth and thermal habitat use during non-spawning periods is unavailable due to the difficulty of observing lake sturgeon away from shallow water spawning sites. In 2002 and 2003, lake sturgeon captured in commercial trap nets near Sarnia, Ontario were implanted with archival tags and released back into southern Lake Huron. Five of the 40 tagged individuals were recaptured and were at large for 32, 57, 286, 301, and 880 days. Temperatures and depths recorded by archival tags ranged from 0 to 23.5 ºC and 0.1 to 42.4 m, respectively. For the three lake sturgeon that were at large for over 200 days, temperatures occupied emulated seasonal fluctuations. Two of these fish occupied deeper waters during winter than summer while the other occupied similar depths during non-spawning periods. This study provides important insight into depth and thermal habitat use of lake sturgeon throughout the calendar year along with exploring the feasibility of using archival tags to obtain important physical habitat attributes during non-spawning periods.

  17. Simulation of a data archival and distribution system at GSFC

    NASA Technical Reports Server (NTRS)

    Bedet, Jean-Jacques; Bodden, Lee; Dwyer, AL; Hariharan, P. C.; Berbert, John; Kobler, Ben; Pease, Phil

    1993-01-01

    A version-0 of a Data Archive and Distribution System (DADS) is being developed at GSFC to support existing and pre-EOS Earth science datasets and test Earth Observing System Data and Information System (EOSDIS) concepts. The performance of DADS is predicted using a discrete event simulation model. The goals of the simulation were to estimate the amount of disk space needed and the time required to fulfill the DADS requirements for ingestion (14 GB/day) and distribution (48 GB/day). The model has demonstrated that 4 mm and 8 mm stackers can play a critical role in improving the performance of the DADS, since it takes, on average, 3 minutes to manually mount/dismount tapes compared to less than a minute with stackers. With two 4 mm stackers and two 8 mm stackers, and a single operator per shift, the DADS requirements can be met within 16 hours using a total of 9 GB of disk space. When the DADS has no stacker, and the DADS depends entirely on operators to handle the distribution tapes, the simulation has shown that the DADS requirements can still be met within 16 hours, but a minimum of 4 operators per shift were required. The compression/decompression of data sets is very CPU intensive, and relatively slow when performed in software, thereby contributing to an increase in the amount of disk space needed.

  18. Preserving the Pyramid of STI Using Buckets

    NASA Technical Reports Server (NTRS)

    Nelson, Michael L.; Maly, Kurt

    2004-01-01

    The product of research projects is information. Through the life cycle of a project, information comes from many sources and takes many forms. Traditionally, this body of information is summarized in a formal publication, typically a journal article. While formal publications enjoy the benefits of peer review and technical editing, they are also often compromises in media format and length. As such, we consider a formal publication to represent an abstract to a larger body of work: a pyramid of scientific and technical information (STI). While this abstract may be sufficient for some applications, an in-depth use or analysis is likely to require the supporting layers from the pyramid. We have developed buckets to preserve this pyramid of STI. Buckets provide an archive- and protocol-independent container construct in which all related information objects can be logically grouped together, archived, and manipulated as a single object. Furthermore, buckets are active archival objects and can communicate with each other, people, or arbitrary network services. Buckets are an implementation of the Smart Object, Dumb Archive (SODA) DL model. In SODA, data objects are more important than the archives that hold them. Much of the functionality traditionally associated with archives is pushed down into the objects, such as enforcing terms and conditions, negotiating display, and content maintenance. In this paper, we discuss the motivation, design, and implication of bucket use in DLs with respect to grey literature.

  19. Data Integration Plans for the NOAA National Climate Model Portal (NCMP) (Invited)

    NASA Astrophysics Data System (ADS)

    Rutledge, G. K.; Williams, D. N.; Deluca, C.; Hankin, S. C.; Compo, G. P.

    2010-12-01

    NOAA’s National Climatic Data Center (NCDC) and its collaborators have initiated a five-year development and implementation of an operational access capability for the next generation weather and climate model datasets. The NOAA National Climate Model Portal (NCMP) is being designed using format neutral open web based standards and tools where users at all levels of expertise can gain access and understanding to many of NOAA’s climate and weather model products. NCMP will closely coordinate with and reside under the emerging NOAA Climate Services Portal (NCSP). To carry out its mission, NOAA must be able to successfully integrate model output and other data and information from all of its discipline specific areas to understand and address the complexity of many environmental problems. The NCMP will be an initial access point for the emerging NOAA Climate Services Portal (NCSP), which is the basis for unified access to NOAA climate products and services. NCMP is currently collaborating with the emerging Environmental Projection Center (EPC) expected to be developed at the Earth System Research Laboratory in Boulder CO. Specifically, NCMP is being designed to: - Enable policy makers and resource managers to make informed national and global policy decisions using integrated climate and weather model outputs, observations, information, products, and other services for the scientist and the non-scientist; - Identify model to observational interoperability requirements for climate and weather system analysis and diagnostics; - Promote the coordination of an international reanalysis observational clearinghouse (i.e.., Reanalysis.org) spanning the worlds numerical processing Center’s for an “Ongoing Analysis of the Climate System”. NCMP will initially provide access capabilities to 3 of NOAA’s high volume Reanalysis data sets of the weather and climate systems: 1) NCEP’s Climate Forecast System Reanalysis (CFS-R); 2) NOAA’s Climate Diagnostics Center/ Earth System Research Laboratory (ESRL) Twentieth Century Reanalysis Project data set (20CR, G. Compo, et al.), a historical reanalysis that will provide climate information dating back to 1850 to the present; and 3) the CPC’s Upper Air Reanlaysis. NCMP will advance the highly successful NOAA National Operational Model Archive and Distribution System (NOMADS, Rutledge, BAMS 2006), and standards already in use including Unidata’s THREDDS (TDS), PMEL’s Live Access Server (LAS) and the GrADS Data Server (GDS) from COLA; the Department of Energy (DOE) Earth System Grid (ESG) and the associated IPCC Climate model archive located at the Program for Climate Model Diagnostics and Inter-comparison (PCMDI) through the ESG; and NOAA’s Unified Access Framework (UAF) effort; and core standards developed by Open Geospatial Consortium (OGC). The format neutral OPeNDAP protocol as used in the NOMADS system will also be a key aspect of the design of NCMP.

  20. VizieR Online Data Catalog: Ccompact group galaxies UV and IR SFR (Lenkic+, 2016)

    NASA Astrophysics Data System (ADS)

    Lenkic, L.; Tzanavaris, P.; Gallagher, S. C.; Desjardins, T. D.; Walker, L. M.; Johnson, K. E.; Fedotov, K.; Charlton, J.; Hornschemeier, A. E.; Durrell, P. R.; Gronwall, C.

    2017-07-01

    The sample of CGs studied here is the same sample studied by Walker et al. (2012AJ....143...69W) of 49 CGs: 33 Hickson Compact Groups and 16 Redshift Survey Compact Groups (RSCGs). The RSCG catalogue of 89 CGs was constructed by Barton et al. (1996AJ....112..871B). The data used in this study originated from 'fill-in' observations with UVOT's three UV filters (uvw2, uvm2, uvw1) as well as the bluest optical filter (u). All UV data (PI: Tzanavaris) were downloaded from the Swift archive. The Spitzer Infrared Array Camera images for our sample of CGs are archival data presented by Walker et al. (2012AJ....143...69W) Spitzer MIPS (24um) data were obtained from the Spitzer Heritage Archive. (4 data files).

  1. Qualitative Comparison of IGRA and ESRL Radiosonde Archived Databases

    NASA Technical Reports Server (NTRS)

    Walker, John R.

    2014-01-01

    Multiple databases of atmospheric profile information are freely available to individuals and groups such as the Natural Environments group. Two of the primary database archives provided by NOAA that are most frequently used are those from the Earth Science Research Laboratory (ESRL) and the Integrated Global Radiosonde Archive (IGRA). Inquiries have been made as to why one database is used as opposed to the other, yet to the best of knowledge, no formal comparison has been performed. The goal of this study is to provide a qualitative comparison of the ESRL and IGRA radiosonde databases. For part of this analyses, 14 upper air observation sites were selected. These sites all have the common attribute of having been used or are planned for use in the development of Range Reference Atmospheres (RRAs) in support of NASA's and DOD's current and future goals.

  2. Enabling data access and interoperability at the EOS Land Processes Distributed Active Archive Center

    NASA Astrophysics Data System (ADS)

    Meyer, D. J.; Gallo, K. P.

    2009-12-01

    The NASA Earth Observation System (EOS) is a long-term, interdisciplinary research mission to study global-scale processes that drive Earth systems. This includes a comprehensive data and information system to provide Earth science researchers with easy, affordable, and reliable access to the EOS and other Earth science data through the EOS Data and Information System (EOSDIS). Data products from EOS and other NASA Earth science missions are stored at Distributed Active Archive Centers (DAACs) to support interactive and interoperable retrieval and distribution of data products. ¶ The Land Processes DAAC (LP DAAC), located at the US Geological Survey’s (USGS) Earth Resources Observation and Science (EROS) Center is one of the twelve EOSDIS data centers, providing both Earth science data and expertise, as well as a mechanism for interaction between EOS data investigators, data center specialists, and other EOS-related researchers. The primary mission of the LP DAAC is stewardship for land data products from the Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) and the Moderate Resolution Imaging Spectroradiometer (MODIS) instruments on the Terra and Aqua observation platforms. The co-location of the LP DAAC at EROS strengthens the relationship between the EOSDIS and USGS Earth science activities, linking the basic research and technology development mission of NASA to the operational mission requirements of the USGS. This linkage, along with the USGS’ role as steward of land science data such as the Landsat archive, will prove to be especially beneficial when extending both USGS and EOSDIS data records into the Decadal Survey era. ¶ This presentation provides an overview of the evolution of LP DAAC efforts over the years to improve data discovery, retrieval and preparation services, toward a future of integrated data interoperability between EOSDIS data centers and data holdings of the USGS and its partner agencies. Historical developmental case studies are presented, including the MODIS Reprojection Tool (MRT), the scheduling of ASTER for emergency response, the inclusion of Landsat metadata in the EOS Clearinghouse (ECHO), and the distribution of a global digital elevation model (GDEM) developed from ASTER. A software re-use case study describes integrating the MRT and the USGS Global Visualization tool (GloVis) into the MRTWeb service, developed to provide on-the-fly reprojection and reformatting of MODIS land products. Current LP DAAC activities are presented, such as the Open geographic information systems (GIS) Consortium (OGC) services provided in support of NASA’s Making Earth Science Data Records for Use in Research Environments (MEaSUREs). Near-term opportunities are discussed, such as the design and development of services in support of the soon-to-be completed on-line archive of all LP DAAC ASTER and MODIS data products. Finally, several case studies for future tools are services are explored, such as bringing algorithms to data centers, using the North American ASTER Land Emissivity Database as an example, as well as the potential for integrating data discovery and retrieval services for LP DAAC, Landsat and USGS Long-term Archive holdings.

  3. Model Data Interoperability for the United States Integrated Ocean Observing System (IOOS)

    NASA Astrophysics Data System (ADS)

    Signell, Richard P.

    2010-05-01

    Model data interoperability for the United States Integrated Ocean Observing System (IOOS) was initiated with a focused one year project. The problem was that there were many regional and national providers of oceanographic model data; each had unique file conventions, distribution techniques and analysis tools that made it difficult to compare model results and observational data. To solve this problem, a distributed system was built utilizing a customized middleware layer and a common data model. This allowed each model data provider to keep their existing model and data files unchanged, yet deliver model data via web services in a common form. With standards-based applications that used these web services, end users then had a common way to access data from any of the models. These applications included: (1) a 2D mapping and animation using a web browser application, (2) an advanced 3D visualization and animation using a desktop application, and (3) a toolkit for a common scientific analysis environment. Due to the flexibility and low impact of the approach on providers, rapid progress was made. The system was implemented in all eleven US IOOS regions and at the NOAA National Coastal Data Development Center, allowing common delivery of regional and national oceanographic model forecast and archived results that cover all US waters. The system, based heavily on software technology from the NSF-sponsored Unidata Program Center, is applicable to any structured gridded data, not just oceanographic model data. There is a clear pathway to expand the system to include unstructured grid (e.g. triangular grid) data.

  4. Buckets: Smart Objects for Digital Libraries

    NASA Technical Reports Server (NTRS)

    Nelson, Michael L.

    2001-01-01

    Current discussion of digital libraries (DLs) is often dominated by the merits of the respective storage, search and retrieval functionality of archives, repositories, search engines, search interfaces and database systems. While these technologies are necessary for information management, the information content is more important than the systems used for its storage and retrieval. Digital information should have the same long-term survivability prospects as traditional hardcopy information and should be protected to the extent possible from evolving search engine technologies and vendor vagaries in database management systems. Information content and information retrieval systems should progress on independent paths and make limited assumptions about the status or capabilities of the other. Digital information can achieve independence from archives and DL systems through the use of buckets. Buckets are an aggregative, intelligent construct for publishing in DLs. Buckets allow the decoupling of information content from information storage and retrieval. Buckets exist within the Smart Objects and Dumb Archives model for DLs in that many of the functionalities and responsibilities traditionally associated with archives are pushed down (making the archives dumber) into the buckets (making them smarter). Some of the responsibilities imbued to buckets are the enforcement of their terms and conditions, and maintenance and display of their contents.

  5. Mission operations update for the restructured Earth Observing System (EOS) mission

    NASA Technical Reports Server (NTRS)

    Kelly, Angelita Castro; Chang, Edward S.

    1993-01-01

    The National Aeronautics and Space Administration's (NASA) Earth Observing System (EOS) will provide a comprehensive long term set of observations of the Earth to the Earth science research community. The data will aid in determining global changes caused both naturally and through human interaction. Understanding man's impact on the global environment will allow sound policy decisions to be made to protect our future. EOS is a major component of the Mission to Planet Earth program, which is NASA's contribution to the U.S. Global Change Research Program. EOS consists of numerous instruments on multiple spacecraft and a distributed ground system. The EOS Data and Information System (EOSDIS) is the major ground system developed to support EOS. The EOSDIS will provide EOS spacecraft command and control, data processing, product generation, and data archival and distribution services for EOS spacecraft. Data from EOS instruments on other Earth science missions (e.g., Tropical Rainfall Measuring Mission (TRMM)) will also be processed, distributed, and archived in EOSDIS. The U.S. and various International Partners (IP) (e.g., the European Space Agency (ESA), the Ministry of International Trade and Industry (MITI) of Japan, and the Canadian Space Agency (CSA)) participate in and contribute to the international EOS program. The EOSDIS will also archive processed data from other designated NASA Earth science missions (e.g., UARS) that are under the broad umbrella of Mission to Planet Earth.

  6. Kepler Certified False Positive Table

    NASA Technical Reports Server (NTRS)

    Bryson, Stephen T.; Batalha, Natalie Marie; Colon, Knicole Dawn; Coughlin, Jeffrey Langer; Haas, Michael R.; Henze, Chris; Huber, Daniel; Morton, Tim; Rowe, Jason Frank; Mullally, Susan Elizabeth; hide

    2017-01-01

    This document describes the Kepler Certied False Positive table hosted at the Exoplanet Archive1, herein referred to as the CFP table. This table is the result of detailed examination by the Kepler False Positive Working Group (FPWG) of declared false positives in the Kepler Object of Interest (KOI) tables (see, for example, Batalha et al. (2012); Burke et al.(2014); Rowe et al. (2015); Mullally et al. (2015); Coughlin et al. (2015b)) at the Exoplanet Archive. A KOI is considered a false positive if it is not due to a planet orbiting the KOI's target star. The CFP table contains all KOIs in the Exoplanet Archive cumulative KOI table. The purpose of the CFP table is to provide a list of certified false positive KOIs. A KOI is certified as a false positive when, in the judgement of the FPWG, there is no plausible planetary interpretation of the observational evidence, which we summarize by saying that the evidence for a false positive is compelling. This certification process involves detailed examination using all available data for each KOI, establishing a high-reliability ground truth set. The CFP table can be used to estimate the reliability of, for example, the KOI tables which are created using only Kepler photometric data, so the disposition of individual KOIs may differ in the KOI and CFP tables. Follow-up observers may find the CFP table useful to avoid observing false positives.

  7. The Nature of Deeply Buried Ultraluminous Infrared Galaxies: A Unified Model for Highly Obscured Dusty Galaxy Emission

    NASA Astrophysics Data System (ADS)

    Marshall, J. A.; Elitzur, M.; Armus, L.; Diaz-Santos, T.; Charmandaris, V.

    2018-05-01

    We present models of deeply buried ultraluminous infrared galaxy (ULIRG) spectral energy distributions (SEDs) and use them to construct a three-dimensional diagram for diagnosing the nature of observed ULIRGs. Our goal is to construct a suite of SEDs for a very simple model ULIRG structure, and to explore how well this simple model can (by itself) explain the full range of observed ULIRG properties. We use our diagnostic to analyze archival Spitzer Space Telescope Infrared Spectrograph data of ULIRGs and find that: (1) in general, our model does provide a comprehensive explanation of the distribution of mid-IR ULIRG properties; (2) >75% (in some cases 100%) of the bolometric luminosities of the most deeply buried ULIRGs must be powered by a dust-enshrouded active galactic nucleus; (3) an unobscured “keyhole” view through ≲10% of the obscuring medium surrounding a deeply buried ULIRG is sufficient to make it appear nearly unobscured in the mid-IR; (4) the observed absence of deeply buried ULIRGs with large polycyclic aromatic hydrocarbon (PAH) equivalent widths is naturally explained by our models, showing that deep absorption features are “filled-in” by small quantities of foreground unobscured PAH emission (e.g., from the host galaxy disk) at the level of ∼1% the bolometric nuclear luminosity. The modeling and analysis we present will also serve as a powerful tool for interpreting the high angular resolution spectra of high-redshift sources to be obtained with the James Webb Space Telescope.

  8. A Regional Climate Model Evaluation System based on contemporary Satellite and other Observations for Assessing Regional Climate Model Fidelity

    NASA Astrophysics Data System (ADS)

    Waliser, D. E.; Kim, J.; Mattman, C.; Goodale, C.; Hart, A.; Zimdars, P.; Lean, P.

    2011-12-01

    Evaluation of climate models against observations is an essential part of assessing the impact of climate variations and change on regionally important sectors and improving climate models. Regional climate models (RCMs) are of a particular concern. RCMs provide fine-scale climate needed by the assessment community via downscaling global climate model projections such as those contributing to the Coupled Model Intercomparison Project (CMIP) that form one aspect of the quantitative basis of the IPCC Assessment Reports. The lack of reliable fine-resolution observational data and formal tools and metrics has represented a challenge in evaluating RCMs. Recent satellite observations are particularly useful as they provide a wealth of information and constraints on many different processes within the climate system. Due to their large volume and the difficulties associated with accessing and using contemporary observations, however, these datasets have been generally underutilized in model evaluation studies. Recognizing this problem, NASA JPL and UCLA have developed the Regional Climate Model Evaluation System (RCMES) to help make satellite observations, in conjunction with in-situ and reanalysis datasets, more readily accessible to the regional modeling community. The system includes a central database (Regional Climate Model Evaluation Database: RCMED) to store multiple datasets in a common format and codes for calculating and plotting statistical metrics to assess model performance (Regional Climate Model Evaluation Tool: RCMET). This allows the time taken to compare model data with satellite observations to be reduced from weeks to days. RCMES is a component of the recent ExArch project, an international effort for facilitating the archive and access of massive amounts data for users using cloud-based infrastructure, in this case as applied to the study of climate and climate change. This presentation will describe RCMES and demonstrate its utility using examples from RCMs applied to the southwest US as well as to Africa based on output from the CORDEX activity. Application of RCMES to the evaluation of multi-RCM hindcast for CORDEX-Africa will be presented in a companion paper in A41.

  9. NASA's Long-Term Archive (LTA) of ICESat Data at the National Snow and Ice Data Center (NSIDC)

    NASA Astrophysics Data System (ADS)

    Fowler, D. K.; Moses, J. F.; Dimarzio, J. P.; Webster, D.

    2011-12-01

    Data Stewardship, preservation, and reproducibility are becoming principal parts of a data manager's work. In an era of distributed data and information systems, where the host location ought to be transparent to the internet user, it is of vital importance that organizations make a commitment to both current and long-term goals of data management and the preservation of scientific data. NASA's EOS Data and Information System (EOSDIS) is a distributed system of discipline-specific archives and mission-specific science data processing facilities. Satellite missions and instruments go through a lifecycle that involves pre-launch calibration, on-orbit data acquisition and product generation, and final reprocessing. Data products and descriptions flow to the archives for distribution on a regular basis during the active part of the mission. However there is additional information from the product generation and science teams needed to ensure the observations will be useful for long term climate studies. Examples include ancillary input datasets, product generation software, and production history as developed by the team during the course of product generation. These data and information will need to be archived after product data processing is completed. Using inputs from the USGCRP Workshop on Long Term Archive Requirements (1998), discussions with EOS instrument teams, and input from the 2011 ESIPS Federation meeting, NASA is developing a set of Earth science data and information content requirements for long term preservation that will ultimately be used for all the EOS missions as they come to completion. Since the ICESat/GLAS mission is one of the first to come to an end, NASA and NSIDC are preparing for long-term support of the ICESat mission data now. For a long-term archive, it is imperative that there is sufficient information about how products were prepared in order to convince future researchers that the scientific results are accurate, understandable, useable, and reproducible. Our experience suggests data centers know what to preserve in most cases, i.e., the processing algorithms along with the Level 0 or Level 1a input and ancillary products used to create the higher-level products will be archived and made available to users. In other cases the data centers must seek guidance from the science team, e.g., for pre-launch, calibration/validation, and test data. All these data are an important part of product provenance, contributing to and helping establish the integrity of the scientific observations for long term climate studies. In this presentation we will describe application of information content requirements, guidance from the ICESat/GLAS Science Team and the flow of additional information from the ICESat Science team and Science Investigator-Led Processing System to the Distributed Active Archive Center.

  10. Three Good Reasons for Celebrating at the ESO/ST-ECF Science Archive Facility

    NASA Astrophysics Data System (ADS)

    2000-12-01

    Great Demand for Data from New "Virtual Observatory" Summary Due to a happy coincidence, the ESO/ST-ECF Science Archive Facility is celebrating three different milestones at the same time: * its 10th anniversary since the establishment in 1991 * the 10,000th request for data , and * the signing-up of active user number 2000 . This Archive contains over 8 Terabytes (1 Terabyte = 1 million million bytes) of valuable observational data from the NASA/ESA Hubble Space Telescope (HST), the ESO Very Large Telescope (VLT) and other ESO telescopes . Its success paves the way for the establishment of "Virtual Observatories" from which first-class data can be obtained by astronomers all over the world. This greatly enhances the opportunities for more (young) scientists to participate in front-line research. PR Photo 34/00 : Front-page of a new brochure, describing the ESO/ST-ECF Science Archive Facility. Just 10 years ago, on the 1st of January 1991, the ESO/ST-ECF (European Southern Observatory/Space Telescope-European Coordinating Facility) Science Archive Facility opened. It has since served the astronomical community with gigabyte after gigabyte of high-quality astronomical data from some of the world's leading telescopes. The Archive, which is located in Garching, just outside Munich (Germany), contains data from the 2.4-m NASA/ESA Hubble Space Telescope , as well as from several ESO telescopes: the four 8.2-m Unit Telescopes of the Very Large Telescope (VLT) at the Paranal Observatory , and the 3.5-m New Technology Telescope (NTT) , the 3.6-m telescope and the MPG/ESO 2.2-m telescope at La Silla. The Archive is a continuously developing project - in terms of amounts of data stored, the number of users and in particular because of the current dramatic development of innovative techniques for data handling and storage. In the year 2000 more than 2 Terabytes (2000 Gigabytes) of data were distributed to users worldwide. The archiving of VLT data has been described in ESO PR 10/99. Celebrating the 10th anniversary Due to a happy coincidence, the Archive passes two other milestones almost exactly at the time of its ten-year anniversary: the 10,000th request for data has just arrived, and active user number 2000 has just signed up to start using the Archive . Dataset number 10000 was requested by Danish astronomer Søren Larsen who works at the University of California (USA). He asked for images of galaxies taken with the Hubble Space Telescope and expressed great satisfaction with the material: "The extremely sharp images from Hubble have provided a quantum leap forward in our ability to study star clusters in external galaxies. We now know that some galaxies contain extremely bright young star clusters. These might constitute a "link" between open and globular clusters as we know them in the Milky Way galaxy in which we live. We are now trying to understand whether all these clusters really form in the same basic way." Active user number 2000 is Swiss astronomer Frédéric Pont , working at the Universidad de Chile: "We use observations from the ESO VLT Unit Telescopes to map the chemical and star-formation history of dwarf galaxies in the Local Group. The stars we are looking at are very faint and we simply need the large size and excellent quality of VLT to observe them in detail. With the new data, we can really move forward in this fundamental research field." ESO PR Photo 34/00 ESO PR Photo 34/00 [Preview - JPEG: 400 x 281 pix - 63k] [Normal - JPEG: 800 x 562 pix - 224k] [Full-Res - JPEG: 1024 x 714 pix - 336k] Caption : PR Photo 34/00 shows the frontpage of the new brochure that describes the ESO/ST-ECF Science Archive Facility (available in PDF version on the web). The collage shows the Hubble Space Telescope above the world's largest optical/infrared telescope, the Very Large Telescope (VLT). To celebrate this special occasion, a 4-page brochure has been prepared that describes the Archive and its various services. The brochure can be requested from ESO or ST-ECF and is now available in PDF format on the web. As a small token, the two astronomers will receive a commemorative version of the photo that accompanies this release. The ASTROVIRTEL initiative One of the major new initiatives undertaken by ESO and ST-ECF in connection with the ESO/ST-ECF Science Archive is ASTROVIRTEL (Accessing Astronomical Archives as Virtual Telescopes) , cf. ESO PR 09/00. It is a project aimed at helping scientists to cope efficiently with the massive amounts of data now becoming available from the world's leading telescopes and so to exploit the true potential of the Archive treasures. ASTROVIRTEL represents the European effort in an area that many astronomers considers one of the most important developments within observing astronomy in the past decade. The future The head of the ESO/ST-ECF Science Archive Facility , Benoît Pirenne , believes that the future holds exciting challenges: "Due to the many improvements of the ESO, NASA and ESA telescopes and instruments expected in the coming years, we anticipate a tremendous increase in the amount of data to be archived and re-distributed. It will not be too long before we will have to start counting storage space in Petabytes (1 Petabyte = 1,000 Terabytes). We are now trying to figure out how to best prepare for this new era." But he is also concerned with maintaining and further enhancing the astronomical value of the data that are made available to the users: "Apart from improving the data storage, we need to invest much effort in building automatic software that will help users with the tedious pre-processing and 'cleaning' of the data, thereby allowing them to focus more on scientific than technical problems."

  11. Building A Cloud Based Distributed Active Data Archive Center

    NASA Technical Reports Server (NTRS)

    Ramachandran, Rahul; Baynes, Katie; Murphy, Kevin

    2017-01-01

    NASA's Earth Science Data System (ESDS) Program facilitates the implementation of NASA's Earth Science strategic plan, which is committed to the full and open sharing of Earth science data obtained from NASA instruments to all users. The Earth Science Data information System (ESDIS) project manages the Earth Observing System Data and Information System (EOSDIS). Data within EOSDIS are held at Distributed Active Archive Centers (DAACs). One of the key responsibilities of the ESDS Program is to continuously evolve the entire data and information system to maximize returns on the collected NASA data.

  12. Knowledge-driven information mining in remote-sensing image archives

    NASA Astrophysics Data System (ADS)

    Datcu, M.; Seidel, K.; D'Elia, S.; Marchetti, P. G.

    2002-05-01

    Users in all domains require information or information-related services that are focused, concise, reliable, low cost and timely and which are provided in forms and formats compatible with the user's own activities. In the current Earth Observation (EO) scenario, the archiving centres generally only offer data, images and other "low level" products. The user's needs are being only partially satisfied by a number of, usually small, value-adding companies applying time-consuming (mostly manual) and expensive processes relying on the knowledge of experts to extract information from those data or images.

  13. The Bright γ-ray Flare of 3C 279 in 2015 June: AGILE Detection and Multifrequency Follow-up Observations

    NASA Astrophysics Data System (ADS)

    Pittori, C.; Lucarelli, F.; Verrecchia, F.; Raiteri, C. M.; Villata, M.; Vittorini, V.; Tavani, M.; Puccetti, S.; Perri, M.; Donnarumma, I.; Vercellone, S.; Acosta-Pulido, J. A.; Bachev, R.; Benítez, E.; Borman, G. A.; Carnerero, M. I.; Carosati, D.; Chen, W. P.; Ehgamberdiev, Sh. A.; Goded, A.; Grishina, T. S.; Hiriart, D.; Hsiao, H. Y.; Jorstad, S. G.; Kimeridze, G. N.; Kopatskaya, E. N.; Kurtanidze, O. M.; Kurtanidze, S. O.; Larionov, V. M.; Larionova, L. V.; Marscher, A. P.; Mirzaqulov, D. O.; Morozova, D. A.; Nilsson, K.; Samal, M. R.; Sigua, L. A.; Spassov, B.; Strigachev, A.; Takalo, L. O.; Antonelli, L. A.; Bulgarelli, A.; Cattaneo, P.; Colafrancesco, S.; Giommi, P.; Longo, F.; Morselli, A.; Paoletti, F.

    2018-04-01

    We report the AGILE detection and the results of the multifrequency follow-up observations of a bright γ-ray flare of the blazar 3C 279 in 2015 June. We use AGILE and Fermi gamma-ray data, together with Swift X-ray andoptical-ultraviolet data, and ground-based GASP-WEBT optical observations, including polarization information, to study the source variability and the overall spectral energy distribution during the γ-ray flare. The γ-ray flaring data, compared with as yet unpublished simultaneous optical data that will allow constraints on the big blue bump disk luminosity, show very high Compton dominance values of ∼100, with the ratio of γ-ray to optical emission rising by a factor of three in a few hours. The multiwavelength behavior of the source during the flare challenges one-zone leptonic theoretical models. The new observations during the 2015 June flare are also compared with already published data and nonsimultaneous historical 3C 279 archival data.

  14. Dithering Observations with JWST's NIRCam

    NASA Astrophysics Data System (ADS)

    Anderson, Jay

    2011-01-01

    Preparations for planning observations with JWST are already well underway at STScI. Many of the aspects of HST observation planning will carry over to JWST, but some things will be different. With HST, users are able to define arbitrary dither patterns (or use no dithering at all) in their Phase-2 submissions. This has allowed many observers to optimize their data quality for the particular science they are focused on. But, unfortunately, when the data reach the archive, the images are often less valuable to the community than they could be, either because of a lack of good dithering or because the association-based pipeline is not optimized for the particular dither pattern used. JWST will do things differently. Except in rare circumstances, such as planetary-transit observations, JWST users will be forced to dither, and they will have a limited set of dithering options to choose from. The NIRCam teams at STScI and UAz have designed a set of dither patterns that are flexible enough to meet the various anticipated science objectives, but they will also be homogeneous enough that the archive and association products will be of uniformly high quality.

  15. Integrated modeling and field study of potential mechanisms forinduced seismicity at The Geysers Goethermal Field, California

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rutqvist, Jonny; Majer, Ernie; Oldenburg, Curt

    2006-06-07

    In this paper, we present progress made in a study aimed atincreasing the understanding of the relative contributions of differentmechanisms that may be causing the seismicity occurring at The Geysersgeothermal field, California. The approach we take is to integrate: (1)coupled reservoir geomechanical numerical modeling, (2) data fromrecently upgraded and expanded NCPA/Calpine/LBNL seismic arrays, and (3)tens of years of archival InSAR data from monthly satellite passes. Wehave conducted a coupled reservoir geomechanical analysis to studypotential mechanisms induced by steam production. Our simulation resultscorroborate co-locations of hypocenter field observations of inducedseismicity and their correlation with steam production as reported in theliterature. Seismicmore » and InSAR data are being collected and processed foruse in constraining the coupled reservoir geomechanicalmodel.« less

  16. The SCALE Verified, Archived Library of Inputs and Data - VALID

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marshall, William BJ J; Rearden, Bradley T

    The Verified, Archived Library of Inputs and Data (VALID) at ORNL contains high quality, independently reviewed models and results that improve confidence in analysis. VALID is developed and maintained according to a procedure of the SCALE quality assurance (QA) plan. This paper reviews the origins of the procedure and its intended purpose, the philosophy of the procedure, some highlights of its implementation, and the future of the procedure and associated VALID library. The original focus of the procedure was the generation of high-quality models that could be archived at ORNL and applied to many studies. The review process associated withmore » model generation minimized the chances of errors in these archived models. Subsequently, the scope of the library and procedure was expanded to provide high quality, reviewed sensitivity data files for deployment through the International Handbook of Evaluated Criticality Safety Benchmark Experiments (IHECSBE). Sensitivity data files for approximately 400 such models are currently available. The VALID procedure and library continue fulfilling these multiple roles. The VALID procedure is based on the quality assurance principles of ISO 9001 and nuclear safety analysis. Some of these key concepts include: independent generation and review of information, generation and review by qualified individuals, use of appropriate references for design data and documentation, and retrievability of the models, results, and documentation associated with entries in the library. Some highlights of the detailed procedure are discussed to provide background on its implementation and to indicate limitations of data extracted from VALID for use by the broader community. Specifically, external users of data generated within VALID must take responsibility for ensuring that the files are used within the QA framework of their organization and that use is appropriate. The future plans for the VALID library include expansion to include additional experiments from the IHECSBE, to include experiments from areas beyond criticality safety, such as reactor physics and shielding, and to include application models. In the future, external SCALE users may also obtain qualification under the VALID procedure and be involved in expanding the library. The VALID library provides a pathway for the criticality safety community to leverage modeling and analysis expertise at ORNL.« less

  17. VirGO: A Visual Browser for the ESO Science Archive Facility

    NASA Astrophysics Data System (ADS)

    Chéreau, F.

    2008-08-01

    VirGO is the next generation Visual Browser for the ESO Science Archive Facility developed by the Virtual Observatory (VO) Systems Department. It is a plug-in for the popular open source software Stellarium adding capabilities for browsing professional astronomical data. VirGO gives astronomers the possibility to easily discover and select data from millions of observations in a new visual and intuitive way. Its main feature is to perform real-time access and graphical display of a large number of observations by showing instrumental footprints and image previews, and to allow their selection and filtering for subsequent download from the ESO SAF web interface. It also allows the loading of external FITS files or VOTables, the superimposition of Digitized Sky Survey (DSS) background images, and the visualization of the sky in a `real life' mode as seen from the main ESO sites. All data interfaces are based on Virtual Observatory standards which allow access to images and spectra from external data centers, and interaction with the ESO SAF web interface or any other VO applications supporting the PLASTIC messaging system. The main website for VirGO is at http://archive.eso.org/cms/virgo.

  18. Compendium of NASA data base for the Global Tropospheric Experiment's Arctic Boundary Layer Experiments ABLE-3A and ABLE-3B

    NASA Technical Reports Server (NTRS)

    Gregory, Gerald L.; Scott, A. Donald, Jr.

    1994-01-01

    The report provides a compendium of NASA aircraft data that are available from NASA's Global Tropospheric Experiment's (GTE) Arctic Boundary Layer Experiments (ABLE) conducted in July and August of 1988 (ABLE-3A) and 1990 (ABLE-3B). ABLE-3A flight experiments were based at Barrow and Bethel, Alaska, and included survey/transit flights to Thule, Greenland. ABLE-3B flight experiments were based at North Bay (Ontario) and Goose Bay, Canada, and included flights northward to Frobisher Bay, Canada. The primary purposes of the experiments were (1) the measurement of the flux of various trace gases from high-arctic ecosystems, (2) the elucidation of factors important to the production and destruction of ozone, and (3) the documentation of source and chemical signature of air common to and transported into the regions. The report provides a representation, in the form of selected data plots, of aircraft data that are available in archived format via NASA Langley's Distributed Active Archive Center. The archived data bases include data for other species measured on the aircraft as well as numerous supporting data, including meteorological observations/products, results from surface studies, satellite observations, and sondes releases.

  19. Special issue on enabling open and interoperable access to Planetary Science and Heliophysics databases and tools

    NASA Astrophysics Data System (ADS)

    2018-01-01

    The large amount of data generated by modern space missions calls for a change of organization of data distribution and access procedures. Although long term archives exist for telescopic and space-borne observations, high-level functions need to be developed on top of these repositories to make Planetary Science and Heliophysics data more accessible and to favor interoperability. Results of simulations and reference laboratory data also need to be integrated to support and interpret the observations. Interoperable software and interfaces have recently been developed in many scientific domains. The Virtual Observatory (VO) interoperable standards developed for Astronomy by the International Virtual Observatory Alliance (IVOA) can be adapted to Planetary Sciences, as demonstrated by the VESPA (Virtual European Solar and Planetary Access) team within the Europlanet-H2020-RI project. Other communities have developed their own standards: GIS (Geographic Information System) for Earth and planetary surfaces tools, SPASE (Space Physics Archive Search and Extract) for space plasma, PDS4 (NASA Planetary Data System, version 4) and IPDA (International Planetary Data Alliance) for planetary mission archives, etc, and an effort to make them interoperable altogether is starting, including automated workflows to process related data from different sources.

  20. Remotely sensed data available from the US Geological Survey EROS Data Center

    USGS Publications Warehouse

    Dwyer, John L.; Qu, J.J.; Gao, W.; Kafatos, M.; Murphy , R.E.; Salomonson, V.V.

    2006-01-01

    The Center for Earth Resources Observation Systems (EROS) is a field center of the geography discipline within the US geological survey (USGS) of the Department of the Interior. The EROS Data Center (EDC) was established in the early 1970s as the nation’s principal archive of remotely sensed data. Initially the EDC was responsible for the archive, reproduction, and distribution of black-and-white and color-infrared aerial photography acquired under numerous mapping programs conducted by various Federal agencies including the USGS, Department of Agriculture, Environmental Protection Agency, and NASA. The EDC was also designated the central archive for data acquired by the first satellite sensor designed for broad-scale earth observations in support of civilian agency needs for earth resource information. A four-band multispectral scanner (MSS) and a return-beam vidicon (RBV) camera were initially flown on the Earth Resources Technology Satellite-1, subsequently designated Landsat-1. The synoptic coverage, moderate spatial resolution, and multi-spectral view provided by these data stimulated scientists with an unprecedented perspective from which to study the Earth’s surface and to understand the relationships between human activity and natural systems.

Top