10 CFR 2.1011 - Management of electronic information.
Code of Federal Regulations, 2013 CFR
2013-01-01
... participants shall make textual (or, where non-text, image) versions of their documents available on a web... of the following acceptable formats: ASCII, native word processing (Word, WordPerfect), PDF Normal, or HTML. (iv) Image files must be formatted as TIFF CCITT G4 for bi-tonal images or PNG (Portable...
10 CFR 2.1011 - Management of electronic information.
Code of Federal Regulations, 2014 CFR
2014-01-01
... participants shall make textual (or, where non-text, image) versions of their documents available on a web... of the following acceptable formats: ASCII, native word processing (Word, WordPerfect), PDF Normal, or HTML. (iv) Image files must be formatted as TIFF CCITT G4 for bi-tonal images or PNG (Portable...
10 CFR 2.1011 - Management of electronic information.
Code of Federal Regulations, 2012 CFR
2012-01-01
... production and service: (i) The participants shall make textual (or, where non-text, image) versions of their... set and be in one of the following acceptable formats: ASCII, native word processing (Word, WordPerfect), PDF Normal, or HTML. (iv) Image files must be formatted as TIFF CCITT G4 for bi-tonal images or...
NASA Astrophysics Data System (ADS)
Paget, A. C.; Brodzik, M. J.; Long, D. G.; Hardman, M.
2016-02-01
The historical record of satellite-derived passive microwave brightness temperatures comprises data from multiple imaging radiometers (SMMR, SSM/I-SSMIS, AMSR-E), spanning nearly 40 years of Earth observations from 1978 to the present. Passive microwave data are used to monitor time series of many climatological variables, including ocean wind speeds, cloud liquid water and sea ice concentrations and ice velocity. Gridded versions of passive microwave data have been produced using various map projections (polar stereographic, Lambert azimuthal equal-area, cylindrical equal-area, quarter-degree Platte-Carree) and data formats (flat binary, HDF). However, none of the currently available versions can be rendered in the common visualization standard, geoTIFF, without requiring cartographic reprojection. Furthermore, the reprojection details are complicated and often require expert knowledge of obscure software package options. We are producing a consistently calibrated, completely reprocessed data set of this valuable multi-sensor satellite record, using EASE-Grid 2.0, an improved equal-area projection definition that will require no reprojection for translation into geoTIFF. Our approach has been twofold: 1) define the projection ellipsoid to match the reference datum of the satellite data, and 2) include required file-level metadata for standard projection software to correctly render the data in the geoTIFF standard. The Calibrated, Enhanced Resolution Brightness Temperature (CETB) Earth System Data Record (ESDR), leverages image reconstruction techniques to enhance gridded spatial resolution to 3 km and uses newly available intersensor calibrations to improve the quality of derived geophysical products. We expect that our attention to easy geoTIFF compatibility will foster higher-quality analysis with the CETB product by enabling easy and correct intercomparison with other gridded and in situ data.
NASA Astrophysics Data System (ADS)
Haran, T. M.; Brodzik, M. J.; Nordgren, B.; Estilow, T.; Scott, D. J.
2015-12-01
An increasing number of new Earth science datasets are being producedby data providers in self-describing, machine-independent file formatsincluding Hierarchical Data Format version 5 (HDF5) and NetworkCommon Data Form version 4 (netCDF-4). Furthermore data providers maybe producing netCDF-4 files that follow the conventions for Climateand Forecast metadata version 1.6 (CF 1.6) which, for datasets mappedto a projected raster grid covering all or a portion of the earth,includes the Coordinate Reference System (CRS) used to define howlatitude and longitude are mapped to grid coordinates, i.e. columnsand rows, and vice versa. One problem that users may encounter is thattheir preferred visualization and analysis tool may not yet includesupport for one of these newer formats. Moreover, data distributorssuch as NASA's NSIDC DAAC may not yet include support for on-the-flyconversion of data files for all data sets produced in a new format toa preferred older distributed format.There do exist open source solutions to this dilemma in the form ofsoftware packages that can translate files in one of the new formatsto one of the preferred formats. However these software packagesrequire that the file to be translated conform to the specificationsof its respective format. Although an online CF-Convention compliancechecker is available from cfconventions.org, a recent NSIDC userservices incident described here in detail involved an NSIDC-supporteddata set that passed the (then current) CF Checker Version 2.0.6, butwas in fact lacking two variables necessary for conformance. Thisproblem was not detected until GDAL, a software package which reliedon the missing variables, was employed by a user in an attempt totranslate the data into a different file format, namely GeoTIFF.This incident indicates that testing a candidate data product with oneor more software products written to accept the advertised conventionsis proposed as a practice which improves interoperability. Differencesbetween data file contents and software package expectations areexposed, affording an opportunity to improve conformance of software,data or both. The incident can also serve as a demonstration that dataproviders, distributors, and users can work together to improve dataproduct quality and interoperability.
User's guide for mapIMG 3--Map image re-projection software package
Finn, Michael P.; Mattli, David M.
2012-01-01
Version 0.0 (1995), Dan Steinwand, U.S. Geological Survey (USGS)/Earth Resources Observation Systems (EROS) Data Center (EDC)--Version 0.0 was a command line version for UNIX that required four arguments: the input metadata, the output metadata, the input data file, and the output destination path. Version 1.0 (2003), Stephen Posch and Michael P. Finn, USGS/Mid-Continent Mapping Center (MCMC--Version 1.0 added a GUI interface that was built using the Qt library for cross platform development. Version 1.01 (2004), Jason Trent and Michael P. Finn, USGS/MCMC--Version 1.01 suggested bounds for the parameters of each projection. Support was added for larger input files, storage of the last used input and output folders, and for TIFF/ GeoTIFF input images. Version 2.0 (2005), Robert Buehler, Jason Trent, and Michael P. Finn, USGS/National Geospatial Technical Operations Center (NGTOC)--Version 2.0 added Resampling Methods (Mean, Mode, Min, Max, and Sum), updated the GUI design, and added the viewer/pre-viewer. The metadata style was changed to XML and was switched to a new naming convention. Version 3.0 (2009), David Mattli and Michael P. Finn, USGS/Center of Excellence for Geospatial Information Science (CEGIS)--Version 3.0 brings optimized resampling methods, an updated GUI, support for less than global datasets, UTM support and the whole codebase was ported to Qt4.
using remotely sensed satellite data and products to access land cover change for local to global Reports * IGOL * Landsat GeoCover * SRTM DEM GeoTIFFs * Rapid Response ` News Tree Canopy Cover Version 4
López, Carlos; Lejeune, Marylène; Escrivà, Patricia; Bosch, Ramón; Salvadó, Maria Teresa; Pons, Lluis E.; Baucells, Jordi; Cugat, Xavier; Álvaro, Tomás; Jaén, Joaquín
2008-01-01
This study investigates the effects of digital image compression on automatic quantification of immunohistochemical nuclear markers. We examined 188 images with a previously validated computer-assisted analysis system. A first group was composed of 47 images captured in TIFF format, and other three contained the same images converted from TIFF to JPEG format with 3×, 23× and 46× compression. Counts of TIFF format images were compared with the other three groups. Overall, differences in the count of the images increased with the percentage of compression. Low-complexity images (≤100 cells/field, without clusters or with small-area clusters) had small differences (<5 cells/field in 95–100% of cases) and high-complexity images showed substantial differences (<35–50 cells/field in 95–100% of cases). Compression does not compromise the accuracy of immunohistochemical nuclear marker counts obtained by computer-assisted analysis systems for digital images with low complexity and could be an efficient method for storing these images. PMID:18755997
Kim, J H; Kang, S W; Kim, J-r; Chang, Y S
2014-01-01
Purpose To evaluate the effect of image compression of spectral-domain optical coherence tomography (OCT) images in the examination of eyes with exudative age-related macular degeneration (AMD). Methods Thirty eyes from 30 patients who were diagnosed with exudative AMD were included in this retrospective observational case series. The horizontal OCT scans centered at the center of the fovea were conducted using spectral-domain OCT. The images were exported to Tag Image File Format (TIFF) and 100, 75, 50, 25 and 10% quality of Joint Photographic Experts Group (JPEG) format. OCT images were taken before and after intravitreal ranibizumab injections, and after relapse. The prevalence of subretinal and intraretinal fluids was determined. Differences in choroidal thickness between the TIFF and JPEG images were compared with the intra-observer variability. Results The prevalence of subretinal and intraretinal fluids was comparable regardless of the degree of compression. However, the chorio–scleral interface was not clearly identified in many images with a high degree of compression. In images with 25 and 10% quality of JPEG, the difference in choroidal thickness between the TIFF images and the respective JPEG images was significantly greater than the intra-observer variability of the TIFF images (P=0.029 and P=0.024, respectively). Conclusions In OCT images of eyes with AMD, 50% of the quality of the JPEG format would be an optimal degree of compression for efficient data storage and transfer without sacrificing image quality. PMID:24788012
Proposed color workflow solution from mobile and website to printing
NASA Astrophysics Data System (ADS)
Qiao, Mu; Wyse, Terry
2015-03-01
With the recent introduction of mobile devices and development in client side application technologies, there is an explosion of the parameter matrix for color management: hardware platform (computer vs. mobile), operating system (Windows, Mac OS, Android, iOS), client application (Flesh, IE, Firefox, Safari, Chrome), and file format (JPEG, TIFF, PDF of various versions). In a modern digital print shop, multiple print solutions are used: digital presses, wide format inkjet, dye sublimation inkjet are used to produce a wide variety of customizable products from photo book, personalized greeting card, canvas, mobile phone case and more. In this paper, we outline a strategy spans from client side application, print file construction, to color setup on printer to manage consistency and also achieve what-you-see-is-what-you-get for customers who are using a wide variety of technologies in viewing and ordering product.
NAVAIR Portable Source Initiative (NPSI) Standard for Reusable Source Dataset Metadata (RSDM) V2.4
2012-09-26
defining a raster file format: <RasterFileFormat> <FormatName>TIFF</FormatName> <Order>BIP</Order> < DataType >8-BIT_UNSIGNED</ DataType ...interleaved by line (BIL); Band interleaved by pixel (BIP). element RasterFileFormatType/ DataType diagram type restriction of xsd:string facets
Polar2Grid 2.0: Reprojecting Satellite Data Made Easy
NASA Astrophysics Data System (ADS)
Hoese, D.; Strabala, K.
2015-12-01
Polar-orbiting multi-band meteorological sensors such as those on the Suomi National Polar-orbiting Partnership (SNPP) satellite pose substantial challenges for taking imagery the last mile to forecast offices, scientific analysis environments, and the general public. To do this quickly and easily, the Cooperative Institute for Meteorological Satellite Studies (CIMSS) at the University of Wisconsin has created an open-source, modular application system, Polar2Grid. This bundled solution automates tools for converting various satellite products like those from VIIRS and MODIS into a variety of output formats, including GeoTIFFs, AWIPS compatible NetCDF files, and NinJo forecasting workstation compatible TIFF images. In addition to traditional visible and infrared imagery, Polar2Grid includes three perceptual enhancements for the VIIRS Day-Night Band (DNB), as well as providing the capability to create sharpened true color, sharpened false color, and user-defined RGB images. Polar2Grid performs conversions and projections in seconds on large swaths of data. Polar2Grid is currently providing VIIRS imagery over the Continental United States, as well as Alaska and Hawaii, from various Direct-Broadcast antennas to operational forecasters at the NOAA National Weather Service (NWS) offices in their AWIPS terminals, within minutes of an overpass of the Suomi NPP satellite. Three years after Polar2Grid development started, the Polar2Grid team is now releasing version 2.0 of the software; supporting more sensors, generating more products, and providing all of its features in an easy to use command line interface.
NASA Astrophysics Data System (ADS)
Rivers, M. L.; Gualda, G. A.
2009-05-01
One of the challenges in tomography is the availability of suitable software for image processing and analysis in 3D. We present here 'tomo_display' and 'vol_tools', two packages created in IDL that enable reconstruction, processing, and visualization of tomographic data. They complement in many ways the capabilities offered by Blob3D (Ketcham 2005 - Geosphere, 1: 32-41, DOI: 10.1130/GES00001.1) and, in combination, allow users without programming knowledge to perform all steps necessary to obtain qualitative and quantitative information using tomographic data. The package 'tomo_display' was created and is maintained by Mark Rivers. It allows the user to: (1) preprocess and reconstruct parallel beam tomographic data, including removal of anomalous pixels, ring artifact reduction, and automated determination of the rotation center, (2) visualization of both raw and reconstructed data, either as individual frames, or as a series of sequential frames. The package 'vol_tools' consists of a series of small programs created and maintained by Guilherme Gualda to perform specific tasks not included in other packages. Existing modules include simple tools for cropping volumes, generating histograms of intensity, sample volume measurement (useful for porous samples like pumice), and computation of volume differences (for differential absorption tomography). The module 'vol_animate' can be used to generate 3D animations using rendered isosurfaces around objects. Both packages use the same NetCDF format '.volume' files created using code written by Mark Rivers. Currently, only 16-bit integer volumes are created and read by the packages, but floating point and 8-bit data can easily be stored in the NetCDF format as well. A simple GUI to convert sequences of tiffs into '.volume' files is available within 'vol_tools'. Both 'tomo_display' and 'vol_tools' include options to (1) generate onscreen output that allows for dynamic visualization in 3D, (2) save sequences of tiffs to disk, and (3) generate MPEG movies for inclusion in presentations, publications, websites, etc. Both are freely available as run-time ('.sav') versions that can be run using the free IDL Virtual Machine TM, available from ITT Visual Information Solutions: http://www.ittvis.com/ProductServices/IDL/VirtualMachine.aspx The run-time versions of 'tomo_display' and 'vol_tools' can be downloaded from: http://cars.uchicago.edu/software/idl/tomography.html http://sites.google.com/site/voltools/
Enhancements to TauDEM to support Rapid Watershed Delineation Services
NASA Astrophysics Data System (ADS)
Sazib, N. S.; Tarboton, D. G.
2015-12-01
Watersheds are widely recognized as the basic functional unit for water resources management studies and are important for a variety of problems in hydrology, ecology, and geomorphology. Nevertheless, delineating a watershed spread across a large region is still cumbersome due to the processing burden of working with large Digital Elevation Model. Terrain Analysis Using Digital Elevation Models (TauDEM) software supports the delineation of watersheds and stream networks from within desktop Geographic Information Systems. A rich set of watershed and stream network attributes are computed. However limitations of the TauDEM desktop tools are (1) it supports only one type of raster (tiff format) data (2) requires installation of software for parallel processing, and (3) data have to be in projected coordinate system. This paper presents enhancements to TauDEM that have been developed to extend its generality and support web based watershed delineation services. The enhancements of TauDEM include (1) reading and writing raster data with the open-source geospatial data abstraction library (GDAL) not limited to the tiff data format and (2) support for both geographic and projected coordinates. To support web services for rapid watershed delineation a procedure has been developed for sub setting the domain based on sub-catchments, with preprocessed data prepared for each catchment stored. This allows the watershed delineation to function locally, while extending to the full extent of watersheds using preprocessed information. Additional capabilities of this program includes computation of average watershed properties and geomorphic and channel network variables such as drainage density, shape factor, relief ratio and stream ordering. The updated version of TauDEM increases the practical applicability of it in terms of raster data type, size and coordinate system. The watershed delineation web service functionality is useful for web based software as service deployments that alleviate the need for users to install and work with desktop GIS software.
NASA Astrophysics Data System (ADS)
Saco, P. M.; Moreno de las Heras, M.; Willgoose, G. R.
2014-12-01
Watersheds are widely recognized as the basic functional unit for water resources management studies and are important for a variety of problems in hydrology, ecology, and geomorphology. Nevertheless, delineating a watershed spread across a large region is still cumbersome due to the processing burden of working with large Digital Elevation Model. Terrain Analysis Using Digital Elevation Models (TauDEM) software supports the delineation of watersheds and stream networks from within desktop Geographic Information Systems. A rich set of watershed and stream network attributes are computed. However limitations of the TauDEM desktop tools are (1) it supports only one type of raster (tiff format) data (2) requires installation of software for parallel processing, and (3) data have to be in projected coordinate system. This paper presents enhancements to TauDEM that have been developed to extend its generality and support web based watershed delineation services. The enhancements of TauDEM include (1) reading and writing raster data with the open-source geospatial data abstraction library (GDAL) not limited to the tiff data format and (2) support for both geographic and projected coordinates. To support web services for rapid watershed delineation a procedure has been developed for sub setting the domain based on sub-catchments, with preprocessed data prepared for each catchment stored. This allows the watershed delineation to function locally, while extending to the full extent of watersheds using preprocessed information. Additional capabilities of this program includes computation of average watershed properties and geomorphic and channel network variables such as drainage density, shape factor, relief ratio and stream ordering. The updated version of TauDEM increases the practical applicability of it in terms of raster data type, size and coordinate system. The watershed delineation web service functionality is useful for web based software as service deployments that alleviate the need for users to install and work with desktop GIS software.
Enhanced Historical Land-Use and Land-Cover Data Sets of the U.S. Geological Survey
Price, Curtis V.; Nakagaki, Naomi; Hitt, Kerie J.; Clawges, Rick M.
2007-01-01
Historical land-use and land-cover data, available from the U.S. Geological Survey (USGS) for the conterminous United States and Hawaii, have been enhanced for use in geographic information systems (GIS) applications. The original digital data sets were created by the USGS in the late 1970s and early 1980s and were later converted by USGS and the U.S. Environmental Protection Agency (USEPA) to a geographic information system (GIS) format in the early 1990s. These data were made available on USEPA's Web site since the early 1990s and have been used for many national applications, despite minor coding and topological errors. During the 1990s, a group of USGS researchers made modifications to the data set for use in the National Water-Quality Assessment Program. These edited files have been further modified to create a more accurate, topologically clean, and seamless national data set. Several different methods, including custom editing software and several batch processes, were applied to create this enhanced version of the national data set. The data sets are included in this report in the commonly used shapefile and Tagged Image Format File (TIFF) formats. In addition, this report includes two polygon data sets (in shapefile format) representing (1) land-use and land-cover source documentation extracted from the previously published USGS data files, and (2) the extent of each polygon data file.
Scanning technology selection impacts acceptability and usefulness of image-rich content.
Alpi, Kristine M; Brown, James C; Neel, Jennifer A; Grindem, Carol B; Linder, Keith E; Harper, James B
2016-01-01
Clinical and research usefulness of articles can depend on image quality. This study addressed whether scans of figures in black and white (B&W), grayscale, or color, or portable document format (PDF) to tagged image file format (TIFF) conversions as provided by interlibrary loan or document delivery were viewed as acceptable or useful by radiologists or pathologists. Residency coordinators selected eighteen figures from studies from radiology, clinical pathology, and anatomic pathology journals. With original PDF controls, each figure was prepared in three or four experimental conditions: PDF conversion to TIFF, and scans from print in B&W, grayscale, and color. Twelve independent observers indicated whether they could identify the features and whether the image quality was acceptable. They also ranked all the experimental conditions of each figure in terms of usefulness. Of 982 assessments of 87 anatomic pathology, 83 clinical pathology, and 77 radiology images, 471 (48%) were unidentifiable. Unidentifiability of originals (4%) and conversions (10%) was low. For scans, unidentifiability ranged from 53% for color, to 74% for grayscale, to 97% for B&W. Of 987 responses about acceptability (n=405), 41% were said to be unacceptable, 97% of B&W, 66% of grayscale, 41% of color, and 1% of conversions. Hypothesized order (original, conversion, color, grayscale, B&W) matched 67% of rankings (n=215). PDF to TIFF conversion provided acceptable content. Color images are rarely useful in grayscale (12%) or B&W (less than 1%). Acceptability of grayscale scans of noncolor originals was 52%. Digital originals are needed for most images. Print images in color or grayscale should be scanned using those modalities.
Analyzing huge pathology images with open source software.
Deroulers, Christophe; Ameisen, David; Badoual, Mathilde; Gerin, Chloé; Granier, Alexandre; Lartaud, Marc
2013-06-06
Digital pathology images are increasingly used both for diagnosis and research, because slide scanners are nowadays broadly available and because the quantitative study of these images yields new insights in systems biology. However, such virtual slides build up a technical challenge since the images occupy often several gigabytes and cannot be fully opened in a computer's memory. Moreover, there is no standard format. Therefore, most common open source tools such as ImageJ fail at treating them, and the others require expensive hardware while still being prohibitively slow. We have developed several cross-platform open source software tools to overcome these limitations. The NDPITools provide a way to transform microscopy images initially in the loosely supported NDPI format into one or several standard TIFF files, and to create mosaics (division of huge images into small ones, with or without overlap) in various TIFF and JPEG formats. They can be driven through ImageJ plugins. The LargeTIFFTools achieve similar functionality for huge TIFF images which do not fit into RAM. We test the performance of these tools on several digital slides and compare them, when applicable, to standard software. A statistical study of the cells in a tissue sample from an oligodendroglioma was performed on an average laptop computer to demonstrate the efficiency of the tools. Our open source software enables dealing with huge images with standard software on average computers. They are cross-platform, independent of proprietary libraries and very modular, allowing them to be used in other open source projects. They have excellent performance in terms of execution speed and RAM requirements. They open promising perspectives both to the clinician who wants to study a single slide and to the research team or data centre who do image analysis of many slides on a computer cluster. The virtual slide(s) for this article can be found here:http://www.diagnosticpathology.diagnomx.eu/vs/5955513929846272.
Analyzing huge pathology images with open source software
2013-01-01
Background Digital pathology images are increasingly used both for diagnosis and research, because slide scanners are nowadays broadly available and because the quantitative study of these images yields new insights in systems biology. However, such virtual slides build up a technical challenge since the images occupy often several gigabytes and cannot be fully opened in a computer’s memory. Moreover, there is no standard format. Therefore, most common open source tools such as ImageJ fail at treating them, and the others require expensive hardware while still being prohibitively slow. Results We have developed several cross-platform open source software tools to overcome these limitations. The NDPITools provide a way to transform microscopy images initially in the loosely supported NDPI format into one or several standard TIFF files, and to create mosaics (division of huge images into small ones, with or without overlap) in various TIFF and JPEG formats. They can be driven through ImageJ plugins. The LargeTIFFTools achieve similar functionality for huge TIFF images which do not fit into RAM. We test the performance of these tools on several digital slides and compare them, when applicable, to standard software. A statistical study of the cells in a tissue sample from an oligodendroglioma was performed on an average laptop computer to demonstrate the efficiency of the tools. Conclusions Our open source software enables dealing with huge images with standard software on average computers. They are cross-platform, independent of proprietary libraries and very modular, allowing them to be used in other open source projects. They have excellent performance in terms of execution speed and RAM requirements. They open promising perspectives both to the clinician who wants to study a single slide and to the research team or data centre who do image analysis of many slides on a computer cluster. Virtual slides The virtual slide(s) for this article can be found here: http://www.diagnosticpathology.diagnomx.eu/vs/5955513929846272 PMID:23829479
DOE Office of Scientific and Technical Information (OSTI.GOV)
Temple, Brian Allen; Armstrong, Jerawan Chudoung
This document is a mid-year report on a deliverable for the PYTHON Radiography Analysis Tool (PyRAT) for project LANL12-RS-107J in FY15. The deliverable is deliverable number 2 in the work package and is titled “Add the ability to read in more types of image file formats in PyRAT”. Right now PyRAT can only read in uncompressed TIF files (tiff files). It is planned to expand the file formats that can be read by PyRAT, making it easier to use in more situations. A summary of the file formats added include jpeg, jpg, png and formatted ASCII files.
Scanning technology selection impacts acceptability and usefulness of image-rich content*†
Alpi, Kristine M.; Brown, James C.; Neel, Jennifer A.; Grindem, Carol B.; Linder, Keith E.; Harper, James B.
2016-01-01
Objective Clinical and research usefulness of articles can depend on image quality. This study addressed whether scans of figures in black and white (B&W), grayscale, or color, or portable document format (PDF) to tagged image file format (TIFF) conversions as provided by interlibrary loan or document delivery were viewed as acceptable or useful by radiologists or pathologists. Methods Residency coordinators selected eighteen figures from studies from radiology, clinical pathology, and anatomic pathology journals. With original PDF controls, each figure was prepared in three or four experimental conditions: PDF conversion to TIFF, and scans from print in B&W, grayscale, and color. Twelve independent observers indicated whether they could identify the features and whether the image quality was acceptable. They also ranked all the experimental conditions of each figure in terms of usefulness. Results Of 982 assessments of 87 anatomic pathology, 83 clinical pathology, and 77 radiology images, 471 (48%) were unidentifiable. Unidentifiability of originals (4%) and conversions (10%) was low. For scans, unidentifiability ranged from 53% for color, to 74% for grayscale, to 97% for B&W. Of 987 responses about acceptability (n=405), 41% were said to be unacceptable, 97% of B&W, 66% of grayscale, 41% of color, and 1% of conversions. Hypothesized order (original, conversion, color, grayscale, B&W) matched 67% of rankings (n=215). Conclusions PDF to TIFF conversion provided acceptable content. Color images are rarely useful in grayscale (12%) or B&W (less than 1%). Acceptability of grayscale scans of noncolor originals was 52%. Digital originals are needed for most images. Print images in color or grayscale should be scanned using those modalities. PMID:26807048
NASA Astrophysics Data System (ADS)
Dunham, G.; Harding, E. C.; Loisel, G. P.; Lake, P. W.; Nielsen-Weber, L. B.
2016-11-01
Fuji TR image plate is frequently used as a replacement detector medium for x-ray imaging and spectroscopy diagnostics at NIF, Omega, and Z facilities. However, the familiar Fuji BAS line of image plate scanners is no longer supported by the industry, and so a replacement scanning system is needed. While the General Electric Typhoon line of scanners could replace the Fuji systems, the shift away from photo stimulated luminescence units to 16-bit grayscale Tag Image File Format (TIFF) leaves a discontinuity when comparing data collected from both systems. For the purposes of quantitative spectroscopy, a known unit of intensity applied to the grayscale values of the TIFF is needed. The DITABIS Super Micron image plate scanning system was tested and shown to potentially rival the resolution and dynamic range of Kodak RAR 2492 x-ray film. However, the absolute sensitivity of the scanner is unknown. In this work, a methodology to cross calibrate Fuji TR image plate and the absolutely calibrated Kodak RAR 2492 x-ray film is presented. Details of the experimental configurations used are included. An energy dependent scale factor to convert Fuji TR IP scanned on a DITABIS Super Micron scanner from 16-bit grayscale TIFF to intensity units (i.e., photons per square micron) is discussed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dunham, G., E-mail: gsdunha@sandia.gov; Harding, E. C.; Loisel, G. P.
Fuji TR image plate is frequently used as a replacement detector medium for x-ray imaging and spectroscopy diagnostics at NIF, Omega, and Z facilities. However, the familiar Fuji BAS line of image plate scanners is no longer supported by the industry, and so a replacement scanning system is needed. While the General Electric Typhoon line of scanners could replace the Fuji systems, the shift away from photo stimulated luminescence units to 16-bit grayscale Tag Image File Format (TIFF) leaves a discontinuity when comparing data collected from both systems. For the purposes of quantitative spectroscopy, a known unit of intensity appliedmore » to the grayscale values of the TIFF is needed. The DITABIS Super Micron image plate scanning system was tested and shown to potentially rival the resolution and dynamic range of Kodak RAR 2492 x-ray film. However, the absolute sensitivity of the scanner is unknown. In this work, a methodology to cross calibrate Fuji TR image plate and the absolutely calibrated Kodak RAR 2492 x-ray film is presented. Details of the experimental configurations used are included. An energy dependent scale factor to convert Fuji TR IP scanned on a DITABIS Super Micron scanner from 16-bit grayscale TIFF to intensity units (i.e., photons per square micron) is discussed.« less
Prototype of Partial Cutting Tool of Geological Map Images Distributed by Geological Web Map Service
NASA Astrophysics Data System (ADS)
Nonogaki, S.; Nemoto, T.
2014-12-01
Geological maps and topographical maps play an important role in disaster assessment, resource management, and environmental preservation. These map information have been distributed in accordance with Web services standards such as Web Map Service (WMS) and Web Map Tile Service (WMTS) recently. In this study, a partial cutting tool of geological map images distributed by geological WMTS was implemented with Free and Open Source Software. The tool mainly consists of two functions: display function and cutting function. The former function was implemented using OpenLayers. The latter function was implemented using Geospatial Data Abstraction Library (GDAL). All other small functions were implemented by PHP and Python. As a result, this tool allows not only displaying WMTS layer on web browser but also generating a geological map image of intended area and zoom level. At this moment, available WTMS layers are limited to the ones distributed by WMTS for the Seamless Digital Geological Map of Japan. The geological map image can be saved as GeoTIFF format and WebGL format. GeoTIFF is one of the georeferenced raster formats that is available in many kinds of Geographical Information System. WebGL is useful for confirming a relationship between geology and geography in 3D. In conclusion, the partial cutting tool developed in this study would contribute to create better conditions for promoting utilization of geological information. Future work is to increase the number of available WMTS layers and the types of output file format.
Harvey, Craig A.; Kolpin, Dana W.; Battaglin, William A.
1996-01-01
A geographic information system (GIS) procedure was developed to compile low-altitude aerial photography, digitized data, and land-use data from U.S. Department of Agriculture Consolidated Farm Service Agency (CFSA) offices into a high-resolution (approximately 5 meters) land-use GIS data set. The aerial photography consisted of 35-mm slides which were scanned into tagged information file format (TIFF) images. These TIFF images were then imported into the GIS where they were registered into a geographically referenced coordinate system. Boundaries between land use were delineated from these GIS data sets using on-screen digitizing techniques. Crop types were determined using information obtained from the U.S. Department of Agriculture CFSA offices. Crop information not supplied by the CFSA was attributed by manual classification procedures. Automated methods to provide delineation of the field boundaries and land-use classification were investigated. It was determined that using these data sources, automated methods were less efficient and accurate than manual methods of delineating field boundaries and classifying land use.
Processed Thematic Mapper Satellite Imagery for Selected Areas within the U.S.-Mexico Borderlands
Dohrenwend, John C.; Gray, Floyd; Miller, Robert J.
2000-01-01
The study is summarized in the Adobe Acrobat Portable Document Format (PDF) file OF00-309.PDF. This publication also contain satellite full-scene images of selected areas along the U.S.-Mexico border. These images are presented as high-resolution images in jpeg format (IMAGES). The folder LOCATIONS in contains TIFF images showing exact positions of easily-identified reference locations for each of the Landsat TM scenes located at least partly within the U.S. A reference location table (BDRLOCS.DOC in MS Word format) lists the latitude and longitude of each reference location with a nominal precision of 0.001 minute of arc
2015-12-24
Signal to Noise Ratio SPICE Simulation Program with Integrated Circuit Emphasis TIFF Tagged Image File Format USC University of Southern California xvii...sources can create errors in digital circuits. These effects can be simulated using Simulation Program with Integrated Circuit Emphasis ( SPICE ) or...compute summary statistics. 4.1 Circuit Simulations Noisy analog circuits can be simulated in SPICE or Cadence SpectreTM software via noisy voltage
ImageJ: Image processing and analysis in Java
NASA Astrophysics Data System (ADS)
Rasband, W. S.
2012-06-01
ImageJ is a public domain Java image processing program inspired by NIH Image. It can display, edit, analyze, process, save and print 8-bit, 16-bit and 32-bit images. It can read many image formats including TIFF, GIF, JPEG, BMP, DICOM, FITS and "raw". It supports "stacks", a series of images that share a single window. It is multithreaded, so time-consuming operations such as image file reading can be performed in parallel with other operations.
McCord, Layne K; Scarfe, William C; Naylor, Rachel H; Scheetz, James P; Silveira, Anibal; Gillespie, Kevin R
2007-05-01
The objectives of this study were to compare the effect of JPEG 2000 compression of hand-wrist radiographs on observer image quality qualitative assessment and to compare with a software-derived quantitative image quality index. Fifteen hand-wrist radiographs were digitized and saved as TIFF and JPEG 2000 images at 4 levels of compression (20:1, 40:1, 60:1, and 80:1). The images, including rereads, were viewed by 13 orthodontic residents who determined the image quality rating on a scale of 1 to 5. A quantitative analysis was also performed by using a readily available software based on the human visual system (Image Quality Measure Computer Program, version 6.2, Mitre, Bedford, Mass). ANOVA was used to determine the optimal compression level (P < or =.05). When we compared subjective indexes, JPEG compression greater than 60:1 significantly reduced image quality. When we used quantitative indexes, the JPEG 2000 images had lower quality at all compression ratios compared with the original TIFF images. There was excellent correlation (R2 >0.92) between qualitative and quantitative indexes. Image Quality Measure indexes are more sensitive than subjective image quality assessments in quantifying image degradation with compression. There is potential for this software-based quantitative method in determining the optimal compression ratio for any image without the use of subjective raters.
DOE Office of Scientific and Technical Information (OSTI.GOV)
North, Michael J.
SchemaOnRead provides tools for implementing schema-on-read including a single function call (e.g., schemaOnRead("filename")) that reads text (TXT), comma separated value (CSV), raster image (BMP, PNG, GIF, TIFF, and JPG), R data (RDS), HDF5, NetCDF, spreadsheet (XLS, XLSX, ODS, and DIF), Weka Attribute-Relation File Format (ARFF), Epi Info (REC), Pajek network (PAJ), R network (NET), Hypertext Markup Language (HTML), SPSS (SAV), Systat (SYS), and Stata (DTA) files. It also recursively reads folders (e.g., schemaOnRead("folder")), returning a nested list of the contained elements.
1998-07-01
all the MS Word files into FrameMaker + SGML format and use the FrameMaker application to SGML tag all of the data in accordance with the Army TM...Document Type Definitions (DTDs) in MIL-STD- 2361. The edited SGML tagged files are saved as PDF files for delivery to the field. The FrameMaker ...as TIFF files and being imported into FrameMaker prior to saving the TMs as PDF files. Since the hardware to be used by the AN/PPS-5 technician is
SCIFIO: an extensible framework to support scientific image formats.
Hiner, Mark C; Rueden, Curtis T; Eliceiri, Kevin W
2016-12-07
No gold standard exists in the world of scientific image acquisition; a proliferation of instruments each with its own proprietary data format has made out-of-the-box sharing of that data nearly impossible. In the field of light microscopy, the Bio-Formats library was designed to translate such proprietary data formats to a common, open-source schema, enabling sharing and reproduction of scientific results. While Bio-Formats has proved successful for microscopy images, the greater scientific community was lacking a domain-independent framework for format translation. SCIFIO (SCientific Image Format Input and Output) is presented as a freely available, open-source library unifying the mechanisms of reading and writing image data. The core of SCIFIO is its modular definition of formats, the design of which clearly outlines the components of image I/O to encourage extensibility, facilitated by the dynamic discovery of the SciJava plugin framework. SCIFIO is structured to support coexistence of multiple domain-specific open exchange formats, such as Bio-Formats' OME-TIFF, within a unified environment. SCIFIO is a freely available software library developed to standardize the process of reading and writing scientific image formats.
Digital seismic-reflection data from western Rhode Island Sound, 1980
McMullen, K.Y.; Poppe, L.J.; Soderberg, N.K.
2009-01-01
During 1980, the U.S. Geological Survey (USGS) conducted a seismic-reflection survey in western Rhode Island Sound aboard the Research Vessel Neecho. Data from this survey were recorded in analog form and archived at the USGS Woods Hole Science Center's Data Library. Due to recent interest in the geology of Rhode Island Sound and in an effort to make the data more readily accessible while preserving the original paper records, the seismic data from this cruise were scanned and converted to Tagged Image File Format (TIFF) images and SEG-Y data files. Navigation data were converted from U.S. Coast Guard Long Range Aids to Navigation (LORAN-C) time delays to latitudes and longitudes, which are available in Environmental Systems Research Institute, Inc. (ESRI) shapefile format and as eastings and northings in space-delimited text format.
Jones, William R.; Garber, Adrienne
2013-01-01
The Coastal Wetlands Planning, Protection and Restoration Act (CWPPRA) funds over 100 wetland restoration projects across Louisiana. Integral to the success of CWPPRA is its long-term monitoring program, which enables State and Federal agencies to determine the effectiveness of each restoration effort. One component of this monitoring program is the classification of high-resolution, color-infrared aerial photography at the U.S. Geological Survey’s National Wetlands Research Center in Lafayette, Louisiana. Color-infrared aerial photography (9- by 9-inch) is obtained before project construction and several times after construction. Each frame is scanned on a photogrametric scanner that produces a high-resolution image in Tagged Image File Format (TIFF). By using image-processing software, these TIFF files are then orthorectified and mosaicked to produce a seamless image of a project area and its associated reference area (a control site near the project that has common environmental features, such as marsh type, soil types, and water salinities.) The project and reference areas are then classified according to pixel value into two distinct classes, land and water. After initial land and water ratios have been established by using photography obtained before and after project construction, subsequent comparisons can be made over time to determine land-water change.
Data publication and sharing using the SciDrive service
NASA Astrophysics Data System (ADS)
Mishin, Dmitry; Medvedev, D.; Szalay, A. S.; Plante, R. L.
2014-01-01
Despite the last years progress in scientific data storage, still remains the problem of public data storage and sharing system for relatively small scientific datasets. These are collections forming the “long tail” of power log datasets distribution. The aggregated size of the long tail data is comparable to the size of all data collections from large archives, and the value of data is significant. The SciDrive project's main goal is providing the scientific community with a place to reliably and freely store such data and provide access to it to broad scientific community. The primary target audience of the project is astoromy community, and it will be extended to other fields. We're aiming to create a simple way of publishing a dataset, which can be then shared with other people. Data owner controls the permissions to modify and access the data and can assign a group of users or open the access to everyone. The data contained in the dataset will be automaticaly recognized by a background process. Known data formats will be extracted according to the user's settings. Currently tabular data can be automatically extracted to the user's MyDB table where user can make SQL queries to the dataset and merge it with other public CasJobs resources. Other data formats can be processed using a set of plugins that upload the data or metadata to user-defined side services. The current implementation targets some of the data formats commonly used by the astronomy communities, including FITS, ASCII and Excel tables, TIFF images, and YT simulations data archives. Along with generic metadata, format-specific metadata is also processed. For example, basic information about celestial objects is extracted from FITS files and TIFF images, if present. A 100TB implementation has just been put into production at Johns Hopkins University. The system features public data storage REST service supporting VOSpace 2.0 and Dropbox protocols, HTML5 web portal, command-line client and Java standalone client to synchronize a local folder with the remote storage. We use VAO SSO (Single Sign On) service from NCSA for users authentication that provides free registration for everyone.
A seamless, high-resolution digital elevation model (DEM) of the north-central California coast
Foxgrover, Amy C.; Barnard, Patrick L.
2012-01-01
A seamless, 2-meter resolution digital elevation model (DEM) of the north-central California coast has been created from the most recent high-resolution bathymetric and topographic datasets available. The DEM extends approximately 150 kilometers along the California coastline, from Half Moon Bay north to Bodega Head. Coverage extends inland to an elevation of +20 meters and offshore to at least the 3 nautical mile limit of state waters. This report describes the procedures of DEM construction, details the input data sources, and provides the DEM for download in both ESRI Arc ASCII and GeoTIFF file formats with accompanying metadata.
The survey on data format of Earth observation satellite data at JAXA.
NASA Astrophysics Data System (ADS)
Matsunaga, M.; Ikehata, Y.
2017-12-01
JAXA's earth observation satellite data are distributed by a portal web site for search and deliver called "G-Portal". Users can download the satellite data of GPM, TRMM, Aqua, ADEOS-II, ALOS (search only), ALOS-2 (search only), MOS-1, MOS-1b, ERS-1 and JERS-1 from G-Portal. However, these data formats are different by each satellite like HDF4, HDF5, NetCDF4, CEOS, etc., and which formats are not familiar to new data users. Although the HDF type self-describing format is very convenient and useful for big dataset information, old-type format product is not readable by open GIS tool nor apply OGC standard. Recently, the satellite data are widely used to be applied to the various needs such as disaster, earth resources, monitoring the global environment, Geographic Information System(GIS) and so on. In order to remove a barrier of using Earth Satellite data for new community users, JAXA has been providing the format-converted product like GeoTIFF or KMZ. In addition, JAXA provides format conversion tool itself. We investigate the trend of data format for data archive, data dissemination and data utilization, then we study how to improve the current product format for various application field users and make a recommendation for new product.
Visualization of GPM Standard Products at the Precipitation Processing System (PPS)
NASA Astrophysics Data System (ADS)
Kelley, O.
2010-12-01
Many of the standard data products for the Global Precipitation Measurement (GPM) constellation of satellites will be generated at and distributed by the Precipitation Processing System (PPS) at NASA Goddard. PPS will provide several means to visualize these data products. These visualization tools will be used internally by PPS analysts to investigate potential anomalies in the data files, and these tools will also be made available to researchers. Currently, a free data viewer called THOR, the Tool for High-resolution Observation Review, can be downloaded and installed on Linux, Windows, and Mac OS X systems. THOR can display swath and grid products, and to a limited degree, the low-level data packets that the satellite itself transmits to the ground system. Observations collected since the 1997 launch of the Tropical Rainfall Measuring Mission (TRMM) satellite can be downloaded from the PPS FTP archive, and in the future, many of the GPM standard products will also be available from this FTP site. To provide easy access to this 80 terabyte and growing archive, PPS currently operates an on-line ordering tool called STORM that provides geographic and time searches, browse-image display, and the ability to order user-specified subsets of standard data files. Prior to the anticipated 2013 launch of the GPM core satellite, PPS will expand its visualization tools by integrating an on-line version of THOR within STORM to provide on-the-fly image creation of any portion of an archived data file at a user-specified degree of magnification. PPS will also provide OpenDAP access to the data archive and OGC WMS image creation of both swath and gridded data products. During the GPM era, PPS will continue to provide realtime globally-gridded 3-hour rainfall estimates to the public in a compact binary format (3B42RT) and in a GIS format (2-byte TIFF images + ESRI WorldFiles).
What is meant by Format Version? Product Version? Collection?
Atmospheric Science Data Center
2017-10-12
The format Version is used to distinguish between software deliveries to ASDC that result in a product format change. The format version is given in the MISR data file name using the designator _Fnn_ where nn is the version number. ...
The Snow Data System at NASA JPL
NASA Astrophysics Data System (ADS)
Horn, J.; Painter, T. H.; Bormann, K. J.; Rittger, K.; Brodzik, M. J.; Skiles, M.; Burgess, A. B.; Mattmann, C. A.; Ramirez, P.; Joyce, M.; Goodale, C. E.; McGibbney, L. J.; Zimdars, P.; Yaghoobi, R.
2017-12-01
The Snow Data System at NASA JPL includes data processing pipelines built with open source software, Apache 'Object Oriented Data Technology' (OODT). Processing is carried out in parallel across a high-powered computing cluster. The pipelines use input data from satellites such as MODIS, VIIRS and Landsat. They apply algorithms to the input data to produce a variety of outputs in GeoTIFF format. These outputs include daily data for SCAG (Snow Cover And Grain size) and DRFS (Dust Radiative Forcing in Snow), along with 8-day composites and MODICE annual minimum snow and ice calculations. This poster will describe the Snow Data System, its outputs and their uses and applications. It will also highlight recent advancements to the system and plans for the future.
The Snow Data System at NASA JPL
NASA Astrophysics Data System (ADS)
Joyce, M.; Laidlaw, R.; Painter, T. H.; Bormann, K. J.; Rittger, K.; Brodzik, M. J.; Skiles, M.; Burgess, A. B.; Mattmann, C. A.; Ramirez, P.; Goodale, C. E.; McGibbney, L. J.; Zimdars, P.; Yaghoobi, R.
2016-12-01
The Snow Data System at NASA JPL includes data processing pipelines built with open source software, Apache 'Object Oriented Data Technology' (OODT). Processing is carried out in parallel across a high-powered computing cluster. The pipelines use input data from satellites such as MODIS, VIIRS and Landsat. They apply algorithms to the input data to produce a variety of outputs in GeoTIFF format. These outputs include daily data for SCAG (Snow Cover And Grain size) and DRFS (Dust Radiative Forcing in Snow), along with 8-day composites and MODICE annual minimum snow and ice calculations. This poster will describe the Snow Data System, its outputs and their uses and applications. It will also highlight recent advancements to the system and plans for the future.
Dying star creates sculpture of gas and dust
NASA Astrophysics Data System (ADS)
2004-09-01
Sculpture of gas and dust hi-res Size hi-res: 125 Kb Credits: ESA, NASA, HEIC and The Hubble Heritage Team (STScI/AURA) Dying star creates sculpture of gas and dust The so-called Cat's Eye Nebula, formally catalogued NGC 6543 and seen here in this detailed view from the NASA/ESA Hubble Space Telescope, is one of the most complex planetary nebulae ever seen in space. A planetary nebula forms when Sun-like stars gently eject their outer gaseous layers to form bright nebulae with amazing twisted shapes. Hubble first revealed NGC 6543's surprisingly intricate structures including concentric gas shells, jets of high-speed gas and unusual shock-induced knots of gas in 1994. This new image, taken with Hubble's Advanced Camera for Surveys (ACS), reveals the full beauty of a bull's-eye pattern of eleven or more concentric rings, or shells, around the Cat’s Eye. Each ‘ring’ is actually the edge of a spherical bubble seen projected onto the sky - which is why it appears bright along its outer edge. High resolution version (JPG format) 125 Kb High resolution version (TIFF format) 2569 Kb Acknowledgment: R. Corradi (Isaac Newton Group of Telescopes, Spain) and Z. Tsvetanov (NASA). Sculpture of gas and dust hi-res Size hi-res: 287 Kb Credits: Nordic Optical Telescope and Romano Corradi (Isaac Newton Group of Telescopes, Spain) Dying star creates sculpture of gas and dust An enormous but extremely faint halo of gaseous material surrounds the Cat’s Eye Nebula and is over three light-years across. Some planetary nebulae been found to have halos like this one, likely formed of material ejected during earlier active episodes in the star's evolution - most likely some 50 000 to 90 000 years ago. This image was taken by Romano Corradi with the Nordic Optical Telescope on La Palma in the Canary Islands. The image is constructed from two narrow-band exposures showing oxygen atoms (1800 seconds, in blue) and nitrogen atoms (1800 seconds, in red). High resolution version (JPG format) 287 Kb High resolution version (TIFF format) 4674 Kb Although the rings may be the key to explaining the final ‘gasp’ of the dying central star, the mystery behind the Cat’s Eye Nebula’s nested ‘Russian doll’ structure remains largely unsolved. The so-called Cat's Eye Nebula, formally catalogued NGC 6543 and seen here in this detailed view from the NASA/ESA Hubble Space Telescope, is one of the most complex planetary nebulae ever seen in space. A planetary nebula forms when Sun-like stars gently eject their outer gaseous layers to form bright nebulae with amazing twisted shapes. Hubble first revealed NGC 6543's surprisingly intricate structures including concentric gas shells, jets of high-speed gas and unusual shock-induced knots of gas in 1994. This new image, taken with Hubble's Advanced Camera for Surveys (ACS), reveals the full beauty of a bull's-eye pattern of eleven or more concentric rings, or shells, around the Cat’s Eye. Each ‘ring’ is actually the edge of a spherical bubble seen projected onto the sky - which is why it appears bright along its outer edge. Observations suggest that the star ejected its mass in a series of pulses at 1500-year intervals. These convulsions created dust shells that each contains as much mass as all of the planets in our Solar System combined (but still only one-percent of the Sun's mass). These concentric shells make a layered onion-skin structure around the dying star. The view from Hubble is like seeing an onion cut in half, where each layer of skin is discernible. Until recently, it was thought that shells around planetary nebulae were a rare phenomenon. However, Romano Corradi (Isaac Newton Group of Telescopes, Spain) and collaborators, in a paper published in the European journal Astronomy & Astrophysics in April 2004, have instead shown that the formation of these rings is likely to be the rule rather than the exception. The bull's-eye patterns seen around planetary nebulae come as a surprise to astronomers because they had no expectation of episodes of mass loss at the end of stellar lives that repeat every 1500 years or so. Several explanations have been proposed, including cycles of magnetic activity somewhat similar to our own Sun's sunspot cycle, the action of companion stars orbiting around the dying star, and stellar pulsations. Another school of thought is that the material is ejected smoothly from the star, and the rings are created later on due to formation of waves in the outflowing material. It will take further observations and more theoretical studies to decide between these and other possible explanations. Approximately 1000 years ago the pattern of mass loss suddenly changed, and the Cat's Eye Nebula itself started forming inside the dusty shells. It has been expanding ever since, as can be seen by comparing Hubble images taken in 1994, 1997, 2000 and 2002. But what has caused this dramatic change? Many aspects of the process that leads a star to lose its gaseous envelope are poorly known, and the study of planetary nebulae is one of the few ways to recover information about the last few thousand years in the life of a Sun-like star. Notes for editors: The group of astronomers involved in the April 2004, Astronomy & Astrophysics paper are: R.L.M. Corradi (Isaac Newton Group of Telescopes, Spain), P. Sanchez-Blazquez (Universidad Complutense, Spain), G. Mellema (Foundation for Research in Astronomy, The Netherlands), C. Giammanco (Instituto de Astrofisica de Canarias, Spain) and H.E. Schwarz (Cerro Tololo Inter-American Observatory, Chile). The Hubble Space Telescope is a project of international co-operation between ESA and NASA.
Image quality (IQ) guided multispectral image compression
NASA Astrophysics Data System (ADS)
Zheng, Yufeng; Chen, Genshe; Wang, Zhonghai; Blasch, Erik
2016-05-01
Image compression is necessary for data transportation, which saves both transferring time and storage space. In this paper, we focus on our discussion on lossy compression. There are many standard image formats and corresponding compression algorithms, for examples, JPEG (DCT -- discrete cosine transform), JPEG 2000 (DWT -- discrete wavelet transform), BPG (better portable graphics) and TIFF (LZW -- Lempel-Ziv-Welch). The image quality (IQ) of decompressed image will be measured by numerical metrics such as root mean square error (RMSE), peak signal-to-noise ratio (PSNR), and structural Similarity (SSIM) Index. Given an image and a specified IQ, we will investigate how to select a compression method and its parameters to achieve an expected compression. Our scenario consists of 3 steps. The first step is to compress a set of interested images by varying parameters and compute their IQs for each compression method. The second step is to create several regression models per compression method after analyzing the IQ-measurement versus compression-parameter from a number of compressed images. The third step is to compress the given image with the specified IQ using the selected compression method (JPEG, JPEG2000, BPG, or TIFF) according to the regressed models. The IQ may be specified by a compression ratio (e.g., 100), then we will select the compression method of the highest IQ (SSIM, or PSNR). Or the IQ may be specified by a IQ metric (e.g., SSIM = 0.8, or PSNR = 50), then we will select the compression method of the highest compression ratio. Our experiments tested on thermal (long-wave infrared) images (in gray scales) showed very promising results.
Tularosa Basin Play Fairway Analysis: Weights of Evidence; Mineralogy, and Temperature Anomaly Maps
Adam Brandt
2015-11-15
This submission has two shapefiles and a tiff image. The weights of evidence analysis was applied to data representing heat of the earth and fracture permeability using training sites around the Southwest; this is shown in the tiff image. A shapefile of surface temperature anomalies was derived from the statistical analysis of Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) thermal infrared data which had been converted to surface temperatures; these anomalies have not been field checked. The second shapefile shows outcrop mineralogy which originally mapped by the New Mexico Bureau of Geology and Mineral Resources, and supplemented with mineralogic information related to rock fracability risk for EGS. Further metadata can be found within each file.
Sanford, Jordan M.; Harrison, Arnell S.; Wiese, Dana S.; Flocks, James G.
2009-01-01
In June of 1990 and July of 1991, the U.S. Geological Survey (USGS) conducted geophysical surveys to investigate the shallow geologic framework of the Mississippi-Alabama-Florida shelf in the northern Gulf of Mexico, from Mississippi Sound to the Florida Panhandle. Work was done onboard the Mississippi Mineral Resources Institute R/V Kit Jones as part of a project to study coastal erosion and offshore sand resources. This report is part of a series to digitally archive the legacy analog data collected from the Mississippi-Alabama SHelf (MASH). The MASH data rescue project is a cooperative effort by the USGS and the Minerals Management Service (MMS). This report serves as an archive of high-resolution scanned Tagged Image File Format (TIFF) and Graphics Interchange Format (GIF) images of the original boomer paper records, navigation files, trackline maps, Geographic Information System (GIS) files, cruise logs, and formal Federal Geographic Data Committee (FGDC) metadata.
Zots, M A; Ivashkina, O I; Ivanova, A A; Anokhin, K V
2014-03-01
We studied the formation of spatial and nonspatial memory in mice during learning in three different condensed versions of Morris water maze task. Learning in combined version caused the formation of both spatial and nonspatial memory, whereas learning in condensed versions (spatial and nonspatial) led to memory formation specific for the version.
IIPImage: Large-image visualization
NASA Astrophysics Data System (ADS)
Pillay, Ruven
2014-08-01
IIPImage is an advanced high-performance feature-rich image server system that enables online access to full resolution floating point (as well as other bit depth) images at terabyte scales. Paired with the VisiOmatic (ascl:1408.010) celestial image viewer, the system can comfortably handle gigapixel size images as well as advanced image features such as both 8, 16 and 32 bit depths, CIELAB colorimetric images and scientific imagery such as multispectral images. Streaming is tile-based, which enables viewing, navigating and zooming in real-time around gigapixel size images. Source images can be in either TIFF or JPEG2000 format. Whole images or regions within images can also be rapidly and dynamically resized and exported by the server from a single source image without the need to store multiple files in various sizes.
Water and Streambed-Sediment Quality in the Upper Elk River Basin, Missouri and Arkansas, 2004-06
Smith, Brenda J.; Richards, Joseph M.; Schumacher, John G.
2007-01-01
The U.S. Geological Survey, in cooperation with the Missouri Department of Natural Resources, collected water and streambedsediment samples in the Upper Elk River Basin in southwestern Missouri and northwestern Arkansas from October 2004 through December 2006. The samples were collected to determine the stream-water quality and streambed-sediment quality. In 1998, the Missouri Department of Natural Resources included a 21.5-mile river reach of the Elk River on the 303(d) list of impaired waters in Missouri as required by Section 303(d) of the Federal Clean Water Act. The Elk River is on the 303(d) list for excess nutrient loading. The total phosphorus distribution by decade indicates that the concentrations since 2000 have increased significantly from those in the 1960s, 1980s, and 1990s. The nitrate as nitrogen (nitrate) concentrations also have increased significantly in post-1985 from pre-1985 samples collected at the Elk River near Tiff City. Concentrations have increased significantly since the 1960s. Concentrations in the 1970s and 1980s, though similar, have increased from those in the 1960s, and the concentrations from the 1990s and 2000s increased still more. Nitrate concentrations significantly increased in samples that were collected during large discharges (greater than 355 cubic feet per second) from the Elk River near Tiff City. Nitrate concentrations were largest in Indian Creek. Several sources of nitrate are present in the basin, including poultry facilities in the upper part of the basin, effluent inflow from communities of Anderson and Lanagan, land-applied animal waste, chemical fertilizer, and possible leaking septic systems. Total phosphorus concentrations were largest in Little Sugar Creek. The median concentration of total phosphorus from samples from Little Sugar Creek near Pineville was almost four times the median concentration in samples from the Elk River near Tiff City. Median concentrations of nutrient species were greater in the stormwater samples than the median concentrations in the ambient samples. Nitrate concentrations in stormwater samples ranged from 133 to 179 percent of the concentration in the ambient samples. The total phosphorus concentrations in the stormwater samples ranged from about 200 to more than 600 percent of the concentration in the ambient samples. Base-flow conditions as reflected by the seepage run of the summer of 2006 indicate that 52 percent of the discharge at the Elk River near Tiff City is contributed by Indian Creek. Little Sugar Creek contributes 32 percent and Big Sugar Creek 9 percent of the discharge in the Elk River near Tiff City. Only about 7 percent of the discharge at Tiff City comes from the mainstem of the Elk River. Concentrations of dissolved ammonia plus organic nitrogen as nitrogen, dissolved ammonia as nitrogen, dissolved phosphorus, and dissolved orthophosphorus were detected in all streambed-sediment leachate samples. Concentrations of leachable nutrients in streambed-sediment samples generally tended to be slightly larger along the major forks of the Elk River as compared to tributary sites, with sites in the upper reaches of the major forks having among the largest concentrations. Concentrations of leachable nutrients in the major forks generally decreased with increasing distance downstream.
López, Carlos; Jaén Martinez, Joaquín; Lejeune, Marylène; Escrivà, Patricia; Salvadó, Maria T; Pons, Lluis E; Alvaro, Tomás; Baucells, Jordi; García-Rojo, Marcial; Cugat, Xavier; Bosch, Ramón
2009-10-01
The volume of digital image (DI) storage continues to be an important problem in computer-assisted pathology. DI compression enables the size of files to be reduced but with the disadvantage of loss of quality. Previous results indicated that the efficiency of computer-assisted quantification of immunohistochemically stained cell nuclei may be significantly reduced when compressed DIs are used. This study attempts to show, with respect to immunohistochemically stained nuclei, which morphometric parameters may be altered by the different levels of JPEG compression, and the implications of these alterations for automated nuclear counts, and further, develops a method for correcting this discrepancy in the nuclear count. For this purpose, 47 DIs from different tissues were captured in uncompressed TIFF format and converted to 1:3, 1:23 and 1:46 compression JPEG images. Sixty-five positive objects were selected from these images, and six morphological parameters were measured and compared for each object in TIFF images and those of the different compression levels using a set of previously developed and tested macros. Roundness proved to be the only morphological parameter that was significantly affected by image compression. Factors to correct the discrepancy in the roundness estimate were derived from linear regression models for each compression level, thereby eliminating the statistically significant differences between measurements in the equivalent images. These correction factors were incorporated in the automated macros, where they reduced the nuclear quantification differences arising from image compression. Our results demonstrate that it is possible to carry out unbiased automated immunohistochemical nuclear quantification in compressed DIs with a methodology that could be easily incorporated in different systems of digital image analysis.
Jones, William R.; Garber, Adrienne
2012-01-01
The Coastal Wetlands Planning, Protection and Restoration Act (CWPPRA) funds over 100 wetland restoration projects across Louisiana. Integral to the success of CWPPRA is its long-term monitoring program, which enables State and Federal agencies to determine the effectiveness of each restoration effort. One component of this monitoring program is the analysis of high-resolution, color-infrared aerial photography at the U.S. Geological Survey's National Wetlands Research Center in Lafayette, Louisiana. Color-infrared aerial photography (9- by 9-inch) is obtained before project construction and several times after construction. Each frame is scanned on a photogrametric scanner that produces a high-resolution image in Tagged Image File Format (TIFF). By using image-processing software, these TIFF files are then orthorectified and mosaicked to produce a seamless image of a project area and its associated reference area (a control site near the project that has common environmental features, such as marsh type, soil types, and water salinities.) The project and reference areas are then classified according to pixel value into two distinct classes, land and water. After initial land and water ratios have been established by using photography obtained before and after project construction, subsequent comparisons can be made over time to determine land-water change. Several challenges are associated with the land-water interpretation process. Primarily, land-water classifications are often complicated by the presence of floating aquatic vegetation that occurs throughout the freshwater systems of coastal Louisiana and that is sometimes difficult to differentiate from emergent marsh. Other challenges include tidal fluctuations and water movement from strong winds, which may result in flooding and inundation of emergent marsh during certain conditions. Compensating for these events is difficult but possible by using other sources of imagery to verify marsh conditions for other dates in time.
Boleneus, David E.; Appelgate, Larry M.; Joseph, Nancy L.; Brandt, Theodore R.
2001-01-01
Geologic maps of the western part of the Belt Basin of western Montana and northern Idaho were converted into digital raster (TIFF image) format to facilitate their manipulation in geographic information systems. The 85-mile x 100-mile map area mostly contains rocks belonging to the lower and middle Belt Supergroup. The area is of interest as these Middle Proterozoic strata contain vein-type lead-zinc-silver deposits in the Coeur d?Alene Mining District in the St. Regis and Revett formations and strata-bound copper-silver deposits, such as the Troy mine, within the Revett Formation. The Prichard Formation is also prospective for strata-bound lead-zinc deposits because equivalent Belt strata in southern British Columbia, Canada host the Sullivan lead-zinc deposit. Map data converted to digital images include 13 geological maps at scales ranging from 1:48,000 to 1:12,000. Geologic map images produced from these maps by color scanning were registered to grid tick coverages in a Universal Transverse Mercator (North American Datum of 1927, zone 11) projection using ArcView Image Analysis. Geo-registering errors vary from 10 ft to 114 ft.
Rothkirch, André; Gatta, G Diego; Meyer, Mathias; Merkel, Sébastien; Merlini, Marco; Liermann, Hanns Peter
2013-09-01
Fast detectors employed at third-generation synchrotrons have reduced collection times significantly and require the optimization of commercial as well as customized software packages for data reduction and analysis. In this paper a procedure to collect, process and analyze single-crystal data sets collected at high pressure at the Extreme Conditions beamline (P02.2) at PETRA III, DESY, is presented. A new data image format called `Esperanto' is introduced that is supported by the commercial software package CrysAlis(Pro) (Agilent Technologies UK Ltd). The new format acts as a vehicle to transform the most common area-detector data formats via a translator software. Such a conversion tool has been developed and converts tiff data collected on a Perkin Elmer detector, as well as data collected on a MAR345/555, to be imported into the CrysAlis(Pro) software. In order to demonstrate the validity of the new approach, a complete structure refinement of boron-mullite (Al5BO9) collected at a pressure of 19.4 (2) GPa is presented. Details pertaining to the data collections and refinements of B-mullite are presented.
Kato, A; Ohno, N
2009-03-01
The study of dental morphology is essential in terms of phylogeny. Advances in three-dimensional (3D) measurement devices have enabled us to make 3D images of teeth without destruction of samples. However, raw fundamental data on tooth shape requires complex equipment and techniques. An online database of 3D teeth models is therefore indispensable. We aimed to explore the basic methodology for constructing 3D teeth models, with application for data sharing. Geometric information on the human permanent upper left incisor was obtained using micro-computed tomography (micro-CT). Enamel, dentine, and pulp were segmented by thresholding of different gray-scale intensities. Segmented data were separately exported in STereo-Lithography Interface Format (STL). STL data were converted to Wavefront OBJ (OBJect), as many 3D computer graphics programs support the Wavefront OBJ format. Data were also applied to Quick Time Virtual Reality (QTVR) format, which allows the image to be viewed from any direction. In addition to Wavefront OBJ and QTVR data, the original CT series were provided as 16-bit Tag Image File Format (TIFF) images on the website. In conclusion, 3D teeth models were constructed in general-purpose data formats, using micro-CT and commercially available programs. Teeth models that can be used widely would benefit all those who study dental morphology.
Harrison, Arnell S.; Dadisman, Shawn V.; Kindinger, Jack G.; Morton, Robert A.; Blum, Mike D.; Wiese, Dana S.; Subiño, Janice A.
2007-01-01
In June of 1996, the U.S. Geological Survey conducted geophysical surveys from Nueces to Copano Bays, Texas. This report serves as an archive of unprocessed digital boomer seismic reflection data, trackline maps, navigation files, GIS information, cruise log, and formal FGDC metadata. Filtered and gained digital images of the seismic profiles and high resolution scanned TIFF images of the original paper printouts are also provided. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG-Y format (Barry and others, 1975) and may be downloaded and processed with commercial or public domain software such as Seismic Unix (SU). Example SU processing scripts and USGS software for viewing the SEG-Y files (Zihlman, 1992) are also provided.
NASA Astrophysics Data System (ADS)
Bruno, L. S.; Rodrigo, B. P.; Lucio, A. de C. Jorge
2016-10-01
This paper presents a system developed by an application of a neural network Multilayer Perceptron for drone acquired agricultural image segmentation. This application allows a supervised user training the classes that will posteriorly be interpreted by neural network. These classes will be generated manually with pre-selected attributes in the application. After the attribute selection a segmentation process is made to allow the relevant information extraction for different types of images, RGB or Hyperspectral. The application allows extracting the geographical coordinates from the image metadata, geo referencing all pixels on the image. In spite of excessive memory consume on hyperspectral images regions of interest, is possible to perform segmentation, using bands chosen by user that can be combined in different ways to obtain different results.
A software platform for the analysis of dermatology images
NASA Astrophysics Data System (ADS)
Vlassi, Maria; Mavraganis, Vlasios; Asvestas, Panteleimon
2017-11-01
The purpose of this paper is to present a software platform developed in Python programming environment that can be used for the processing and analysis of dermatology images. The platform provides the capability for reading a file that contains a dermatology image. The platform supports image formats such as Windows bitmaps, JPEG, JPEG2000, portable network graphics, TIFF. Furthermore, it provides suitable tools for selecting, either manually or automatically, a region of interest (ROI) on the image. The automated selection of a ROI includes filtering for smoothing the image and thresholding. The proposed software platform has a friendly and clear graphical user interface and could be a useful second-opinion tool to a dermatologist. Furthermore, it could be used to classify images including from other anatomical parts such as breast or lung, after proper re-training of the classification algorithms.
PredictABEL: an R package for the assessment of risk prediction models.
Kundu, Suman; Aulchenko, Yurii S; van Duijn, Cornelia M; Janssens, A Cecile J W
2011-04-01
The rapid identification of genetic markers for multifactorial diseases from genome-wide association studies is fuelling interest in investigating the predictive ability and health care utility of genetic risk models. Various measures are available for the assessment of risk prediction models, each addressing a different aspect of performance and utility. We developed PredictABEL, a package in R that covers descriptive tables, measures and figures that are used in the analysis of risk prediction studies such as measures of model fit, predictive ability and clinical utility, and risk distributions, calibration plot and the receiver operating characteristic plot. Tables and figures are saved as separate files in a user-specified format, which include publication-quality EPS and TIFF formats. All figures are available in a ready-made layout, but they can be customized to the preferences of the user. The package has been developed for the analysis of genetic risk prediction studies, but can also be used for studies that only include non-genetic risk factors. PredictABEL is freely available at the websites of GenABEL ( http://www.genabel.org ) and CRAN ( http://cran.r-project.org/).
Lossless Data Embedding—New Paradigm in Digital Watermarking
NASA Astrophysics Data System (ADS)
Fridrich, Jessica; Goljan, Miroslav; Du, Rui
2002-12-01
One common drawback of virtually all current data embedding methods is the fact that the original image is inevitably distorted due to data embedding itself. This distortion typically cannot be removed completely due to quantization, bit-replacement, or truncation at the grayscales 0 and 255. Although the distortion is often quite small and perceptual models are used to minimize its visibility, the distortion may not be acceptable for medical imagery (for legal reasons) or for military images inspected under nonstandard viewing conditions (after enhancement or extreme zoom). In this paper, we introduce a new paradigm for data embedding in images (lossless data embedding) that has the property that the distortion due to embedding can be completely removed from the watermarked image after the embedded data has been extracted. We present lossless embedding methods for the uncompressed formats (BMP, TIFF) and for the JPEG format. We also show how the concept of lossless data embedding can be used as a powerful tool to achieve a variety of nontrivial tasks, including lossless authentication using fragile watermarks, steganalysis of LSB embedding, and distortion-free robust watermarking.
10 CFR 9.35 - Duplication fees.
Code of Federal Regulations, 2012 CFR
2012-01-01
...) copying of ADAMS documents to paper (applies to images, OCR TIFF, and PDF text) is $0.30 per page. (B) EFT... is $0.30 per page. (vi) Priority rates (rush processing) are as follows: (A) The priority rate...
10 CFR 9.35 - Duplication fees.
Code of Federal Regulations, 2013 CFR
2013-01-01
...) copying of ADAMS documents to paper (applies to images, OCR TIFF, and PDF text) is $0.30 per page. (B) EFT... is $0.30 per page. (vi) Priority rates (rush processing) are as follows: (A) The priority rate...
10 CFR 9.35 - Duplication fees.
Code of Federal Regulations, 2014 CFR
2014-01-01
...) copying of ADAMS documents to paper (applies to images, OCR TIFF, and PDF text) is $0.30 per page. (B) EFT... is $0.30 per page. (vi) Priority rates (rush processing) are as follows: (A) The priority rate...
The Open Microscopy Environment: open image informatics for the biological sciences
NASA Astrophysics Data System (ADS)
Blackburn, Colin; Allan, Chris; Besson, Sébastien; Burel, Jean-Marie; Carroll, Mark; Ferguson, Richard K.; Flynn, Helen; Gault, David; Gillen, Kenneth; Leigh, Roger; Leo, Simone; Li, Simon; Lindner, Dominik; Linkert, Melissa; Moore, Josh; Moore, William J.; Ramalingam, Balaji; Rozbicki, Emil; Rustici, Gabriella; Tarkowska, Aleksandra; Walczysko, Petr; Williams, Eleanor; Swedlow, Jason R.
2016-07-01
Despite significant advances in biological imaging and analysis, major informatics challenges remain unsolved: file formats are proprietary, storage and analysis facilities are lacking, as are standards for sharing image data and results. While the open FITS file format is ubiquitous in astronomy, astronomical imaging shares many challenges with biological imaging, including the need to share large image sets using secure, cross-platform APIs, and the need for scalable applications for processing and visualization. The Open Microscopy Environment (OME) is an open-source software framework developed to address these challenges. OME tools include: an open data model for multidimensional imaging (OME Data Model); an open file format (OME-TIFF) and library (Bio-Formats) enabling free access to images (5D+) written in more than 145 formats from many imaging domains, including FITS; and a data management server (OMERO). The Java-based OMERO client-server platform comprises an image metadata store, an image repository, visualization and analysis by remote access, allowing sharing and publishing of image data. OMERO provides a means to manage the data through a multi-platform API. OMERO's model-based architecture has enabled its extension into a range of imaging domains, including light and electron microscopy, high content screening, digital pathology and recently into applications using non-image data from clinical and genomic studies. This is made possible using the Bio-Formats library. The current release includes a single mechanism for accessing image data of all types, regardless of original file format, via Java, C/C++ and Python and a variety of applications and environments (e.g. ImageJ, Matlab and R).
http://www.esa.int/esaSC/Pr_21_2004_s_en.html
NASA Astrophysics Data System (ADS)
2004-09-01
X-ray brightness map hi-res Size hi-res: 38 Kb Credits: ESA/ XMM-Newton/ Patrick Henry et al. X-ray brightness map This map shows "surface brightness" or how luminous the region is. The larger of the two galaxy clusters is brighter, shown here as a white and red spot. A second cluster resides about "2 o'clock" from this, shown by a batch of yellow surrounded by green. Luminosity is related to density, so the densest regions (cluster cores) are the brightest regions. The white color corresponds to regions of the highest surface brightness, followed by red, orange, yellow, green, blue and purple. High resolution version (JPG format) 38 Kb High resolution version (TIFF format) 525 Kb Temperature map Credits: NASA Artist’s impression of cosmic head on collision The event details what the scientists are calling the perfect cosmic storm: galaxy clusters that collided like two high-pressure weather fronts and created hurricane-like conditions, tossing galaxies far from their paths and churning shock waves of 100-million-degree gas through intergalactic space. The tiny dots in this artist's concept are galaxies containing thousand million of stars. Animated GIF version Temperature map hi-res Size hi-res: 57 Kb Credits: ESA/ XMM-Newton/ Patrick Henry et al. Temperature map This image shows the temperature of gas in and around the two merging galaxy clusters, based directly on X-ray data. The galaxies themselves are difficult to identify; the image highlights the hot ‘invisible’ gas between the clusters heated by shock waves. The white colour corresponds to regions of the highest temperature - million of degrees, hotter than the surface of the Sun - followed by red, orange, yellow and blue. High resolution version (JPG format) 57 Kb High resolution version (TIFF format) 819 Kb The event details what the scientists are calling the ‘perfect cosmic storm’: galaxy clusters that collided like two high-pressure weather fronts and created hurricane-like conditions, tossing galaxies far from their paths and churning shock waves of 100-million-degree gas through intergalactic space. This unprecedented view of a merger in action crystallises the theory that the Universe built its magnificent hierarchal structure from the ‘bottom up’ - essentially through mergers of smaller galaxies and galaxy clusters into bigger ones. "Here before our eyes we see the making of one of the biggest objects in the Universe," said Dr Patrick Henry of the University of Hawaii, who led the study. "What was once two distinct but smaller galaxy clusters 300 million years ago is now one massive cluster in turmoil.” Henry and his colleagues, Alexis Finoguenov and Ulrich Briel of the Max-Planck Institute for Extraterrestrial Physics in Germany, present these results in an upcoming issue of the Astrophysical Journal. The forecast for the new super-cluster, they said, is 'clear and calm' now that the worst of the storm has passed. Galaxy clusters are the largest gravitationally bound structures in Universe, containing hundreds to thousands of galaxies. Our Milky Way galaxy is part of a small group of galaxies but is not gravitationally bound to the closest cluster, the Virgo Cluster. We are destined for a collision in a few thousand million years, though. The cluster named Abell 754 in the constellation Hydra has been known for decades. However, to the scientists' surprise, the new observation reveals that the merger may have occurred from the opposite direction than what was thought. They found evidence for this by tracing the wreckage today left in the merger's wake, spanning a distance of millions of light years. While other large mergers are known, none has been measured in such detail as Abell 754. For the first time, the scientists could create a complete ‘weather map’ of Abell 754 and thus determine a forecast. This map contains information about the temperature, pressure and density of the new cluster. As in all clusters, most the ordinary matter is in the form of gas between the galaxies and not locked up in the galaxies or stars themselves. The massive forces of the merging clusters accelerated intergalactic gas to great speeds. This resulted in shock waves that heat the gas to very high temperatures, which then radiated X-ray light, far more energetic than the visible light our eyes can detect. XMM-Newton, in orbit, detects this type of high-energy light. The dynamics of the merger revealed by XMM-Newton point to a cluster in transition. "One cluster has apparently smashed into the other from the 'north-west' and has since made one pass through," said Finoguenov. "Now, gravity will pull the remnants of this first cluster back towards the core of the second. Over the next few thousand million of years, the remnants of the clusters will settle and the merger will be complete." The observation implies that the largest structures in the Universe are essentially still forming in the modern era. Abell 754 is relatively close, about 800 million light years away. The construction boom may soon be over in a few more thousand million years though. A mysterious substance dubbed 'dark energy' appears to be accelerating the Universe's expansion rate. This means that objects are flying apart from each other at an ever-increasing speed and that clusters may eventually never have the opportunity to collide with each other. X-ray observations of galaxy clusters such as Abell 754 will help to better define dark energy and also dark matter, an ‘invisible’ and mysterious substance that appears to comprise over 80 percent of a galaxy cluster's mass. Notes for editors: This observation was announced at a NASA Internet press conference today. A paper describing these results, by Patrick Henry and his collaborators, will be published in the Astrophysical Journal. Images and other visual material are available at: http://www.gsfc.nasa.gov/topstory/2004/0831galaxymerger_media.html More about XMM-Newton ESA's XMM-Newton can detect more X-ray sources than any previous satellite and is helping to solve many cosmic mysteries of the violent Universe, from black holes to the formation of galaxies. It was launched on 10 December 1999, using an Ariane-5 rocket, from French Guiana. It is expected to return data for a decade. XMM-Newton's high-tech design uses over 170 wafer-thin cylindrical mirrors spread over three telescopes. Its orbit takes it almost a third of the way to the Moon, so that astronomers can enjoy long, uninterrupted views of celestial objects.
Tularosa Basin Play Fairway Analysis: Strain Analysis
Adam Brandt
2015-11-15
A DEM of the Tularosa Basin was divided into twelve zones, each of which a ZR ratio was calculated for. This submission has a TIFF image of the zoning designations, along with a table with respective ZR ratio calculations in the metadata.
77 FR 23382 - Airworthiness Directives; Sikorsky Aircraft Corporation Helicopters
Federal Register 2010, 2011, 2012, 2013, 2014
2012-04-19
... prior to that time. (e) Required Actions Within 90 days: (1) By making pen and ink changes, insert into... depicted in the circled area of Figure 1 of this AD. [GRAPHIC] [TIFF OMITTED] TR19AP12.000 (f) Alternative...
77 FR 42971 - Airworthiness Directives; Various Restricted Category Helicopters
Federal Register 2010, 2011, 2012, 2013, 2014
2012-07-23
... specifies replacing any mast with a crack, pitting, or corrosion beyond surface rust that is removed with a... rust with a wire brush or steel wool. [GRAPHIC] [TIFF OMITTED] TR23JY12.003 (2) If there is a crack...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-23
... result of management measures designed to meet the Pacific Coast Groundfish FMP objective of achieving..., subpart G, are revised to read as follows: BILLING CODE 3510-22-P [GRAPHIC] [TIFF OMITTED] TR23AU10.046 [[Page 51687
Cluster finds giant gas vortices at the edge of Earth's magnetic bubble
NASA Astrophysics Data System (ADS)
2004-08-01
12 August 2004 ESA’s quartet of space-weather watchers, Cluster, has discovered vortices of ejected solar material high above the Earth. The superheated gases trapped in these structures are probably tunnelling their way into the Earth’s magnetic ‘bubble’, the magnetosphere. This discovery possibly solves a 17-year-mystery of how the magnetosphere is constantly topped up with electrified gases when it should be acting as a barrier. hi-res Size hi-res: 1446 Kb Credits: H. Hasegawa (Dartmouth College) Three-dimensional cut-away view of Earth's magnetosphere This figure shows a three-dimensional cut-away view of Earth' s magnetosphere. The curly features sketched on the boundary layer are the Kelvin-Helmholtz vortices discovered by Cluster. They originate where two adjacent flows travel with different speed. In this case, one of the flows is the heated gas inside the boundary layer of the magnetosphere, the other the solar wind just outside it. The arrows show the direction of the magnetic field, in red that associated with the solar wind and in green the one inside Earth’s magnetosphere. The white dashed arrow shows the trajectory followed by Cluster. High resolution version (JPG format) 1446 Kb High resolution version (TIFF format) 15 365 Kb hi-res Size hi-res: 22 Kb Credits: H. Hasegawa (Dartmouth College) Electrified gas varies across the vortices along Cluster’s trajectory This computer simulation shows how the density of the electrified gas is expected to vary across the vortices along Cluster’s trajectory (white dashed line). The density is lower inside the boundary layer (blue region) and higher outside, in the region dominated by the solar wind (shown in red). The density variations measured by the instruments on board Cluster match those predicted by this model. Low resolution version (JPG format) 22 Kb High resolution version (TIFF format) 3438 Kb The Earth’s magnetic field is our planet’s first line of defence against the bombardment of the solar wind. The solar wind itself is launched from the Sun and carries the Sun’s magnetic field throughout the Solar System. Sometimes this magnetic field is aligned with Earth’s and sometimes it points in the opposite direction. When the two fields point in opposite directions, scientists understand how ‘doors’ in Earth’s field can open. This phenomenon, called ‘magnetic reconnection’, allows the solar wind to flow in and collect in the reservoir known as the boundary layer. On the contrary, when the fields are aligned they should present an impenetrable barrier to the flow. However, spacecraft measurements of the boundary layer, dating back to 1987, present a puzzle because they clearly show that the boundary layer is fuller when the fields are aligned than when they are not. So how is the solar wind getting in? Thanks to the data from the four formation-flying spacecraft of ESA’s Cluster mission, scientists have made a breakthrough. On 20 November 2001, the Cluster flotilla was heading around from behind Earth and had just arrived at the dusk side of the planet, where the solar wind slides past Earth’s magnetosphere. There it began to encounter gigantic vortices of gas at the magnetopause, the outer ‘edge’ of the magnetosphere. “These vortices were really huge structures, about six Earth radii across,” says Hiroshi Hasegawa, Dartmouth College, New Hampshire who has been analysing the data with help from an international team of colleagues. Their results place the size of the vortices at almost 40 000 kilometres each, and this is the first time such structures have been detected. These vortices are known as products of Kelvin-Helmholtz instabilities (KHI). They can occur when two adjacent flows are travelling with different speeds, so one is slipping past the other. Good examples of such instabilities are the waves whipped up by the wind slipping across the surface of the ocean. Although KHI-waves had been observed before, this is the first time that vortices are actually detected. When a KHI-wave rolls up into a vortex, it becomes known as a ‘Kelvin Cat’s eye’. The data collected by Cluster have shown density variations of the electrified gas, right at the magnetopause, precisely like those expected when travelling through a ‘Kelvin Cat’s eye’. Scientists had postulated that, if these structures were to form at the magnetopause, they might be able to pull large quantities of the solar wind inside the boundary layer as they collapse. Once the solar wind particles are carried into the inner part of the magnetosphere, they can be excited strongly, allowing them to smash into Earth’s atmosphere and give rise to the aurorae. Cluster’s discovery strengthens this scenario but does not show the precise mechanism by which the gas is transported into Earth’s magnetic bubble. Thus, scientists still do not know whether this is the only process to fill up the boundary layer when the magnetic fields are aligned. For those measurements, Hasegawa says, scientists will have to wait for a future generation of magnetospheric satellites. Notes for editors The results of this investigation have appeared in today’s issue of the scientific journal Nature, in a paper entitled ‘Transport of solar wind into Earth's magnetosphere through rolled-up Kelvin-Helmholtz vortices’, by H. Hasegawa, M. Fujimoto, T.D. Phan, H. Reme, A. Balogh, M.W. Dunlop, C. Hashimoto and R. TanDokoro. More about magnetic reconnection Solar wind particles follow ‘magnetic field lines’, rather like beads on a wire. The ‘doors’ that open in Earth’s magnetosphere during oppositely aligned magnetic configurations are caused by a phenomenon called ‘magnetic reconnection‘. During this process, Earth’s field lines spontaneously break and join themselves to the Sun’s, allowing the solar wind to pass freely into Earth’s magnetosphere. Magnetic reconnections are not possible in the aligned case, however, hence the need for a different mechanism to inject the particles into Earth’s magnetosphere. More about Cluster Cluster is a mission of international co-operation between ESA and NASA. It involves four spacecraft, launched on two Russian rockets during the summer of 2000. They are now flying in formation around Earth, relaying the most detailed ever information about how the solar wind affects our planet in 3D. The solar wind is the perpetual stream of subatomic particles given out by the Sun and it can damage communications satellites and power stations on Earth. The Cluster mission is expected to continue until at least 2005. The ongoing archiving of the Cluster data (or Cluster Active Archive) is part of the International Living with a Star programme (ILWS), in which space agencies worldwide get together to investigate how variations in the Sun affect the environment of Earth and the other planets. In particular, ILWS concentrate on those aspects of the Sun-Earth system that may affect mankind and society. ILWS is a collaborative initiative between Europe, the United States, Russia, Japan and Canada.
NASA Technical Reports Server (NTRS)
Godfrey, Gary S.
2003-01-01
This project illustrates an animation of the orbiter mate to the external tank, an animation of the OMS POD installation to the orbiter, and a simulation of the landing gear mechanism at the Kennedy Space Center. A detailed storyboard was created to reflect each animation or simulation. Solid models were collected and translated into Pro/Engineer's prt and asm formats. These solid models included computer files of the: orbiter, external tank, solid rocket booster, mobile launch platform, transporter, vehicle assembly building, OMS POD fixture, and landing gear. A depository of the above solid models was established. These solid models were translated into several formats. This depository contained the following files: stl for sterolithography, stp for neutral file work, shrinkwrap for compression, tiff for photoshop work, jpeg for Internet use, and prt and asm for Pro/Engineer use. Solid models were created of the material handling sling, bay 3 platforms, and orbiter contact points. Animations were developed using mechanisms to reflect each storyboard. Every effort was made to build all models technically correct for engineering use. The result was an animated routine that could be used by NASA for training material handlers and uncovering engineering safety issues.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Phillion, D.
This code enables one to display, take line-outs on, and perform various transformations on an image created by an array of integer*2 data. Uncompressed eight-bit TIFF files created on either the Macintosh or the IBM PC may also be read in and converted to a 16 bit signed integer image. This code is designed to handle all the formats used for PDS (photo-densitometer) files at the Lawrence Livermore National Laboratory. These formats are all explained by the application code. The image may be zoomed infinitely and the gray scale mapping can be easily changed. Line-outs may be horizontal or verticalmore » with arbitrary width, angled with arbitrary end points, or taken along any path. This code is usually used to examine spectrograph data. Spectral lines may be identified and a polynomial fit from position to wavelength may be found. The image array can be remapped so that the pixels all have the same change of lambda width. It is not necessary to do this, however. Lineouts may be printed, saved as Cricket tab-delimited files, or saved as PICT2 files. The plots may be linear, semilog, or logarithmic with nice values and proper scientific notation. Typically, spectral lines are curved.« less
Improving Access to Precipitation Data for GIS Users: Designing for Ease of Use
NASA Technical Reports Server (NTRS)
Stocker, Erich F.; Kelley, Owen A.
2007-01-01
The Global Precipitation Measurement Mission (GPM) is a NASA/JAXA led international mission to configure a constellation of space-based radiometers to monitor precipitation over the globe. The GPM goal of making global 3-hour precipitation products available in near real-time will make such global products more useful to a broader community of modelers and Geographic Information Systems (GIS) users than is currently the case with remote sensed precipitation products. Based on the existing interest to make Tropical Rainfall Measuring Mission (TRMM) data available to a growing community of GIS users as well as what will certainly be an expanded community during the GPM era, it is clear that data systems must make a greater effort to provide data in formats easily used by GIS. We describe precipitation GIS products being developed for TRMM data. These products will serve as prototypes for production efforts during the GPM era. We describe efforts to convert TRMM precipitation data to GeoTIFF, Shapefile, and ASCII grid. Clearly, our goal is to format GPM data so that it can be easily used within GIS applications. We desire feedback on these efforts and any additions or direction changes that should be undertaken by the data system.
NASA Earth Observations (NEO): Data Imagery for Education and Visualization
NASA Astrophysics Data System (ADS)
Ward, K.
2008-12-01
NASA Earth Observations (NEO) has dramatically simplified public access to georeferenced imagery of NASA remote sensing data. NEO targets the non-traditional data users who are currently underserved by functionality and formats available from the existing data ordering systems. These users include formal and informal educators, museum and science center personnel, professional communicators, and citizen scientists. NEO currently serves imagery from 45 different datasets with daily, weekly, and/or monthly temporal resolutions, with more datasets currently under development. The imagery from these datasets is produced in coordination with several data partners who are affiliated either with the instrument science teams or with the respective data processing center. NEO is a system of three components -- website, WMS (Web Mapping Service), and ftp archive -- which together are able to meet the wide-ranging needs of our users. Some of these needs include the ability to: view and manipulate imagery using the NEO website -- e.g., applying color palettes, resizing, exporting to a variety of formats including PNG, JPEG, KMZ (Google Earth), GeoTIFF; access the NEO collection via a standards-based API (WMS); and create customized exports for select users (ftp archive) such as Science on a Sphere, NASA's Earth Observatory, and others.
Providing Internet Access to High-Resolution Lunar Images
NASA Technical Reports Server (NTRS)
Plesea, Lucian
2008-01-01
The OnMoon server is a computer program that provides Internet access to high-resolution Lunar images, maps, and elevation data, all suitable for use in geographical information system (GIS) software for generating images, maps, and computational models of the Moon. The OnMoon server implements the Open Geospatial Consortium (OGC) Web Map Service (WMS) server protocol and supports Moon-specific extensions. Unlike other Internet map servers that provide Lunar data using an Earth coordinate system, the OnMoon server supports encoding of data in Moon-specific coordinate systems. The OnMoon server offers access to most of the available high-resolution Lunar image and elevation data. This server can generate image and map files in the tagged image file format (TIFF) or the Joint Photographic Experts Group (JPEG), 8- or 16-bit Portable Network Graphics (PNG), or Keyhole Markup Language (KML) format. Image control is provided by use of the OGC Style Layer Descriptor (SLD) protocol. Full-precision spectral arithmetic processing is also available, by use of a custom SLD extension. This server can dynamically add shaded relief based on the Lunar elevation to any image layer. This server also implements tiled WMS protocol and super-overlay KML for high-performance client application programs.
Providing Internet Access to High-Resolution Mars Images
NASA Technical Reports Server (NTRS)
Plesea, Lucian
2008-01-01
The OnMars server is a computer program that provides Internet access to high-resolution Mars images, maps, and elevation data, all suitable for use in geographical information system (GIS) software for generating images, maps, and computational models of Mars. The OnMars server is an implementation of the Open Geospatial Consortium (OGC) Web Map Service (WMS) server. Unlike other Mars Internet map servers that provide Martian data using an Earth coordinate system, the OnMars WMS server supports encoding of data in Mars-specific coordinate systems. The OnMars server offers access to most of the available high-resolution Martian image and elevation data, including an 8-meter-per-pixel uncontrolled mosaic of most of the Mars Global Surveyor (MGS) Mars Observer Camera Narrow Angle (MOCNA) image collection, which is not available elsewhere. This server can generate image and map files in the tagged image file format (TIFF), Joint Photographic Experts Group (JPEG), 8- or 16-bit Portable Network Graphics (PNG), or Keyhole Markup Language (KML) format. Image control is provided by use of the OGC Style Layer Descriptor (SLD) protocol. The OnMars server also implements tiled WMS protocol and super-overlay KML for high-performance client application programs.
A web-based subsetting service for regional scale MODIS land products
DOE Office of Scientific and Technical Information (OSTI.GOV)
SanthanaVannan, Suresh K; Cook, Robert B; Holladay, Susan K
2009-12-01
The Moderate Resolution Imaging Spectroradiometer (MODIS) sensor has provided valuable information on various aspects of the Earth System since March 2000. The spectral, spatial, and temporal characteristics of MODIS products have made them an important data source for analyzing key science questions relating to Earth System processes at regional, continental, and global scales. The size of the MODIS product and native HDF-EOS format are not optimal for use in field investigations at individual sites (100 - 100 km or smaller). In order to make MODIS data readily accessible for field investigations, the NASA-funded Distributed Active Archive Center (DAAC) for Biogeochemicalmore » Dynamics at Oak Ridge National Laboratory (ORNL) has developed an online system that provides MODIS land products in an easy-to-use format and in file sizes more appropriate to field research. This system provides MODIS land products data in a nonproprietary comma delimited ASCII format and in GIS compatible formats (GeoTIFF and ASCII grid). Web-based visualization tools are also available as part of this system and these tools provide a quick snapshot of the data. Quality control tools and a multitude of data delivery options are available to meet the demands of various user communities. This paper describes the important features and design goals for the system, particularly in the context of data archive and distribution for regional scale analysis. The paper also discusses the ways in which data from this system can be used for validation, data intercomparison, and modeling efforts.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-28
.... Since the publication of that rule, it has come to DOE's attention that, due to a technical oversight, a...-percent confidence limit of the true mean (X L ) divided by 0.97, i.e., [GRAPHIC] [TIFF OMITTED] TR28AP10...
Personalization of structural PDB files.
Woźniak, Tomasz; Adamiak, Ryszard W
2013-01-01
PDB format is most commonly applied by various programs to define three-dimensional structure of biomolecules. However, the programs often use different versions of the format. Thus far, no comprehensive solution for unifying the PDB formats has been developed. Here we present an open-source, Python-based tool called PDBinout for processing and conversion of various versions of PDB file format for biostructural applications. Moreover, PDBinout allows to create one's own PDB versions. PDBinout is freely available under the LGPL licence at http://pdbinout.ibch.poznan.pl.
Digitized Database of Old Seismograms Recorder in Romania
NASA Astrophysics Data System (ADS)
Paulescu, Daniel; Rogozea, Maria; Popa, Mihaela; Radulian, Mircea
2016-08-01
The aim of this paper is to describe a managing system for a unique Romanian database of historical seismograms and complementary documentation (metadata) and its dissemination and analysis procedure. For this study, 5188 historical seismograms recorded between 1903 and 1957 by the Romanian seismological observatories (Bucharest-Filaret, Focşani, Bacău, Vrincioaia, Câmpulung-Muscel, Iaşi) were used. In order to reconsider the historical instrumental data, the analog seismograms are converted to digital images and digital waveforms (digitization/ vectorialisation). First, we applied a careful scanning procedure of the seismograms and related material (seismic bulletins, station books, etc.). In a next step, the high resolution scanned seismograms will be processed to obtain the digital/numeric waveforms. We used a Colortrac Smartlf Cx40 scanner which provides images in TIFF or JPG format. For digitization the algorithm Teseo2 developed by the National Institute of Geophysics and Volcanology in Rome (Italy), within the framework of the SISMOS Project, will be used.
Iurov, Iu B; Khazatskiĭ, I A; Akindinov, V A; Dovgilov, L V; Kobrinskiĭ, B A; Vorsanova, S G
2000-08-01
Original software FISHMet has been developed and tried for improving the efficiency of diagnosis of hereditary diseases caused by chromosome aberrations and for chromosome mapping by fluorescent in situ hybridization (FISH) method. The program allows creation and analysis of pseudocolor chromosome images and hybridization signals in the Windows 95 system, allows computer analysis and editing of the results of pseudocolor hybridization in situ, including successive imposition of initial black-and-white images created using fluorescent filters (blue, green, and red), and editing of each image individually or of a summary pseudocolor image in BMP, TIFF, and JPEG formats. Components of image computer analysis system (LOMO, Leitz Ortoplan, and Axioplan fluorescent microscopes, COHU 4910 and Sanyo VCB-3512P CCD cameras, Miro-Video, Scion LG-3 and VG-5 image capture maps, and Pentium 100 and Pentium 200 computers) and specialized software for image capture and visualization (Scion Image PC and Video-Cup) have been used with good results in the study.
GNU Data Language (GDL) - a free and open-source implementation of IDL
NASA Astrophysics Data System (ADS)
Arabas, Sylwester; Schellens, Marc; Coulais, Alain; Gales, Joel; Messmer, Peter
2010-05-01
GNU Data Language (GDL) is developed with the aim of providing an open-source drop-in replacement for the ITTVIS's Interactive Data Language (IDL). It is free software developed by an international team of volunteers led by Marc Schellens - the project's founder (a list of contributors is available on the project's website). The development is hosted on SourceForge where GDL continuously ranks in the 99th percentile of most active projects. GDL with its library routines is designed as a tool for numerical data analysis and visualisation. As its proprietary counterparts (IDL and PV-WAVE), GDL is used particularly in geosciences and astronomy. GDL is dynamically-typed, vectorized and has object-oriented programming capabilities. The library routines handle numerical calculations, data visualisation, signal/image processing, interaction with host OS and data input/output. GDL supports several data formats such as netCDF, HDF4, HDF5, GRIB, PNG, TIFF, DICOM, etc. Graphical output is handled by X11, PostScript, SVG or z-buffer terminals, the last one allowing output to be saved in a variety of raster graphics formats. GDL is an incremental compiler with integrated debugging facilities. It is written in C++ using the ANTLR language-recognition framework. Most of the library routines are implemented as interfaces to open-source packages such as GNU Scientific Library, PLPlot, FFTW, ImageMagick, and others. GDL features a Python bridge (Python code can be called from GDL; GDL can be compiled as a Python module). Extensions to GDL can be written in C++, GDL, and Python. A number of open software libraries written in IDL, such as the NASA Astronomy Library, MPFIT, CMSVLIB and TeXtoIDL are fully or partially functional under GDL. Packaged versions of GDL are available for several Linux distributions and Mac OS X. The source code compiles on some other UNIX systems, including BSD and OpenSolaris. The presentation will cover the current status of the project, the key accomplishments, and the weaknesses - areas where contributions and users' feedback are welcome! While still being in beta-stage of development, GDL proved to be a useful tool for classroom work on data analysis. Its usage for teaching meteorological-data processing at the University of Warsaw will serve as an example.
77 FR 12843 - Fees for Sanitation Inspections of Cruise Ships
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-02
... used to determine the fees: [GRAPHIC] [TIFF OMITTED] TN02MR12.006 The average cost per inspection is multiplied by size and cost factors to determine the fee for vessels in each size category. The size and cost... exists rodent, insect, or other vermin infestations, contaminated food or water, or other sanitary...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Phillion, D.
This code enables one to display, take line-outs on, and perform various transformations on an image created by an array of integer*2 data. Uncompressed eight-bit TIFF files created on either the Macintosh or the IBM PC may also be read in and converted to a 16 bit signed integer image. This code is designed to handle all the formates used for PDS (photo-densitometer) files at the Lawrence Livermore National Laboratory. These formats are all explained by the application code. The image may be zoomed infinitely and the gray scale mapping can be easily changed. Line-outs may be horizontal or verticalmore » with arbitrary width, angled with arbitrary end points, or taken along any path. This code is usually used to examine spectrograph data. Spectral lines may be identified and a polynomial fit from position to wavelength may be found. The image array can be remapped so that the pixels all have the same change of lambda width. It is not necessary to do this, however. Lineouts may be printed, saved as Cricket tab-delimited files, or saved as PICT2 files. The plots may be linear, semilog, or logarithmic with nice values and proper scientific notation. Typically, spectral lines are curved. By identifying points on these lines and fitting their shapes by polyn.« less
Dwyer, John L.; Schmidt, Gail L.; Qu, J.J.; Gao, W.; Kafatos, M.; Murphy , R.E.; Salomonson, V.V.
2006-01-01
The MODIS Reprojection Tool (MRT) is designed to help individuals work with MODIS Level-2G, Level-3, and Level-4 land data products. These products are referenced to a global tiling scheme in which each tile is approximately 10° latitude by 10° longitude and non-overlapping (Fig. 9.1). If desired, the user may reproject only selected portions of the product (spatial or parameter subsetting). The software may also be used to convert MODIS products to file formats (generic binary and GeoTIFF) that are more readily compatible with existing software packages. The MODIS land products distributed by the Land Processes Distributed Active Archive Center (LP DAAC) are in the Hierarchical Data Format - Earth Observing System (HDF-EOS), developed by the National Center for Supercomputing Applications at the University of Illinois at Urbana Champaign for the NASA EOS Program. Each HDF-EOS file is comprised of one or more science data sets (SDSs) corresponding to geophysical or biophysical parameters. Metadata are embedded in the HDF file as well as contained in a .met file that is associated with each HDF-EOS file. The MRT supports 8-bit, 16-bit, and 32-bit integer data (both signed and unsigned), as well as 32-bit float data. The data type of the output is the same as the data type of each corresponding input SDS.
Navigator GPS Receiver for Fast Acquisition and Weak Signal Space Applications
NASA Technical Reports Server (NTRS)
Winternitz, Luke; Moreau, Michael; Boegner, Gregory J.; Sirotzky, Steve
2004-01-01
NASA Goddard Space Flight Center (GSFC) is developing a new space-borne GPS receiver that can operate effectively in the full range of Earth orbiting missions from Low Earth Orbit (LEO) to geostationary and beyond. Navigator is designed to be a fully space flight qualified GPS receiver optimized for fast signal acquisition and weak signal tracking. The fast acquisition capabilities provide exceptional time to first fix performance (TIFF) with no a priori receiver state or GPS almanac information, even in the presence of high Doppler shifts present in LEO (or near perigee in highly eccentric orbits). The fast acquisition capability also makes it feasible to implement extended correlation intervals and therefore significantly reduce Navigator s acquisition threshold. This greatly improves GPS observability when the receiver is above the GPS constellation (and satellites must be tracked from the opposite side of the Earth) by providing at least 10 dB of increased acquisition sensitivity. Fast acquisition and weak signal tracking algorithms have been implemented and validated on a hardware development board. A fully functional version of the receiver, employing most of the flight parts, with integrated navigation software is expected by mid 2005. An ultimate goal of this project is to license the Navigator design to an industry partner who will then market the receiver as a commercial product.
Kodak's New Photo CD Portfolio: Multimedia for the Rest of Us.
ERIC Educational Resources Information Center
Bonime, Andrew
1994-01-01
Describes Photo CD Portfolio, an Eastman Kodak product that provides interactive multimedia CD-ROM production capability. The article focuses on the capabilities of the tool's simplest authoring system, Create It, which allows users to work with Photo CD, PICT, or TIFF images, add graphics, text and audio, and create menus with branching. (KRN)
Rea, A.H.; Becker, C.J.
1997-01-01
This compact disc contains 25 digital map data sets covering the State of Oklahoma that may be of interest to the general public, private industry, schools, and government agencies. Fourteen data sets are statewide. These data sets include: administrative boundaries; 104th U.S. Congressional district boundaries; county boundaries; latitudinal lines; longitudinal lines; geographic names; indexes of U.S. Geological Survey 1:100,000, and 1:250,000-scale topographic quadrangles; a shaded-relief image; Oklahoma State House of Representatives district boundaries; Oklahoma State Senate district boundaries; locations of U.S. Geological Survey stream gages; watershed boundaries and hydrologic cataloging unit numbers; and locations of weather stations. Eleven data sets are divided by county and are located in 77 county subdirectories. These data sets include: census block group boundaries with selected demographic data; city and major highways text; geographic names; land surface elevation contours; elevation points; an index of U.S. Geological Survey 1:24,000-scale topographic quadrangles; roads, streets and address ranges; highway text; school district boundaries; streams, river and lakes; and the public land survey system. All data sets are provided in a readily accessible format. Most data sets are provided in Digital Line Graph (DLG) format. The attributes for many of the DLG files are stored in related dBASE(R)-format files and may be joined to the data set polygon attribute or arc attribute tables using dBASE(R)-compatible software. (Any use of trade names in this publication is for descriptive purposes only and does not imply endorsement by the U.S. Government.) Point attribute tables are provided in dBASE(R) format only, and include the X and Y map coordinates of each point. Annotation (text plotted in map coordinates) are provided in AutoCAD Drawing Exchange format (DXF) files. The shaded-relief image is provided in TIFF format. All data sets except the shaded-relief image also are provided in ARC/INFO export-file format.
Delgado-Herrera, Leticia; Banderas, Benjamin; Ojo, Oluwafunke; Kothari, Ritesh; Zeiher, Bernhardt
2017-01-01
Subjects with diarrhea-predominant irritable bowel syndrome (IBS-D) experience abdominal cramping, bloating, pressure, and pain. Due to an absence of clinical biomarkers for IBS-D severity, evaluation of clinical therapy benefits depends on valid and reliable symptom assessments. A patient-reported outcome (PRO) instrument has been developed, comprising of two questionnaires - the IBS-D Daily Symptom Diary and IBS-D Symptom Event Log - suitable for clinical trials and real-world settings. This program aimed to support instrument conversion from pen-and-paper to electronic format. Digital technology (Android/iOS) and a traditional mode of administration study in the target population were used to migrate or convert the validated PRO IBS-D pen-and-paper measure to an electronic format. Equivalence interviews, conducted in three waves, each had three parts: 1) conceptual equivalence testing between formats, 2) electronic-version report-history cognitive debriefing, and 3) electronic version usability evaluation. After each inter-view wave, preliminary analyses were conducted and modifications made to the electronic version, before the next wave. Final revisions were based on a full analysis of equivalence interviews. The final analysis evaluated subjects' ability to read, understand, and provide meaningful responses to the instruments across both formats. Responses were classified according to conceptual equivalence between formats and mobile-format usability assessed with a questionnaire and open-ended probes. Equivalence interviews (n=25) demonstrated conceptual equivalence between formats. Mobile-application cognitive debriefing showed some subjects experienced difficulty with font/screen visibility and understanding or reading some report-history charts and summary screens. To address difficulties, minor revisions/modifications were made and landscape orientation and zoom-in/zoom-out features incorporated. This study indicates that the two administration modes are conceptually equivalent. Since both formats are conceptually equivalent, both are psychometrically reliable, as established in the pen-and-paper version. Subjects found both mobile applications (Android/iOS) offered many advantages over the paper version, such as real-time assessment of their experience.
75 FR 52267 - Waiver of Statement of Account Filing Deadline for the 2010/1 Period
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-25
... available in a print format, a PDF format, and a software ``fill-in'' format created by Gralin Associates... retransmission of multicast streams. The paper and PDF versions of the form have been available to cable... recognize that the paper and PDF versions of the SOA have been available since July, many large and small...
75 FR 81157 - Version One Regional Reliability Standard for Transmission Operations
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-27
... processing software should be filed in native applications or print-to-PDF format and not in a scanned format..., Inc. v. FERC, 564 F.3d 1342 (D.C. Cir. 2009). \\3\\ NERC designates the version number of a Reliability...
Digital soils survey map of the Patagonia Mountains, Arizona
Norman, Laura; Wissler, Craig; Guertin, D. Phillip; Gray, Floyd
2002-01-01
The ‘Soil Survey of Santa Cruz and Parts of Cochise and Pima Counties, Arizona,' a product of the USDA’s Soil Conservation Service and the Forest Service in cooperation with the Arizona Agricultural Experiment Station, released in 1979, was created according to the site conditions in 1971, when soil scientists identified soils types on aerial photographs. The scale at which these maps were published is 1:20,000. These soil maps were automated for incorporation into the hydrologic modeling within a GIS. The aerial photos onto which the soils units were drawn had not been orthoganalized, and contained distortion. A total of 15 maps composed the study area. These maps were scanned into TIFF format using an 8-bit black and white drum scanner at 100 dpi. The images were imported into ERDAS IMAGINE and the white borders were removed through subset decollaring processes. Five CD-ROM’s containing Digital Orthophoto Quarter Quads (DOQQ’s) were used to register and rectify the scanned soils maps. Polygonal data was then attributed according to the datasets.
Brabb, Earl E.; Colgan, Joseph P.; Best, Timothy C.
2000-01-01
Introduction Debris flows, debris avalanches, mud flows and lahars are fast-moving landslides that occur in a wide variety of environments throughout the world. They are particularly dangerous to life and property because they move quickly, destroy objects in their paths, and often strike without warning. This map represents a significant effort to compile the locations of known debris flows in United Stated and predict where future flows might occur. The files 'dfipoint.e00' and 'dfipoly.e00' contain the locations of over 6600 debris flows from published and unpublished sources. The locations are referenced by numbers that correspond to entries in a bibliography, which is part of the pamphlet 'mf2329pamphlet.pdf'. The areas of possible future debris flows are shown in the file 'susceptibility.tif', which is a georeferenced TIFF file that can be opened in an image editing program or imported into a GIS system like ARC/INFO. All other databases are in ARC/INFO export (.e00) format.
Java Image I/O for VICAR, PDS, and ISIS
NASA Technical Reports Server (NTRS)
Deen, Robert G.; Levoe, Steven R.
2011-01-01
This library, written in Java, supports input and output of images and metadata (labels) in the VICAR, PDS image, and ISIS-2 and ISIS-3 file formats. Three levels of access exist. The first level comprises the low-level, direct access to the file. This allows an application to read and write specific image tiles, lines, or pixels and to manipulate the label data directly. This layer is analogous to the C-language "VICAR Run-Time Library" (RTL), which is the image I/O library for the (C/C++/Fortran) VICAR image processing system from JPL MIPL (Multimission Image Processing Lab). This low-level library can also be used to read and write labeled, uncompressed images stored in formats similar to VICAR, such as ISIS-2 and -3, and a subset of PDS (image format). The second level of access involves two codecs based on Java Advanced Imaging (JAI) to provide access to VICAR and PDS images in a file-format-independent manner. JAI is supplied by Sun Microsystems as an extension to desktop Java, and has a number of codecs for formats such as GIF, TIFF, JPEG, etc. Although Sun has deprecated the codec mechanism (replaced by IIO), it is still used in many places. The VICAR and PDS codecs allow any program written using the JAI codec spec to use VICAR or PDS images automatically, with no specific knowledge of the VICAR or PDS formats. Support for metadata (labels) is included, but is format-dependent. The PDS codec, when processing PDS images with an embedded VIAR label ("dual-labeled images," such as used for MER), presents the VICAR label in a new way that is compatible with the VICAR codec. The third level of access involves VICAR, PDS, and ISIS Image I/O plugins. The Java core includes an "Image I/O" (IIO) package that is similar in concept to the JAI codec, but is newer and more capable. Applications written to the IIO specification can use any image format for which a plug-in exists, with no specific knowledge of the format itself.
Sanford, Jordan M.; Harrison, Arnell S.; Wiese, Dana S.; Flocks, James G.
2009-01-01
In April and July of 1981, the U.S. Geological Survey (USGS) conducted geophysical surveys to investigate the shallow geologic framework of the Alabama-Mississippi-Louisiana Shelf in the northern Gulf of Mexico. Work was conducted onboard the Texas A&M University R/V Carancahua and the R/V Gyre to develop a geologic understanding of the study area and to locate potential hazards related to offshore oil and gas production. While the R/V Carancahua only collected boomer data, the R/V Gyre used a 400-Joule minisparker, 3.5-kilohertz (kHz) subbottom profiler, 12-kHz precision depth recorder, and two air guns. The authors selected the minisparker data set because, unlike with the boomer data, it provided the most complete record. This report is part of a series to digitally archive the legacy analog data collected from the Mississippi-Alabama SHelf (MASH). The MASH data rescue project is a cooperative effort by the USGS and the Minerals Management Service (MMS). This report serves as an archive of high-resolution scanned Tagged Image File Format (TIFF) and Graphics Interchange Format (GIF) images of the original boomer and minisparker paper records, navigation files, trackline maps, Geographic Information System (GIS) files, cruise logs, and formal Federal Geographic Data Committee (FGDC) metadata.
About a method for compressing x-ray computed microtomography data
NASA Astrophysics Data System (ADS)
Mancini, Lucia; Kourousias, George; Billè, Fulvio; De Carlo, Francesco; Fidler, Aleš
2018-04-01
The management of scientific data is of high importance especially for experimental techniques that produce big data volumes. Such a technique is x-ray computed tomography (CT) and its community has introduced advanced data formats which allow for better management of experimental data. Rather than the organization of the data and the associated meta-data, the main topic on this work is data compression and its applicability to experimental data collected from a synchrotron-based CT beamline at the Elettra-Sincrotrone Trieste facility (Italy) and studies images acquired from various types of samples. This study covers parallel beam geometry, but it could be easily extended to a cone-beam one. The reconstruction workflow used is the one currently in operation at the beamline. Contrary to standard image compression studies, this manuscript proposes a systematic framework and workflow for the critical examination of different compression techniques and does so by applying it to experimental data. Beyond the methodology framework, this study presents and examines the use of JPEG-XR in combination with HDF5 and TIFF formats providing insights and strategies on data compression and image quality issues that can be used and implemented at other synchrotron facilities and laboratory systems. In conclusion, projection data compression using JPEG-XR appears as a promising, efficient method to reduce data file size and thus to facilitate data handling and image reconstruction.
Using component technologies for web based wavelet enhanced mammographic image visualization.
Sakellaropoulos, P; Costaridou, L; Panayiotakis, G
2000-01-01
The poor contrast detectability of mammography can be dealt with by domain specific software visualization tools. Remote desktop client access and time performance limitations of a previously reported visualization tool are addressed, aiming at more efficient visualization of mammographic image resources existing in web or PACS image servers. This effort is also motivated by the fact that at present, web browsers do not support domain-specific medical image visualization. To deal with desktop client access the tool was redesigned by exploring component technologies, enabling the integration of stand alone domain specific mammographic image functionality in a web browsing environment (web adaptation). The integration method is based on ActiveX Document Server technology. ActiveX Document is a part of Object Linking and Embedding (OLE) extensible systems object technology, offering new services in existing applications. The standard DICOM 3.0 part 10 compatible image-format specification Papyrus 3.0 is supported, in addition to standard digitization formats such as TIFF. The visualization functionality of the tool has been enhanced by including a fast wavelet transform implementation, which allows for real time wavelet based contrast enhancement and denoising operations. Initial use of the tool with mammograms of various breast structures demonstrated its potential in improving visualization of diagnostic mammographic features. Web adaptation and real time wavelet processing enhance the potential of the previously reported tool in remote diagnosis and education in mammography.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reyhan, M; Yue, N
Purpose: To validate an automated image processing algorithm designed to detect the center of radiochromic film used for in vivo film dosimetry against the current gold standard of manual selection. Methods: An image processing algorithm was developed to automatically select the region of interest (ROI) in *.tiff images that contain multiple pieces of radiochromic film (0.5x1.3cm{sup 2}). After a user has linked a calibration file to the processing algorithm and selected a *.tiff file for processing, an ROI is automatically detected for all films by a combination of thresholding and erosion, which removes edges and any additional markings for orientation.more » Calibration is applied to the mean pixel values from the ROIs and a *.tiff image is output displaying the original image with an overlay of the ROIs and the measured doses. Validation of the algorithm was determined by comparing in vivo dose determined using the current gold standard (manually drawn ROIs) versus automated ROIs for n=420 scanned films. Bland-Altman analysis, paired t-test, and linear regression were performed to demonstrate agreement between the processes. Results: The measured doses ranged from 0.2-886.6cGy. Bland-Altman analysis of the two techniques (automatic minus manual) revealed a bias of -0.28cGy and a 95% confidence interval of (5.5cGy,-6.1cGy). These values demonstrate excellent agreement between the two techniques. Paired t-test results showed no statistical differences between the two techniques, p=0.98. Linear regression with a forced zero intercept demonstrated that Automatic=0.997*Manual, with a Pearson correlation coefficient of 0.999. The minimal differences between the two techniques may be explained by the fact that the hand drawn ROIs were not identical to the automatically selected ones. The average processing time was 6.7seconds in Matlab on an IntelCore2Duo processor. Conclusion: An automated image processing algorithm has been developed and validated, which will help minimize user interaction and processing time of radiochromic film used for in vivo dosimetry.« less
NASA Technical Reports Server (NTRS)
Warren, W. H., Jr.
1984-01-01
Detailed descriptions of the data and format of the machine-readable astronomical catalog are given. The machine version is identical in data content to the published edition, but minor modifications in the data format were made in order to effect uniformity with machine versions of other astronomical catalogs. Stellar motions and positions at epoch and equinox 1950.0 are reported.
Psyplot: Visualizing rectangular and triangular Climate Model Data with Python
NASA Astrophysics Data System (ADS)
Sommer, Philipp
2016-04-01
The development and use of climate models often requires the visualization of geo-referenced data. Creating visualizations should be fast, attractive, flexible, easily applicable and easily reproducible. There is a wide range of software tools available for visualizing raster data, but they often are inaccessible to many users (e.g. because they are difficult to use in a script or have low flexibility). In order to facilitate easy visualization of geo-referenced data, we developed a new framework called "psyplot," which can aid earth system scientists with their daily work. It is purely written in the programming language Python and primarily built upon the python packages matplotlib, cartopy and xray. The package can visualize data stored on the hard disk (e.g. NetCDF, GeoTIFF, any other file format supported by the xray package), or directly from the memory or Climate Data Operators (CDOs). Furthermore, data can be visualized on a rectangular grid (following or not following the CF Conventions) and on a triangular grid (following the CF or UGRID Conventions). Psyplot visualizes 2D scalar and vector fields, enabling the user to easily manage and format multiple plots at the same time, and to export the plots into all common picture formats and movies covered by the matplotlib package. The package can currently be used in an interactive python session or in python scripts, and will soon be developed for use with a graphical user interface (GUI). Finally, the psyplot framework enables flexible configuration, allows easy integration into other scripts that uses matplotlib, and provides a flexible foundation for further development.
NASA IMAGESEER: NASA IMAGEs for Science, Education, Experimentation and Research
NASA Technical Reports Server (NTRS)
Le Moigne, Jacqueline; Grubb, Thomas G.; Milner, Barbara C.
2012-01-01
A number of web-accessible databases, including medical, military or other image data, offer universities and other users the ability to teach or research new Image Processing techniques on relevant and well-documented data. However, NASA images have traditionally been difficult for researchers to find, are often only available in hard-to-use formats, and do not always provide sufficient context and background for a non-NASA Scientist user to understand their content. The new IMAGESEER (IMAGEs for Science, Education, Experimentation and Research) database seeks to address these issues. Through a graphically-rich web site for browsing and downloading all of the selected datasets, benchmarks, and tutorials, IMAGESEER provides a widely accessible database of NASA-centric, easy to read, image data for teaching or validating new Image Processing algorithms. As such, IMAGESEER fosters collaboration between NASA and research organizations while simultaneously encouraging development of new and enhanced Image Processing algorithms. The first prototype includes a representative sampling of NASA multispectral and hyperspectral images from several Earth Science instruments, along with a few small tutorials. Image processing techniques are currently represented with cloud detection, image registration, and map cover/classification. For each technique, corresponding data are selected from four different geographic regions, i.e., mountains, urban, water coastal, and agriculture areas. Satellite images have been collected from several instruments - Landsat-5 and -7 Thematic Mappers, Earth Observing-1 (EO-1) Advanced Land Imager (ALI) and Hyperion, and the Moderate Resolution Imaging Spectroradiometer (MODIS). After geo-registration, these images are available in simple common formats such as GeoTIFF and raw formats, along with associated benchmark data.
The AMCP Format for Formulary Submissions: Welcome to Version 4.0.
2016-05-01
Managed care pharmacists are increasingly presented with complex considerations related to prescription drug formulary management. As prescription drug spending soars, and new effective, but expensive drugs rush to the market, pharmacists and other health care decision makers must evaluate a myriad of important clinical and economic considerations in determining the relative value and, subsequently, the appropriate placement of a product within a formulary. The AMCP Format for Formulary Submissions, Version 4.0, is the next iteration of the Format, which was first released in 2000. Version 4.0, developed by pharmacists from health plan, manufacturer, and academic perspectives, provides updated recommendations on acquiring and evaluating clinical and economic evidence to inform formulary and medical policy decisions. It also includes new guidance related to emerging special topic considerations such as biosimilars, specialty pharmacy products, and companion diagnostic tests. Version 4.0 has been modified to improve the usability of the Format, with clarifying guidance related to logistical considerations such as a recommended time frame for implementation of Version 4.0, as well as dossier updates and ongoing communication between manufacturers and health care decision makers. The Format should be used as a framework for ongoing evidence-based dialogue between manufacturers and payers. The evolving health care landscape will require new levels of collaboration and communication among key stakeholders to successfully navigate the challenges of this new environment. The Format provides a framework to support these critical interactions related to product value by facilitating an evidence-based, transparent approach. This document was prepared by Jeff Lee, PharmD, FCCP, on behalf of the AMCP Format Executive Committee. Committee members reviewed and provided feedback on the final draft. No conflicts of interest, financial or otherwise, were reported.
Visualization of small scale structures on high resolution DEMs
NASA Astrophysics Data System (ADS)
Kokalj, Žiga; Zakšek, Klemen; Pehani, Peter; Čotar, Klemen; Oštir, Krištof
2015-04-01
Knowledge on the terrain morphology is very important for observation of numerous processes and events and digital elevation models are therefore one of the most important datasets in geographic analyses. Furthermore, recognition of natural and anthropogenic microrelief structures, which can be observed on detailed terrain models derived from aerial laser scanning (lidar) or structure-from-motion photogrammetry, is of paramount importance in many applications. In this paper we thus examine and evaluate methods of raster lidar data visualization for the determination (recognition) of microrelief features and present a series of strategies to assist selecting the preferred visualization of choice for structures of various shapes and sizes, set in varied landscapes. Often the answer is not definite and more frequently a combination of techniques has to be used to map a very diverse landscape. Researchers can only very recently benefit from free software for calculation of advanced visualization techniques. These tools are often difficult to understand, have numerous options that confuse the user, or require and produce non-standard data formats, because they were written for specific purposes. We therefore designed the Relief Visualization Toolbox (RVT) as a free, easy-to-use, standalone application to create visualisations from high-resolution digital elevation data. It is tailored for the very beginners in relief interpretation, but it can also be used by more advanced users in data processing and geographic information systems. It offers a range of techniques, such as simple hillshading and its derivatives, slope gradient, trend removal, positive and negative openness, sky-view factor, and anisotropic sky-view factor. All included methods have been proven to be effective for detection of small scale features and the default settings are optimised to accomplish this task. However, the usability of the tool goes beyond computation for visualization purposes, as sky-view factor, for example, is an essential variable in many fields, e.g. in meteorology. RVT produces two types of results: 1) the original files have a full range of values and are intended for further analyses in geographic information systems, 2) the simplified versions are histogram stretched for visualization purposes and saved as 8-bit GeoTIFF files. This means that they can be explored in non-GIS software, e.g. with simple picture viewers, which is essential when a larger community of non-specialists needs to be considered, e.g. in public collaborative projects. The tool recognizes all frequently used single band raster formats and supports elevation raster file data conversion.
COMPPAP - COMPOSITE PLATE BUCKLING ANALYSIS PROGRAM (IBM PC VERSION)
NASA Technical Reports Server (NTRS)
Smith, J. P.
1994-01-01
The Composite Plate Buckling Analysis Program (COMPPAP) was written to help engineers determine buckling loads of orthotropic (or isotropic) irregularly shaped plates without requiring hand calculations from design curves or extensive finite element modeling. COMPPAP is a one element finite element program that utilizes high-order displacement functions. The high order of the displacement functions enables the user to produce results more accurate than traditional h-finite elements. This program uses these high-order displacement functions to perform a plane stress analysis of a general plate followed by a buckling calculation based on the stresses found in the plane stress solution. The current version assumes a flat plate (constant thickness) subject to a constant edge load (normal or shear) on one or more edges. COMPPAP uses the power method to find the eigenvalues of the buckling problem. The power method provides an efficient solution when only one eigenvalue is desired. Once the eigenvalue is found, the eigenvector, which corresponds to the plate buckling mode shape, results as a by-product. A positive feature of the power method is that the dominant eigenvalue is the first found, which is this case is the plate buckling load. The reported eigenvalue expresses a load factor to induce plate buckling. COMPPAP is written in ANSI FORTRAN 77. Two machine versions are available from COSMIC: a PC version (MSC-22428), which is for IBM PC 386 series and higher computers and compatibles running MS-DOS; and a UNIX version (MSC-22286). The distribution medium for both machine versions includes source code for both single and double precision versions of COMPPAP. The PC version includes source code which has been optimized for implementation within DOS memory constraints as well as sample executables for both the single and double precision versions of COMPPAP. The double precision versions of COMPPAP have been successfully implemented on an IBM PC 386 compatible running MS-DOS, a Sun4 series computer running SunOS, an HP-9000 series computer running HP-UX, and a CRAY X-MP series computer running UNICOS. COMPPAP requires 1Mb of RAM and the BLAS and LINPACK math libraries, which are included on the distribution medium. The COMPPAP documentation provides instructions for using the commercial post-processing package PATRAN for graphical interpretation of COMPPAP output. The UNIX version includes two electronic versions of the documentation: one in LaTex format and one in PostScript format. The standard distribution medium for the PC version (MSC-22428) is a 5.25 inch 1.2Mb MS-DOS format diskette. The standard distribution medium for the UNIX version (MSC-22286) is a .25 inch streaming magnetic tape cartridge (Sun QIC-24) in UNIX tar format. For the UNIX version, alternate distribution media and formats are available upon request. COMPPAP was developed in 1992.
COMPPAP - COMPOSITE PLATE BUCKLING ANALYSIS PROGRAM (UNIX VERSION)
NASA Technical Reports Server (NTRS)
Smith, J. P.
1994-01-01
The Composite Plate Buckling Analysis Program (COMPPAP) was written to help engineers determine buckling loads of orthotropic (or isotropic) irregularly shaped plates without requiring hand calculations from design curves or extensive finite element modeling. COMPPAP is a one element finite element program that utilizes high-order displacement functions. The high order of the displacement functions enables the user to produce results more accurate than traditional h-finite elements. This program uses these high-order displacement functions to perform a plane stress analysis of a general plate followed by a buckling calculation based on the stresses found in the plane stress solution. The current version assumes a flat plate (constant thickness) subject to a constant edge load (normal or shear) on one or more edges. COMPPAP uses the power method to find the eigenvalues of the buckling problem. The power method provides an efficient solution when only one eigenvalue is desired. Once the eigenvalue is found, the eigenvector, which corresponds to the plate buckling mode shape, results as a by-product. A positive feature of the power method is that the dominant eigenvalue is the first found, which is this case is the plate buckling load. The reported eigenvalue expresses a load factor to induce plate buckling. COMPPAP is written in ANSI FORTRAN 77. Two machine versions are available from COSMIC: a PC version (MSC-22428), which is for IBM PC 386 series and higher computers and compatibles running MS-DOS; and a UNIX version (MSC-22286). The distribution medium for both machine versions includes source code for both single and double precision versions of COMPPAP. The PC version includes source code which has been optimized for implementation within DOS memory constraints as well as sample executables for both the single and double precision versions of COMPPAP. The double precision versions of COMPPAP have been successfully implemented on an IBM PC 386 compatible running MS-DOS, a Sun4 series computer running SunOS, an HP-9000 series computer running HP-UX, and a CRAY X-MP series computer running UNICOS. COMPPAP requires 1Mb of RAM and the BLAS and LINPACK math libraries, which are included on the distribution medium. The COMPPAP documentation provides instructions for using the commercial post-processing package PATRAN for graphical interpretation of COMPPAP output. The UNIX version includes two electronic versions of the documentation: one in LaTex format and one in PostScript format. The standard distribution medium for the PC version (MSC-22428) is a 5.25 inch 1.2Mb MS-DOS format diskette. The standard distribution medium for the UNIX version (MSC-22286) is a .25 inch streaming magnetic tape cartridge (Sun QIC-24) in UNIX tar format. For the UNIX version, alternate distribution media and formats are available upon request. COMPPAP was developed in 1992.
Atmospheric Science Data Center
2018-02-21
... MISR-Versioning-V23 Version Number: F13_0023 (aerosol), F08_0023 (land) Production Start Date: 11/1/2017 Product Updates: This is a major revision to aerosol and land surface products, including both product format and algorithm ...
Paleozoic and mesozoic GIS data from the Geologic Atlas of the Rocky Mountain Region: Volume 1
Graeber, Aimee; Gunther, Gregory
2017-01-01
The Rocky Mountain Association of Geologists (RMAG) is, once again, publishing portions of the 1972 Geologic Atlas of the Rocky Mountain Region (Mallory, ed., 1972) as a geospatial map and data package. Georeferenced tiff (Geo TIFF) images of map figures from this atlas has served as the basis for these data products. Shapefiles and file geodatabase features have been generated and cartographically represented for select pages from the following chapters:• Phanerozoic Rocks (page 56)• Cambrian System (page 63)• Ordovician System (pages 78 and 79)• Silurian System (pages 87 - 89)• Devonian System (pages 93, 94, and 96 - 98)• Mississippian System (pages 102 and 103)• Pennsylvanian System (pages 114 and 115)• Permian System (pages 146 and 149 - 154)• Triassic System (pages 168 and 169)• Jurassic System (pages 179 and 180)• Cretaceous System (pages 197 - 201, 207 - 210, 215, - 218, 221, 222, 224, 225, and 227).The primary purpose of this publication is to provide regional-scale, as well as local-scale, geospatial data of the Rocky Mountain Region for use in geoscience studies. An important aspect of this interactive map product is that it does not require extensive GIS experience or highly specialized software.
Debruyne, Philippe; Rossenbacker, Tom; Vankelecom, Bart; Charlier, Filip; Roosen, John; Ector, Bavo; Janssens, Luc
2014-02-01
Radiofrequency ablation (RFA) can unfavorably cause coagulum on the ablation electrode. The aim of this study was to assess this phenomenon on three different multielectrode catheters used to treat persistent atrial fibrillation with duty-cycled RFA. Twenty-six consecutive patients have been treated with the pulmonary vein ablation catheter (PVAC) and the multiarray ablation catheter (MAAC). In 13 patients, additional ablation with the multiarray septal catheter (MASC) has been performed. The multichannel RF generator GENius™ (Medtronic Inc., Minneapolis, MN, USA) independently delivered energy in a bipolar and unipolar mode (ratio of 4/1, 2/1, or 1/1) to any of the electrodes. Versions 14.2, 14.3, and 14.4 of the generator were used. Coagulum presence was determined postablation by careful visual inspection of the catheter electrodes. No coagulum formation was visualized on the PVACs. Coagulum formation was visualized in 59% of the electrodes of the MAACs using a 2/1 mode and the 14.2 software version versus 69% using the 14.4 version and a 2/1 mode (P = 0.7) versus 14% of the electrodes applying a 1/1 ratio and the 14.4 software version (P < 0.001). Duty-cycled RFA in 2/1 bipolar/unipolar ratio generates a substantial frequency of coagulum formation on the multielectrode catheters MAAC and MASC. The use of the 14.4 version of the software to drive the RF generator and the use of energy in the default 1/1 bipolar/unipolar ratio could significantly reduce the frequency of coagulum formation, but so far, could not completely overcome it. The PVAC did not form coagulum, regardless of generator version or energy ratio used. ©2013, The Authors. Journal compilation ©2013 Wiley Periodicals, Inc.
Robles, Noemí; Rajmil, Luis; Rodriguez-Arjona, Dolors; Azuara, Marta; Codina, Francisco; Raat, Hein; Ravens-Sieberer, Ulrike; Herdman, Michael
2015-06-03
The objectives of the study were to develop web-based Spanish and Catalan versions of the EQ-5D-Y, and to compare scores and psychometric properties with the paper version. Web-based and paper versions of EQ-5D-Y were included in a cross-sectional study in Palafolls (Barcelona), Spain and administered to students (n = 923) aged 8 to 18 years from 2 primary and 1 secondary school and their parents. All students completed both the web-based and paper versions during school time with an interval of at least 2 h between administrations. The order of administration was randomized. Participants completed EQ-5D-Y, a measure of mental health status (the Strengths and Difficulties Questionnaire), and sociodemographic variables using a self-administered questionnaire. Parents questionnaire included parental level of education and presence of chronic conditions in children. Missing values, and floor and ceiling effects were compared between versions. Mean score differences were computed for the visual analogue scale (VAS). Percentage of agreement, kappa index (k) and intraclass correlation coefficient (ICC) were computed to analyze the level of agreement between web-based and paper versions on EQ-5D-Y dimensions and VAS. Known groups validity was analyzed and compared between the two formats. Participation rate was 77 % (n = 715). Both formats of EQ-5D-Y showed low percentages of missing values (n = 2, and 4 to 9 for web and paper versions respectively), and a high ceiling effect by dimension (range from 79 % to 96 %). Percent agreement for EQ-5D-Y dimensions on the web and paper versions was acceptable (range 89 % to 97 %), and k ranged from 0.55 (0.48-0.61, usual activities dimension) to 0.75 (0.68-0.82, mobility dimension). Mean score difference on the VAS was 0.07, and the ICC for VAS scores on the two formats was 0.84 (0.82-0.86). Both formats showed acceptable ability to discriminate according to self-perceived health, reporting chronic conditions, and mental health status. The digital EQ-5D-Y showed almost identical VAS scores and acceptable levels of agreement on dimensions. Both formats demonstrated acceptable levels of construct validity. Availability of the Spanish and Catalan web-version will facilitate its use in HRQOL assessment and in economic evaluation.
Exposing Coverage Data to the Semantic Web within the MELODIES project: Challenges and Solutions
NASA Astrophysics Data System (ADS)
Riechert, Maik; Blower, Jon; Griffiths, Guy
2016-04-01
Coverage data, typically big in data volume, assigns values to a given set of spatiotemporal positions, together with metadata on how to interpret those values. Existing storage formats like netCDF, HDF and GeoTIFF all have various restrictions that prevent them from being preferred formats for use over the web, especially the semantic web. Factors that are relevant here are the processing complexity, the semantic richness of the metadata, and the ability to request partial information, such as a subset or just the appropriate metadata. Making coverage data available within web browsers opens the door to new ways for working with such data, including new types of visualization and on-the-fly processing. As part of the European project MELODIES (http://melodiesproject.eu) we look into the challenges of exposing such coverage data in an interoperable and web-friendly way, and propose solutions using a host of emerging technologies like JSON-LD, the DCAT and GeoDCAT-AP ontologies, the CoverageJSON format, and new approaches to REST APIs for coverage data. We developed the CoverageJSON format within the MELODIES project as an additional way to expose coverage data to the web, next to having simple rendered images available using standards like OGC's WMS. CoverageJSON partially incorporates JSON-LD but does not encode individual data values as semantic resources, making use of the technology in a practical manner. The development also focused on it being a potential output format for OGC WCS. We will demonstrate how existing netCDF data can be exposed as CoverageJSON resources on the web together with a REST API that allows users to explore the data and run operations such as spatiotemporal subsetting. We will show various use cases from the MELODIES project, including reclassification of a Land Cover dataset client-side within the browser with the ability for the user to influence the reclassification result by making use of the above technologies.
A Performance Comparison for Two Versions of the Vulcan Photometer
NASA Technical Reports Server (NTRS)
Borucki, W. J.; Caldwell, D. A.; Koch, D. G.; Jenkins, J. M.; Showen, R. L.
2001-01-01
Analysis of the images produced by the first version (V1) of the Vulcan photometer indicated that two major sources of noise were sky brightness and image motion. To reduce the effect of the sky brightness, a second version (V2) with a longer focal length and a larger format detector was developed and tested. The first version consisted of 15-centimeter (cm) focal length, F/1.5 Aerojet Delft reconnaissance lens, and a 2048 x 2048 format front-illuminated charged coupled device (CCD) with 9 microns micropixels (Mpixels). The second version used a 30-cm focal length, F/2.5 Kodak AeroEktar lens, and a 4096 x 4096 format CCD with 9 micro pixels. Both have a 49-square-degree field of view (FOV) but the area of the sky subtended by each pixel in the V2 version is one-fourth that of the V1 version. This modification substantially reduces the shot noise due to the sky background and allows fainter stars to be monitored for planetary transits. To remove the data gap and consequent signal-level change caused by flipping the photometer around the declination axis and to reduce image movement on the detector, several other modifications were incorporated. These include modifying the mount and stiffening the photometer and autoguider structures to reduce flexure. This paper compares the performance characteristics of each photometer and discusses tests to identify sources of systematic noise.
Sun, Zhi-Jing; Zhu, Lan; Liang, Maolian; Xu, Tao; Lang, Jing-He
2016-08-01
WeChat is a promising tool for capturing electronic data; however, no research has examined its use. This study evaluates the reliability and feasibility of WeChat for administering the Pelvic Floor Impact Questionnaire Short Form 7 questionnaire to women with pelvic floor disorders. Sixty-eight pelvic floor rehabilitation women were recruited between June and December 2015 and crossover randomized to two groups. All participants completed two questionnaire formats. One group completed the paper version followed by the WeChat version; the other group completed the questionnaires in reverse order. Two weeks later, each group completed the two versions in reverse order. The WeChat version's reliability was assessed using intraclass correlation coefficients and test-retest reliability. Forty-two women (61.8%) preferred the WeChat to the paper format, eight (11.8%) preferred the paper format, and 18 (26.5%) had no preference. The younger women preferred WeChat. Completion time was 116.5 (61.3) seconds for the WeChat version and 133.4 (107.0) seconds for the paper version, with no significant difference (P = 0.145). Age and education did not impact completion time (P > 0.05). Consistency between the WeChat and paper versions was excellent. The intraclass correlation coefficients of the Pelvic Floor Impact Questionnaire Short Form 7 and the three subscales ranged from 0.915 to 0.980. The Bland-Altman analysis and linear regression results also showed high consistency. The test-retest study had a Pearson's correlation coefficient of 0.908, demonstrating a strong correlation. WeChat-based questionnaires were well accepted by women with pelvic floor disorders and had good data quality and reliability.
ERIC Educational Resources Information Center
Yin, Yue; Olson, Judith; Olson, Melfried; Solvin, Hannah; Brandon, Paul R.
2015-01-01
This study compared two versions of professional development (PD) designed for teachers using formative assessment (FA) in mathematics classrooms that were networked with Texas Instruments Navigator (NAV) technology. Thirty-two middle school mathematics teachers were randomly assigned to one of the two groups: FA-then-NAV group and FA-and-NAV…
Code of Federal Regulations, 2014 CFR
2014-04-01
... submit a public version of a database in pdf format. The public version of the database must be publicly... interested party that files with the Department a request for an expedited antidumping review, an..., whichever is later. If the interested party that files the request is unable to locate a particular exporter...
Naval Research Lab Review 1999
1999-01-01
Center offers high-quality out- put from computer-generated files in EPS, Postscript, PICT, TIFF, Photoshop , and PowerPoint. Photo- graphic-quality color...767-3200 (228) 688-3390 (831) 656-4731 (410) 257-4000 DSN 297- or 754- 485 878 — Direct- in -Dialing 767- or 404- 688 656 257 Public Affairs (202) 767...research described in this NRL Review can be obtained from the Public Affairs Office, Code 1230, (202) 767-2541. Information concerning Technology
1984-02-01
I . . . . . . An Introduction to Geometric Programming Patrick D. Allen and David W. Baker . . . . . . , . . . . . . . Space and Time...Zarwyn, US-Army Electronics R & D Comhiand GEOMETRIC PROGRAMING SPACE AND TIFFE ANALYSIS IN DYNAMIC PROGRAMING ALGORITHMS Renne..tf Stizti, AkeanXa...physical and parameter space can be connected by asymptotic matching. The purpose of the asymptotic analysis is to define the simplest problems
Atmospheric Science Data Center
2013-12-19
UAEMIAAE Aerosol product. ( File version details ) File version F07_0015 has better ... properties. File version F08_0016 has improved cloud screening procedure resulting in better aerosol optical depth. ... Coverage: August - October 2004 File Format: HDF-EOS Tools: FTP Access: Data Pool ...
ERIC Educational Resources Information Center
Schonfeld, Roger C.; King, Donald W.; Okerson, Ann; Fenton, Eileen Gifford
2004-01-01
Many academic and research libraries are in the midst of what may ultimately be a major transition for various parts of their collection--a shift from print to electronic format. Libraries that had long subscribed only to print versions of journals are, in increasing numbers, licensing electronic versions to replace the print. What effects will…
Sirota, Miroslav; Juanchich, Marie
2018-03-27
The Cognitive Reflection Test, measuring intuition inhibition and cognitive reflection, has become extremely popular because it reliably predicts reasoning performance, decision-making, and beliefs. Across studies, the response format of CRT items sometimes differs, based on the assumed construct equivalence of tests with open-ended versus multiple-choice items (the equivalence hypothesis). Evidence and theoretical reasons, however, suggest that the cognitive processes measured by these response formats and their associated performances might differ (the nonequivalence hypothesis). We tested the two hypotheses experimentally by assessing the performance in tests with different response formats and by comparing their predictive and construct validity. In a between-subjects experiment (n = 452), participants answered stem-equivalent CRT items in an open-ended, a two-option, or a four-option response format and then completed tasks on belief bias, denominator neglect, and paranormal beliefs (benchmark indicators of predictive validity), as well as on actively open-minded thinking and numeracy (benchmark indicators of construct validity). We found no significant differences between the three response formats in the numbers of correct responses, the numbers of intuitive responses (with the exception of the two-option version, which had a higher number than the other tests), and the correlational patterns of the indicators of predictive and construct validity. All three test versions were similarly reliable, but the multiple-choice formats were completed more quickly. We speculate that the specific nature of the CRT items helps build construct equivalence among the different response formats. We recommend using the validated multiple-choice version of the CRT presented here, particularly the four-option CRT, for practical and methodological reasons. Supplementary materials and data are available at https://osf.io/mzhyc/ .
Application of a digital technique in evaluating the reliability of shade guides.
Cal, E; Sonugelen, M; Guneri, P; Kesercioglu, A; Kose, T
2004-05-01
There appears to be a need for a reliable method for quantification of tooth colour and analysis of shade. Therefore, the primary objective of this study was to show the applicability of graphic software in colour analysis and secondly to investigate the reliability of commercial shade guides produced by the same manufacturer, using this digital technique. After confirming the reliability and reproducibility of the digital method by using self-assessed coloured images, three shade guides of the same manufacturer were photographed in daylight and in studio environments with a digital camera and saved in tagged image file format (TIFF) format. Colour analysis of each photograph was performed using the Adobe Photoshop 4.0 graphic program. Luminosity, and red, green, blue (L and RGB) values of each shade tab of each shade guide were measured and the data were subjected to statistical analysis using the repeated measure Anova test. The L and RGB values of the images taken in daylight differed significantly from those of the images taken in studio environment (P < 0.05). In both environments, the luminosity and red values of the shade tabs were significantly different from each other (P < 0.05). It was concluded that, when the environmental conditions were kept constant, the Adobe Photoshop 4.0 colour analysis program could be used to analyse the colour of images. On the other hand, the results revealed that the accuracy of shade tabs widely being used in colour matching should be readdressed.
Identification of stars and digital version of the catalogue of 1958 by Brodskaya and Shajn
NASA Astrophysics Data System (ADS)
Gorbunov, M. A.; Shlyapnikov, A. A.
2017-12-01
The following topics are considered: the identification of objects on search maps, the determination of their coordinates at the epoch of 2000, and converting the published version of the catalogue of 1958 by Brodskaya and Shajn into a machine-readable format. The statistics for photometric and spectral data from the original catalogue is presented. A digital version of the catalogue is described, as well as its presentation in HTML, VOTable and AJS formats and the basic principles of work in the interactive application of International Virtual Observatory - the Aladin Sky Atlas.
High-resolution seismic-reflection data offshore of Dana Point, southern California borderland
Sliter, Ray W.; Ryan, Holly F.; Triezenberg, Peter J.
2010-01-01
The U.S. Geological Survey collected high-resolution shallow seismic-reflection profiles in September 2006 in the offshore area between Dana Point and San Mateo Point in southern Orange and northern San Diego Counties, California. Reflection profiles were located to image folds and reverse faults associated with the San Mateo fault zone and high-angle strike-slip faults near the shelf break (the Newport-Inglewood fault zone) and at the base of the slope. Interpretations of these data were used to update the USGS Quaternary fault database and in shaking hazard models for the State of California developed by the Working Group for California Earthquake Probabilities. This cruise was funded by the U.S. Geological Survey Coastal and Marine Catastrophic Hazards project. Seismic-reflection data were acquired aboard the R/V Sea Explorer, which is operated by the Ocean Institute at Dana Point. A SIG ELC820 minisparker seismic source and a SIG single-channel streamer were used. More than 420 km of seismic-reflection data were collected. This report includes maps of the seismic-survey sections, linked to Google Earth? software, and digital data files showing images of each transect in SEG-Y, JPEG, and TIFF formats.
Tu, Li-ping; Chen, Jing-bo; Hu, Xiao-juan; Zhang, Zhi-feng
2016-01-01
Background and Goal. The application of digital image processing techniques and machine learning methods in tongue image classification in Traditional Chinese Medicine (TCM) has been widely studied nowadays. However, it is difficult for the outcomes to generalize because of lack of color reproducibility and image standardization. Our study aims at the exploration of tongue colors classification with a standardized tongue image acquisition process and color correction. Methods. Three traditional Chinese medical experts are chosen to identify the selected tongue pictures taken by the TDA-1 tongue imaging device in TIFF format through ICC profile correction. Then we compare the mean value of L * a * b * of different tongue colors and evaluate the effect of the tongue color classification by machine learning methods. Results. The L * a * b * values of the five tongue colors are statistically different. Random forest method has a better performance than SVM in classification. SMOTE algorithm can increase classification accuracy by solving the imbalance of the varied color samples. Conclusions. At the premise of standardized tongue acquisition and color reproduction, preliminary objectification of tongue color classification in Traditional Chinese Medicine (TCM) is feasible. PMID:28050555
Integrated Multibeam and LIDAR Bathymetry Data Offshore of New London and Niantic, Connecticut
Poppe, L.J.; Danforth, W.W.; McMullen, K.Y.; Parker, Castle E.; Lewit, P.G.; Doran, E.F.
2010-01-01
Nearshore areas within Long Island Sound are of great interest to the Connecticut and New York research and resource management communities because of their ecological, recreational, and commercial importance. Although advances in multibeam echosounder technology permit the construction of high-resolution representations of sea-floor topography in deeper waters, limitations inherent in collecting fixed-angle multibeam data make using this technology in shallower waters (less than 10 meters deep) difficult and expensive. These limitations have often resulted in data gaps between areas for which multibeam bathymetric datasets are available and the adjacent shoreline. To address this problem, the geospatial data sets released in this report seamlessly integrate complete-coverage multibeam bathymetric data acquired off New London and Niantic Bay, Connecticut, with hydrographic Light Detection and Ranging (LIDAR) data acquired along the nearshore. The result is a more continuous sea floor representation and a much smaller gap between the digital bathymetric data and the shoreline than previously available. These data sets are provided online and on CD-ROM in Environmental Systems Research Institute (ESRI) raster-grid and GeoTIFF formats in order to facilitate access, compatibility, and utility.
Qi, Zhen; Tu, Li-Ping; Chen, Jing-Bo; Hu, Xiao-Juan; Xu, Jia-Tuo; Zhang, Zhi-Feng
2016-01-01
Background and Goal . The application of digital image processing techniques and machine learning methods in tongue image classification in Traditional Chinese Medicine (TCM) has been widely studied nowadays. However, it is difficult for the outcomes to generalize because of lack of color reproducibility and image standardization. Our study aims at the exploration of tongue colors classification with a standardized tongue image acquisition process and color correction. Methods . Three traditional Chinese medical experts are chosen to identify the selected tongue pictures taken by the TDA-1 tongue imaging device in TIFF format through ICC profile correction. Then we compare the mean value of L * a * b * of different tongue colors and evaluate the effect of the tongue color classification by machine learning methods. Results . The L * a * b * values of the five tongue colors are statistically different. Random forest method has a better performance than SVM in classification. SMOTE algorithm can increase classification accuracy by solving the imbalance of the varied color samples. Conclusions . At the premise of standardized tongue acquisition and color reproduction, preliminary objectification of tongue color classification in Traditional Chinese Medicine (TCM) is feasible.
Theiler, R; Spielberger, J; Bischoff, H A; Bellamy, N; Huber, J; Kroesen, S
2002-06-01
The Western Ontario and McMaster Universities (WOMAC) Osteoarthritis Index is a previously described self-administered questionnaire covering three domains: pain, stiffness and function. It has been validated in patients with osteoarthritis (OA) of the hip or knee in a paper-based format. To validate the WOMAC 3.0 using a numerical rating scale in a computerized touch screen format allowing immediate evaluation of the questionnaire. In the computed version cartoons, written and audio instruments were included in order facilitate application. Fifty patients, demographically balanced, with radiographically proven primary hip or knee OA completed the classical paper and the new computerized WOMAC version. Subjects were randomized either to paper format or computerized format first to balance possible order effects. The intra-class correlation coefficients for pain, stiffness and function values were 0.915, 0.745 and 0.940, respectively. The Spearman correlation coefficients for pain, stiffness and function were 0.88, 0.77 and 0.87, respectively. These data indicate that the computerized WOMAC OA index 3.0 is comparable to the paper WOMAC in all three dimensions. The computerized version would allow physicians to get an immediate result and if present a direct comparison with a previous exam. Copyright 2002 OsteoArthritis Research Society International. Published by Elsevier Science Ltd. All rights reserved.
Validation of the insomnia severity index as a web-based measure.
Thorndike, Frances P; Ritterband, Lee M; Saylor, Drew K; Magee, Joshua C; Gonder-Frederick, Linda A; Morin, Charles M
2011-01-01
Although the Insomnia Severity Index (ISI) is already administered online, this frequently used instrument has not been validated for Web delivery. This study compares online and paper-and-pencil ISI versions completed by participants in a randomized controlled trial testing an Internet-delivered intervention for insomnia. Forty-three adults with insomnia completed both ISI versions during pre- (Assessment 1) and post-intervention (Assessment 2). Correlations between total scores of both versions were significant (rs ≥ .98, ps < .001). For both ISI versions, internal consistency was acceptable (Assessment 1, α = .61; Assessment 2, α ≥ .88). Among participants not receiving the parent study intervention, correlations between 1 format at Assessment 1 and the alternative format at Assessment 2 were generally significant (rs = .26-.82). Together, findings suggest the ISI can be delivered online.
JAMI: a Java library for molecular interactions and data interoperability.
Sivade Dumousseau, M; Koch, M; Shrivastava, A; Alonso-López, D; De Las Rivas, J; Del-Toro, N; Combe, C W; Meldal, B H M; Heimbach, J; Rappsilber, J; Sullivan, J; Yehudi, Y; Orchard, S
2018-04-11
A number of different molecular interactions data download formats now exist, designed to allow access to these valuable data by diverse user groups. These formats include the PSI-XML and MITAB standard interchange formats developed by Molecular Interaction workgroup of the HUPO-PSI in addition to other, use-specific downloads produced by other resources. The onus is currently on the user to ensure that a piece of software is capable of read/writing all necessary versions of each format. This problem may increase, as data providers strive to meet ever more sophisticated user demands and data types. A collaboration between EMBL-EBI and the University of Cambridge has produced JAMI, a single library to unify standard molecular interaction data formats such as PSI-MI XML and PSI-MITAB. The JAMI free, open-source library enables the development of molecular interaction computational tools and pipelines without the need to produce different versions of software to read different versions of the data formats. Software and tools developed on top of the JAMI framework are able to integrate and support both PSI-MI XML and PSI-MITAB. The use of JAMI avoids the requirement to chain conversions between formats in order to reach a desired output format and prevents code and unit test duplication as the code becomes more modular. JAMI's model interfaces are abstracted from the underlying format, hiding the complexity and requirements of each data format from developers using JAMI as a library.
Seiter, John S; Weger, Harry
2005-04-01
Compared to televised debates using a single-screen format, such debates using a split screen presenting both debaters simultaneously show viewers the nonverbal reactions of each debater's opponent. The authors examined how appropriate or inappropriate such nonverbal behaviors are perceived to be. Students watched one of four versions of a televised debate. One version used a single-screen format, showing only the speaker, whereas the other three versions used a split-screen format in which the speaker's oppodent displayed constant, occasional, or no nonverbal disagreement with the speaker. Students then rated the debaters' appropriateness. Analysis indicated that the opponent was perceived to be less appropriate when he displayed any background disagreement compared to when he did not. The students perceived the speaker as most appropriate when his opponent displayed constant nonverbal disagreement.
Cartographic Production for the FLaSH Map Study: Generation of Rugosity Grids, 2008
Robbins, Lisa L.; Knorr, Paul O.; Hansen, Mark
2010-01-01
Project Summary This series of raster data is a U.S. Geological Survey (USGS) Data Series release from the Florida Shelf Habitat Project (FLaSH). This disc contains two raster images in Environmental Systems Research Institute, Inc. (ESRI) raster grid format, jpeg image format, and Geo-referenced Tagged Image File Format (GeoTIFF). Data is also provided in non-image ASCII format. Rugosity grids at two resolutions (250 m and 1000 m) were generated for West Florida shelf waters to 250 m using a custom algorithm that follows the methods of Valentine and others (2004). The Methods portion of this document describes the specific steps used to generate the raster images. Rugosity, also referred to as roughness, ruggedness, or the surface-area ratio (Riley and others, 1999; Wilson and others, 2007), is a visual and quantitative measurement of terrain complexity, a common variable in ecological habitat studies. The rugosity of an area can affect biota by influencing habitat, providing shelter from elements, determining the quantity and type of living space, influencing the type and quantity of flora, affecting predator-prey relationships by providing cover and concealment, and, as an expression of vertical relief, can influence local environmental conditions such as temperature and moisture. In the marine environment rugosity can furthermore influence current flow rate and direction, increase the residence time of water in an area through eddying and current deflection, influence local water conditions such as chemistry, turbidity, and temperature, and influence the rate and nature of sedimentary deposition. State-of-the-art computer-mapping techniques and data-processing tools were used to develop shelf-wide raster and vector data layers. Florida Shelf Habitat (FLaSH) Mapping Project (http://coastal.er.usgs.gov/flash) endeavors to locate available data, identify data gaps, synthesize existing information, and expand our understanding of geologic processes in our dynamic coastal and marine systems.
Diet History Questionnaire II: Size Formats
Two serving size formats are used on the NCI versions of the DHQ as shown below. Format 1 is used for nearly all serving size questions. Format 2 is used only in special cases, where 'never' is allowed as a response.
NASA Technical Reports Server (NTRS)
Warren, W. H., Jr.
1982-01-01
Modifications, corrections, and the record format are provided for the machine-readable version of the "Morphological Catalogue of Galaxies.' In addition to hundreds of individual corrections, a detailed comparison of the machine-readable with the published catalogue resulted in the addition of 116 missing objects, the deletion of 10 duplicate records, and a format modification to increase storage efficiency.
Wong, Florence L.; Grim, Muriel S.
2015-01-01
Contours and derivative raster files of depth-to-basement, sediment-thickness, and bathymetry data for the area offshore of Washington, Oregon, and California are provided here as GIS-ready shapefiles and GeoTIFF files. The data were used to generate paper maps in 1992 and 1993 from 1984 surveys of the U.S. Exclusive Economic Zone by the U.S. Geological Survey for depth to basement and sediment thickness, and from older data for the bathymetry.
Documentation for the machine-readable character coded version of the SKYMAP catalogue
NASA Technical Reports Server (NTRS)
Warren, W. H., Jr.
1981-01-01
The SKYMAP catalogue is a compilation of astronomical data prepared primarily for purposes of attitude guidance for satellites. In addition to the SKYMAP Master Catalogue data base, a software package of data base management and utility programs is available. The tape version of the SKYMAP Catalogue, as received by the Astronomical Data Center (ADC), contains logical records consisting of a combination of binary and EBCDIC data. Certain character coded data in each record are redundant in that the same data are present in binary form. In order to facilitate wider use of all SKYMAP data by the astronomical community, a formatted (character) version was prepared by eliminating all redundant character data and converting all binary data to character form. The character version of the catalogue is described. The document is intended to fully describe the formatted tape so that users can process the data problems and guess work; it should be distributed with any character version of the catalogue.
Nuclear Engine System Simulation (NESS). Version 2.0: Program user's guide
NASA Technical Reports Server (NTRS)
Pelaccio, Dennis G.; Scheil, Christine M.; Petrosky, Lyman
1993-01-01
This Program User's Guide discusses the Nuclear Thermal Propulsion (NTP) engine system design features and capabilities modeled in the Nuclear Engine System Simulation (NESS): Version 2.0 program (referred to as NESS throughout the remainder of this document), as well as its operation. NESS was upgraded to include many new modeling capabilities not available in the original version delivered to NASA LeRC in Dec. 1991, NESS's new features include the following: (1) an improved input format; (2) an advanced solid-core NERVA-type reactor system model (ENABLER 2); (3) a bleed-cycle engine system option; (4) an axial-turbopump design option; (5) an automated pump-out turbopump assembly sizing option; (6) an off-design gas generator engine cycle design option; (7) updated hydrogen properties; (8) an improved output format; and (9) personal computer operation capability. Sample design cases are presented in the user's guide that demonstrate many of the new features associated with this upgraded version of NESS, as well as design modeling features associated with the original version of NESS.
NASA Technical Reports Server (NTRS)
Warren, W. H., Jr.
1982-01-01
The contents and format of the machine-readable version of the cataloque distributed by the Astronomical Data Center are described. Coding for the various scales and abbreviations used in the catalogue are tabulated and certain revisions to the machine version made to improve storage efficiency and notation are discussed.
ELAS - SCIENCE & TECHNOLOGY LABORATORY APPLICATIONS SOFTWARE (SILICON GRAPHICS VERSION)
NASA Technical Reports Server (NTRS)
Walters, D.
1994-01-01
The Science and Technology Laboratory Applications Software (ELAS) was originally designed to analyze and process digital imagery data, specifically remotely-sensed scanner data. This capability includes the processing of Landsat multispectral data; aircraft-acquired scanner data; digitized topographic data; and numerous other ancillary data, such as soil types and rainfall information, that can be stored in digitized form. ELAS has the subsequent capability to geographically reference this data to dozens of standard, as well as user created projections. As an integrated image processing system, ELAS offers the user of remotely-sensed data a wide range of capabilities in the areas of land cover analysis and general purpose image analysis. ELAS is designed for flexible use and operation and includes its own FORTRAN operating subsystem and an expandable set of FORTRAN application modules. Because all of ELAS resides in one "logical" FORTRAN program, data inputs and outputs, directives, and module switching are convenient for the user. There are over 230 modules presently available to aid the user in performing a wide range of land cover analyses and manipulation. The file management modules enable the user to allocate, define, access, and specify usage for all types of files (ELAS files, subfiles, external files etc.). Various other modules convert specific types of satellite, aircraft, and vector-polygon data into files that can be used by other ELAS modules. The user also has many module options which aid in displaying image data, such as magnification/reduction of the display; true color display; and several memory functions. Additional modules allow for the building and manipulation of polygonal areas of the image data. Finally, there are modules which allow the user to select and classify the image data. An important feature of the ELAS subsystem is that its structure allows new applications modules to be easily integrated in the future. ELAS has as a standard the flexibility to process data elements exceeding 8 bits in length, including floating point (noninteger) elements and 16 or 32 bit integers. Thus it is able to analyze and process "non-standard" nonimage data. The VAX (ERL-10017) and Concurrent (ERL-10013) versions of ELAS 9.0 are written in FORTRAN and ASSEMBLER for DEC VAX series computers running VMS and Concurrent computers running MTM. The Sun (SSC-00019), Masscomp (SSC-00020), and Silicon Graphics (SSC-00021) versions of ELAS 9.0 are written in FORTRAN 77 and C-LANGUAGE for Sun4 series computers running SunOS, Masscomp computers running UNIX, and Silicon Graphics IRIS computers running IRIX. The Concurrent version requires at least 15 bit addressing and a direct memory access channel. The VAX and Concurrent versions of ELAS both require floating-point hardware, at least 1Mb of RAM, and approximately 70Mb of disk space. Both versions also require a COMTAL display device in order to display images. For the Sun, Masscomp, and Silicon Graphics versions of ELAS, the disk storage required is approximately 115Mb, and a minimum of 8Mb of RAM is required for execution. The Sun version of ELAS requires either the X-Window System Version 11 Revision 4 or Sun OpenWindows Version 2. The Masscomp version requires a GA1000 display device and the associated "gp" library. The Silicon Graphics version requires Silicon Graphics' GL library. ELAS display functions will not work with a monochrome monitor. The standard distribution medium for the VAX version (ERL10017) is a set of two 9-track 1600 BPI magnetic tapes in DEC VAX BACKUP format. This version is also available on a TK50 tape cartridge in DEC VAX BACKUP format. The standard distribution medium for the Concurrent version (ERL-10013) is a set of two 9-track 1600 BPI magnetic tapes in Concurrent BACKUP format. The standard distribution medium for the Sun version (SSC-00019) is a .25 inch streaming magnetic tape cartridge in UNIX tar format. The standard distribution medium for the Masscomp version, (SSC-00020) is a .25 inch streaming magnetic tape cartridge in UNIX tar format. The standard distribution medium for the Silicon Graphics version (SSC-00021) is a .25 inch streaming magnetic IRIS tape cartridge in UNIX tar format. Version 9.0 was released in 1991. Sun4, SunOS, and Open Windows are trademarks of Sun Microsystems, Inc. MIT X Window System is licensed by Massachusetts Institute of Technology.
ELAS - SCIENCE & TECHNOLOGY LABORATORY APPLICATIONS SOFTWARE (CONCURRENT VERSION)
NASA Technical Reports Server (NTRS)
Pearson, R. W.
1994-01-01
The Science and Technology Laboratory Applications Software (ELAS) was originally designed to analyze and process digital imagery data, specifically remotely-sensed scanner data. This capability includes the processing of Landsat multispectral data; aircraft-acquired scanner data; digitized topographic data; and numerous other ancillary data, such as soil types and rainfall information, that can be stored in digitized form. ELAS has the subsequent capability to geographically reference this data to dozens of standard, as well as user created projections. As an integrated image processing system, ELAS offers the user of remotely-sensed data a wide range of capabilities in the areas of land cover analysis and general purpose image analysis. ELAS is designed for flexible use and operation and includes its own FORTRAN operating subsystem and an expandable set of FORTRAN application modules. Because all of ELAS resides in one "logical" FORTRAN program, data inputs and outputs, directives, and module switching are convenient for the user. There are over 230 modules presently available to aid the user in performing a wide range of land cover analyses and manipulation. The file management modules enable the user to allocate, define, access, and specify usage for all types of files (ELAS files, subfiles, external files etc.). Various other modules convert specific types of satellite, aircraft, and vector-polygon data into files that can be used by other ELAS modules. The user also has many module options which aid in displaying image data, such as magnification/reduction of the display; true color display; and several memory functions. Additional modules allow for the building and manipulation of polygonal areas of the image data. Finally, there are modules which allow the user to select and classify the image data. An important feature of the ELAS subsystem is that its structure allows new applications modules to be easily integrated in the future. ELAS has as a standard the flexibility to process data elements exceeding 8 bits in length, including floating point (noninteger) elements and 16 or 32 bit integers. Thus it is able to analyze and process "non-standard" nonimage data. The VAX (ERL-10017) and Concurrent (ERL-10013) versions of ELAS 9.0 are written in FORTRAN and ASSEMBLER for DEC VAX series computers running VMS and Concurrent computers running MTM. The Sun (SSC-00019), Masscomp (SSC-00020), and Silicon Graphics (SSC-00021) versions of ELAS 9.0 are written in FORTRAN 77 and C-LANGUAGE for Sun4 series computers running SunOS, Masscomp computers running UNIX, and Silicon Graphics IRIS computers running IRIX. The Concurrent version requires at least 15 bit addressing and a direct memory access channel. The VAX and Concurrent versions of ELAS both require floating-point hardware, at least 1Mb of RAM, and approximately 70Mb of disk space. Both versions also require a COMTAL display device in order to display images. For the Sun, Masscomp, and Silicon Graphics versions of ELAS, the disk storage required is approximately 115Mb, and a minimum of 8Mb of RAM is required for execution. The Sun version of ELAS requires either the X-Window System Version 11 Revision 4 or Sun OpenWindows Version 2. The Masscomp version requires a GA1000 display device and the associated "gp" library. The Silicon Graphics version requires Silicon Graphics' GL library. ELAS display functions will not work with a monochrome monitor. The standard distribution medium for the VAX version (ERL10017) is a set of two 9-track 1600 BPI magnetic tapes in DEC VAX BACKUP format. This version is also available on a TK50 tape cartridge in DEC VAX BACKUP format. The standard distribution medium for the Concurrent version (ERL-10013) is a set of two 9-track 1600 BPI magnetic tapes in Concurrent BACKUP format. The standard distribution medium for the Sun version (SSC-00019) is a .25 inch streaming magnetic tape cartridge in UNIX tar format. The standard distribution medium for the Masscomp version, (SSC-00020) is a .25 inch streaming magnetic tape cartridge in UNIX tar format. The standard distribution medium for the Silicon Graphics version (SSC-00021) is a .25 inch streaming magnetic IRIS tape cartridge in UNIX tar format. Version 9.0 was released in 1991. Sun4, SunOS, and Open Windows are trademarks of Sun Microsystems, Inc. MIT X Window System is licensed by Massachusetts Institute of Technology.
ELAS - SCIENCE & TECHNOLOGY LABORATORY APPLICATIONS SOFTWARE (SUN VERSION)
NASA Technical Reports Server (NTRS)
Walters, D.
1994-01-01
The Science and Technology Laboratory Applications Software (ELAS) was originally designed to analyze and process digital imagery data, specifically remotely-sensed scanner data. This capability includes the processing of Landsat multispectral data; aircraft-acquired scanner data; digitized topographic data; and numerous other ancillary data, such as soil types and rainfall information, that can be stored in digitized form. ELAS has the subsequent capability to geographically reference this data to dozens of standard, as well as user created projections. As an integrated image processing system, ELAS offers the user of remotely-sensed data a wide range of capabilities in the areas of land cover analysis and general purpose image analysis. ELAS is designed for flexible use and operation and includes its own FORTRAN operating subsystem and an expandable set of FORTRAN application modules. Because all of ELAS resides in one "logical" FORTRAN program, data inputs and outputs, directives, and module switching are convenient for the user. There are over 230 modules presently available to aid the user in performing a wide range of land cover analyses and manipulation. The file management modules enable the user to allocate, define, access, and specify usage for all types of files (ELAS files, subfiles, external files etc.). Various other modules convert specific types of satellite, aircraft, and vector-polygon data into files that can be used by other ELAS modules. The user also has many module options which aid in displaying image data, such as magnification/reduction of the display; true color display; and several memory functions. Additional modules allow for the building and manipulation of polygonal areas of the image data. Finally, there are modules which allow the user to select and classify the image data. An important feature of the ELAS subsystem is that its structure allows new applications modules to be easily integrated in the future. ELAS has as a standard the flexibility to process data elements exceeding 8 bits in length, including floating point (noninteger) elements and 16 or 32 bit integers. Thus it is able to analyze and process "non-standard" nonimage data. The VAX (ERL-10017) and Concurrent (ERL-10013) versions of ELAS 9.0 are written in FORTRAN and ASSEMBLER for DEC VAX series computers running VMS and Concurrent computers running MTM. The Sun (SSC-00019), Masscomp (SSC-00020), and Silicon Graphics (SSC-00021) versions of ELAS 9.0 are written in FORTRAN 77 and C-LANGUAGE for Sun4 series computers running SunOS, Masscomp computers running UNIX, and Silicon Graphics IRIS computers running IRIX. The Concurrent version requires at least 15 bit addressing and a direct memory access channel. The VAX and Concurrent versions of ELAS both require floating-point hardware, at least 1Mb of RAM, and approximately 70Mb of disk space. Both versions also require a COMTAL display device in order to display images. For the Sun, Masscomp, and Silicon Graphics versions of ELAS, the disk storage required is approximately 115Mb, and a minimum of 8Mb of RAM is required for execution. The Sun version of ELAS requires either the X-Window System Version 11 Revision 4 or Sun OpenWindows Version 2. The Masscomp version requires a GA1000 display device and the associated "gp" library. The Silicon Graphics version requires Silicon Graphics' GL library. ELAS display functions will not work with a monochrome monitor. The standard distribution medium for the VAX version (ERL10017) is a set of two 9-track 1600 BPI magnetic tapes in DEC VAX BACKUP format. This version is also available on a TK50 tape cartridge in DEC VAX BACKUP format. The standard distribution medium for the Concurrent version (ERL-10013) is a set of two 9-track 1600 BPI magnetic tapes in Concurrent BACKUP format. The standard distribution medium for the Sun version (SSC-00019) is a .25 inch streaming magnetic tape cartridge in UNIX tar format. The standard distribution medium for the Masscomp version, (SSC-00020) is a .25 inch streaming magnetic tape cartridge in UNIX tar format. The standard distribution medium for the Silicon Graphics version (SSC-00021) is a .25 inch streaming magnetic IRIS tape cartridge in UNIX tar format. Version 9.0 was released in 1991. Sun4, SunOS, and Open Windows are trademarks of Sun Microsystems, Inc. MIT X Window System is licensed by Massachusetts Institute of Technology.
ELAS - SCIENCE & TECHNOLOGY LABORATORY APPLICATIONS SOFTWARE (MASSCOMP VERSION)
NASA Technical Reports Server (NTRS)
Walters, D.
1994-01-01
The Science and Technology Laboratory Applications Software (ELAS) was originally designed to analyze and process digital imagery data, specifically remotely-sensed scanner data. This capability includes the processing of Landsat multispectral data; aircraft-acquired scanner data; digitized topographic data; and numerous other ancillary data, such as soil types and rainfall information, that can be stored in digitized form. ELAS has the subsequent capability to geographically reference this data to dozens of standard, as well as user created projections. As an integrated image processing system, ELAS offers the user of remotely-sensed data a wide range of capabilities in the areas of land cover analysis and general purpose image analysis. ELAS is designed for flexible use and operation and includes its own FORTRAN operating subsystem and an expandable set of FORTRAN application modules. Because all of ELAS resides in one "logical" FORTRAN program, data inputs and outputs, directives, and module switching are convenient for the user. There are over 230 modules presently available to aid the user in performing a wide range of land cover analyses and manipulation. The file management modules enable the user to allocate, define, access, and specify usage for all types of files (ELAS files, subfiles, external files etc.). Various other modules convert specific types of satellite, aircraft, and vector-polygon data into files that can be used by other ELAS modules. The user also has many module options which aid in displaying image data, such as magnification/reduction of the display; true color display; and several memory functions. Additional modules allow for the building and manipulation of polygonal areas of the image data. Finally, there are modules which allow the user to select and classify the image data. An important feature of the ELAS subsystem is that its structure allows new applications modules to be easily integrated in the future. ELAS has as a standard the flexibility to process data elements exceeding 8 bits in length, including floating point (noninteger) elements and 16 or 32 bit integers. Thus it is able to analyze and process "non-standard" nonimage data. The VAX (ERL-10017) and Concurrent (ERL-10013) versions of ELAS 9.0 are written in FORTRAN and ASSEMBLER for DEC VAX series computers running VMS and Concurrent computers running MTM. The Sun (SSC-00019), Masscomp (SSC-00020), and Silicon Graphics (SSC-00021) versions of ELAS 9.0 are written in FORTRAN 77 and C-LANGUAGE for Sun4 series computers running SunOS, Masscomp computers running UNIX, and Silicon Graphics IRIS computers running IRIX. The Concurrent version requires at least 15 bit addressing and a direct memory access channel. The VAX and Concurrent versions of ELAS both require floating-point hardware, at least 1Mb of RAM, and approximately 70Mb of disk space. Both versions also require a COMTAL display device in order to display images. For the Sun, Masscomp, and Silicon Graphics versions of ELAS, the disk storage required is approximately 115Mb, and a minimum of 8Mb of RAM is required for execution. The Sun version of ELAS requires either the X-Window System Version 11 Revision 4 or Sun OpenWindows Version 2. The Masscomp version requires a GA1000 display device and the associated "gp" library. The Silicon Graphics version requires Silicon Graphics' GL library. ELAS display functions will not work with a monochrome monitor. The standard distribution medium for the VAX version (ERL10017) is a set of two 9-track 1600 BPI magnetic tapes in DEC VAX BACKUP format. This version is also available on a TK50 tape cartridge in DEC VAX BACKUP format. The standard distribution medium for the Concurrent version (ERL-10013) is a set of two 9-track 1600 BPI magnetic tapes in Concurrent BACKUP format. The standard distribution medium for the Sun version (SSC-00019) is a .25 inch streaming magnetic tape cartridge in UNIX tar format. The standard distribution medium for the Masscomp version, (SSC-00020) is a .25 inch streaming magnetic tape cartridge in UNIX tar format. The standard distribution medium for the Silicon Graphics version (SSC-00021) is a .25 inch streaming magnetic IRIS tape cartridge in UNIX tar format. Version 9.0 was released in 1991. Sun4, SunOS, and Open Windows are trademarks of Sun Microsystems, Inc. MIT X Window System is licensed by Massachusetts Institute of Technology.
ELAS - SCIENCE & TECHNOLOGY LABORATORY APPLICATIONS SOFTWARE (DEC VAX VERSION)
NASA Technical Reports Server (NTRS)
Junkin, B. G.
1994-01-01
The Science and Technology Laboratory Applications Software (ELAS) was originally designed to analyze and process digital imagery data, specifically remotely-sensed scanner data. This capability includes the processing of Landsat multispectral data; aircraft-acquired scanner data; digitized topographic data; and numerous other ancillary data, such as soil types and rainfall information, that can be stored in digitized form. ELAS has the subsequent capability to geographically reference this data to dozens of standard, as well as user created projections. As an integrated image processing system, ELAS offers the user of remotely-sensed data a wide range of capabilities in the areas of land cover analysis and general purpose image analysis. ELAS is designed for flexible use and operation and includes its own FORTRAN operating subsystem and an expandable set of FORTRAN application modules. Because all of ELAS resides in one "logical" FORTRAN program, data inputs and outputs, directives, and module switching are convenient for the user. There are over 230 modules presently available to aid the user in performing a wide range of land cover analyses and manipulation. The file management modules enable the user to allocate, define, access, and specify usage for all types of files (ELAS files, subfiles, external files etc.). Various other modules convert specific types of satellite, aircraft, and vector-polygon data into files that can be used by other ELAS modules. The user also has many module options which aid in displaying image data, such as magnification/reduction of the display; true color display; and several memory functions. Additional modules allow for the building and manipulation of polygonal areas of the image data. Finally, there are modules which allow the user to select and classify the image data. An important feature of the ELAS subsystem is that its structure allows new applications modules to be easily integrated in the future. ELAS has as a standard the flexibility to process data elements exceeding 8 bits in length, including floating point (noninteger) elements and 16 or 32 bit integers. Thus it is able to analyze and process "non-standard" nonimage data. The VAX (ERL-10017) and Concurrent (ERL-10013) versions of ELAS 9.0 are written in FORTRAN and ASSEMBLER for DEC VAX series computers running VMS and Concurrent computers running MTM. The Sun (SSC-00019), Masscomp (SSC-00020), and Silicon Graphics (SSC-00021) versions of ELAS 9.0 are written in FORTRAN 77 and C-LANGUAGE for Sun4 series computers running SunOS, Masscomp computers running UNIX, and Silicon Graphics IRIS computers running IRIX. The Concurrent version requires at least 15 bit addressing and a direct memory access channel. The VAX and Concurrent versions of ELAS both require floating-point hardware, at least 1Mb of RAM, and approximately 70Mb of disk space. Both versions also require a COMTAL display device in order to display images. For the Sun, Masscomp, and Silicon Graphics versions of ELAS, the disk storage required is approximately 115Mb, and a minimum of 8Mb of RAM is required for execution. The Sun version of ELAS requires either the X-Window System Version 11 Revision 4 or Sun OpenWindows Version 2. The Masscomp version requires a GA1000 display device and the associated "gp" library. The Silicon Graphics version requires Silicon Graphics' GL library. ELAS display functions will not work with a monochrome monitor. The standard distribution medium for the VAX version (ERL10017) is a set of two 9-track 1600 BPI magnetic tapes in DEC VAX BACKUP format. This version is also available on a TK50 tape cartridge in DEC VAX BACKUP format. The standard distribution medium for the Concurrent version (ERL-10013) is a set of two 9-track 1600 BPI magnetic tapes in Concurrent BACKUP format. The standard distribution medium for the Sun version (SSC-00019) is a .25 inch streaming magnetic tape cartridge in UNIX tar format. The standard distribution medium for the Masscomp version, (SSC-00020) is a .25 inch streaming magnetic tape cartridge in UNIX tar format. The standard distribution medium for the Silicon Graphics version (SSC-00021) is a .25 inch streaming magnetic IRIS tape cartridge in UNIX tar format. Version 9.0 was released in 1991. Sun4, SunOS, and Open Windows are trademarks of Sun Microsystems, Inc. MIT X Window System is licensed by Massachusetts Institute of Technology.
View Toward 'Vera Rubin Ridge' on Mount Sharp, Mars
2017-07-11
This look ahead from NASA's Curiosity Mars rover includes four geological layers to be examined by the mission, and higher reaches of Mount Sharp beyond the planned study area. The redder rocks of the foreground are part of the Murray formation. Pale gray rocks in the middle distance of the right half of the image are in the Clay Unit. A band between those terrains is "Vera Rubin Ridge." Rounded brown knobs beyond the Clay Unit are in the Sulfate Unit, beyond which lie higher portions of the mountain. The view combines six images taken with the rover's Mast Camera (Mastcam) on Jan. 24, 2017, during the 1,589th Martian day, or sol, of Curiosity's work on Mars, when the rover was still more than half a mile (about a kilometer) north of Vera Rubin Ridge. The panorama has been white-balanced so that the colors of the rock and sand materials resemble how they would appear under daytime lighting conditions on Earth. It spans from east-southeast on the left to south on the right. The Sol 1589 location was just north of the waypoint labeled "Ogunquit Beach" on a map of the area that also shows locations of the Murray formation, Vera Rubin Ridge, Clay Unit and Sulfate Unit. The ridge was informally named in early 2017 in memory of Vera Cooper Rubin (1928-2016), whose astronomical observations provided evidence for the existence of the universe's dark matter. Annotated and full resolution TIFF files are available at https://photojournal.jpl.nasa.gov/catalog/PIA21716
Lu, Hao; Papathomas, Thomas G; van Zessen, David; Palli, Ivo; de Krijger, Ronald R; van der Spek, Peter J; Dinjens, Winand N M; Stubbs, Andrew P
2014-11-25
In prognosis and therapeutics of adrenal cortical carcinoma (ACC), the selection of the most active areas in proliferative rate (hotspots) within a slide and objective quantification of immunohistochemical Ki67 Labelling Index (LI) are of critical importance. In addition to intratumoral heterogeneity in proliferative rate i.e. levels of Ki67 expression within a given ACC, lack of uniformity and reproducibility in the method of quantification of Ki67 LI may confound an accurate assessment of Ki67 LI. We have implemented an open source toolset, Automated Selection of Hotspots (ASH), for automated hotspot detection and quantification of Ki67 LI. ASH utilizes NanoZoomer Digital Pathology Image (NDPI) splitter to convert the specific NDPI format digital slide scanned from the Hamamatsu instrument into a conventional tiff or jpeg format image for automated segmentation and adaptive step finding hotspots detection algorithm. Quantitative hotspot ranking is provided by the functionality from the open source application ImmunoRatio as part of the ASH protocol. The output is a ranked set of hotspots with concomitant quantitative values based on whole slide ranking. We have implemented an open source automated detection quantitative ranking of hotspots to support histopathologists in selecting the 'hottest' hotspot areas in adrenocortical carcinoma. To provide wider community easy access to ASH we implemented a Galaxy virtual machine (VM) of ASH which is available from http://bioinformatics.erasmusmc.nl/wiki/Automated_Selection_of_Hotspots . The virtual slide(s) for this article can be found here: http://www.diagnosticpathology.diagnomx.eu/vs/13000_2014_216.
Facial affect interpretation in boys with attention deficit/hyperactivity disorder.
Boakes, Jolee; Chapman, Elaine; Houghton, Stephen; West, John
2008-01-01
Recent studies have produced mixed evidence of impairments in facial affect interpretation for children with attention deficit/hyperactivity disorder (ADHD). This study investigated the presence and nature of such impairments across different stimulus formats. Twenty-four boys with ADHD and 24 age-matched comparison boys completed a 72-trial task that included facial expressions of happiness, sadness, fear, anger, surprise, and disgust. Three versions of each expression were used: a static version, a dynamic version, and a dynamic version presented within a relevant situational context. Expressions were also presented in one of two portrayal modes (cartoon versus real-life). Results indicated significant impairments for boys with ADHD on two of the six emotions (fear and disgust), which were consistent across stimulus formats. Directions for further research to identify mediating factors in the expression of such impairments in children with ADHD are discussed.
Courtney, Jane; Woods, Elena; Scholz, Dimitri; Hall, William W; Gautier, Virginie W
2015-01-01
We introduce here MATtrack, an open source MATLAB-based computational platform developed to process multi-Tiff files produced by a photo-conversion time lapse protocol for live cell fluorescent microscopy. MATtrack automatically performs a series of steps required for image processing, including extraction and import of numerical values from Multi-Tiff files, red/green image classification using gating parameters, noise filtering, background extraction, contrast stretching and temporal smoothing. MATtrack also integrates a series of algorithms for quantitative image analysis enabling the construction of mean and standard deviation images, clustering and classification of subcellular regions and injection point approximation. In addition, MATtrack features a simple user interface, which enables monitoring of Fluorescent Signal Intensity in multiple Regions of Interest, over time. The latter encapsulates a region growing method to automatically delineate the contours of Regions of Interest selected by the user, and performs background and regional Average Fluorescence Tracking, and automatic plotting. Finally, MATtrack computes convenient visualization and exploration tools including a migration map, which provides an overview of the protein intracellular trajectories and accumulation areas. In conclusion, MATtrack is an open source MATLAB-based software package tailored to facilitate the analysis and visualization of large data files derived from real-time live cell fluorescent microscopy using photoconvertible proteins. It is flexible, user friendly, compatible with Windows, Mac, and Linux, and a wide range of data acquisition software. MATtrack is freely available for download at eleceng.dit.ie/courtney/MATtrack.zip.
Courtney, Jane; Woods, Elena; Scholz, Dimitri; Hall, William W.; Gautier, Virginie W.
2015-01-01
We introduce here MATtrack, an open source MATLAB-based computational platform developed to process multi-Tiff files produced by a photo-conversion time lapse protocol for live cell fluorescent microscopy. MATtrack automatically performs a series of steps required for image processing, including extraction and import of numerical values from Multi-Tiff files, red/green image classification using gating parameters, noise filtering, background extraction, contrast stretching and temporal smoothing. MATtrack also integrates a series of algorithms for quantitative image analysis enabling the construction of mean and standard deviation images, clustering and classification of subcellular regions and injection point approximation. In addition, MATtrack features a simple user interface, which enables monitoring of Fluorescent Signal Intensity in multiple Regions of Interest, over time. The latter encapsulates a region growing method to automatically delineate the contours of Regions of Interest selected by the user, and performs background and regional Average Fluorescence Tracking, and automatic plotting. Finally, MATtrack computes convenient visualization and exploration tools including a migration map, which provides an overview of the protein intracellular trajectories and accumulation areas. In conclusion, MATtrack is an open source MATLAB-based software package tailored to facilitate the analysis and visualization of large data files derived from real-time live cell fluorescent microscopy using photoconvertible proteins. It is flexible, user friendly, compatible with Windows, Mac, and Linux, and a wide range of data acquisition software. MATtrack is freely available for download at eleceng.dit.ie/courtney/MATtrack.zip. PMID:26485569
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dolan, Daniel H.; Ao, Tommy
The Sandia Data Archive (SDA) format is a specific implementation of the HDF5 (Hierarchal Data Format version 5) standard. The format was developed for storing data in a universally accessible manner. SDA files may contain one or more data records, each associated with a distinct text label. Primitive records provide basic data storage, while compound records support more elaborate grouping. External records allow text/binary files to be carried inside an archive and later recovered. This report documents version 1.0 of the SDA standard. The information provided here is sufficient for reading from and writing to an archive. Although the formatmore » was original designed for use in MATLAB, broader use is encouraged.« less
Interformat reliability of digital psychiatric self-report questionnaires: a systematic review.
Alfonsson, Sven; Maathz, Pernilla; Hursti, Timo
2014-12-03
Research on Internet-based interventions typically use digital versions of pen and paper self-report symptom scales. However, adaptation into the digital format could affect the psychometric properties of established self-report scales. Several studies have investigated differences between digital and pen and paper versions of instruments, but no systematic review of the results has yet been done. This review aims to assess the interformat reliability of self-report symptom scales used in digital or online psychotherapy research. Three databases (MEDLINE, Embase, and PsycINFO) were systematically reviewed for studies investigating the reliability between digital and pen and paper versions of psychiatric symptom scales. From a total of 1504 publications, 33 were included in the review, and interformat reliability of 40 different symptom scales was assessed. Significant differences in mean total scores between formats were found in 10 of 62 analyses. These differences were found in just a few studies, which indicates that the results were due to study effects and sample effects rather than unreliable instruments. The interformat reliability ranged from r=.35 to r=.99; however, the majority of instruments showed a strong correlation between format scores. The quality of the included studies varied, and several studies had insufficient power to detect small differences between formats. When digital versions of self-report symptom scales are compared to pen and paper versions, most scales show high interformat reliability. This supports the reliability of results obtained in psychotherapy research on the Internet and the comparability of the results to traditional psychotherapy research. There are, however, some instruments that consistently show low interformat reliability, suggesting that these conclusions cannot be generalized to all questionnaires. Most studies had at least some methodological issues with insufficient statistical power being the most common issue. Future studies should preferably provide information about the transformation of the instrument into digital format and the procedure for data collection in more detail.
75 FR 65359 - Common Formats for Patient Safety Data Collection and Event Reporting
Federal Register 2010, 2011, 2012, 2013, 2014
2010-10-22
... DEPARTMENT OF HEALTH AND HUMAN SERVICES Agency for Healthcare Research and Quality Common Formats..., and analyze confidential information regarding the quality and safety of healthcare delivery. The... Information Technology (HIT) Device format and the remaining Common Formats Version 1.1 can be accessed...
Dias, Juliana Chioda Ribeiro; Maroco, João; Campos, Juliana Alvares Duarte Bonini
2015-03-01
Online data collection is becoming increasingly common and has some advantages compared to traditional paper-and-pencil formats, such as reducing loss of data, increasing participants' privacy, and decreasing the effect of social desirability. However, the validity and reliability of this administration format must be established before results can be considered acceptable. The aim of this study was to evaluate the validity, reliability, and equivalence of paper-and-pencil and online versions of the Weight Concerns Scale (WCS) when applied to Brazilian university students. A crossover design was used, and the Portuguese version of the WCS (in both paper-and-pencil and online formats) was completed by 100 college students. The results indicated adequate fit in both formats. The simultaneous fit of data for both groups was excellent, with strong invariance between models. Adequate convergent validity, internal consistency, and mean score equivalence of the WCS in both formats were observed. Thus, the WCS presented adequate reliability and validity in both administration formats, with equivalence/stability between answers.
Fuzzy Arden Syntax: A fuzzy programming language for medicine.
Vetterlein, Thomas; Mandl, Harald; Adlassnig, Klaus-Peter
2010-05-01
The programming language Arden Syntax has been optimised for use in clinical decision support systems. We describe an extension of this language named Fuzzy Arden Syntax, whose original version was introduced in S. Tiffe's dissertation on "Fuzzy Arden Syntax: Representation and Interpretation of Vague Medical Knowledge by Fuzzified Arden Syntax" (Vienna University of Technology, 2003). The primary aim is to provide an easy means of processing vague or uncertain data, which frequently appears in medicine. For both propositional and number data types, fuzzy equivalents have been added to Arden Syntax. The Boolean data type was generalised to represent any truth degree between the two extremes 0 (falsity) and 1 (truth); fuzzy data types were introduced to represent fuzzy sets. The operations on truth values and real numbers were generalised accordingly. As the conditions to decide whether a certain programme unit is executed or not may be indeterminate, a Fuzzy Arden Syntax programme may split. The data in the different branches may be optionally aggregated subsequently. Fuzzy Arden Syntax offers the possibility to formulate conveniently Medical Logic Modules (MLMs) based on the principle of a continuously graded applicability of statements. Furthermore, ad hoc decisions about sharp value boundaries can be avoided. As an illustrative example shows, an MLM making use of the features of Fuzzy Arden Syntax is not significantly more complex than its Arden Syntax equivalent; in the ideal case, a programme handling crisp data remains practically unchanged when compared to its fuzzified version. In the latter case, the output data, which can be a set of weighted alternatives, typically depends continuously from the input data. In typical applications an Arden Syntax MLM can produce a different output after only slight changes of the input; discontinuities are in fact unavoidable when the input varies continuously but the output is taken from a discrete set of possibilities. This inconvenience can, however, be attenuated by means of certain mechanisms on which the programme flow under Fuzzy Arden Syntax is based. To write a programme making use of these possibilities is not significantly more difficult than to write a programme according to the usual practice. 2010 Elsevier B.V. All rights reserved.
1993-04-01
FREIGHT INVOICE (VERSION 003020) FORMATTING INVOICE INFORMATION FOR THE DoD TRANSPORTATION PAYMENT SYSTEM USING THE X1 2.55 TRANSACTION SET 859 GENERIC...GBYERIC FREIGHT NIVOICE EDI CONVENTON 859.003020 * Contents FORMATTING INVOICE INFORMATION FOR THE DoD TRANSPORTATION PAYMENT SYSTEM USING THE Xl 2.55... transportation invoice using the ASC X12.55 Transaction Set 859 Generic Freight Invoice (003020). It contains information for the design of interface
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cullen, D.E.
1977-01-12
A code, SIGMA1, has been designed to Doppler broaden evaluated cross sections in the ENDF/B format. The code can only be applied to tabulated data that vary linearly in energy and cross section between tabulated points. This report describes the methods used in the code and serves as a user's guide to the code.
IAU MDC Photographic Meteor Orbits Database: Version 2013
NASA Astrophysics Data System (ADS)
Neslušan, L.; Porubčan, V.; Svoreň, J.
2014-05-01
A new 2013 version of the IAU MDC photographic meteor orbits database which is an upgrade of the current 2003 version (Lindblad et al. 2003, EMP 93:249-260) is presented. To the 2003 version additional 292 orbits are added, thus the new version of the database consists of 4,873 meteors with their geophysical and orbital parameters compiled in 41 catalogues. For storing the data, a new format enabling a more simple treatment with the parameters, including the errors of their determination is applied. The data can be downloaded from the IAU MDC web site: http://www.astro.sk/IAUMDC/Ph2013/
GEMPAK 5.1 - A GENERAL METEOROLOGICAL PACKAGE (UNIX VERSION)
NASA Technical Reports Server (NTRS)
Desjardins, M. L.
1994-01-01
GEMPAK is a general meteorological software package developed at NASA/Goddard Space Flight Center. It includes programs to analyze and display surface, upper-air, and gridded data, including model output. There are very general programs to list, edit, and plot data on maps, to display profiles and time series, to draw and fill contours, to draw streamlines, to plot symbols for clouds, sky cover, and pressure tendency, and draw cross sections in the case of gridded data and sounding data. In addition, there are Barnes objective analysis programs to grid surface and upper-air data. The programs include the capabilities to derive meteorological parameters from those found in the dataset, to perform vertical interpolations of sounding data to different coordinate systems, and to compute an extensive set of gridded diagnostic quantities by specifying various nested combinations of scalars and vector arithmetic, algebraic, and differential operators. The GEMPAK 5.1 graphics/transformation subsystem, GEMPLT, provides device-independent graphics. GEMPLT also has the capability to display output in a variety of map projections or overlaid on satellite imagery. GEMPAK 5.1 is written in FORTRAN 77 and C-language and has been implemented on VAX computers under VMS and on computers running the UNIX operating system. During installation and normal use, this package occupies approximately 100Mb of hard disk space. The UNIX version of GEMPAK includes drivers for several graphic output systems including MIT's X Window System (X11,R4), Sun GKS, PostScript (color and monochrome), Silicon Graphics, and others. The VMS version of GEMPAK also includes drivers for several graphic output systems including PostScript (color and monochrome). The VMS version is delivered with the object code for the Transportable Applications Environment (TAE) program, version 4.1 which serves as a user interface. A color monitor is recommended for displaying maps on video display devices. Data for rendering regional maps is included with this package. The standard distribution medium for the UNIX version of GEMPAK 5.1 is a .25 inch streaming magnetic tape cartridge in UNIX tar format. The standard distribution medium for the VMS version of GEMPAK 5.1 is a 6250 BPI 9-track magnetic tape in DEC VAX BACKUP format. The VMS version is also available on a TK50 tape cartridge in DEC VAX BACKUP format. This program was developed in 1985. The current version, GEMPAK 5.1, was released in 1992. The package is delivered with source code. An extensive collection of subroutine libraries allows users to format data for use by GEMPAK, to develop new programs, and to enhance existing ones.
Bowman, Caroline H; Evans, Cathryn E Y; Turnbull, Oliver H
2005-02-01
In the last decade, the Iowa Gambling Task (IGT) has become a widely employed neuropsychological research instrument for the investigation of executive function. The task has been employed in a wide range of formats, from 'manual' procedures to more recently introduced computerised versions. Computer-based formats often require that responses on the task should be artificially delayed by a number of seconds between trials to collect skin-conductance data. Participants, however, may become frustrated when they want to select from a particular deck in the time-limited versions--so that an unintended emotional experience of frustration might well disrupt a task presumed to be reliant on emotion-based learning. We investigated the effect of the various types of Iowa Gambling Task format on performance, using three types of task: the classic manual administration, with no time limitations; a computerised administration with a 6-s enforced delay; and a control computerised version which had no time constraints. We also evaluated the subjective experience of participants on each task. There were no significant differences in performance, between formats, in behavioural terms. Subjective experience measures on the task also showed consistent effects across all three formats-with substantial, and rapidly developing, awareness of which decks were 'good' and 'bad.'
Mosaic of Digital Raster Soviet Topographic Maps of Afghanistan
Chirico, Peter G.; Warner, Michael B.
2005-01-01
EXPLANATION The data contained in this publication include scanned, geographically referenced digital raster graphics (DRGs) of Soviet 1:200,000 - scale topographic map quadrangles. The original Afghanistan topographic map series at 1:200,000 scale, for the entire country, was published by the Soviet military between 1985 and 1991(MTDGS, 85-91). Hard copies of these original paper maps were scanned using a large format scanner, reprojected into Geographic Coordinate System (GCS) coordinates, and then clipped to remove the map collars to create a seamless, topographic map base for the entire country. An index of all available topographic map sheets is displayed here: Index_Geo_DD.pdf. This publication also includes the originial topographic map quadrangles projected in Universal Transverse Mercator (UTM) projection. The country of Afghanistan spans three UTM Zones: Zone 41, Zone 42, and Zone 43. Maps are stored as GeoTIFFs in their respective UTM zone projection. Indexes of all available topographic map sheets in their respective UTM zone are displayed here: Index_UTM_Z41.pdf, Index_UTM_Z42.pdf, Index_UTM_Z43.pdf. An Adobe Acrobat PDF file of the U.S. Department of the Army's Technical Manual 30-548, is available (U.S. Army, 1958). This document has been translated into English for assistance in reading Soviet topographic map symbols.
Managing multiple image stacks from confocal laser scanning microscopy
NASA Astrophysics Data System (ADS)
Zerbe, Joerg; Goetze, Christian H.; Zuschratter, Werner
1999-05-01
A major goal in neuroanatomy is to obtain precise information about the functional organization of neuronal assemblies and their interconnections. Therefore, the analysis of histological sections frequently requires high resolution images in combination with an overview about the structure. To overcome this conflict we have previously introduced a software for the automatic acquisition of multiple image stacks (3D-MISA) in confocal laser scanning microscopy. Here, we describe a Windows NT based software for fast and easy navigation through the multiple images stacks (MIS-browser), the visualization of individual channels and layers and the selection of user defined subregions. In addition, the MIS browser provides useful tools for the visualization and evaluation of the datavolume, as for instance brightness and contrast corrections of individual layers and channels. Moreover, it includes a maximum intensity projection, panning and zoom in/out functions within selected channels or focal planes (x/y) and tracking along the z-axis. The import module accepts any tiff-format and reconstructs the original image arrangement after the user has defined the sequence of images in x/y and z and the number of channels. The implemented export module allows storage of user defined subregions (new single image stacks) for further 3D-reconstruction and evaluation.
NASA Astrophysics Data System (ADS)
Chiu, L.; Hao, X.; Kinter, J. L.; Stearn, G.; Aliani, M.
2017-12-01
The launch of GOES-16 series provides an opportunity to advance near real-time applications in natural hazard detection, monitoring and warning. This study demonstrates the capability and values of receiving real-time satellite-based Earth observations over a fast terrestrial networks and processing high-resolution remote sensing data in a university environment. The demonstration system includes 4 components: 1) Near real-time data receiving and processing; 2) data analysis and visualization; 3) event detection and monitoring; and 4) information dissemination. Various tools are developed and integrated to receive and process GRB data in near real-time, produce images and value-added data products, and detect and monitor extreme weather events such as hurricane, fire, flooding, fog, lightning, etc. A web-based application system is developed to disseminate near-real satellite images and data products. The images are generated with GIS-compatible format (GeoTIFF) to enable convenient use and integration in various GIS platforms. This study enhances the capacities for undergraduate and graduate education in Earth system and climate sciences, and related applications to understand the basic principles and technology in real-time applications with remote sensing measurements. It also provides an integrated platform for near real-time monitoring of extreme weather events, which are helpful for various user communities.
Sliter, Ray W.; Triezenberg, Peter J.; Hart, Patrick E.; Watt, Janet T.; Johnson, Samuel Y.; Scheirer, Daniel S.
2009-01-01
The U.S. Geological Survey (USGS) collected high-resolution shallow seismic-reflection and marine magnetic data in June 2008 in the offshore areas between the towns of Cayucos and Pismo Beach, Calif., from the nearshore (~6-m depth) to just west of the Hosgri Fault Zone (~200-m depth). These data are in support of the California State Waters Mapping Program and the Cooperative Research and Development Agreement (CRADA) between the Pacific Gas & Electric Co. and the U.S. Geological Survey. Seismic-reflection and marine magnetic data were acquired aboard the R/V Parke Snavely, using a SIG 2Mille minisparker seismic source and a Geometrics G882 cesium-vapor marine magnetometer. More than 550 km of seismic and marine magnetic data was collected simultaneously along shore-perpendicular transects spaced 800 m apart, with an additional 220 km of marine magnetometer data collected across the Hosgri Fault Zone, resulting in spacing locally as smallas 400 m. This report includes maps of the seismic-survey sections, linked to Google Earth software, and digital data files showing images of each transect in SEG-Y, JPEG, and TIFF formats, as well as preliminary gridded marine-magnetic-anomaly and residual-magnetic-anomaly (shallow magnetic source) maps.
47 CFR 11.56 - Obligation to process CAP-formatted EAS messages.
Code of Federal Regulations, 2013 CFR
2013-10-01
...), and Common Alerting Protocol, v. 1.2 USA Integrated Public Alert and Warning System Profile Version 1...) “Common Alerting Protocol, v. 1.2 USA Integrated Public Alert and Warning System Profile Version 1.0” (Oct...
47 CFR 11.56 - Obligation to process CAP-formatted EAS messages.
Code of Federal Regulations, 2012 CFR
2012-10-01
...), and Common Alerting Protocol, v. 1.2 USA Integrated Public Alert and Warning System Profile Version 1...) “Common Alerting Protocol, v. 1.2 USA Integrated Public Alert and Warning System Profile Version 1.0” (Oct...
47 CFR 11.56 - Obligation to process CAP-formatted EAS messages.
Code of Federal Regulations, 2014 CFR
2014-10-01
...), and Common Alerting Protocol, v. 1.2 USA Integrated Public Alert and Warning System Profile Version 1...) “Common Alerting Protocol, v. 1.2 USA Integrated Public Alert and Warning System Profile Version 1.0” (Oct...
Study on generation and sharing of on-demand global seamless data—Taking MODIS NDVI as an example
NASA Astrophysics Data System (ADS)
Shen, Dayong; Deng, Meixia; Di, Liping; Han, Weiguo; Peng, Chunming; Yagci, Ali Levent; Yu, Genong; Chen, Zeqiang
2013-04-01
By applying advanced Geospatial Data Abstraction Library (GDAL) and BigTIFF technology in a Geographical Information System (GIS) with Service Oriented Architecture (SOA), this study has derived global datasets using tile-based input data and implemented Virtual Web Map Service (VWMS) and Virtual Web Coverage Service (VWCS) to provide software tools for visualization and acquisition of global data. Taking MODIS Normalized Difference Vegetation Index (NDVI) as an example, this study proves the feasibility, efficiency and features of the proposed approach.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-02
... Software Developers on the Technical Specifications for Common Formats for Patient Safety Data Collection... software developers can provide input on these technical specifications for the Common Formats Version 1.1... specifications, which provide direction to software developers that plan to implement the Common Formats...
Guo, Jinqiu; Takada, Akira; Tanaka, Koji; Sato, Junzo; Suzuki, Muneou; Suzuki, Toshiaki; Nakashima, Yusei; Araki, Kenji; Yoshihara, Hiroyuki
2004-12-01
Medical Markup Language (MML), as a set of standards, has been developed over the last 8 years to allow the exchange of medical data between different medical information providers. MML Version 2.21 used XML as a metalanguage and was announced in 1999. In 2001, MML was updated to Version 2.3, which contained 12 modules. The latest version--Version 3.0--is based on the HL7 Clinical Document Architecture (CDA). During the development of this new version, the structure of MML Version 2.3 was analyzed, subdivided into several categories, and redefined so the information defined in MML could be described in HL7 CDA Level One. As a result of this development, it has become possible to exchange MML Version 3.0 medical documents via HL7 messages.
Excoffier, Laurent; Lischer, Heidi E L
2010-05-01
We present here a new version of the Arlequin program available under three different forms: a Windows graphical version (Winarl35), a console version of Arlequin (arlecore), and a specific console version to compute summary statistics (arlsumstat). The command-line versions run under both Linux and Windows. The main innovations of the new version include enhanced outputs in XML format, the possibility to embed graphics displaying computation results directly into output files, and the implementation of a new method to detect loci under selection from genome scans. Command-line versions are designed to handle large series of files, and arlsumstat can be used to generate summary statistics from simulated data sets within an Approximate Bayesian Computation framework. © 2010 Blackwell Publishing Ltd.
GREET 1.5 : transportation fuel-cycle model. Vol. 1 : methodology, development, use, and results.
DOT National Transportation Integrated Search
1999-10-01
This report documents the development and use of the most recent version (Version 1.5) of the Greenhouse Gases, Regulated Emissions, and Energy Use in Transportation (GREET) model. The model, developed in a spreadsheet format, estimates the full fuel...
Equivalence of electronic and paper-based patient-reported outcome measures.
Campbell, Niloufar; Ali, Faraz; Finlay, Andrew Y; Salek, Sam S
2015-08-01
Electronic formats (ePROs) of paper-based patient-reported outcomes (PROs) should be validated before they can be reliably used. This review aimed to examine studies investigating measurement equivalence between ePROs and their paper originals to identify methodologies used and to determine the extent of such validation. Three databases (OvidSP, Web of Science and PubMed) were searched using a set of keywords. Results were examined for compliance with inclusion criteria. Articles or abstracts that directly compared screen-based electronic versions of PROs with their validated paper-based originals, with regard to their measurement equivalence, were included. Publications were excluded if the only instruments reported were stand-alone visual analogue scales or interactive voice response formats. Papers published before 2007 were excluded, as a previous meta-analysis examined papers published before this time. Fifty-five studies investigating 79 instruments met the inclusion criteria. 53 % of the 79 instruments studied were condition specific. Several instruments, such as the SF-36, were reported in more than one publication. The most frequently reported formats for ePROs were Web-based versions. In 78 % of the publications, there was evidence of equivalence or comparability between the two formats as judged by study authors. Of the 30 publications that provided preference data, 87 % found that overall participants preferred the electronic format. When examining equivalence between paper and electronic versions of PROs, formats are usually judged by authors to be equivalent. Participants prefer electronic formats. This literature review gives encouragement to the further widespread development and use of ePROs.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cullen, D.E.
1978-07-04
The code SIGMA1 Doppler broadens evaluated cross sections in the ENDF/B format. The code can be applied only to data that vary as a linear function of energy and cross section between tabulated points. This report describes the methods used in the code and serves as a user's guide to the code. 6 figures, 2 tables.
USSAERO version D computer program development using ANSI standard FORTRAN 77 and DI-3000 graphics
NASA Technical Reports Server (NTRS)
Wiese, M. R.
1986-01-01
The D version of the Unified Subsonic Supersonic Aerodynamic Analysis (USSAERO) program is the result of numerous modifications and enhancements to the B01 version. These changes include conversion to ANSI standard FORTRAN 77; use of the DI-3000 graphics package; removal of the overlay structure; a revised input format; the addition of an input data analysis routine; and increasing the number of aeronautical components allowed.
Berent, Jarosław
2007-01-01
This paper presents the new DNAStat version 1.2 for processing genetic profile databases and biostatistical calculations. This new version contains, besides all the options of its predecessor 1.0, a calculation-results file export option in .xls format for Microsoft Office Excel, as well as the option of importing/exporting the population base of systems as .txt files for processing in Microsoft Notepad or EditPad
GEMPAK 5.1 - A GENERAL METEOROLOGICAL PACKAGE (VAX VMS VERSION)
NASA Technical Reports Server (NTRS)
Des, Jardins M. L.
1994-01-01
GEMPAK is a general meteorological software package developed at NASA/Goddard Space Flight Center. It includes programs to analyze and display surface, upper-air, and gridded data, including model output. There are very general programs to list, edit, and plot data on maps, to display profiles and time series, to draw and fill contours, to draw streamlines, to plot symbols for clouds, sky cover, and pressure tendency, and draw cross sections in the case of gridded data and sounding data. In addition, there are Barnes objective analysis programs to grid surface and upper-air data. The programs include the capabilities to derive meteorological parameters from those found in the dataset, to perform vertical interpolations of sounding data to different coordinate systems, and to compute an extensive set of gridded diagnostic quantities by specifying various nested combinations of scalars and vector arithmetic, algebraic, and differential operators. The GEMPAK 5.1 graphics/transformation subsystem, GEMPLT, provides device-independent graphics. GEMPLT also has the capability to display output in a variety of map projections or overlaid on satellite imagery. GEMPAK 5.1 is written in FORTRAN 77 and C-language and has been implemented on VAX computers under VMS and on computers running the UNIX operating system. During installation and normal use, this package occupies approximately 100Mb of hard disk space. The UNIX version of GEMPAK includes drivers for several graphic output systems including MIT's X Window System (X11,R4), Sun GKS, PostScript (color and monochrome), Silicon Graphics, and others. The VMS version of GEMPAK also includes drivers for several graphic output systems including PostScript (color and monochrome). The VMS version is delivered with the object code for the Transportable Applications Environment (TAE) program, version 4.1 which serves as a user interface. A color monitor is recommended for displaying maps on video display devices. Data for rendering regional maps is included with this package. The standard distribution medium for the UNIX version of GEMPAK 5.1 is a .25 inch streaming magnetic tape cartridge in UNIX tar format. The standard distribution medium for the VMS version of GEMPAK 5.1 is a 6250 BPI 9-track magnetic tape in DEC VAX BACKUP format. The VMS version is also available on a TK50 tape cartridge in DEC VAX BACKUP format. This program was developed in 1985. The current version, GEMPAK 5.1, was released in 1992. The package is delivered with source code. An extensive collection of subroutine libraries allows users to format data for use by GEMPAK, to develop new programs, and to enhance existing ones.
OSMEAN - OSCULATING/MEAN CLASSICAL ORBIT ELEMENTS CONVERSION (HP9000/7XX VERSION)
NASA Technical Reports Server (NTRS)
Guinn, J. R.
1994-01-01
OSMEAN is a sophisticated FORTRAN algorithm that converts between osculating and mean classical orbit elements. Mean orbit elements are advantageous for trajectory design and maneuver planning since they can be propagated very quickly; however, mean elements cannot describe the exact orbit at any given time. Osculating elements will enable the engineer to give an exact description of an orbit; however, computation costs are significantly higher due to the numerical integration procedure required for propagation. By calculating accurate conversions between osculating and mean orbit elements, OSMEAN allows the engineer to exploit the advantages of each approach for the design and planning of orbital trajectories and maneuver planning. OSMEAN is capable of converting mean elements to osculating elements or vice versa. The conversion is based on modelling of all first order aspherical and lunar-solar gravitation perturbations as well as a second-order aspherical term based on the second degree central body zonal perturbation. OSMEAN is written in FORTRAN 77 for HP 9000 series computers running HP-UX (NPO-18796) and DEC VAX series computers running VMS (NPO-18741). The HP version requires 388K of RAM for execution and the DEC VAX version requires 254K of RAM for execution. Sample input and output are listed in the documentation. Sample input is also provided on the distribution medium. The standard distribution medium for the HP 9000 series version is a .25 inch streaming magnetic IOTAMAT tape cartridge in UNIX tar format. It is also available on a .25 inch streaming magnetic tape cartridge in UNIX tar format or on a 3.5 inch diskette in UNIX tar format. The standard distribution medium for the DEC VAX version is a 1600 BPI 9-track magnetic tape in DEC VAX BACKUP format. It is also available on a TK50 tape cartridge in DEC VAX BACKUP format. OSMEAN was developed on a VAX 6410 in 1989, and was ported to the HP 9000 series platform in 1991. It is a copyrighted work with all copyright vested in NASA.
OSMEAN - OSCULATING/MEAN CLASSICAL ORBIT ELEMENTS CONVERSION (VAX VMS VERSION)
NASA Technical Reports Server (NTRS)
Guinn, J. R.
1994-01-01
OSMEAN is a sophisticated FORTRAN algorithm that converts between osculating and mean classical orbit elements. Mean orbit elements are advantageous for trajectory design and maneuver planning since they can be propagated very quickly; however, mean elements cannot describe the exact orbit at any given time. Osculating elements will enable the engineer to give an exact description of an orbit; however, computation costs are significantly higher due to the numerical integration procedure required for propagation. By calculating accurate conversions between osculating and mean orbit elements, OSMEAN allows the engineer to exploit the advantages of each approach for the design and planning of orbital trajectories and maneuver planning. OSMEAN is capable of converting mean elements to osculating elements or vice versa. The conversion is based on modelling of all first order aspherical and lunar-solar gravitation perturbations as well as a second-order aspherical term based on the second degree central body zonal perturbation. OSMEAN is written in FORTRAN 77 for HP 9000 series computers running HP-UX (NPO-18796) and DEC VAX series computers running VMS (NPO-18741). The HP version requires 388K of RAM for execution and the DEC VAX version requires 254K of RAM for execution. Sample input and output are listed in the documentation. Sample input is also provided on the distribution medium. The standard distribution medium for the HP 9000 series version is a .25 inch streaming magnetic IOTAMAT tape cartridge in UNIX tar format. It is also available on a .25 inch streaming magnetic tape cartridge in UNIX tar format or on a 3.5 inch diskette in UNIX tar format. The standard distribution medium for the DEC VAX version is a 1600 BPI 9-track magnetic tape in DEC VAX BACKUP format. It is also available on a TK50 tape cartridge in DEC VAX BACKUP format. OSMEAN was developed on a VAX 6410 in 1989, and was ported to the HP 9000 series platform in 1991. It is a copyrighted work with all copyright vested in NASA.
NASA Astrophysics Data System (ADS)
Bhattacharya, R.; Liberty, L. M.; Almeida, R. V.; Hubbard, J.
2016-12-01
We explore the structural and depositional evolution of the Stevenson Basin, Gulf of Alaska from a dense network of 2-D marine seismic profiles that span the Gulf of Alaska continental margin. The grid of 71 seismic profiles was acquired as part of a 1975 Mineral Management Services (MMS) exploration project to assess basin architecture along the Alaska continental shelf. We obtained unmigrated and stacked seismic profiles in TIFF format. We converted the data to SEGY format and migrated each profile. Within the Stevenson Basin, we identify key seismic horizons, including the regional Eocene-Miocene unconformity, that provide insights into its depositional and structural history. Using these observations combined with stacking velocities, sonic logs from wells, and refraction velocities from the Edge profile of Ye et al. (1997), we develop a local 3D velocity model that we use to depth-convert the seismic reflection profiles. By using ties to >2.5 km deep exploration wells, we note the Stevenson Basin is one of many Eocene and younger depocenters that span the forearc between Kodiak and Prince William Sound. Well logs and seismic data suggest basal strata consist of Eocene sediments than are unconformably overlain by Neogene and younger strata. Faults that breach the sea floor suggest active deformation within and at the bounds of this basin, including on new faults that do not follow any pre-existing structural trends. This assessment is consistent with slip models that place tsunamigenic faults that ruptured during the 1964 Great Alaska earthquake in the vicinity of the basin. The catalog of faults, their slip history and the depositional evolution of the Stevenson Basin, all suggest that the basin evolution may be controlled by heterogeneities along the incoming plate.
LVFS: A Big Data File Storage Bridge for the HPC Community
NASA Astrophysics Data System (ADS)
Golpayegani, N.; Halem, M.; Mauoka, E.; Fonseca, L. F.
2015-12-01
Merging Big Data capabilities into High Performance Computing architecture starts at the file storage level. Heterogeneous storage systems are emerging which offer enhanced features for dealing with Big Data such as the IBM GPFS storage system's integration into Hadoop Map-Reduce. Taking advantage of these capabilities requires file storage systems to be adaptive and accommodate these new storage technologies. We present the extension of the Lightweight Virtual File System (LVFS) currently running as the production system for the MODIS Level 1 and Atmosphere Archive and Distribution System (LAADS) to incorporate a flexible plugin architecture which allows easy integration of new HPC hardware and/or software storage technologies without disrupting workflows, system architectures and only minimal impact on existing tools. We consider two essential aspects provided by the LVFS plugin architecture needed for the future HPC community. First, it allows for the seamless integration of new and emerging hardware technologies which are significantly different than existing technologies such as Segate's Kinetic disks and Intel's 3DXPoint non-volatile storage. Second is the transparent and instantaneous conversion between new software technologies and various file formats. With most current storage system a switch in file format would require costly reprocessing and nearly doubling of storage requirements. We will install LVFS on UMBC's IBM iDataPlex cluster with a heterogeneous storage architecture utilizing local, remote, and Seagate Kinetic storage as a case study. LVFS merges different kinds of storage architectures to show users a uniform layout and, therefore, prevent any disruption in workflows, architecture design, or tool usage. We will show how LVFS will convert HDF data produced by applying machine learning algorithms to Xco2 Level 2 data from the OCO-2 satellite to produce CO2 surface fluxes into GeoTIFF for visualization.
TRMM Version 7 Near-Realtime Data Products
NASA Technical Reports Server (NTRS)
Tocker, Erich Franz; Kelley, Owen
2012-01-01
The TRMM data system has been providing near-realtime data products to the community since late 1999. While the TRMM project never had near-realtime production requirements, the science and applications communities had a great interest in receiving TRMM data as quickly as possible. As a result these NRT data are provided under a best-effort scenario but with the objective of having the swath data products available within three hours of data collection 90% of the time. In July of 2011 the Joint Precipitation Measurement Missions Science Team (JPST) authorized the reprocessing of TRMM mission data using the new version 7 algorithms. The reprocessing of the 14+ years of the mission was concluded within 30 days. Version 7 algorithms had substantial changes in the data product file formats both for data and metadata. In addition, the algorithms themselves had major modifications and improvements. The general approach to versioning up the NRT is to wait for the regular production algorithms to have run for a while and shake out any issues that might arise from the new version before updating the NRT products. Because of the substantial changes in data/metadata formats as well as the algorithm improvements themselves, the update of NRT to V7 followed an even more conservative path than usual. This was done to ensure that applications agencies and other users of the TRMM NRT would not be faces with short-timeframes for conversion to the new format. This paper will describe the process by which the TRMM NRT was updated to V7 and the V7 data products themselves.
Suggested Format for Acute Toxicity Studies
This document suggests the format for final reports on pesticide studies (right column of the tables in the document) and provides instructions for the creation of PDF Version 1.3 electronic submission documents (left column of the tables).
Documentation for the machine-readable version of the Henry Draper Catalogue (edition 1985)
NASA Technical Reports Server (NTRS)
Roman, N. G.; Warren, W. H., Jr.
1985-01-01
An updated, corrected and extended machine-readable version of the catalog is described. Published and unpublished errors discovered in the previous version was corrected; letters indicating supplemental stars in the BD have been moved to a new byte to distinguish them from double-star components; and the machine readable portion of The Henry Draper Extension (HDE) (HA 100) was converted to the same format as the main catalog, with additional data added as necessary.
CT Scans of Cores Metadata, Barrow, Alaska 2015
Katie McKnight; Tim Kneafsey; Craig Ulrich
2015-03-11
Individual ice cores were collected from Barrow Environmental Observatory in Barrow, Alaska, throughout 2013 and 2014. Cores were drilled along different transects to sample polygonal features (i.e. the trough, center and rim of high, transitional and low center polygons). Most cores were drilled around 1 meter in depth and a few deep cores were drilled around 3 meters in depth. Three-dimensional images of the frozen cores were constructed using a medical X-ray computed tomography (CT) scanner. TIFF files can be uploaded to ImageJ (an open-source imaging software) to examine soil structure and densities within each core.
Computer Simulation Performed for Columbia Project Cooling System
NASA Technical Reports Server (NTRS)
Ahmad, Jasim
2005-01-01
This demo shows a high-fidelity simulation of the air flow in the main computer room housing the Columbia (10,024 intel titanium processors) system. The simulation asseses the performance of the cooling system and identified deficiencies, and recommended modifications to eliminate them. It used two in house software packages on NAS supercomputers: Chimera Grid tools to generate a geometric model of the computer room, OVERFLOW-2 code for fluid and thermal simulation. This state-of-the-art technology can be easily extended to provide a general capability for air flow analyses on any modern computer room. Columbia_CFD_black.tiff
A UNIMARC Bibliographic Format Database for ABCD
ERIC Educational Resources Information Center
Megnigbeto, Eustache
2012-01-01
Purpose: ABCD is a web-based open and free software suite for library management derived from the UNESCO CDS/ISIS software technology. The first version was launched officially in December 2009 with a MARC 21 bibliographic format database. This paper aims to detail the building of the UNIMARC bibliographic format database for ABCD.…
The Development of the Planet Formation Concept Inventory: A Preliminary Analysis of Version 1
NASA Astrophysics Data System (ADS)
Simon, Molly; Impey, Chris David; Buxner, Sanlyn
2018-01-01
The topic of planet formation is poorly represented in the educational literature, especially at the college level. As recently as 2014, when developing the Test of Astronomy Standards (TOAST), Slater (2014) noted that for two topics (formation of the Solar System and cosmology), “high quality test items that reflect our current understanding of students’ conceptions were not available [in the literature]” (Slater,2014, p. 8). Furthermore, nearly half of ASTR 101 enrollments are at 2 year/community colleges where both instructors and students have little access to current research and models of planet formation. In response, we administered six student replied response (SSR) short answer questions on the topic of planet formation to n = 1,050 students enrolled in introductory astronomy and planetary science courses at The University of Arizona in the Fall 2016 and Spring 2017 semesters. After analyzing and coding the data from the SSR questions, we developed a preliminary version of the Planet Formation Concept Inventory (PFCI). The PFCI is a multiple-choice instrument with 20 planet formation-related questions, and 4 demographic-related questions. We administered version 1 of the PFCI to six introductory astronomy and planetary science courses (n ~ 700 students) during the Fall 2017 semester. We provided students with 7-8 multiple-choice with explanation of reasoning (MCER) questions from the PFCI. Students selected an answer (similar to a traditional multiple-choice test), and then briefly explained why they chose the answer they did. We also conducted interviews with ~15 students to receive feedback on the quality of the questions and clarity of the instrument. We will present an analysis of the MCER responses and student interviews, and discuss any modifications that will be made to the instrument as a result.
TRASYS - THERMAL RADIATION ANALYZER SYSTEM (DEC VAX VERSION WITH NASADIG)
NASA Technical Reports Server (NTRS)
Anderson, G. E.
1994-01-01
The Thermal Radiation Analyzer System, TRASYS, is a computer software system with generalized capability to solve the radiation related aspects of thermal analysis problems. TRASYS computes the total thermal radiation environment for a spacecraft in orbit. The software calculates internode radiation interchange data as well as incident and absorbed heat rate data originating from environmental radiant heat sources. TRASYS provides data of both types in a format directly usable by such thermal analyzer programs as SINDA/FLUINT (available from COSMIC, program number MSC-21528). One primary feature of TRASYS is that it allows users to write their own driver programs to organize and direct the preprocessor and processor library routines in solving specific thermal radiation problems. The preprocessor first reads and converts the user's geometry input data into the form used by the processor library routines. Then, the preprocessor accepts the user's driving logic, written in the TRASYS modified FORTRAN language. In many cases, the user has a choice of routines to solve a given problem. Users may also provide their own routines where desirable. In particular, the user may write output routines to provide for an interface between TRASYS and any thermal analyzer program using the R-C network concept. Input to the TRASYS program consists of Options and Edit data, Model data, and Logic Flow and Operations data. Options and Edit data provide for basic program control and user edit capability. The Model data describe the problem in terms of geometry and other properties. This information includes surface geometry data, documentation data, nodal data, block coordinate system data, form factor data, and flux data. Logic Flow and Operations data house the user's driver logic, including the sequence of subroutine calls and the subroutine library. Output from TRASYS consists of two basic types of data: internode radiation interchange data, and incident and absorbed heat rate data. The flexible structure of TRASYS allows considerable freedom in the definition and choice of solution method for a thermal radiation problem. The program's flexible structure has also allowed TRASYS to retain the same basic input structure as the authors update it in order to keep up with changing requirements. Among its other important features are the following: 1) up to 3200 node problem size capability with shadowing by intervening opaque or semi-transparent surfaces; 2) choice of diffuse, specular, or diffuse/specular radiant interchange solutions; 3) a restart capability that minimizes recomputing; 4) macroinstructions that automatically provide the executive logic for orbit generation that optimizes the use of previously completed computations; 5) a time variable geometry package that provides automatic pointing of the various parts of an articulated spacecraft and an automatic look-back feature that eliminates redundant form factor calculations; 6) capability to specify submodel names to identify sets of surfaces or components as an entity; and 7) subroutines to perform functions which save and recall the internodal and/or space form factors in subsequent steps for nodes with fixed geometry during a variable geometry run. There are two machine versions of TRASYS v27: a DEC VAX version and a Cray UNICOS version. Both versions require installation of the NASADIG library (MSC-21801 for DEC VAX or COS-10049 for CRAY), which is available from COSMIC either separately or bundled with TRASYS. The NASADIG (NASA Device Independent Graphics Library) plot package provides a pictorial representation of input geometry, orbital/orientation parameters, and heating rate output as a function of time. NASADIG supports Tektronix terminals. The CRAY version of TRASYS v27 is written in FORTRAN 77 for batch or interactive execution and has been implemented on CRAY X-MP and CRAY Y-MP series computers running UNICOS. The standard distribution medium for MSC-21959 (CRAY version without NASADIG) is a 1600 BPI 9-track magnetic tape in UNIX tar format. The standard distribution medium for COS-10040 (CRAY version with NASADIG) is a set of two 6250 BPI 9-track magnetic tapes in UNIX tar format. Alternate distribution media and formats are available upon request. The DEC VAX version of TRASYS v27 is written in FORTRAN 77 for batch execution (only the plotting driver program is interactive) and has been implemented on a DEC VAX 8650 computer under VMS. Since the source codes for MSC-21030 and COS-10026 are in VAX/VMS text library files and DEC Command Language files, COSMIC will only provide these programs in the following formats: MSC-21030, TRASYS (DEC VAX version without NASADIG) is available on a 1600 BPI 9-track magnetic tape in VAX BACKUP format (standard distribution medium) or in VAX BACKUP format on a TK50 tape cartridge; COS-10026, TRASYS (DEC VAX version with NASADIG), is available in VAX BACKUP format on a set of three 6250 BPI 9-track magnetic tapes (standard distribution medium) or a set of three TK50 tape cartridges in VAX BACKUP format. TRASYS was last updated in 1993.
TRASYS - THERMAL RADIATION ANALYZER SYSTEM (CRAY VERSION WITH NASADIG)
NASA Technical Reports Server (NTRS)
Anderson, G. E.
1994-01-01
The Thermal Radiation Analyzer System, TRASYS, is a computer software system with generalized capability to solve the radiation related aspects of thermal analysis problems. TRASYS computes the total thermal radiation environment for a spacecraft in orbit. The software calculates internode radiation interchange data as well as incident and absorbed heat rate data originating from environmental radiant heat sources. TRASYS provides data of both types in a format directly usable by such thermal analyzer programs as SINDA/FLUINT (available from COSMIC, program number MSC-21528). One primary feature of TRASYS is that it allows users to write their own driver programs to organize and direct the preprocessor and processor library routines in solving specific thermal radiation problems. The preprocessor first reads and converts the user's geometry input data into the form used by the processor library routines. Then, the preprocessor accepts the user's driving logic, written in the TRASYS modified FORTRAN language. In many cases, the user has a choice of routines to solve a given problem. Users may also provide their own routines where desirable. In particular, the user may write output routines to provide for an interface between TRASYS and any thermal analyzer program using the R-C network concept. Input to the TRASYS program consists of Options and Edit data, Model data, and Logic Flow and Operations data. Options and Edit data provide for basic program control and user edit capability. The Model data describe the problem in terms of geometry and other properties. This information includes surface geometry data, documentation data, nodal data, block coordinate system data, form factor data, and flux data. Logic Flow and Operations data house the user's driver logic, including the sequence of subroutine calls and the subroutine library. Output from TRASYS consists of two basic types of data: internode radiation interchange data, and incident and absorbed heat rate data. The flexible structure of TRASYS allows considerable freedom in the definition and choice of solution method for a thermal radiation problem. The program's flexible structure has also allowed TRASYS to retain the same basic input structure as the authors update it in order to keep up with changing requirements. Among its other important features are the following: 1) up to 3200 node problem size capability with shadowing by intervening opaque or semi-transparent surfaces; 2) choice of diffuse, specular, or diffuse/specular radiant interchange solutions; 3) a restart capability that minimizes recomputing; 4) macroinstructions that automatically provide the executive logic for orbit generation that optimizes the use of previously completed computations; 5) a time variable geometry package that provides automatic pointing of the various parts of an articulated spacecraft and an automatic look-back feature that eliminates redundant form factor calculations; 6) capability to specify submodel names to identify sets of surfaces or components as an entity; and 7) subroutines to perform functions which save and recall the internodal and/or space form factors in subsequent steps for nodes with fixed geometry during a variable geometry run. There are two machine versions of TRASYS v27: a DEC VAX version and a Cray UNICOS version. Both versions require installation of the NASADIG library (MSC-21801 for DEC VAX or COS-10049 for CRAY), which is available from COSMIC either separately or bundled with TRASYS. The NASADIG (NASA Device Independent Graphics Library) plot package provides a pictorial representation of input geometry, orbital/orientation parameters, and heating rate output as a function of time. NASADIG supports Tektronix terminals. The CRAY version of TRASYS v27 is written in FORTRAN 77 for batch or interactive execution and has been implemented on CRAY X-MP and CRAY Y-MP series computers running UNICOS. The standard distribution medium for MSC-21959 (CRAY version without NASADIG) is a 1600 BPI 9-track magnetic tape in UNIX tar format. The standard distribution medium for COS-10040 (CRAY version with NASADIG) is a set of two 6250 BPI 9-track magnetic tapes in UNIX tar format. Alternate distribution media and formats are available upon request. The DEC VAX version of TRASYS v27 is written in FORTRAN 77 for batch execution (only the plotting driver program is interactive) and has been implemented on a DEC VAX 8650 computer under VMS. Since the source codes for MSC-21030 and COS-10026 are in VAX/VMS text library files and DEC Command Language files, COSMIC will only provide these programs in the following formats: MSC-21030, TRASYS (DEC VAX version without NASADIG) is available on a 1600 BPI 9-track magnetic tape in VAX BACKUP format (standard distribution medium) or in VAX BACKUP format on a TK50 tape cartridge; COS-10026, TRASYS (DEC VAX version with NASADIG), is available in VAX BACKUP format on a set of three 6250 BPI 9-track magnetic tapes (standard distribution medium) or a set of three TK50 tape cartridges in VAX BACKUP format. TRASYS was last updated in 1993.
TRASYS - THERMAL RADIATION ANALYZER SYSTEM (DEC VAX VERSION WITHOUT NASADIG)
NASA Technical Reports Server (NTRS)
Vogt, R. A.
1994-01-01
The Thermal Radiation Analyzer System, TRASYS, is a computer software system with generalized capability to solve the radiation related aspects of thermal analysis problems. TRASYS computes the total thermal radiation environment for a spacecraft in orbit. The software calculates internode radiation interchange data as well as incident and absorbed heat rate data originating from environmental radiant heat sources. TRASYS provides data of both types in a format directly usable by such thermal analyzer programs as SINDA/FLUINT (available from COSMIC, program number MSC-21528). One primary feature of TRASYS is that it allows users to write their own driver programs to organize and direct the preprocessor and processor library routines in solving specific thermal radiation problems. The preprocessor first reads and converts the user's geometry input data into the form used by the processor library routines. Then, the preprocessor accepts the user's driving logic, written in the TRASYS modified FORTRAN language. In many cases, the user has a choice of routines to solve a given problem. Users may also provide their own routines where desirable. In particular, the user may write output routines to provide for an interface between TRASYS and any thermal analyzer program using the R-C network concept. Input to the TRASYS program consists of Options and Edit data, Model data, and Logic Flow and Operations data. Options and Edit data provide for basic program control and user edit capability. The Model data describe the problem in terms of geometry and other properties. This information includes surface geometry data, documentation data, nodal data, block coordinate system data, form factor data, and flux data. Logic Flow and Operations data house the user's driver logic, including the sequence of subroutine calls and the subroutine library. Output from TRASYS consists of two basic types of data: internode radiation interchange data, and incident and absorbed heat rate data. The flexible structure of TRASYS allows considerable freedom in the definition and choice of solution method for a thermal radiation problem. The program's flexible structure has also allowed TRASYS to retain the same basic input structure as the authors update it in order to keep up with changing requirements. Among its other important features are the following: 1) up to 3200 node problem size capability with shadowing by intervening opaque or semi-transparent surfaces; 2) choice of diffuse, specular, or diffuse/specular radiant interchange solutions; 3) a restart capability that minimizes recomputing; 4) macroinstructions that automatically provide the executive logic for orbit generation that optimizes the use of previously completed computations; 5) a time variable geometry package that provides automatic pointing of the various parts of an articulated spacecraft and an automatic look-back feature that eliminates redundant form factor calculations; 6) capability to specify submodel names to identify sets of surfaces or components as an entity; and 7) subroutines to perform functions which save and recall the internodal and/or space form factors in subsequent steps for nodes with fixed geometry during a variable geometry run. There are two machine versions of TRASYS v27: a DEC VAX version and a Cray UNICOS version. Both versions require installation of the NASADIG library (MSC-21801 for DEC VAX or COS-10049 for CRAY), which is available from COSMIC either separately or bundled with TRASYS. The NASADIG (NASA Device Independent Graphics Library) plot package provides a pictorial representation of input geometry, orbital/orientation parameters, and heating rate output as a function of time. NASADIG supports Tektronix terminals. The CRAY version of TRASYS v27 is written in FORTRAN 77 for batch or interactive execution and has been implemented on CRAY X-MP and CRAY Y-MP series computers running UNICOS. The standard distribution medium for MSC-21959 (CRAY version without NASADIG) is a 1600 BPI 9-track magnetic tape in UNIX tar format. The standard distribution medium for COS-10040 (CRAY version with NASADIG) is a set of two 6250 BPI 9-track magnetic tapes in UNIX tar format. Alternate distribution media and formats are available upon request. The DEC VAX version of TRASYS v27 is written in FORTRAN 77 for batch execution (only the plotting driver program is interactive) and has been implemented on a DEC VAX 8650 computer under VMS. Since the source codes for MSC-21030 and COS-10026 are in VAX/VMS text library files and DEC Command Language files, COSMIC will only provide these programs in the following formats: MSC-21030, TRASYS (DEC VAX version without NASADIG) is available on a 1600 BPI 9-track magnetic tape in VAX BACKUP format (standard distribution medium) or in VAX BACKUP format on a TK50 tape cartridge; COS-10026, TRASYS (DEC VAX version with NASADIG), is available in VAX BACKUP format on a set of three 6250 BPI 9-track magnetic tapes (standard distribution medium) or a set of three TK50 tape cartridges in VAX BACKUP format. TRASYS was last updated in 1993.
Effectiveness of Electronic Textbooks with Embedded Activities on Student Learning
ERIC Educational Resources Information Center
Porter, Paula L.
2010-01-01
Current versions of electronic textbooks mimic the format and structure of printed textbooks; however, the electronic capabilities of these new versions of textbooks offer the potential of embedding interactive features of web-based learning within the context of a textbook. This dissertation research study was conducted to determine if student…
ERIC Educational Resources Information Center
Abramson, Theodore; Kagen, Edward
This study investigated attribute by treatment interactions between prior familiarity and response mode to programmed materials for college level subjects by manipulating subjects' familiarity. The programs were a revised version of Diagnosis of Myocardial Infraction in standard format and in a reading version. Materials to familiarize subjects…
RELEASE NOTES FOR MODELS-3 VERSION 4.1 PATCH: SMOKE TOOL AND FILE CONVERTER
This software patch to the Models-3 system corrects minor errors in the Models-3 framework, provides substantial improvements in the ASCII to I/O API format conversion of the File Converter utility, and new functionalities for the SMOKE Tool. Version 4.1 of the Models-3 system...
Data files from the Grays Harbor Sediment Transport Experiment Spring 2001
Landerman, Laura A.; Sherwood, Christopher R.; Gelfenbaum, Guy; Lacy, Jessica; Ruggiero, Peter; Wilson, Douglas; Chisholm, Tom; Kurrus, Keith
2005-01-01
This publication consists of two DVD-ROMs, both of which are presented here. This report describes data collected during the Spring 2001 Grays Harbor Sediment Transport Experiment, and provides additional information needed to interpret the data. Two DVDs accompany this report; both contain documentation in html format that assist the user in navigating through the data. DVD-ROM-1 contains a digital version of this report in .pdf format, raw Aquatec acoustic backscatter (ABS) data in .zip format, Sonar data files in .avi format, and coastal processes and morphology data in ASCII format. ASCII data files are provided in .zip format; bundled coastal processes ASCII files are separated by deployment and instrument; bundled morphology ASCII files are separated into monthly data collection efforts containing the beach profiles collected (or extracted from the surface map) at that time; weekly surface maps are also bundled together. DVD-ROM-2 contains a digital version of this report in .pdf format, the binary data files collected by the SonTek instrumentation, calibration files for the pressure sensors, and Matlab m-files for loading the ABS data into Matlab and cleaning-up the optical backscatter (OBS) burst time-series data.
NASA Technical Reports Server (NTRS)
Roman, N. G.; Warren, W. H., Jr.
1984-01-01
The machine-readable, character-coded version of the catalog, as it is currently being distributed from the Astronomical Data Center(ADC), is described. The format and data provided in the magnetic tape version differ somewhat from those of the published catalog, which was also produced from a tape prepared at the ADC. The primary catalog data are positions and proper motions (equinox 1950.0) for 14597 stars.
NASA Astrophysics Data System (ADS)
Blaschek, Michael; Gerken, Daniel; Ludwig, Ralf; Duttmann, Rainer
2015-04-01
Geoportals are important elements of spatial data infrastructures (SDIs) that are strongly based on GIS-related web services. These services are basically meant for distributing, documenting and visualizing (spatial) data in a standardized manner; an important but challenging task especially in large scientific projects with a high number of data suppliers and producers from various countries. This presentation focuses on introducing the free and open-source based geoportal solution developed within the research project CLIMB (Climate Induced Changes on the Hydrology of Mediterranean Basins, www.climb-fp7.eu) that serves as the central platform for interchanging project-related spatial data and information. In this collaboration, financed by the EU-FP7-framework and coordinated at the LMU Munich, 21 partner institutions from nine European and non-European countries were involved. The CLIMB Geoportal (lgi-climbsrv.geographie.uni-kiel.de) stores and provides spatially distributed data about the current state and future changes of the hydrological conditions within the seven CLIMB test sites around the Mediterranean. Hydrological modelling outcome - validated by the CLIMB partners - is offered to the public in forms of Web Map Services (WMS), whereas downloading the underlying data itself through Web Coverage Services (WCS) is possible for registered users only. A selection of common indicators such as discharge, drought index as well as uncertainty measures including their changes over time were used in different spatial resolution. Besides map information, the portal enables the graphical display of time series of selected variables calculated by the individual models applied within the CLIMB-project. The implementation of the CLIMB Geoportal is finally based on version 2.0c5 of the open source geospatial content management system GeoNode. It includes a GeoServer instance for providing the OGC-compliant web services and comes with a metadata catalog (pycsw) as well as a built-in WebGIS-client based on GeoExt (GeoExplorer). PostgreSQL enhanced by PostGIS in versions 9.2.1/2.0.1 serves as database backend for all base data of the study sites and for the time series of relevant hydrological indicators. Spatial model results in raster-format are stored file-based as GeoTIFFs. Due to the high number of model outputs, the generation of metadata (xml) and graphical rendering instructions (sld) associated with each single layer of the WMS has been done automatically using the statistical software R. Additional applications that have been programmed during the project period include a Java-based interface for comfortable download of climate data that was initially needed as input data in hydrological modeling as well as a tool for displaying time series of selected risk indicators which is directly integrated into the portal structure implemented using Python (Django) and JavaScript. The presented CLIMB Geoportal shows that relevant results of even large international research projects involving many partners and varying national standards in data handling, can be effectively disseminated to stakeholders, policy makers and other interested parties. Thus, it is a successful example of using free and open-source software for providing long-term visibility and access to data produced within a particular (environmental) research project.
Image editing with Adobe Photoshop 6.0.
Caruso, Ronald D; Postel, Gregory C
2002-01-01
The authors introduce Photoshop 6.0 for radiologists and demonstrate basic techniques of editing gray-scale cross-sectional images intended for publication and for incorporation into computerized presentations. For basic editing of gray-scale cross-sectional images, the Tools palette and the History/Actions palette pair should be displayed. The History palette may be used to undo a step or series of steps. The Actions palette is a menu of user-defined macros that save time by automating an action or series of actions. Converting an image to 8-bit gray scale is the first editing function. Cropping is the next action. Both decrease file size. Use of the smallest file size necessary for the purpose at hand is recommended. Final file size for gray-scale cross-sectional neuroradiologic images (8-bit, single-layer TIFF [tagged image file format] at 300 pixels per inch) intended for publication varies from about 700 Kbytes to 3 Mbytes. Final file size for incorporation into computerized presentations is about 10-100 Kbytes (8-bit, single-layer, gray-scale, high-quality JPEG [Joint Photographic Experts Group]), depending on source and intended use. Editing and annotating images before they are inserted into presentation software is highly recommended, both for convenience and flexibility. Radiologists should find that image editing can be carried out very rapidly once the basic steps are learned and automated. Copyright RSNA, 2002
Some Processing and Dynamic-Range Issues in Side-Scan Sonar Work
NASA Astrophysics Data System (ADS)
Asper, V. L.; Caruthers, J. W.
2007-05-01
Often side-scan sonar data are collected in such a way that they afford little opportunity to do more than simply display them as images. These images are often limited in dynamic range and stored only in an 8-bit tiff format of numbers representing less than true intensity values. Furthermore, there is little prior knowledge during a survey of the best range in which to set those eight bits. This can result in clipped strong targets and/or the depth of shadows so that the bits that can be recovered from the image are not fully representative of target or bottom backscatter strengths. Several top-of-the-line sonars do have a means of logging high-bit-rate digital data (sometimes only as an option), but only dedicated specialists pay much attention to such data, if they record them at all. Most users of side-scan sonars are interested only in the images. Discussed in this paper are issues related to storing and processing of high-bit-rate digital data to preserve their integrity for future enhanced, after- the-fact use and ability to recover actual backscatter strengths. This papers discusses issues in the use high-bit- rate, digital side-scan sonar data. This work was supported by the Office of Naval Research, Code 321OA, and the Naval Oceanographic Office, Mine Warfare Program.
Applications of the JPEG standard in a medical environment
NASA Astrophysics Data System (ADS)
Wittenberg, Ulrich
1993-10-01
JPEG is a very versatile image coding and compression standard for single images. Medical images make a higher demand on image quality and precision than the usual 'pretty pictures'. In this paper the potential applications of the various JPEG coding modes in a medical environment are evaluated. Due to legal reasons the lossless modes are especially interesting. The spatial modes are equally important because medical data may well exceed the maximum of 12 bit precision allowed for the DCT modes. The performance of the spatial predictors is investigated. From the users point of view the progressive modes, which provide a fast but coarse approximation of the final image, reduce the subjective time one has to wait for it, so they also reduce the user's frustration. Even the lossy modes will find some applications, but they have to be handled with care, because repeated lossy coding and decoding leads to a degradation of the image quality. The amount of this degradation is investigated. The JPEG standard alone is not sufficient for a PACS because it does not store enough additional data such as creation data or details of the imaging modality. Therefore it will be an imbedded coding format in standards like TIFF or ACR/NEMA. It is concluded that the JPEG standard is versatile enough to match the requirements of the medical community.
Vaughan, R. Greg; Heasler, Henry; Jaworowski, Cheryl; Lowenstern, Jacob B.; Keszthelyi, Laszlo P.
2014-01-01
Maps that define the current distribution of geothermally heated ground are useful toward setting a baseline for thermal activity to better detect and understand future anomalous hydrothermal and (or) volcanic activity. Monitoring changes in the dynamic thermal areas also supports decisions regarding the development of Yellowstone National Park infrastructure, preservation and protection of park resources, and ensuring visitor safety. Because of the challenges associated with field-based monitoring of a large, complex geothermal system that is spread out over a large and remote area, satellite-based thermal infrared images from the Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) were used to map the location and spatial extent of active thermal areas, to generate thermal anomaly maps, and to quantify the radiative component of the total geothermal heat flux. ASTER thermal infrared data acquired during winter nights were used to minimize the contribution of solar heating of the surface. The ASTER thermal infrared mapping results were compared to maps of thermal areas based on field investigations and high-resolution aerial photos. Field validation of the ASTER thermal mapping is an ongoing task. The purpose of this report is to make available ASTER-based maps of Yellowstone’s thermal areas. We include an appendix containing the names and characteristics of Yellowstone’s thermal areas, georeferenced TIFF files containing ASTER thermal imagery, and several spatial data sets in Esri shapefile format.
Sliter, Ray W.; Triezenberg, Peter J.; Hart, Patrick E.; Draut, Amy E.; Normark, William R.; Conrad, James E.
2008-01-01
The U.S. Geological Survey (USGS) collected high-resolution shallow seismic-reflection data in September, 2007, and June-July, 2008, from the continental shelf offshore of southern California between Gaviota and Mugu Canyon, in support of the California's State Waters Mapping Program. Data were acquired using SIG 2mille mini-sparker and Edgetech chirp 512 instruments aboard the R/V Zephyr (Sept. 2007) and R/V Parke Snavely (June-July 2008). The survey area spanned approximately 120 km of coastline, and included shore-perpendicular transects spaced 1.0-1.5 km apart that extended offshore to at least the 3-mile limit of State waters, in water depths ranging from 10 m near shore to 300 m near the offshore extent of Mugu and Hueneme submarine canyons. Subbottom acoustic penetration spanned tens to several hundred meters, variable by location. This report includes maps of the surveyed transects, linked to Google Earth software, as well as digital data files showing images of each transect in SEG-Y, JPEG, and TIFF formats. The images of sediment deposits, tectonic structure, and natural-gas seeps collected during this study provide geologic information that is essential to coastal zone and resource management at Federal, State and local levels, as well as to future research on the sedimentary, tectonic, and climatic record of southern California.
Course Format Effects on Learning Outcomes in an Introductory Statistics Course
ERIC Educational Resources Information Center
Sami, Fary
2011-01-01
The purpose of this study was to determine if course format significantly impacted student learning and course completion rates in an introductory statistics course taught at Harford Community College. In addition to the traditional lecture format, the College offers an online, and a hybrid (blend of traditional and online) version of this class.…
Shaffer, Victoria A; Owens, Justin; Zikmund-Fisher, Brian J
2013-12-17
Previous research has examined the impact of patient narratives on treatment choices, but to our knowledge, no study has examined the effect of narratives on information search. Further, no research has considered the relative impact of their format (text vs video) on health care decisions in a single study. Our goal was to examine the impact of video and text-based narratives on information search in a Web-based patient decision aid for early stage breast cancer. Fifty-six women were asked to imagine that they had been diagnosed with early stage breast cancer and needed to choose between two surgical treatments (lumpectomy with radiation or mastectomy). Participants were randomly assigned to view one of four versions of a Web decision aid. Two versions of the decision aid included videos of interviews with patients and physicians or videos of interviews with physicians only. To distinguish between the effect of narratives and the effect of videos, we created two text versions of the Web decision aid by replacing the patient and physician interviews with text transcripts of the videos. Participants could freely browse the Web decision aid until they developed a treatment preference. We recorded participants' eye movements using the Tobii 1750 eye-tracking system equipped with Tobii Studio software. A priori, we defined 24 areas of interest (AOIs) in the Web decision aid. These AOIs were either separate pages of the Web decision aid or sections within a single page covering different content. We used multilevel modeling to examine the effect of narrative presence, narrative format, and their interaction on information search. There was a significant main effect of condition, P=.02; participants viewing decision aids with patient narratives spent more time searching for information than participants viewing the decision aids without narratives. The main effect of format was not significant, P=.10. However, there was a significant condition by format interaction on fixation duration, P<.001. When comparing the two video decision aids, participants viewing the narrative version spent more time searching for information than participants viewing the control version of the decision aid. In contrast, participants viewing the narrative version of the text decision aid spent less time searching for information than participants viewing the control version of the text decision aid. Further, narratives appear to have a global effect on information search; these effects were not limited to specific sections of the decision aid that contained topics discussed in the patient stories. The observed increase in fixation duration with video patient testimonials is consistent with the idea that the vividness of the video content could cause greater elaboration of the message, thereby encouraging greater information search. Conversely, because reading requires more effortful processing than watching, reading patient narratives may have decreased participant motivation to engage in more reading in the remaining sections of the Web decision aid. These findings suggest that the format of patient stories may be equally as important as their content in determining their effect on decision making. More research is needed to understand why differences in format result in fundamental differences in information search.
CLIPS 6.0 - C LANGUAGE INTEGRATED PRODUCTION SYSTEM, VERSION 6.0 (UNIX VERSION)
NASA Technical Reports Server (NTRS)
Donnell, B.
1994-01-01
CLIPS, the C Language Integrated Production System, is a complete environment for developing expert systems -- programs which are specifically intended to model human expertise or knowledge. It is designed to allow artificial intelligence research, development, and delivery on conventional computers. CLIPS 6.0 provides a cohesive tool for handling a wide variety of knowledge with support for three different programming paradigms: rule-based, object-oriented, and procedural. Rule-based programming allows knowledge to be represented as heuristics, or "rules-of-thumb" which specify a set of actions to be performed for a given situation. Object-oriented programming allows complex systems to be modeled as modular components (which can be easily reused to model other systems or create new components). The procedural programming capabilities provided by CLIPS 6.0 allow CLIPS to represent knowledge in ways similar to those allowed in languages such as C, Pascal, Ada, and LISP. Using CLIPS 6.0, one can develop expert system software using only rule-based programming, only object-oriented programming, only procedural programming, or combinations of the three. CLIPS provides extensive features to support the rule-based programming paradigm including seven conflict resolution strategies, dynamic rule priorities, and truth maintenance. CLIPS 6.0 supports more complex nesting of conditional elements in the if portion of a rule ("and", "or", and "not" conditional elements can be placed within a "not" conditional element). In addition, there is no longer a limitation on the number of multifield slots that a deftemplate can contain. The CLIPS Object-Oriented Language (COOL) provides object-oriented programming capabilities. Features supported by COOL include classes with multiple inheritance, abstraction, encapsulation, polymorphism, dynamic binding, and message passing with message-handlers. CLIPS 6.0 supports tight integration of the rule-based programming features of CLIPS with COOL (that is, a rule can pattern match on objects created using COOL). CLIPS 6.0 provides the capability to define functions, overloaded functions, and global variables interactively. In addition, CLIPS can be embedded within procedural code, called as a subroutine, and integrated with languages such as C, FORTRAN and Ada. CLIPS can be easily extended by a user through the use of several well-defined protocols. CLIPS provides several delivery options for programs including the ability to generate stand alone executables or to load programs from text or binary files. CLIPS 6.0 provides support for the modular development and execution of knowledge bases with the defmodule construct. CLIPS modules allow a set of constructs to be grouped together such that explicit control can be maintained over restricting the access of the constructs by other modules. This type of control is similar to global and local scoping used in languages such as C or Ada. By restricting access to deftemplate and defclass constructs, modules can function as blackboards, permitting only certain facts and instances to be seen by other modules. Modules are also used by rules to provide execution control. The CRSV (Cross-Reference, Style, and Verification) utility included with previous version of CLIPS is no longer supported. The capabilities provided by this tool are now available directly within CLIPS 6.0 to aid in the development, debugging, and verification of large rule bases. COSMIC offers four distribution versions of CLIPS 6.0: UNIX (MSC-22433), VMS (MSC-22434), MACINTOSH (MSC-22429), and IBM PC (MSC-22430). Executable files, source code, utilities, documentation, and examples are included on the program media. All distribution versions include identical source code for the command line version of CLIPS 6.0. This source code should compile on any platform with an ANSI C compiler. Each distribution version of CLIPS 6.0, except that for the Macintosh platform, includes an executable for the command line version. For the UNIX version of CLIPS 6.0, the command line interface has been successfully implemented on a Sun4 running SunOS, a DECstation running DEC RISC ULTRIX, an SGI Indigo Elan running IRIX, a DEC Alpha AXP running OSF/1, and an IBM RS/6000 running AIX. Command line interface executables are included for Sun4 computers running SunOS 4.1.1 or later and for the DEC RISC ULTRIX platform. The makefiles may have to be modified slightly to be used on other UNIX platforms. The UNIX, Macintosh, and IBM PC versions of CLIPS 6.0 each have a platform specific interface. Source code, a makefile, and an executable for the Windows 3.1 interface version of CLIPS 6.0 are provided only on the IBM PC distribution diskettes. Source code, a makefile, and an executable for the Macintosh interface version of CLIPS 6.0 are provided only on the Macintosh distribution diskettes. Likewise, for the UNIX version of CLIPS 6.0, only source code and a makefile for an X-Windows interface are provided. The X-Windows interface requires MIT's X Window System, Version 11, Release 4 (X11R4), the Athena Widget Set, and the Xmu library. The source code for the Athena Widget Set is provided on the distribution medium. The X-Windows interface has been successfully implemented on a Sun4 running SunOS 4.1.2 with the MIT distribution of X11R4 (not OpenWindows), an SGI Indigo Elan running IRIX 4.0.5, and a DEC Alpha AXP running OSF/1 1.2. The VAX version of CLIPS 6.0 comes only with the generic command line interface. ASCII makefiles for the command line version of CLIPS are provided on all the distribution media for UNIX, VMS, and DOS. Four executables are provided with the IBM PC version: a windowed interface executable for Windows 3.1 built using Borland C++ v3.1, an editor for use with the windowed interface, a command line version of CLIPS for Windows 3.1, and a 386 command line executable for DOS built using Zortech C++ v3.1. All four executables are capable of utilizing extended memory and require an 80386 CPU or better. Users needing an 8086/8088 or 80286 executable must recompile the CLIPS source code themselves. Users who wish to recompile the DOS executable using Borland C++ or MicroSoft C must use a DOS extender program to produce an executable capable of using extended memory. The version of CLIPS 6.0 for IBM PC compatibles requires DOS v3.3 or later and/or Windows 3.1 or later. It is distributed on a set of three 1.4Mb 3.5 inch diskettes. A hard disk is required. The Macintosh version is distributed in compressed form on two 3.5 inch 1.4Mb Macintosh format diskettes, and requires System 6.0.5, or higher, and 1Mb RAM. The version for DEC VAX/VMS is available in VAX BACKUP format on a 1600 BPI 9-track magnetic tape (standard distribution medium) or a TK50 tape cartridge. The UNIX version is distributed in UNIX tar format on a .25 inch streaming magnetic tape cartridge (Sun QIC-24). For the UNIX version, alternate distribution media and formats are available upon request. The CLIPS 6.0 documentation includes a User's Guide and a three volume Reference Manual consisting of Basic and Advanced Programming Guides and an Interfaces Guide. An electronic version of the documentation is provided on the distribution medium for each version: in MicroSoft Word format for the Macintosh and PC versions of CLIPS, and in both PostScript format and MicroSoft Word for Macintosh format for the UNIX and DEC VAX versions of CLIPS. CLIPS was developed in 1986 and Version 6.0 was released in 1993.
CLIPS 6.0 - C LANGUAGE INTEGRATED PRODUCTION SYSTEM, VERSION 6.0 (IBM PC VERSION)
NASA Technical Reports Server (NTRS)
Donnell, B.
1994-01-01
CLIPS, the C Language Integrated Production System, is a complete environment for developing expert systems -- programs which are specifically intended to model human expertise or knowledge. It is designed to allow artificial intelligence research, development, and delivery on conventional computers. CLIPS 6.0 provides a cohesive tool for handling a wide variety of knowledge with support for three different programming paradigms: rule-based, object-oriented, and procedural. Rule-based programming allows knowledge to be represented as heuristics, or "rules-of-thumb" which specify a set of actions to be performed for a given situation. Object-oriented programming allows complex systems to be modeled as modular components (which can be easily reused to model other systems or create new components). The procedural programming capabilities provided by CLIPS 6.0 allow CLIPS to represent knowledge in ways similar to those allowed in languages such as C, Pascal, Ada, and LISP. Using CLIPS 6.0, one can develop expert system software using only rule-based programming, only object-oriented programming, only procedural programming, or combinations of the three. CLIPS provides extensive features to support the rule-based programming paradigm including seven conflict resolution strategies, dynamic rule priorities, and truth maintenance. CLIPS 6.0 supports more complex nesting of conditional elements in the if portion of a rule ("and", "or", and "not" conditional elements can be placed within a "not" conditional element). In addition, there is no longer a limitation on the number of multifield slots that a deftemplate can contain. The CLIPS Object-Oriented Language (COOL) provides object-oriented programming capabilities. Features supported by COOL include classes with multiple inheritance, abstraction, encapsulation, polymorphism, dynamic binding, and message passing with message-handlers. CLIPS 6.0 supports tight integration of the rule-based programming features of CLIPS with COOL (that is, a rule can pattern match on objects created using COOL). CLIPS 6.0 provides the capability to define functions, overloaded functions, and global variables interactively. In addition, CLIPS can be embedded within procedural code, called as a subroutine, and integrated with languages such as C, FORTRAN and Ada. CLIPS can be easily extended by a user through the use of several well-defined protocols. CLIPS provides several delivery options for programs including the ability to generate stand alone executables or to load programs from text or binary files. CLIPS 6.0 provides support for the modular development and execution of knowledge bases with the defmodule construct. CLIPS modules allow a set of constructs to be grouped together such that explicit control can be maintained over restricting the access of the constructs by other modules. This type of control is similar to global and local scoping used in languages such as C or Ada. By restricting access to deftemplate and defclass constructs, modules can function as blackboards, permitting only certain facts and instances to be seen by other modules. Modules are also used by rules to provide execution control. The CRSV (Cross-Reference, Style, and Verification) utility included with previous version of CLIPS is no longer supported. The capabilities provided by this tool are now available directly within CLIPS 6.0 to aid in the development, debugging, and verification of large rule bases. COSMIC offers four distribution versions of CLIPS 6.0: UNIX (MSC-22433), VMS (MSC-22434), MACINTOSH (MSC-22429), and IBM PC (MSC-22430). Executable files, source code, utilities, documentation, and examples are included on the program media. All distribution versions include identical source code for the command line version of CLIPS 6.0. This source code should compile on any platform with an ANSI C compiler. Each distribution version of CLIPS 6.0, except that for the Macintosh platform, includes an executable for the command line version. For the UNIX version of CLIPS 6.0, the command line interface has been successfully implemented on a Sun4 running SunOS, a DECstation running DEC RISC ULTRIX, an SGI Indigo Elan running IRIX, a DEC Alpha AXP running OSF/1, and an IBM RS/6000 running AIX. Command line interface executables are included for Sun4 computers running SunOS 4.1.1 or later and for the DEC RISC ULTRIX platform. The makefiles may have to be modified slightly to be used on other UNIX platforms. The UNIX, Macintosh, and IBM PC versions of CLIPS 6.0 each have a platform specific interface. Source code, a makefile, and an executable for the Windows 3.1 interface version of CLIPS 6.0 are provided only on the IBM PC distribution diskettes. Source code, a makefile, and an executable for the Macintosh interface version of CLIPS 6.0 are provided only on the Macintosh distribution diskettes. Likewise, for the UNIX version of CLIPS 6.0, only source code and a makefile for an X-Windows interface are provided. The X-Windows interface requires MIT's X Window System, Version 11, Release 4 (X11R4), the Athena Widget Set, and the Xmu library. The source code for the Athena Widget Set is provided on the distribution medium. The X-Windows interface has been successfully implemented on a Sun4 running SunOS 4.1.2 with the MIT distribution of X11R4 (not OpenWindows), an SGI Indigo Elan running IRIX 4.0.5, and a DEC Alpha AXP running OSF/1 1.2. The VAX version of CLIPS 6.0 comes only with the generic command line interface. ASCII makefiles for the command line version of CLIPS are provided on all the distribution media for UNIX, VMS, and DOS. Four executables are provided with the IBM PC version: a windowed interface executable for Windows 3.1 built using Borland C++ v3.1, an editor for use with the windowed interface, a command line version of CLIPS for Windows 3.1, and a 386 command line executable for DOS built using Zortech C++ v3.1. All four executables are capable of utilizing extended memory and require an 80386 CPU or better. Users needing an 8086/8088 or 80286 executable must recompile the CLIPS source code themselves. Users who wish to recompile the DOS executable using Borland C++ or MicroSoft C must use a DOS extender program to produce an executable capable of using extended memory. The version of CLIPS 6.0 for IBM PC compatibles requires DOS v3.3 or later and/or Windows 3.1 or later. It is distributed on a set of three 1.4Mb 3.5 inch diskettes. A hard disk is required. The Macintosh version is distributed in compressed form on two 3.5 inch 1.4Mb Macintosh format diskettes, and requires System 6.0.5, or higher, and 1Mb RAM. The version for DEC VAX/VMS is available in VAX BACKUP format on a 1600 BPI 9-track magnetic tape (standard distribution medium) or a TK50 tape cartridge. The UNIX version is distributed in UNIX tar format on a .25 inch streaming magnetic tape cartridge (Sun QIC-24). For the UNIX version, alternate distribution media and formats are available upon request. The CLIPS 6.0 documentation includes a User's Guide and a three volume Reference Manual consisting of Basic and Advanced Programming Guides and an Interfaces Guide. An electronic version of the documentation is provided on the distribution medium for each version: in MicroSoft Word format for the Macintosh and PC versions of CLIPS, and in both PostScript format and MicroSoft Word for Macintosh format for the UNIX and DEC VAX versions of CLIPS. CLIPS was developed in 1986 and Version 6.0 was released in 1993.
CLIPS 6.0 - C LANGUAGE INTEGRATED PRODUCTION SYSTEM, VERSION 6.0 (MACINTOSH VERSION)
NASA Technical Reports Server (NTRS)
Riley, G.
1994-01-01
CLIPS, the C Language Integrated Production System, is a complete environment for developing expert systems -- programs which are specifically intended to model human expertise or knowledge. It is designed to allow artificial intelligence research, development, and delivery on conventional computers. CLIPS 6.0 provides a cohesive tool for handling a wide variety of knowledge with support for three different programming paradigms: rule-based, object-oriented, and procedural. Rule-based programming allows knowledge to be represented as heuristics, or "rules-of-thumb" which specify a set of actions to be performed for a given situation. Object-oriented programming allows complex systems to be modeled as modular components (which can be easily reused to model other systems or create new components). The procedural programming capabilities provided by CLIPS 6.0 allow CLIPS to represent knowledge in ways similar to those allowed in languages such as C, Pascal, Ada, and LISP. Using CLIPS 6.0, one can develop expert system software using only rule-based programming, only object-oriented programming, only procedural programming, or combinations of the three. CLIPS provides extensive features to support the rule-based programming paradigm including seven conflict resolution strategies, dynamic rule priorities, and truth maintenance. CLIPS 6.0 supports more complex nesting of conditional elements in the if portion of a rule ("and", "or", and "not" conditional elements can be placed within a "not" conditional element). In addition, there is no longer a limitation on the number of multifield slots that a deftemplate can contain. The CLIPS Object-Oriented Language (COOL) provides object-oriented programming capabilities. Features supported by COOL include classes with multiple inheritance, abstraction, encapsulation, polymorphism, dynamic binding, and message passing with message-handlers. CLIPS 6.0 supports tight integration of the rule-based programming features of CLIPS with COOL (that is, a rule can pattern match on objects created using COOL). CLIPS 6.0 provides the capability to define functions, overloaded functions, and global variables interactively. In addition, CLIPS can be embedded within procedural code, called as a subroutine, and integrated with languages such as C, FORTRAN and Ada. CLIPS can be easily extended by a user through the use of several well-defined protocols. CLIPS provides several delivery options for programs including the ability to generate stand alone executables or to load programs from text or binary files. CLIPS 6.0 provides support for the modular development and execution of knowledge bases with the defmodule construct. CLIPS modules allow a set of constructs to be grouped together such that explicit control can be maintained over restricting the access of the constructs by other modules. This type of control is similar to global and local scoping used in languages such as C or Ada. By restricting access to deftemplate and defclass constructs, modules can function as blackboards, permitting only certain facts and instances to be seen by other modules. Modules are also used by rules to provide execution control. The CRSV (Cross-Reference, Style, and Verification) utility included with previous version of CLIPS is no longer supported. The capabilities provided by this tool are now available directly within CLIPS 6.0 to aid in the development, debugging, and verification of large rule bases. COSMIC offers four distribution versions of CLIPS 6.0: UNIX (MSC-22433), VMS (MSC-22434), MACINTOSH (MSC-22429), and IBM PC (MSC-22430). Executable files, source code, utilities, documentation, and examples are included on the program media. All distribution versions include identical source code for the command line version of CLIPS 6.0. This source code should compile on any platform with an ANSI C compiler. Each distribution version of CLIPS 6.0, except that for the Macintosh platform, includes an executable for the command line version. For the UNIX version of CLIPS 6.0, the command line interface has been successfully implemented on a Sun4 running SunOS, a DECstation running DEC RISC ULTRIX, an SGI Indigo Elan running IRIX, a DEC Alpha AXP running OSF/1, and an IBM RS/6000 running AIX. Command line interface executables are included for Sun4 computers running SunOS 4.1.1 or later and for the DEC RISC ULTRIX platform. The makefiles may have to be modified slightly to be used on other UNIX platforms. The UNIX, Macintosh, and IBM PC versions of CLIPS 6.0 each have a platform specific interface. Source code, a makefile, and an executable for the Windows 3.1 interface version of CLIPS 6.0 are provided only on the IBM PC distribution diskettes. Source code, a makefile, and an executable for the Macintosh interface version of CLIPS 6.0 are provided only on the Macintosh distribution diskettes. Likewise, for the UNIX version of CLIPS 6.0, only source code and a makefile for an X-Windows interface are provided. The X-Windows interface requires MIT's X Window System, Version 11, Release 4 (X11R4), the Athena Widget Set, and the Xmu library. The source code for the Athena Widget Set is provided on the distribution medium. The X-Windows interface has been successfully implemented on a Sun4 running SunOS 4.1.2 with the MIT distribution of X11R4 (not OpenWindows), an SGI Indigo Elan running IRIX 4.0.5, and a DEC Alpha AXP running OSF/1 1.2. The VAX version of CLIPS 6.0 comes only with the generic command line interface. ASCII makefiles for the command line version of CLIPS are provided on all the distribution media for UNIX, VMS, and DOS. Four executables are provided with the IBM PC version: a windowed interface executable for Windows 3.1 built using Borland C++ v3.1, an editor for use with the windowed interface, a command line version of CLIPS for Windows 3.1, and a 386 command line executable for DOS built using Zortech C++ v3.1. All four executables are capable of utilizing extended memory and require an 80386 CPU or better. Users needing an 8086/8088 or 80286 executable must recompile the CLIPS source code themselves. Users who wish to recompile the DOS executable using Borland C++ or MicroSoft C must use a DOS extender program to produce an executable capable of using extended memory. The version of CLIPS 6.0 for IBM PC compatibles requires DOS v3.3 or later and/or Windows 3.1 or later. It is distributed on a set of three 1.4Mb 3.5 inch diskettes. A hard disk is required. The Macintosh version is distributed in compressed form on two 3.5 inch 1.4Mb Macintosh format diskettes, and requires System 6.0.5, or higher, and 1Mb RAM. The version for DEC VAX/VMS is available in VAX BACKUP format on a 1600 BPI 9-track magnetic tape (standard distribution medium) or a TK50 tape cartridge. The UNIX version is distributed in UNIX tar format on a .25 inch streaming magnetic tape cartridge (Sun QIC-24). For the UNIX version, alternate distribution media and formats are available upon request. The CLIPS 6.0 documentation includes a User's Guide and a three volume Reference Manual consisting of Basic and Advanced Programming Guides and an Interfaces Guide. An electronic version of the documentation is provided on the distribution medium for each version: in MicroSoft Word format for the Macintosh and PC versions of CLIPS, and in both PostScript format and MicroSoft Word for Macintosh format for the UNIX and DEC VAX versions of CLIPS. CLIPS was developed in 1986 and Version 6.0 was released in 1993.
CLIPS 6.0 - C LANGUAGE INTEGRATED PRODUCTION SYSTEM, VERSION 6.0 (DEC VAX VMS VERSION)
NASA Technical Reports Server (NTRS)
Donnell, B.
1994-01-01
CLIPS, the C Language Integrated Production System, is a complete environment for developing expert systems -- programs which are specifically intended to model human expertise or knowledge. It is designed to allow artificial intelligence research, development, and delivery on conventional computers. CLIPS 6.0 provides a cohesive tool for handling a wide variety of knowledge with support for three different programming paradigms: rule-based, object-oriented, and procedural. Rule-based programming allows knowledge to be represented as heuristics, or "rules-of-thumb" which specify a set of actions to be performed for a given situation. Object-oriented programming allows complex systems to be modeled as modular components (which can be easily reused to model other systems or create new components). The procedural programming capabilities provided by CLIPS 6.0 allow CLIPS to represent knowledge in ways similar to those allowed in languages such as C, Pascal, Ada, and LISP. Using CLIPS 6.0, one can develop expert system software using only rule-based programming, only object-oriented programming, only procedural programming, or combinations of the three. CLIPS provides extensive features to support the rule-based programming paradigm including seven conflict resolution strategies, dynamic rule priorities, and truth maintenance. CLIPS 6.0 supports more complex nesting of conditional elements in the if portion of a rule ("and", "or", and "not" conditional elements can be placed within a "not" conditional element). In addition, there is no longer a limitation on the number of multifield slots that a deftemplate can contain. The CLIPS Object-Oriented Language (COOL) provides object-oriented programming capabilities. Features supported by COOL include classes with multiple inheritance, abstraction, encapsulation, polymorphism, dynamic binding, and message passing with message-handlers. CLIPS 6.0 supports tight integration of the rule-based programming features of CLIPS with COOL (that is, a rule can pattern match on objects created using COOL). CLIPS 6.0 provides the capability to define functions, overloaded functions, and global variables interactively. In addition, CLIPS can be embedded within procedural code, called as a subroutine, and integrated with languages such as C, FORTRAN and Ada. CLIPS can be easily extended by a user through the use of several well-defined protocols. CLIPS provides several delivery options for programs including the ability to generate stand alone executables or to load programs from text or binary files. CLIPS 6.0 provides support for the modular development and execution of knowledge bases with the defmodule construct. CLIPS modules allow a set of constructs to be grouped together such that explicit control can be maintained over restricting the access of the constructs by other modules. This type of control is similar to global and local scoping used in languages such as C or Ada. By restricting access to deftemplate and defclass constructs, modules can function as blackboards, permitting only certain facts and instances to be seen by other modules. Modules are also used by rules to provide execution control. The CRSV (Cross-Reference, Style, and Verification) utility included with previous version of CLIPS is no longer supported. The capabilities provided by this tool are now available directly within CLIPS 6.0 to aid in the development, debugging, and verification of large rule bases. COSMIC offers four distribution versions of CLIPS 6.0: UNIX (MSC-22433), VMS (MSC-22434), MACINTOSH (MSC-22429), and IBM PC (MSC-22430). Executable files, source code, utilities, documentation, and examples are included on the program media. All distribution versions include identical source code for the command line version of CLIPS 6.0. This source code should compile on any platform with an ANSI C compiler. Each distribution version of CLIPS 6.0, except that for the Macintosh platform, includes an executable for the command line version. For the UNIX version of CLIPS 6.0, the command line interface has been successfully implemented on a Sun4 running SunOS, a DECstation running DEC RISC ULTRIX, an SGI Indigo Elan running IRIX, a DEC Alpha AXP running OSF/1, and an IBM RS/6000 running AIX. Command line interface executables are included for Sun4 computers running SunOS 4.1.1 or later and for the DEC RISC ULTRIX platform. The makefiles may have to be modified slightly to be used on other UNIX platforms. The UNIX, Macintosh, and IBM PC versions of CLIPS 6.0 each have a platform specific interface. Source code, a makefile, and an executable for the Windows 3.1 interface version of CLIPS 6.0 are provided only on the IBM PC distribution diskettes. Source code, a makefile, and an executable for the Macintosh interface version of CLIPS 6.0 are provided only on the Macintosh distribution diskettes. Likewise, for the UNIX version of CLIPS 6.0, only source code and a makefile for an X-Windows interface are provided. The X-Windows interface requires MIT's X Window System, Version 11, Release 4 (X11R4), the Athena Widget Set, and the Xmu library. The source code for the Athena Widget Set is provided on the distribution medium. The X-Windows interface has been successfully implemented on a Sun4 running SunOS 4.1.2 with the MIT distribution of X11R4 (not OpenWindows), an SGI Indigo Elan running IRIX 4.0.5, and a DEC Alpha AXP running OSF/1 1.2. The VAX version of CLIPS 6.0 comes only with the generic command line interface. ASCII makefiles for the command line version of CLIPS are provided on all the distribution media for UNIX, VMS, and DOS. Four executables are provided with the IBM PC version: a windowed interface executable for Windows 3.1 built using Borland C++ v3.1, an editor for use with the windowed interface, a command line version of CLIPS for Windows 3.1, and a 386 command line executable for DOS built using Zortech C++ v3.1. All four executables are capable of utilizing extended memory and require an 80386 CPU or better. Users needing an 8086/8088 or 80286 executable must recompile the CLIPS source code themselves. Users who wish to recompile the DOS executable using Borland C++ or MicroSoft C must use a DOS extender program to produce an executable capable of using extended memory. The version of CLIPS 6.0 for IBM PC compatibles requires DOS v3.3 or later and/or Windows 3.1 or later. It is distributed on a set of three 1.4Mb 3.5 inch diskettes. A hard disk is required. The Macintosh version is distributed in compressed form on two 3.5 inch 1.4Mb Macintosh format diskettes, and requires System 6.0.5, or higher, and 1Mb RAM. The version for DEC VAX/VMS is available in VAX BACKUP format on a 1600 BPI 9-track magnetic tape (standard distribution medium) or a TK50 tape cartridge. The UNIX version is distributed in UNIX tar format on a .25 inch streaming magnetic tape cartridge (Sun QIC-24). For the UNIX version, alternate distribution media and formats are available upon request. The CLIPS 6.0 documentation includes a User's Guide and a three volume Reference Manual consisting of Basic and Advanced Programming Guides and an Interfaces Guide. An electronic version of the documentation is provided on the distribution medium for each version: in MicroSoft Word format for the Macintosh and PC versions of CLIPS, and in both PostScript format and MicroSoft Word for Macintosh format for the UNIX and DEC VAX versions of CLIPS. CLIPS was developed in 1986 and Version 6.0 was released in 1993.
DOT National Transportation Integrated Search
1999-12-01
Volume III of the Logical Architecture contract deliverable documents the Data Dictionary. This formatted version of the Teamwork model data dictionary is mechanically produced from the Teamwork CDIF (Case Data Interchange Format) output file. It is ...
Standard interface files and procedures for reactor physics codes, version III
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carmichael, B.M.
Standards and procedures for promoting the exchange of reactor physics codes are updated to Version-III status. Standards covering program structure, interface files, file handling subroutines, and card input format are included. The implementation status of the standards in codes and the extension of the standards to new code areas are summarized. (15 references) (auth)
32 CFR 989.14 - Environmental assessment.
Code of Federal Regulations, 2010 CFR
2010-07-01
... format for the EA may be the same as the EIS. The alternatives section of an EA and an EIS are similar... Alternative (FONPA) must be submitted (five hard copies and an electronic version) to the MAJCOM EPF when the... submitted (five hard copies and an electronic version) through the MAJCOM EPF to HQ USAF/A7CI for SAF/IEE...
NASA Technical Reports Server (NTRS)
Gladden, Roy
2007-01-01
Version 2.0 of the autogen software has been released. "Autogen" (automated sequence generation) signifies both a process and software used to implement the process of automated generation of sequences of commands in a standard format for uplink to spacecraft. Autogen requires fewer workers than are needed for older manual sequence-generation processes and reduces sequence-generation times from weeks to minutes.
Dietrich, John D.; Johnson, Ronald C.
2013-01-01
Thirteen stratigraphic cross sections of the Eocene Green River Formation in the Piceance Basin of northwestern Colorado are presented in this report. Originally published in a much larger and more detailed form by Self and others (2010), they are shown here in simplified, page-size versions that are easily accessed and used for presentation purposes. Modifications to the original versions include the elimination of the detailed lithologic columns and oil-yield histograms from Fischer assay data and the addition of ground-surface lines to give the depth of the various oil shale units shown on the cross section.
Covariance Data File Formats for Whisper-1.0 & Whisper-1.1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, Forrest B.; Rising, Michael Evan
2017-01-09
Whisper is a statistical analysis package developed in 2014 to support nuclear criticality safety (NCS) validation. It uses the sensitivity profile data for an application as computed by MCNP6 along with covariance files for the nuclear data to determine a baseline upper-subcritical-limit (USL) for the application. Whisper version 1.0 was first developed and used at LANL in 2014. During 2015-2016, Whisper was updated to version 1.1 and is to be included with the upcoming release of MCNP6.2. This report describes the file formats used for the covariance data in both Whisper-1.0 and Whisper-1.1.
ESDAPT - APT PROGRAMMING EDITOR AND INTERPRETER
NASA Technical Reports Server (NTRS)
Premack, T.
1994-01-01
ESDAPT is a graphical programming environment for developing APT (Automatically Programmed Tool) programs for controlling numerically controlled machine tools. ESDAPT has a graphical user interface that provides the user with an APT syntax sensitive text editor and windows for displaying geometry and tool paths. APT geometry statement can also be created using menus and screen picks. ESDAPT interprets APT geometry statements and displays the results in its view windows. Tool paths are generated by batching the APT source to an APT processor (COSMIC P-APT recommended). The tool paths are then displayed in the view windows. Hardcopy output of the view windows is in color PostScript format. ESDAPT is written in C-language, yacc, lex, and XView for use on Sun4 series computers running SunOS. ESDAPT requires 4Mb of disk space, 7Mb of RAM, and MIT's X Window System, Version 11 Release 4, or OpenWindows version 3 for execution. Program documentation in PostScript format and an executable for OpenWindows version 3 are provided on the distribution media. The standard distribution medium for ESDAPT is a .25 inch streaming magnetic tape cartridge (Sun QIC-24) in UNIX tar format. This program was developed in 1992.
Simulation Experiment Description Markup Language (SED-ML) Level 1 Version 2.
Bergmann, Frank T; Cooper, Jonathan; Le Novère, Nicolas; Nickerson, David; Waltemath, Dagmar
2015-09-04
The number, size and complexity of computational models of biological systems are growing at an ever increasing pace. It is imperative to build on existing studies by reusing and adapting existing models and parts thereof. The description of the structure of models is not sufficient to enable the reproduction of simulation results. One also needs to describe the procedures the models are subjected to, as recommended by the Minimum Information About a Simulation Experiment (MIASE) guidelines. This document presents Level 1 Version 2 of the Simulation Experiment Description Markup Language (SED-ML), a computer-readable format for encoding simulation and analysis experiments to apply to computational models. SED-ML files are encoded in the Extensible Markup Language (XML) and can be used in conjunction with any XML-based model encoding format, such as CellML or SBML. A SED-ML file includes details of which models to use, how to modify them prior to executing a simulation, which simulation and analysis procedures to apply, which results to extract and how to present them. Level 1 Version 2 extends the format by allowing the encoding of repeated and chained procedures.
Simulation Experiment Description Markup Language (SED-ML) Level 1 Version 2.
Bergmann, Frank T; Cooper, Jonathan; Le Novère, Nicolas; Nickerson, David; Waltemath, Dagmar
2015-06-01
The number, size and complexity of computational models of biological systems are growing at an ever increasing pace. It is imperative to build on existing studies by reusing and adapting existing models and parts thereof. The description of the structure of models is not sufficient to enable the reproduction of simulation results. One also needs to describe the procedures the models are subjected to, as recommended by the Minimum Information About a Simulation Experiment (MIASE) guidelines. This document presents Level 1 Version 2 of the Simulation Experiment Description Markup Language (SED-ML), a computer-readable format for encoding simulation and analysis experiments to apply to computational models. SED-ML files are encoded in the Extensible Markup Language (XML) and can be used in conjunction with any XML-based model encoding format, such as CellML or SBML. A SED-ML file includes details of which models to use, how to modify them prior to executing a simulation, which simulation and analysis procedures to apply, which results to extract and how to present them. Level 1 Version 2 extends the format by allowing the encoding of repeated and chained procedures.
Interference, aging, and visuospatial working memory: the role of similarity.
Rowe, Gillian; Hasher, Lynn; Turcotte, Josée
2010-11-01
Older adults' performance on working memory (WM) span tasks is known to be negatively affected by the buildup of proactive interference (PI) across trials. PI has been reduced in verbal tasks and performance increased by presenting distinctive items across trials. In addition, reversing the order of trial presentation (i.e., starting with the longest sets first) has been shown to reduce PI in both verbal and visuospatial WM span tasks. We considered whether making each trial visually distinct would improve older adults' visuospatial WM performance, and whether combining the 2 PI-reducing manipulations, distinct trials and reversed order of presentation, would prove additive, thus providing even greater benefit. Forty-eight healthy older adults (age range = 60-77 years) completed 1 of 3 versions of a computerized Corsi block test. For 2 versions of the task, trials were either all visually similar or all visually distinct, and were presented in the standard ascending format (shortest set size first). In the third version, visually distinct trials were presented in a reverse order of presentation (longest set size first). Span scores were reliably higher in the ascending version for visually distinct compared with visually similar trials, F(1, 30) = 4.96, p = .03, η² = .14. However, combining distinct trials and a descending format proved no more beneficial than administering the descending format alone. Our findings suggest that a more accurate measurement of the visuospatial WM span scores of older adults (and possibly neuropsychological patients) might be obtained by reducing within-test interference.
Data File Standard for Flow Cytometry, version FCS 3.1.
Spidlen, Josef; Moore, Wayne; Parks, David; Goldberg, Michael; Bray, Chris; Bierre, Pierre; Gorombey, Peter; Hyun, Bill; Hubbard, Mark; Lange, Simon; Lefebvre, Ray; Leif, Robert; Novo, David; Ostruszka, Leo; Treister, Adam; Wood, James; Murphy, Robert F; Roederer, Mario; Sudar, Damir; Zigon, Robert; Brinkman, Ryan R
2010-01-01
The flow cytometry data file standard provides the specifications needed to completely describe flow cytometry data sets within the confines of the file containing the experimental data. In 1984, the first Flow Cytometry Standard format for data files was adopted as FCS 1.0. This standard was modified in 1990 as FCS 2.0 and again in 1997 as FCS 3.0. We report here on the next generation flow cytometry standard data file format. FCS 3.1 is a minor revision based on suggested improvements from the community. The unchanged goal of the standard is to provide a uniform file format that allows files created by one type of acquisition hardware and software to be analyzed by any other type.The FCS 3.1 standard retains the basic FCS file structure and most features of previous versions of the standard. Changes included in FCS 3.1 address potential ambiguities in the previous versions and provide a more robust standard. The major changes include simplified support for international characters and improved support for storing compensation. The major additions are support for preferred display scale, a standardized way of capturing the sample volume, information about originality of the data file, and support for plate and well identification in high throughput, plate based experiments. Please see the normative version of the FCS 3.1 specification in Supporting Information for this manuscript (or at http://www.isac-net.org/ in the Current standards section) for a complete list of changes.
Data File Standard for Flow Cytometry, Version FCS 3.1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Spidlen, Josef; Moore, Wayne; Parks, David
2009-11-10
The flow cytometry data file standard provides the specifications needed to completely describe flow cytometry data sets within the confines of the file containing the experimental data. In 1984, the first Flow Cytometry Standard format for data files was adopted as FCS 1.0. This standard was modified in 1990 as FCS 2.0 and again in 1997 as FCS 3.0. We report here on the next generation flow cytometry standard data file format. FCS 3.1 is a minor revision based on suggested improvements from the community. The unchanged goal of the standard is to provide a uniform file format that allowsmore » files created by one type of acquisition hardware and software to be analyzed by any other type. The FCS 3.1 standard retains the basic FCS file structure and most features of previous versions of the standard. Changes included in FCS 3.1 address potential ambiguities in the previous versions and provide a more robust standard. The major changes include simplified support for international characters and improved support for storing compensation. The major additions are support for preferred display scale, a standardized way of capturing the sample volume, information about originality of the data file, and support for plate and well identification in high throughput, plate based experiments. Please see the normative version of the FCS 3.1 specification in Supporting Information for this manuscript (or at http://www.isac-net.org/ in the Current standards section) for a complete list of changes.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-30
... supplied to the DFO in the following formats: One hard copy with original signature, and one electronic copy via e-mail (acceptable file format: Adobe Acrobat PDF, WordPerfect, MS Word, MS PowerPoint, or Rich Text files in IBM-PC/ Windows 98/2000/XP format). Submitters are requested to provide two versions...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-01-11
... supplied to the DFO in the following formats: one hard copy with original signature and one electronic copy via e-mail (acceptable file format: Adobe Acrobat PDF, MS Word, WordPerfect, MS PowerPoint, or Rich Text files in IBM-PC/Windows 98/2000/XP format). Submitters are asked to provide versions of each...
Lee, Hang Wai; Chan, Albert S C; Kwong, Fuk Yee
2007-07-07
A rhodium-(S)-xyl-BINAP complex-catalyzed tandem formate decarbonylation and [2 + 2 + 1] carbonylative cyclization is described; this cooperative process utilizes formate as a condensed CO source, and the newly developed cascade protocol can be extended to its enantioselective version, providing up to 94% ee of the cyclopentenone adducts.
Transferable Output ASCII Data (TOAD) gateway: Version 1.0 user's guide
NASA Technical Reports Server (NTRS)
Bingel, Bradford D.
1991-01-01
The Transferable Output ASCII Data (TOAD) Gateway, release 1.0 is described. This is a software tool for converting tabular data from one format into another via the TOAD format. This initial release of the Gateway allows free data interchange among the following file formats: TOAD; Standard Interface File (SIF); Program to Optimize Simulated Trajectories (POST) input; Comma Separated Value (TSV); and a general free-form file format. As required, additional formats can be accommodated quickly and easily.
NASA Astrophysics Data System (ADS)
Heitzig, Jobst; Kornek, Ulrike
2018-06-01
In the PDF version of this Article originally published, in equation (6) gi' was incorrectly formatted as gi', and at the end of the Methods section wi was incorrectly formatted as wi. These have now been corrected.
TADPLOT program, version 2.0: User's guide
NASA Technical Reports Server (NTRS)
Hammond, Dana P.
1991-01-01
The TADPLOT Program, Version 2.0 is described. The TADPLOT program is a software package coordinated by a single, easy-to-use interface, enabling the researcher to access several standard file formats, selectively collect specific subsets of data, and create full-featured publication and viewgraph quality plots. The user-interface was designed to be independent from any file format, yet provide capabilities to accommodate highly specialized data queries. Integrated with an applications software network, data can be assessed, collected, and viewed quickly and easily. Since the commands are data independent, subsequent modifications to the file format will be transparent, while additional file formats can be integrated with minimal impact on the user-interface. The graphical capabilities are independent of the method of data collection; thus, the data specification and subsequent plotting can be modified and upgraded as separate functional components. The graphics kernel selected adheres to the full functional specifications of the CORE standard. Both interface and postprocessing capabilities are fully integrated into TADPLOT.
EPA Remote Sensing Information Gateway
NASA Astrophysics Data System (ADS)
Paulsen, H. K.; Szykman, J. J.; Plessel, T.; Freeman, M.; Dimmick, F.
2009-12-01
The Remote Sensing Information Gateway was developed by the U.S. Environmental Protection Agency (EPA) to assist researchers in easily obtaining and combining a variety of environmental datasets related to air quality research. Current datasets available include, but are not limited to surface PM2.5 and O3 data, satellite derived aerosol optical depth , and 3-dimensional output from U.S. EPA's Models 3/Community Multi-scale Air Quality (CMAQ) modeling system. The presentation will include a demonstration that illustrates several scenarios of how researchers use the tool to help them visualize and obtain data for their work; with a particular focus on episode analysis related to biomass burning impacts on air quality. The presentation will provide an overview on how RSIG works and how the code has been—and can be—adapted for other projects. One example is the Virtual Estuary, which focuses on automating the retrieval and pre-processing of a variety of data needed for estuarine research. RSIG’s source codes are freely available to researchers with permission from the EPA principal investigator, Dr. Jim Szykman. RSIG is available to the community and can be accessed online at http://www.epa.gov/rsig. Once the JAVA policy file is configured on your computer you can run the RSIG applet on your computer and connect to the RSIG server to visualize and retrieve available data sets. The applet allows the user to specify the temporal/spatial areas of interest, and the types of data to retrieve. The applet then communicates with RSIG subsetter codes located on the data owners’ remote servers; the subsetter codes assemble and transfer via ordinary Internet protocols only the specified data to the researcher’s computer. This is much faster than the usual method of transferring large files via FTP and greatly reduces network traffic. The RSIG applet then visualizes the transferred data on a latitude-longitude map, automatically locating the data in the correct geographic position. Images, animations, and aggregated data can be saved or exported in a variety of data formats: Binary External Data Representation (XDR) format (simple, efficient), NetCDF-COARDS format, NetCDF-IOAPI format (regridding the data to a CMAQ grid), HDF (unsubsetted satellite files), ASCII tab-delimited spreadsheet, MCMC (used for input into HB model), PNG images, MPG movies, KMZ movies (for display in Google Earth and similar applications), GeoTIFF RGB format and 32-bit float format. RSIG’s source codes are freely available to researchers with permission from the EPA. Contacts for obtaining RSIG code are available at the RSIG website.
Breast Cancer Epidemiology in Puerto Rico
2009-06-01
Award 5 potential population controls from the above mentioned area is available to identify eligible controls. The survey instrument to be use ...be available for English speaking participants. An electronic version of the questionnaire is being developed using Microsoft Access. At this moment...questionnaire electronic version revision, and to test it with using a formal interview format. We hope to begin interviewing in September 2009
ERIC Educational Resources Information Center
Department of Education, Washington, DC.
This interactive teleconference (in VHS format, Spanish language version) presents renowned national experts, local educators, and community leaders who share ideas on how to improve schools and reach the National Educational Goals. The 60-minute Satellite Town Meeting focuses on laying the foundation for school success through readiness to read.…
The Systems Biology Markup Language (SBML): Language Specification for Level 3 Version 2 Core.
Hucka, Michael; Bergmann, Frank T; Dräger, Andreas; Hoops, Stefan; Keating, Sarah M; Le Novère, Nicolas; Myers, Chris J; Olivier, Brett G; Sahle, Sven; Schaff, James C; Smith, Lucian P; Waltemath, Dagmar; Wilkinson, Darren J
2018-03-09
Computational models can help researchers to interpret data, understand biological functions, and make quantitative predictions. The Systems Biology Markup Language (SBML) is a file format for representing computational models in a declarative form that different software systems can exchange. SBML is oriented towards describing biological processes of the sort common in research on a number of topics, including metabolic pathways, cell signaling pathways, and many others. By supporting SBML as an input/output format, different tools can all operate on an identical representation of a model, removing opportunities for translation errors and assuring a common starting point for analyses and simulations. This document provides the specification for Version 2 of SBML Level 3 Core. The specification defines the data structures prescribed by SBML, their encoding in XML (the eXtensible Markup Language), validation rules that determine the validity of an SBML document, and examples of models in SBML form. The design of Version 2 differs from Version 1 principally in allowing new MathML constructs, making more child elements optional, and adding identifiers to all SBML elements instead of only selected elements. Other materials and software are available from the SBML project website at http://sbml.org/.
78 FR 24107 - Version 5 Critical Infrastructure Protection Reliability Standards
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-24
... native applications or print-to-PDF format and not a scanned format. Mail/Hand Delivery: Those unable to... criteria that characterize their impact for the application of cyber security requirements commensurate... recognition. Requirement R2 requires testing to verify response plan effectiveness and consistent application...
Accelerating Malware Detection via a Graphics Processing Unit
2010-09-01
Processing Unit . . . . . . . . . . . . . . . . . . 4 PE Portable Executable . . . . . . . . . . . . . . . . . . . . . 4 COFF Common Object File Format...operating systems for the future [Szo05]. The PE format is an updated version of the common object file format ( COFF ) [Mic06]. Microsoft released a new...NAs02]. These alerts can be costly in terms of time and resources for individuals and organizations to investigate each misidentified file [YWL07] [Vak10
ERIC Educational Resources Information Center
Tsai, Fu-Hsing
2013-01-01
This study developed a game-based formative assessment, called tic-tac-toe quiz for single-player version (TRIS-Q-SP), in an energy education e-learning system. This assessment game combined tic-tac-toe with online assessment, and revised the rule of tic-tac-toe for stimulating students to use online formative assessment actively. Additionally, to…
NETS - A NEURAL NETWORK DEVELOPMENT TOOL, VERSION 3.0 (MACINTOSH VERSION)
NASA Technical Reports Server (NTRS)
Phillips, T. A.
1994-01-01
NETS, A Tool for the Development and Evaluation of Neural Networks, provides a simulation of Neural Network algorithms plus an environment for developing such algorithms. Neural Networks are a class of systems modeled after the human brain. Artificial Neural Networks are formed from hundreds or thousands of simulated neurons, connected to each other in a manner similar to brain neurons. Problems which involve pattern matching readily fit the class of problems which NETS is designed to solve. NETS uses the back propagation learning method for all of the networks which it creates. The nodes of a network are usually grouped together into clumps called layers. Generally, a network will have an input layer through which the various environment stimuli are presented to the network, and an output layer for determining the network's response. The number of nodes in these two layers is usually tied to some features of the problem being solved. Other layers, which form intermediate stops between the input and output layers, are called hidden layers. NETS allows the user to customize the patterns of connections between layers of a network. NETS also provides features for saving the weight values of a network during the learning process, which allows for more precise control over the learning process. NETS is an interpreter. Its method of execution is the familiar "read-evaluate-print" loop found in interpreted languages such as BASIC and LISP. The user is presented with a prompt which is the simulator's way of asking for input. After a command is issued, NETS will attempt to evaluate the command, which may produce more prompts requesting specific information or an error if the command is not understood. The typical process involved when using NETS consists of translating the problem into a format which uses input/output pairs, designing a network configuration for the problem, and finally training the network with input/output pairs until an acceptable error is reached. NETS allows the user to generate C code to implement the network loaded into the system. This permits the placement of networks as components, or subroutines, in other systems. In short, once a network performs satisfactorily, the Generate C Code option provides the means for creating a program separate from NETS to run the network. Other features: files may be stored in binary or ASCII format; multiple input propagation is permitted; bias values may be included; capability to scale data without writing scaling code; quick interactive testing of network from the main menu; and several options that allow the user to manipulate learning efficiency. NETS is written in ANSI standard C language to be machine independent. The Macintosh version (MSC-22108) includes code for both a graphical user interface version and a command line interface version. The machine independent version (MSC-21588) only includes code for the command line interface version of NETS 3.0. The Macintosh version requires a Macintosh II series computer and has been successfully implemented under System 7. Four executables are included on these diskettes, two for floating point operations and two for integer arithmetic. It requires Think C 5.0 to compile. A minimum of 1Mb of RAM is required for execution. Sample input files and executables for both the command line version and the Macintosh user interface version are provided on the distribution medium. The Macintosh version is available on a set of three 3.5 inch 800K Macintosh format diskettes. The machine independent version has been successfully implemented on an IBM PC series compatible running MS-DOS, a DEC VAX running VMS, a SunIPC running SunOS, and a CRAY Y-MP running UNICOS. Two executables for the IBM PC version are included on the MS-DOS distribution media, one compiled for floating point operations and one for integer arithmetic. The machine independent version is available on a set of three 5.25 inch 360K MS-DOS format diskettes (standard distribution medium) or a .25 inch streaming magnetic tape cartridge in UNIX tar format. NETS was developed in 1989 and updated in 1992. IBM PC is a registered trademark of International Business Machines. MS-DOS is a registered trademark of Microsoft Corporation. DEC, VAX, and VMS are trademarks of Digital Equipment Corporation. SunIPC and SunOS are trademarks of Sun Microsystems, Inc. CRAY Y-MP and UNICOS are trademarks of Cray Research, Inc.
NETS - A NEURAL NETWORK DEVELOPMENT TOOL, VERSION 3.0 (MACHINE INDEPENDENT VERSION)
NASA Technical Reports Server (NTRS)
Baffes, P. T.
1994-01-01
NETS, A Tool for the Development and Evaluation of Neural Networks, provides a simulation of Neural Network algorithms plus an environment for developing such algorithms. Neural Networks are a class of systems modeled after the human brain. Artificial Neural Networks are formed from hundreds or thousands of simulated neurons, connected to each other in a manner similar to brain neurons. Problems which involve pattern matching readily fit the class of problems which NETS is designed to solve. NETS uses the back propagation learning method for all of the networks which it creates. The nodes of a network are usually grouped together into clumps called layers. Generally, a network will have an input layer through which the various environment stimuli are presented to the network, and an output layer for determining the network's response. The number of nodes in these two layers is usually tied to some features of the problem being solved. Other layers, which form intermediate stops between the input and output layers, are called hidden layers. NETS allows the user to customize the patterns of connections between layers of a network. NETS also provides features for saving the weight values of a network during the learning process, which allows for more precise control over the learning process. NETS is an interpreter. Its method of execution is the familiar "read-evaluate-print" loop found in interpreted languages such as BASIC and LISP. The user is presented with a prompt which is the simulator's way of asking for input. After a command is issued, NETS will attempt to evaluate the command, which may produce more prompts requesting specific information or an error if the command is not understood. The typical process involved when using NETS consists of translating the problem into a format which uses input/output pairs, designing a network configuration for the problem, and finally training the network with input/output pairs until an acceptable error is reached. NETS allows the user to generate C code to implement the network loaded into the system. This permits the placement of networks as components, or subroutines, in other systems. In short, once a network performs satisfactorily, the Generate C Code option provides the means for creating a program separate from NETS to run the network. Other features: files may be stored in binary or ASCII format; multiple input propagation is permitted; bias values may be included; capability to scale data without writing scaling code; quick interactive testing of network from the main menu; and several options that allow the user to manipulate learning efficiency. NETS is written in ANSI standard C language to be machine independent. The Macintosh version (MSC-22108) includes code for both a graphical user interface version and a command line interface version. The machine independent version (MSC-21588) only includes code for the command line interface version of NETS 3.0. The Macintosh version requires a Macintosh II series computer and has been successfully implemented under System 7. Four executables are included on these diskettes, two for floating point operations and two for integer arithmetic. It requires Think C 5.0 to compile. A minimum of 1Mb of RAM is required for execution. Sample input files and executables for both the command line version and the Macintosh user interface version are provided on the distribution medium. The Macintosh version is available on a set of three 3.5 inch 800K Macintosh format diskettes. The machine independent version has been successfully implemented on an IBM PC series compatible running MS-DOS, a DEC VAX running VMS, a SunIPC running SunOS, and a CRAY Y-MP running UNICOS. Two executables for the IBM PC version are included on the MS-DOS distribution media, one compiled for floating point operations and one for integer arithmetic. The machine independent version is available on a set of three 5.25 inch 360K MS-DOS format diskettes (standard distribution medium) or a .25 inch streaming magnetic tape cartridge in UNIX tar format. NETS was developed in 1989 and updated in 1992. IBM PC is a registered trademark of International Business Machines. MS-DOS is a registered trademark of Microsoft Corporation. DEC, VAX, and VMS are trademarks of Digital Equipment Corporation. SunIPC and SunOS are trademarks of Sun Microsystems, Inc. CRAY Y-MP and UNICOS are trademarks of Cray Research, Inc.
Owens, Justin; Zikmund-Fisher, Brian J
2013-01-01
Background Previous research has examined the impact of patient narratives on treatment choices, but to our knowledge, no study has examined the effect of narratives on information search. Further, no research has considered the relative impact of their format (text vs video) on health care decisions in a single study. Objective Our goal was to examine the impact of video and text-based narratives on information search in a Web-based patient decision aid for early stage breast cancer. Methods Fifty-six women were asked to imagine that they had been diagnosed with early stage breast cancer and needed to choose between two surgical treatments (lumpectomy with radiation or mastectomy). Participants were randomly assigned to view one of four versions of a Web decision aid. Two versions of the decision aid included videos of interviews with patients and physicians or videos of interviews with physicians only. To distinguish between the effect of narratives and the effect of videos, we created two text versions of the Web decision aid by replacing the patient and physician interviews with text transcripts of the videos. Participants could freely browse the Web decision aid until they developed a treatment preference. We recorded participants’ eye movements using the Tobii 1750 eye-tracking system equipped with Tobii Studio software. A priori, we defined 24 areas of interest (AOIs) in the Web decision aid. These AOIs were either separate pages of the Web decision aid or sections within a single page covering different content. Results We used multilevel modeling to examine the effect of narrative presence, narrative format, and their interaction on information search. There was a significant main effect of condition, P=.02; participants viewing decision aids with patient narratives spent more time searching for information than participants viewing the decision aids without narratives. The main effect of format was not significant, P=.10. However, there was a significant condition by format interaction on fixation duration, P<.001. When comparing the two video decision aids, participants viewing the narrative version spent more time searching for information than participants viewing the control version of the decision aid. In contrast, participants viewing the narrative version of the text decision aid spent less time searching for information than participants viewing the control version of the text decision aid. Further, narratives appear to have a global effect on information search; these effects were not limited to specific sections of the decision aid that contained topics discussed in the patient stories. Conclusions The observed increase in fixation duration with video patient testimonials is consistent with the idea that the vividness of the video content could cause greater elaboration of the message, thereby encouraging greater information search. Conversely, because reading requires more effortful processing than watching, reading patient narratives may have decreased participant motivation to engage in more reading in the remaining sections of the Web decision aid. These findings suggest that the format of patient stories may be equally as important as their content in determining their effect on decision making. More research is needed to understand why differences in format result in fundamental differences in information search. PMID:24345424
Navarro-Mateu, Fernando; Morán-Sánchez, Inés; Alonso, Jordi; Tormo, Ma José; Pujalte, Ma Luisa; Garriga, Ascensión; Aguilar-Gaxiola, Sergio; Navarro, Carmen
2013-01-01
To develop a Spanish version of the WHO-Composite International Diagnostic Interview (WHO-CIDI) applicable to Spain, through cultural adaptation of its most recent Latin American (LA v 20.0) version. A 1-week training course on the WHO-CIDI was provided by certified trainers. An expert panel reviewed the LA version, identified words or expressions that needed to be adapted to the cultural or linguistic norms for Spain, and proposed alternative expressions that were agreed on through consensus. The entire process was supervised and approved by a member of the WHO-CIDI Editorial Committee. The changes were incorporated into a Computer Assisted Personal Interview (CAPI) format and the feasibility and administration time were pilot tested in a convenience sample of 32 volunteers. A total of 372 questions were slightly modified (almost 7% of approximately 5000 questions in the survey) and incorporated into the CAPI version of the WHO-CIDI. Most of the changes were minor - but important - linguistic adaptations, and others were related to specific Spanish institutions and currency. In the pilot study, the instrument's mean completion administration time was 2h and 10min, with an interquartile range from 1.5 to nearly 3h. All the changes made were tested and officially approved. The Latin American version of the WHO-CIDI was successfully adapted and pilot-tested in its computerized format and is now ready for use in Spain. Copyright © 2012 SESPAS. Published by Elsevier Espana. All rights reserved.
Standard Electronic Format Specification for Tank Characterization Data Loader Version 3.5
DOE Office of Scientific and Technical Information (OSTI.GOV)
ADAMS, M.R.
2001-01-31
The purpose of this document is to describe the standard electronic format for data files that will be sent for entry into the Tank Characterization Database (TCD). There are 2 different file types needed for each data load: (1) Analytical Results and (2) Sample Descriptions.
surface reports in the NMC observational files. This revision represents the final update to NMC/NCEP Office Note Number 124. This format for representing meteorological surface observational data at NMC observational data format at NCEP. An accurate version of this Office Note is still necessary for historical
The Effect of Anisotropic Scatter on Atmospheric Neutron Transport
2015-03-26
Labratory, ENDF-6 Formats Manual: Data Formats and Procedures for the Evaluated Nuclear Data Files ENDF/B-VI and ENDF/B-VII. BNL - 90365-2009, Revision 2...Upton, NY: BNL , December 2011 [7] Microsoft Visual Studio Professional 2013, Version 12.0.30501.00 Update 2. Computer Software. Microsoft Corporation
Dependency Tree Annotation Software
2015-11-01
formats, and it provides numerous options for customizing how dependency trees are displayed. Built entirely in Java , it can run on a wide range of...tree can be saved as an image, .mxe (a mxGraph editing file), a .conll file, and several other file formats. DTE uses the open source Java version
The NJOY Nuclear Data Processing System, Version 2016
DOE Office of Scientific and Technical Information (OSTI.GOV)
Macfarlane, Robert; Muir, Douglas W.; Boicourt, R. M.
The NJOY Nuclear Data Processing System, version 2016, is a comprehensive computer code package for producing pointwise and multigroup cross sections and related quantities from evaluated nuclear data in the ENDF-4 through ENDF-6 legacy card-image formats. NJOY works with evaluated files for incident neutrons, photons, and charged particles, producing libraries for a wide variety of particle transport and reactor analysis codes.
Reynolds, Nicholas P; Adamcik, Jozef; Berryman, Joshua T; Handschin, Stephan; Zanjani, Ali Asghar Hakami; Li, Wen; Liu, Kun; Zhang, Afang; Mezzenga, Raffaele
2017-12-20
The original version of this article contained an error in Fig. 5c. The label for the back series of columns was incorrectly given as '1.5 mM pH 2', rather than the correct '1.5 mM pH 7'. This has now been corrected in both the PDF and HTML versions of the article.
Goldman, Alvin; de Vignemont, Frederique
2009-04-01
Theories of embodied cognition abound in the literature, but it is often unclear how to understand them. We offer several interpretations of embodiment, the most interesting being the thesis that mental representations in bodily formats (B-formats) have an important role in cognition. Potential B-formats include motoric, somatosensory, affective and interoceptive formats. The literature on mirroring and related phenomena provides support for a limited-scope version of embodied social cognition under the B-format interpretation. It is questionable, however, whether such a thesis can be extended. We show the limits of embodiment in social cognition.
Beebook: light field mapping app
NASA Astrophysics Data System (ADS)
De Donatis, Mauro; Di Pietro, Gianfranco; Rinnone, Fabio
2014-05-01
In the last decade the mobile systems for field digital mapping were developed (see Wikipedia for "Digital geologic mapping"), also against many skeptic traditional geologists. Until now, hardware was often heavy (tablet PC) and software sometime difficult also for expert GIS users. At present, the advent of light tablet and applications makes things easier, but we are far to find a whole solution for a complex survey like the geological one where you have to manage complexities such information, hypothesis, data, interpretation. Beebook is a new app for Android devices, has been developed for fast ad easy mapping work in the field trying to try to solve this problem. The main features are: • off-line raster management, GeoTIFF ed other raster format using; • on-line map visualisation (Google Maps, OSM, WMS, WFS); • SR management and conversion using PROJ.4; • vector file mash-up (KML and SQLite format); • editing of vector data on the map (lines, points, polygons); • augmented reality using "Mixare" platform; • export of vector data in KML, CSV, SQLite (Spatialite) format; • note: GPS or manual point inserting linked to other application files (pictures, spreadsheet, etc.); • form: creation, edition and filling of customized form; • GPS: status control, tracker and positioning on map; • sharing: synchronization and sharing of data, forms, positioning and other information can be done among users. The input methods are different from digital keyboard to fingers touch, from voice recording to stylus. In particular the most efficient way of inserting information is the stylus (or pen): field geologists are familiar with annotation and sketches. Therefore we suggest the use of devices with stylus. The main point is that Beebook is the first "transparent" mobile GIS for tablet and smartphone deriving from previous experience as traditional mapping and different previous digital mapping software ideation and development (MapIT, BeeGIS, Geopaparazzi). Deriving from those experiences, we developed a tool which is easy to use and applicable not only for geology but also to every field survey.
Enhancing the Accessibility and Utility of UAVSAR L-band SAR Data
NASA Astrophysics Data System (ADS)
Atwood, D.; Arko, S. A.; Gens, R.; Sanches, R. R.
2011-12-01
The UAVSAR instrument, developed at NASA Jet Propulsion Lab, is a reconfigurable L-band, quad-polarimetric Synthetic Aperture Radar (SAR) developed specifically for repeat-track differential interferometry (InSAR). It offers resolution of approximately 5m and swaths greater than 16 km. Although designed to be flown aboard a UAV (Uninhabited Aerial Vehicle), it is currently being flown aboard a Gulfstream III in an ambitious set of campaigns around the world. The current archive from 2009 contains data from more than 100 missions from North America, Central America, the Caribbean, and Greenland. Compared with most SAR data from satellites, UAVSAR offers higher resolution, full-polarimetry, and an impressive noise floor. For scientists, these datasets present wonderful opportunities for understanding Earth processes and developing new algorithms for information extraction. Yet despite the diverse range of coverage, UAVSAR is still relatively under-utilized. In its capacity as the NASA SAR DAAC, the Alaska Satellite Facility (ASF) is interested in expanding recognition of this data and serving data products that can be readily downloaded into a Geographic Information System (GIS) environment. Two hurdles exist: one is the large size of the data products and the second is the format of the data. The data volumes are in excess of several GB; presenting slow downloads and overwhelming many software programs. Secondly, while the data is appropriately formatted for expert users, it may prove challenging for scientists who have not previously worked with SAR. This paper will address ways that ASF is working to reduce data volume while maintaining the integrity of the data. At the same time, the creation of value-added products that permit immediate visualization in a GIS environment will be described. Conversion of the UAVSAR polarimetric data to radiometrically terrain-corrected Pauli images in a GeoTIFF format will permit researchers to understand the scattering mechanisms that characterize various land cover classes in their study areas. Specific examples of UAVSAR polarimetric classifications will be used to demonstrate the benefit of the UAVSAR products for Earth science projects.
TAE+ 5.2 - TRANSPORTABLE APPLICATIONS ENVIRONMENT PLUS, VERSION 5.2 (HP9000 SERIES 700/800 VERSION)
NASA Technical Reports Server (NTRS)
TAE SUPPORT OFFICE
1994-01-01
TAE (Transportable Applications Environment) Plus is an integrated, portable environment for developing and running interactive window, text, and graphical object-based application systems. The program allows both programmers and non-programmers to easily construct their own custom application interface and to move that interface and application to different machine environments. TAE Plus makes both the application and the machine environment transparent, with noticeable improvements in the learning curve. The main components of TAE Plus are as follows: (1) the WorkBench, a What You See Is What You Get (WYSIWYG) tool for the design and layout of a user interface; (2) the Window Programming Tools Package (WPT), a set of callable subroutines that control an application's user interface; and (3) TAE Command Language (TCL), an easy-to-learn command language that provides an easy way to develop an executable application prototype with a run-time interpreted language. The WorkBench tool allows the application developer to interactively construct the layout of an application's display screen by manipulating a set of interaction objects including input items such as buttons, icons, and scrolling text lists. User interface interactive objects include data-driven graphical objects such as dials, thermometers, and strip charts as well as menubars, option menus, file selection items, message items, push buttons, and color loggers. The WorkBench user specifies the windows and interaction objects that will make up the user interface, then specifies the sequence of the user interface dialogue. The description of the designed user interface is then saved into resource files. For those who desire to develop the designed user interface into an operational application, the WorkBench tool also generates source code (C, C++, Ada, and TCL) which fully controls the application's user interface through function calls to the WPTs. The WPTs are the runtime services used by application programs to display and control the user interfaces. Since the WPTs access the workbench-generated resource files during each execution, details such as color, font, location, and object type remain independent from the application code, allowing changes to the user interface without recompiling and relinking. In addition to WPTs, TAE Plus can control interaction of objects from the interpreted TAE Command Language. TCL provides a means for the more experienced developer to quickly prototype an application's use of TAE Plus interaction objects and add programming logic without the overhead of compiling or linking. TAE Plus requires MIT's X Window System and the Open Software Foundation's Motif. The HP 9000 Series 700/800 version of TAE 5.2 requires Version 11 Release 5 of the X Window System. All other machine versions of TAE 5.2 require Version 11, Release 4 of the X Window System. The Workbench and WPTs are written in C++ and the remaining code is written in C. TAE Plus is available by license for an unlimited time period. The licensed program product includes the TAE Plus source code and one set of supporting documentation. Additional documentation may be purchased separately at the price indicated below. The amount of disk space required to load the TAE Plus tar format tape is between 35Mb and 67Mb depending on the machine version. The recommended minimum memory is 12Mb. Each TAE Plus platform delivery tape includes pre-built libraries and executable binary code for that particular machine, as well as source code, so users do not have to do an installation. Users wishing to recompile the source will need both a C compiler and either GNU's C++ Version 1.39 or later, or a C++ compiler based on AT&T 2.0 cfront. TAE Plus was developed in 1989 and version 5.2 was released in 1993. TAE Plus 5.2 is available on media suitable for five different machine platforms: (1) IBM RS/6000 series workstations running AIX (.25 inch tape cartridge in UNIX tar format), (2) DEC RISC workstations running ULTRIX (TK50 cartridge in UNIX tar format), (3) HP9000 Series 700/800 computers running HP-UX 9.x and X11/R5 (HP 4mm DDS DAT tape cartridge in UNIX tar format), (4) Sun4 (SPARC) series computers running SunOS (.25 inch tape cartridge in UNIX tar format), and (5) SGI Indigo computers running IRIX (.25 inch IRIS tape cartridge in UNIX tar format). Please contact COSMIC to obtain detailed information about the supported operating system and OSF/Motif releases required for each of these machine versions. An optional Motif Object Code License is available for the Sun4 version of TAE Plus 5.2. Version 5.1 of TAE Plus remains available for DEC VAX computers running VMS, HP9000 Series 300/400 computers running HP-UX, and HP 9000 Series 700/800 computers running HP-UX 8.x and X11/R4. Please contact COSMIC for details on these versions of TAE Plus.
TAE+ 5.2 - TRANSPORTABLE APPLICATIONS ENVIRONMENT PLUS, VERSION 5.2 (IBM RS/6000 VERSION)
NASA Technical Reports Server (NTRS)
TAE SUPPORT OFFICE
1994-01-01
TAE (Transportable Applications Environment) Plus is an integrated, portable environment for developing and running interactive window, text, and graphical object-based application systems. The program allows both programmers and non-programmers to easily construct their own custom application interface and to move that interface and application to different machine environments. TAE Plus makes both the application and the machine environment transparent, with noticeable improvements in the learning curve. The main components of TAE Plus are as follows: (1) the WorkBench, a What You See Is What You Get (WYSIWYG) tool for the design and layout of a user interface; (2) the Window Programming Tools Package (WPT), a set of callable subroutines that control an application's user interface; and (3) TAE Command Language (TCL), an easy-to-learn command language that provides an easy way to develop an executable application prototype with a run-time interpreted language. The WorkBench tool allows the application developer to interactively construct the layout of an application's display screen by manipulating a set of interaction objects including input items such as buttons, icons, and scrolling text lists. User interface interactive objects include data-driven graphical objects such as dials, thermometers, and strip charts as well as menubars, option menus, file selection items, message items, push buttons, and color loggers. The WorkBench user specifies the windows and interaction objects that will make up the user interface, then specifies the sequence of the user interface dialogue. The description of the designed user interface is then saved into resource files. For those who desire to develop the designed user interface into an operational application, the WorkBench tool also generates source code (C, C++, Ada, and TCL) which fully controls the application's user interface through function calls to the WPTs. The WPTs are the runtime services used by application programs to display and control the user interfaces. Since the WPTs access the workbench-generated resource files during each execution, details such as color, font, location, and object type remain independent from the application code, allowing changes to the user interface without recompiling and relinking. In addition to WPTs, TAE Plus can control interaction of objects from the interpreted TAE Command Language. TCL provides a means for the more experienced developer to quickly prototype an application's use of TAE Plus interaction objects and add programming logic without the overhead of compiling or linking. TAE Plus requires MIT's X Window System and the Open Software Foundation's Motif. The HP 9000 Series 700/800 version of TAE 5.2 requires Version 11 Release 5 of the X Window System. All other machine versions of TAE 5.2 require Version 11, Release 4 of the X Window System. The Workbench and WPTs are written in C++ and the remaining code is written in C. TAE Plus is available by license for an unlimited time period. The licensed program product includes the TAE Plus source code and one set of supporting documentation. Additional documentation may be purchased separately at the price indicated below. The amount of disk space required to load the TAE Plus tar format tape is between 35Mb and 67Mb depending on the machine version. The recommended minimum memory is 12Mb. Each TAE Plus platform delivery tape includes pre-built libraries and executable binary code for that particular machine, as well as source code, so users do not have to do an installation. Users wishing to recompile the source will need both a C compiler and either GNU's C++ Version 1.39 or later, or a C++ compiler based on AT&T 2.0 cfront. TAE Plus was developed in 1989 and version 5.2 was released in 1993. TAE Plus 5.2 is available on media suitable for five different machine platforms: (1) IBM RS/6000 series workstations running AIX (.25 inch tape cartridge in UNIX tar format), (2) DEC RISC workstations running ULTRIX (TK50 cartridge in UNIX tar format), (3) HP9000 Series 700/800 computers running HP-UX 9.x and X11/R5 (HP 4mm DDS DAT tape cartridge in UNIX tar format), (4) Sun4 (SPARC) series computers running SunOS (.25 inch tape cartridge in UNIX tar format), and (5) SGI Indigo computers running IRIX (.25 inch IRIS tape cartridge in UNIX tar format). Please contact COSMIC to obtain detailed information about the supported operating system and OSF/Motif releases required for each of these machine versions. An optional Motif Object Code License is available for the Sun4 version of TAE Plus 5.2. Version 5.1 of TAE Plus remains available for DEC VAX computers running VMS, HP9000 Series 300/400 computers running HP-UX, and HP 9000 Series 700/800 computers running HP-UX 8.x and X11/R4. Please contact COSMIC for details on these versions of TAE Plus.
TAE+ 5.2 - TRANSPORTABLE APPLICATIONS ENVIRONMENT PLUS, VERSION 5.2 (SUN4 VERSION WITH MOTIF)
NASA Technical Reports Server (NTRS)
TAE SUPPORT OFFICE
1994-01-01
TAE (Transportable Applications Environment) Plus is an integrated, portable environment for developing and running interactive window, text, and graphical object-based application systems. The program allows both programmers and non-programmers to easily construct their own custom application interface and to move that interface and application to different machine environments. TAE Plus makes both the application and the machine environment transparent, with noticeable improvements in the learning curve. The main components of TAE Plus are as follows: (1) the WorkBench, a What You See Is What You Get (WYSIWYG) tool for the design and layout of a user interface; (2) the Window Programming Tools Package (WPT), a set of callable subroutines that control an application's user interface; and (3) TAE Command Language (TCL), an easy-to-learn command language that provides an easy way to develop an executable application prototype with a run-time interpreted language. The WorkBench tool allows the application developer to interactively construct the layout of an application's display screen by manipulating a set of interaction objects including input items such as buttons, icons, and scrolling text lists. User interface interactive objects include data-driven graphical objects such as dials, thermometers, and strip charts as well as menubars, option menus, file selection items, message items, push buttons, and color loggers. The WorkBench user specifies the windows and interaction objects that will make up the user interface, then specifies the sequence of the user interface dialogue. The description of the designed user interface is then saved into resource files. For those who desire to develop the designed user interface into an operational application, the WorkBench tool also generates source code (C, C++, Ada, and TCL) which fully controls the application's user interface through function calls to the WPTs. The WPTs are the runtime services used by application programs to display and control the user interfaces. Since the WPTs access the workbench-generated resource files during each execution, details such as color, font, location, and object type remain independent from the application code, allowing changes to the user interface without recompiling and relinking. In addition to WPTs, TAE Plus can control interaction of objects from the interpreted TAE Command Language. TCL provides a means for the more experienced developer to quickly prototype an application's use of TAE Plus interaction objects and add programming logic without the overhead of compiling or linking. TAE Plus requires MIT's X Window System and the Open Software Foundation's Motif. The HP 9000 Series 700/800 version of TAE 5.2 requires Version 11 Release 5 of the X Window System. All other machine versions of TAE 5.2 require Version 11, Release 4 of the X Window System. The Workbench and WPTs are written in C++ and the remaining code is written in C. TAE Plus is available by license for an unlimited time period. The licensed program product includes the TAE Plus source code and one set of supporting documentation. Additional documentation may be purchased separately at the price indicated below. The amount of disk space required to load the TAE Plus tar format tape is between 35Mb and 67Mb depending on the machine version. The recommended minimum memory is 12Mb. Each TAE Plus platform delivery tape includes pre-built libraries and executable binary code for that particular machine, as well as source code, so users do not have to do an installation. Users wishing to recompile the source will need both a C compiler and either GNU's C++ Version 1.39 or later, or a C++ compiler based on AT&T 2.0 cfront. TAE Plus was developed in 1989 and version 5.2 was released in 1993. TAE Plus 5.2 is available on media suitable for five different machine platforms: (1) IBM RS/6000 series workstations running AIX (.25 inch tape cartridge in UNIX tar format), (2) DEC RISC workstations running ULTRIX (TK50 cartridge in UNIX tar format), (3) HP9000 Series 700/800 computers running HP-UX 9.x and X11/R5 (HP 4mm DDS DAT tape cartridge in UNIX tar format), (4) Sun4 (SPARC) series computers running SunOS (.25 inch tape cartridge in UNIX tar format), and (5) SGI Indigo computers running IRIX (.25 inch IRIS tape cartridge in UNIX tar format). Please contact COSMIC to obtain detailed information about the supported operating system and OSF/Motif releases required for each of these machine versions. An optional Motif Object Code License is available for the Sun4 version of TAE Plus 5.2. Version 5.1 of TAE Plus remains available for DEC VAX computers running VMS, HP9000 Series 300/400 computers running HP-UX, and HP 9000 Series 700/800 computers running HP-UX 8.x and X11/R4. Please contact COSMIC for details on these versions of TAE Plus.
TAE+ 5.2 - TRANSPORTABLE APPLICATIONS ENVIRONMENT PLUS, VERSION 5.2 (SILICON GRAPHICS VERSION)
NASA Technical Reports Server (NTRS)
TAE SUPPORT OFFICE
1994-01-01
TAE (Transportable Applications Environment) Plus is an integrated, portable environment for developing and running interactive window, text, and graphical object-based application systems. The program allows both programmers and non-programmers to easily construct their own custom application interface and to move that interface and application to different machine environments. TAE Plus makes both the application and the machine environment transparent, with noticeable improvements in the learning curve. The main components of TAE Plus are as follows: (1) the WorkBench, a What You See Is What You Get (WYSIWYG) tool for the design and layout of a user interface; (2) the Window Programming Tools Package (WPT), a set of callable subroutines that control an application's user interface; and (3) TAE Command Language (TCL), an easy-to-learn command language that provides an easy way to develop an executable application prototype with a run-time interpreted language. The WorkBench tool allows the application developer to interactively construct the layout of an application's display screen by manipulating a set of interaction objects including input items such as buttons, icons, and scrolling text lists. User interface interactive objects include data-driven graphical objects such as dials, thermometers, and strip charts as well as menubars, option menus, file selection items, message items, push buttons, and color loggers. The WorkBench user specifies the windows and interaction objects that will make up the user interface, then specifies the sequence of the user interface dialogue. The description of the designed user interface is then saved into resource files. For those who desire to develop the designed user interface into an operational application, the WorkBench tool also generates source code (C, C++, Ada, and TCL) which fully controls the application's user interface through function calls to the WPTs. The WPTs are the runtime services used by application programs to display and control the user interfaces. Since the WPTs access the workbench-generated resource files during each execution, details such as color, font, location, and object type remain independent from the application code, allowing changes to the user interface without recompiling and relinking. In addition to WPTs, TAE Plus can control interaction of objects from the interpreted TAE Command Language. TCL provides a means for the more experienced developer to quickly prototype an application's use of TAE Plus interaction objects and add programming logic without the overhead of compiling or linking. TAE Plus requires MIT's X Window System and the Open Software Foundation's Motif. The HP 9000 Series 700/800 version of TAE 5.2 requires Version 11 Release 5 of the X Window System. All other machine versions of TAE 5.2 require Version 11, Release 4 of the X Window System. The Workbench and WPTs are written in C++ and the remaining code is written in C. TAE Plus is available by license for an unlimited time period. The licensed program product includes the TAE Plus source code and one set of supporting documentation. Additional documentation may be purchased separately at the price indicated below. The amount of disk space required to load the TAE Plus tar format tape is between 35Mb and 67Mb depending on the machine version. The recommended minimum memory is 12Mb. Each TAE Plus platform delivery tape includes pre-built libraries and executable binary code for that particular machine, as well as source code, so users do not have to do an installation. Users wishing to recompile the source will need both a C compiler and either GNU's C++ Version 1.39 or later, or a C++ compiler based on AT&T 2.0 cfront. TAE Plus was developed in 1989 and version 5.2 was released in 1993. TAE Plus 5.2 is available on media suitable for five different machine platforms: (1) IBM RS/6000 series workstations running AIX (.25 inch tape cartridge in UNIX tar format), (2) DEC RISC workstations running ULTRIX (TK50 cartridge in UNIX tar format), (3) HP9000 Series 700/800 computers running HP-UX 9.x and X11/R5 (HP 4mm DDS DAT tape cartridge in UNIX tar format), (4) Sun4 (SPARC) series computers running SunOS (.25 inch tape cartridge in UNIX tar format), and (5) SGI Indigo computers running IRIX (.25 inch IRIS tape cartridge in UNIX tar format). Please contact COSMIC to obtain detailed information about the supported operating system and OSF/Motif releases required for each of these machine versions. An optional Motif Object Code License is available for the Sun4 version of TAE Plus 5.2. Version 5.1 of TAE Plus remains available for DEC VAX computers running VMS, HP9000 Series 300/400 computers running HP-UX, and HP 9000 Series 700/800 computers running HP-UX 8.x and X11/R4. Please contact COSMIC for details on these versions of TAE Plus.
TAE+ 5.2 - TRANSPORTABLE APPLICATIONS ENVIRONMENT PLUS, VERSION 5.2 (SUN4 VERSION)
NASA Technical Reports Server (NTRS)
TAE SUPPORT OFFICE
1994-01-01
TAE (Transportable Applications Environment) Plus is an integrated, portable environment for developing and running interactive window, text, and graphical object-based application systems. The program allows both programmers and non-programmers to easily construct their own custom application interface and to move that interface and application to different machine environments. TAE Plus makes both the application and the machine environment transparent, with noticeable improvements in the learning curve. The main components of TAE Plus are as follows: (1) the WorkBench, a What You See Is What You Get (WYSIWYG) tool for the design and layout of a user interface; (2) the Window Programming Tools Package (WPT), a set of callable subroutines that control an application's user interface; and (3) TAE Command Language (TCL), an easy-to-learn command language that provides an easy way to develop an executable application prototype with a run-time interpreted language. The WorkBench tool allows the application developer to interactively construct the layout of an application's display screen by manipulating a set of interaction objects including input items such as buttons, icons, and scrolling text lists. User interface interactive objects include data-driven graphical objects such as dials, thermometers, and strip charts as well as menubars, option menus, file selection items, message items, push buttons, and color loggers. The WorkBench user specifies the windows and interaction objects that will make up the user interface, then specifies the sequence of the user interface dialogue. The description of the designed user interface is then saved into resource files. For those who desire to develop the designed user interface into an operational application, the WorkBench tool also generates source code (C, C++, Ada, and TCL) which fully controls the application's user interface through function calls to the WPTs. The WPTs are the runtime services used by application programs to display and control the user interfaces. Since the WPTs access the workbench-generated resource files during each execution, details such as color, font, location, and object type remain independent from the application code, allowing changes to the user interface without recompiling and relinking. In addition to WPTs, TAE Plus can control interaction of objects from the interpreted TAE Command Language. TCL provides a means for the more experienced developer to quickly prototype an application's use of TAE Plus interaction objects and add programming logic without the overhead of compiling or linking. TAE Plus requires MIT's X Window System and the Open Software Foundation's Motif. The HP 9000 Series 700/800 version of TAE 5.2 requires Version 11 Release 5 of the X Window System. All other machine versions of TAE 5.2 require Version 11, Release 4 of the X Window System. The Workbench and WPTs are written in C++ and the remaining code is written in C. TAE Plus is available by license for an unlimited time period. The licensed program product includes the TAE Plus source code and one set of supporting documentation. Additional documentation may be purchased separately at the price indicated below. The amount of disk space required to load the TAE Plus tar format tape is between 35Mb and 67Mb depending on the machine version. The recommended minimum memory is 12Mb. Each TAE Plus platform delivery tape includes pre-built libraries and executable binary code for that particular machine, as well as source code, so users do not have to do an installation. Users wishing to recompile the source will need both a C compiler and either GNU's C++ Version 1.39 or later, or a C++ compiler based on AT&T 2.0 cfront. TAE Plus was developed in 1989 and version 5.2 was released in 1993. TAE Plus 5.2 is available on media suitable for five different machine platforms: (1) IBM RS/6000 series workstations running AIX (.25 inch tape cartridge in UNIX tar format), (2) DEC RISC workstations running ULTRIX (TK50 cartridge in UNIX tar format), (3) HP9000 Series 700/800 computers running HP-UX 9.x and X11/R5 (HP 4mm DDS DAT tape cartridge in UNIX tar format), (4) Sun4 (SPARC) series computers running SunOS (.25 inch tape cartridge in UNIX tar format), and (5) SGI Indigo computers running IRIX (.25 inch IRIS tape cartridge in UNIX tar format). Please contact COSMIC to obtain detailed information about the supported operating system and OSF/Motif releases required for each of these machine versions. An optional Motif Object Code License is available for the Sun4 version of TAE Plus 5.2. Version 5.1 of TAE Plus remains available for DEC VAX computers running VMS, HP9000 Series 300/400 computers running HP-UX, and HP 9000 Series 700/800 computers running HP-UX 8.x and X11/R4. Please contact COSMIC for details on these versions of TAE Plus.
TAE+ 5.2 - TRANSPORTABLE APPLICATIONS ENVIRONMENT PLUS, VERSION 5.2 (DEC RISC ULTRIX VERSION)
NASA Technical Reports Server (NTRS)
TAE SUPPORT OFFICE
1994-01-01
TAE (Transportable Applications Environment) Plus is an integrated, portable environment for developing and running interactive window, text, and graphical object-based application systems. The program allows both programmers and non-programmers to easily construct their own custom application interface and to move that interface and application to different machine environments. TAE Plus makes both the application and the machine environment transparent, with noticeable improvements in the learning curve. The main components of TAE Plus are as follows: (1) the WorkBench, a What You See Is What You Get (WYSIWYG) tool for the design and layout of a user interface; (2) the Window Programming Tools Package (WPT), a set of callable subroutines that control an application's user interface; and (3) TAE Command Language (TCL), an easy-to-learn command language that provides an easy way to develop an executable application prototype with a run-time interpreted language. The WorkBench tool allows the application developer to interactively construct the layout of an application's display screen by manipulating a set of interaction objects including input items such as buttons, icons, and scrolling text lists. User interface interactive objects include data-driven graphical objects such as dials, thermometers, and strip charts as well as menubars, option menus, file selection items, message items, push buttons, and color loggers. The WorkBench user specifies the windows and interaction objects that will make up the user interface, then specifies the sequence of the user interface dialogue. The description of the designed user interface is then saved into resource files. For those who desire to develop the designed user interface into an operational application, the WorkBench tool also generates source code (C, C++, Ada, and TCL) which fully controls the application's user interface through function calls to the WPTs. The WPTs are the runtime services used by application programs to display and control the user interfaces. Since the WPTs access the workbench-generated resource files during each execution, details such as color, font, location, and object type remain independent from the application code, allowing changes to the user interface without recompiling and relinking. In addition to WPTs, TAE Plus can control interaction of objects from the interpreted TAE Command Language. TCL provides a means for the more experienced developer to quickly prototype an application's use of TAE Plus interaction objects and add programming logic without the overhead of compiling or linking. TAE Plus requires MIT's X Window System and the Open Software Foundation's Motif. The HP 9000 Series 700/800 version of TAE 5.2 requires Version 11 Release 5 of the X Window System. All other machine versions of TAE 5.2 require Version 11, Release 4 of the X Window System. The Workbench and WPTs are written in C++ and the remaining code is written in C. TAE Plus is available by license for an unlimited time period. The licensed program product includes the TAE Plus source code and one set of supporting documentation. Additional documentation may be purchased separately at the price indicated below. The amount of disk space required to load the TAE Plus tar format tape is between 35Mb and 67Mb depending on the machine version. The recommended minimum memory is 12Mb. Each TAE Plus platform delivery tape includes pre-built libraries and executable binary code for that particular machine, as well as source code, so users do not have to do an installation. Users wishing to recompile the source will need both a C compiler and either GNU's C++ Version 1.39 or later, or a C++ compiler based on AT&T 2.0 cfront. TAE Plus was developed in 1989 and version 5.2 was released in 1993. TAE Plus 5.2 is available on media suitable for five different machine platforms: (1) IBM RS/6000 series workstations running AIX (.25 inch tape cartridge in UNIX tar format), (2) DEC RISC workstations running ULTRIX (TK50 cartridge in UNIX tar format), (3) HP9000 Series 700/800 computers running HP-UX 9.x and X11/R5 (HP 4mm DDS DAT tape cartridge in UNIX tar format), (4) Sun4 (SPARC) series computers running SunOS (.25 inch tape cartridge in UNIX tar format), and (5) SGI Indigo computers running IRIX (.25 inch IRIS tape cartridge in UNIX tar format). Please contact COSMIC to obtain detailed information about the supported operating system and OSF/Motif releases required for each of these machine versions. An optional Motif Object Code License is available for the Sun4 version of TAE Plus 5.2. Version 5.1 of TAE Plus remains available for DEC VAX computers running VMS, HP9000 Series 300/400 computers running HP-UX, and HP 9000 Series 700/800 computers running HP-UX 8.x and X11/R4. Please contact COSMIC for details on these versions of TAE Plus.
Evaluation of Droplet Splashing Algorithm in LEWICE 3.0
NASA Technical Reports Server (NTRS)
Homenko, Hilary N.
2004-01-01
The Icing Branch at NASA Glenn Research has developed a computer program to simulate ice formation on the leading edge of an aircraft wing during flight through cold, moist air. As part of the branch's current research, members have developed software known as LEWICE. This program is capable of predicting the formation of ice under designated weather conditions. The success of LEWICE is an asset to airplane manufacturers, ice protection system manufacturers, and the airline industry. Simulations of ice formation conducted in the tunnel and in flight is costly and time consuming. However, the danger of in-flight icing continues to be a concern for both commercial and military pilots. The LEWICE software is a step towards inexpensive and time efficient prediction of ice collection. In the most recent version of the program, LEWICE contains an algorithm for droplet splashing. Droplet splashing is a natural occurrence that causes accumulation of ice on aircraft surfaces. At impingement water droplets lose a portion of their mass to splashing. With part of each droplet joining the airflow and failing to freeze, early versions of LEWICE without the splashing algorithm over-predicted the collection of ice on the leading edge. The objective of my project was to determine whether the revised version of LEWICE accurately reflected the ice collection data obtained from the Icing Research Tunnel (IRT). The experimental data from the IRT was collected by Mark Potapczuk in January, March and July of 2001 and April and December of 2002. Experimental data points were the result of ice tracings conducted shortly after testing in the tunnel. Run sheets, which included a record of velocity, temperature, liquid water content and droplet diameter, served as the input of the LEWICE computer program. Parameters identical to the tunnel conditions were used to run LEWICE 2.0 and LEWICE 3.0. The results from IRT and versions of LEWICE were compared graphically. After entering the raw experimental data and computer output into a spread sheet, I mapped each ice formation onto a clean airfoil. The LEWICE output provided the data points to graphically depict ice formations developed by the program. weather conditions of runs conducted in January 2001, it was evident that the splashing algorithm of LEWICE 3.0 predicts ice formations more accurately than LEWICE 2.0. Especially at conditions with droplet size between 80 and 160 microns, the splashing algorithm of the new LEWICE version compensated for the loss of droplet mass as a result of splashing. In contrast, LEWICE 2.0 consistently over-predicted the mass of the ice in conditions with droplet size exceeding 80 microns. This evidence confirms that changes made to algorithms of LEWICE 3.0 have increased the accuracy of predicting ice collection.
Implementing the HDF-EOS5 software library for data products in the UNAVCO InSAR archive
NASA Astrophysics Data System (ADS)
Baker, Scott; Meertens, Charles; Crosby, Christopher
2017-04-01
UNAVCO is a non-profit university-governed consortium that operates the U.S. National Science Foundation (NSF) Geodesy Advancing Geosciences and EarthScope (GAGE) facility and provides operational support to the Western North America InSAR Consortium (WInSAR). The seamless synthetic aperture radar archive (SSARA) is a seamless distributed access system for SAR data and higher-level data products. Under the NASA-funded SSARA project, a user-contributed InSAR archive for interferograms, time series, and other derived data products was developed at UNAVCO. The InSAR archive development has led to the adoption of the HDF-EOS5 data model, file format, and library. The HDF-EOS software library was designed to support NASA Earth Observation System (EOS) science data products and provides data structures for radar geometry (Swath) and geocoded (Grid) data based on the HDF5 data model and file format provided by the HDF Group. HDF-EOS5 inherits the benefits of HDF5 (open-source software support, internal compression, portability, support for structural data, self-describing file metadata enhanced performance, and xml support) and provides a way to standardize InSAR data products. Instrument- and datatype-independent services, such as subsetting, can be applied to files across a wide variety of data products through the same library interface. The library allows integration with GIS software packages such as ArcGIS and GDAL, conversion to other data formats like NetCDF and GeoTIFF, and is extensible with new data structures to support future requirements. UNAVCO maintains a GitHub repository that provides example software for creating data products from popular InSAR processing software packages like GMT5SAR and ISCE as well as examples for reading and converting the data products into other formats. Digital object identifiers (DOI) have been incorporated into the InSAR archive allowing users to assign a permanent location for their processed result and easily reference the final data products. A metadata attribute is added to the HDF-EOS5 file when a DOI is minted for a data product. These data products are searchable through the SSARA federated query providing access to processed data for both expert and non-expert InSAR users. The archive facilitates timely distribution of processed data with particular importance for geohazards and event response.
Andersson, Gerhard; Engström, Ingemar
2010-01-01
Background Self-report measures can guide clinical decisions and are useful when evaluating treatment outcomes. However, many clinicians do not use self-report measures systematically in their clinical practice. Internet-based questionnaires could facilitate administration, but the psychometric properties of the online version of an instrument should be explored before implementation. The recommendation from the International Test Commission is to test the psychometric properties of each questionnaire separately. Objective Our objective was to compare the psychometric properties of paper-and-pencil versions and Internet versions of two questionnaires measuring depressive symptoms. Methods The 87 participating patients were recruited from primary care and psychiatric care within the public health care system in Sweden. Participants completed the Beck Depression Inventory (BDI-II) and the Montgomery-Åsberg Depression Rating Scale—Self-rated (MADRS-S), both on paper and on the Internet. The order was randomized to control for order effects. Symptom severity in the sample ranged from mild to severe depressive symptoms. Results Psychometric properties of the two administration formats were mostly equivalent. The internal consistency was similar for the Internet and paper versions, and significant correlations were found between the formats for both MADRS-S (r = .84) and the BDI-II (r = .89). Differences between paper and Internet total scores were not statistically significant for either questionnaire nor for the MADRS-S question dealing with suicidality (item 9) when analyzed separately. The score on the BDI-II question about suicidality (item 9) was significantly lower when administered via the Internet compared with the paper score, but the difference was small (effect size, Cohen’s [d] = 0.14). There were significant main effects for order of administration on both questionnaires and significant interaction effects between format and order. This should not, however, pose a problem in clinical use as long as the administration format is not changed when repeated measurements are made. Conclusions The MADRS-S can be transferred to online use without affecting the psychometric properties in a clinically meaningful way. The full BDI-II also seems to retain its properties when transferred; however, the item measuring suicidality in the Internet version needs further investigation since it was associated with a lower score in this study. The use of online questionnaires offers clinicians a more practical way of measuring depressive symptoms and has the potential to save resources. PMID:21169165
SEGY to ASCII Conversion and Plotting Program 2.0
Goldman, Mark R.
2005-01-01
INTRODUCTION SEGY has long been a standard format for storing seismic data and header information. Almost every seismic processing package can read and write seismic data in SEGY format. In the data processing world, however, ASCII format is the 'universal' standard format. Very few general-purpose plotting or computation programs will accept data in SEGY format. The software presented in this report, referred to as SEGY to ASCII (SAC), converts seismic data written in SEGY format (Barry et al., 1975) to an ASCII data file, and then creates a postscript file of the seismic data using a general plotting package (GMT, Wessel and Smith, 1995). The resulting postscript file may be plotted by any standard postscript plotting program. There are two versions of SAC: one version for plotting a SEGY file that contains a single gather, such as a stacked CDP or migrated section, and a second version for plotting multiple gathers from a SEGY file containing more than one gather, such as a collection of shot gathers. Note that if a SEGY file has multiple gathers, then each gather must have the same number of traces per gather, and each trace must have the same sample interval and number of samples per trace. SAC will read several common standards of SEGY data, including SEGY files with sample values written in either IBM or IEEE floating-point format. In addition, utility programs are present to convert non-standard Seismic Unix (.sux) SEGY files and PASSCAL (.rsy) SEGY files to standard SEGY files. SAC allows complete user control over all plotting parameters including label size and font, tick mark intervals, trace scaling, and the inclusion of a title and descriptive text. SAC shell scripts create a postscript image of the seismic data in vector rather than bitmap format, using GMT's pswiggle command. Although this can produce a very large postscript file, the image quality is generally superior to that of a bitmap image, and commercial programs such as Adobe Illustrator? can manipulate the image more efficiently.
In Vitro, Matrix-Free Formation Of Solid Tumor Spheroids
NASA Technical Reports Server (NTRS)
Gonda, Steve R.; Marley, Garry M.
1993-01-01
Cinostatic bioreactor promotes formation of relatively large solid tumor spheroids exhibiting diameters from 750 to 2,100 micrometers. Process useful in studying efficacy of chemotherapeutic agents and of interactions between cells not constrained by solid matrices. Two versions have been demonstrated; one for anchorage-independent cells and one for anchorage-dependent cells.
Identity Formation in Adolescents from Italian, Mixed, and Migrant Families
ERIC Educational Resources Information Center
Crocetti, Elisabetta; Fermani, Alessandra; Pojaghi, Barbara; Meeus, Wim
2011-01-01
The purpose of this study was to compare identity formation in adolescents from Italian (n = 261), mixed (n = 100), and migrant families (n =148). Participants completed the Italian version of the Utrecht-Management of Identity Commitments Scale that assesses identity processes in educational and relational domains. Within a variable-centered…
Manual for Getdata Version 3.1: a FORTRAN Utility Program for Time History Data
NASA Technical Reports Server (NTRS)
Maine, Richard E.
1987-01-01
This report documents version 3.1 of the GetData computer program. GetData is a utility program for manipulating files of time history data, i.e., data giving the values of parameters as functions of time. The most fundamental capability of GetData is extracting selected signals and time segments from an input file and writing the selected data to an output file. Other capabilities include converting file formats, merging data from several input files, time skewing, interpolating to common output times, and generating calculated output signals as functions of the input signals. This report also documents the interface standards for the subroutines used by GetData to read and write the time history files. All interface to the data files is through these subroutines, keeping the main body of GetData independent of the precise details of the file formats. Different file formats can be supported by changes restricted to these subroutines. Other computer programs conforming to the interface standards can call the same subroutines to read and write files in compatible formats.
EDAM: an ontology of bioinformatics operations, types of data and identifiers, topics and formats
Ison, Jon; Kalaš, Matúš; Jonassen, Inge; Bolser, Dan; Uludag, Mahmut; McWilliam, Hamish; Malone, James; Lopez, Rodrigo; Pettifer, Steve; Rice, Peter
2013-01-01
Motivation: Advancing the search, publication and integration of bioinformatics tools and resources demands consistent machine-understandable descriptions. A comprehensive ontology allowing such descriptions is therefore required. Results: EDAM is an ontology of bioinformatics operations (tool or workflow functions), types of data and identifiers, application domains and data formats. EDAM supports semantic annotation of diverse entities such as Web services, databases, programmatic libraries, standalone tools, interactive applications, data schemas, datasets and publications within bioinformatics. EDAM applies to organizing and finding suitable tools and data and to automating their integration into complex applications or workflows. It includes over 2200 defined concepts and has successfully been used for annotations and implementations. Availability: The latest stable version of EDAM is available in OWL format from http://edamontology.org/EDAM.owl and in OBO format from http://edamontology.org/EDAM.obo. It can be viewed online at the NCBO BioPortal and the EBI Ontology Lookup Service. For documentation and license please refer to http://edamontology.org. This article describes version 1.2 available at http://edamontology.org/EDAM_1.2.owl. Contact: jison@ebi.ac.uk PMID:23479348
NASA Technical Reports Server (NTRS)
2002-01-01
Hurricane season in the eastern Pacific started off with a whimper late last month as Alma, a Category 2 hurricane, slowly made its way up the coast of Baja California, packing sustained winds of 110 miles per hour and gusts of 135 miles per hour. The above image of the hurricane was acquired on May 29, 2002, and displays the rainfall rates occurring within the storm. Click the image above to see an animated data visualization (3.8 MB) of the interior of Hurricane Alma. The images of the clouds seen at the beginning of the movie were retrieved from the National Oceanic and Atmospheric Association's (NOAA's) Geostationary Orbiting Environmental Satellite (GOES) network. As the movie continues, the clouds are peeled away to reveal an image of rainfall levels in the hurricane. The rainfall data were obtained by the Precipitation Radar aboard NASA's Tropical Rainfall Measuring Mission (TRMM) satellite. The Precipitation Radar bounces radio waves off of clouds to retrieve a reading of the number of large, rain-sized droplets within the clouds. Using these data, scientists can tell how much precipitation is occurring within and beneath a hurricane. In the movie, yellow denotes areas where 0.5 inches of rain is falling per hour, green denotes 1 inch per hour, and red denotes over 2 inches per hour. (Please note that high resolution still images of Hurricane Alma are available in the NASA Visible Earth in TIFF format.) Image and animation courtesy Lori Perkins, NASA Goddard Space Flight Center Scientific Visualization Studio
Jackson, Charlene R; Fedorka-Cray, Paula J; Wineland, Nora; Tankson, Jeanetta D; Barrett, John B; Douris, Aphrodite; Gresham, Cheryl P; Jackson-Hall, Carolina; McGlinchey, Beth M; Price, Maria Victoria
2007-01-01
In 2003 the United States Department of Agriculture established USDA VetNet. It was modeled after PulseNet USA, the national molecular subtyping network for foodborne disease surveillance. The objectives of USDA VetNet are: to use pulsed-field gel electrophoresis (PFGE) to subtype zoonotic pathogens submitted to the animal arm of the National Antimicrobial Resistance Monitoring System (NARMS); examine VetNet and PulseNet PFGE patterns; and use the data for surveillance and investigation of suspected foodborne illness outbreaks. Whereas PulseNet subtypes 7 foodborne disease-causing bacteria- Escherichia coli O157:H7, Salmonella, Shigella, Listeria monocytogenes, Campylobacter, Yersinia pestis, and Vibrio cholerae-VetNet at present subtypes nontyphoidal Salmonella serotypes and Campylobacter from animals, including diagnostic specimens, healthy farm animals, and carcasses of food-producing animals at slaughter. By the end of 2005, VetNet had two functioning databases: the NARMS Salmonella and the NARMS Campylobacter databases. The Salmonella database contained 6763 Salmonella isolates and 2514 unique XbaI patterns, while the Campylobacter database contained 58 Campylobacter isolates and 53 unique SmaI patterns. Both databases contain the PFGE tagged image file format (TIFF) images, demographic information, and the antimicrobial resistance profiles assigned by NARMS. In the future, veterinary diagnostic laboratories will be invited to participate in VetNet. The establishment of USDA VetNet enhances the mission of the agriculture and public health communities in the surveillance and investigation of foodborne illness outbreaks.
Definition of the Flexible Image Transport System (FITS), Version 3.0
NASA Technical Reports Server (NTRS)
Pence, W. D.; Chiapetti, L.; Page, C. G.; Shaw, R. A.; Stobie, E.
2010-01-01
The Flexible Image Transport System (FITS) has been used by astronomers for over 30 years as a data interchange and archiving format; FITS files are now handled by a wide range of astronomical software packages. Since the FITS format definition document (the "standard") was last printed in this journal in 2001, several new features have been developed and standardized, notably support for 64-bit integers in images and tables, variable-length arrays in tables, and new world coordinate system conventions which provide a mapping from an element in a data array to a physical coordinate on the sky or within a spectrum. The FITS Working Group of the International Astronomical Union has therefore produced this new Version 3.0 of the FITS standard, which is provided here in its entirety. In addition to describing the new features in FITS, numerous editorial changes were made to the previous version to clarify and reorganize many of the sections. Also included are some appendices which are not formally part of the standard. The FITS standard is likely to undergo further evolution, in which case the latest version may be found on the FITS Support Office Web site at http://fits.gsfc.nasa.gov/, which also provides many links to FITS-related resources.
Using MATLAB software with Tomcat server and Java platform for remote image analysis in pathology.
Markiewicz, Tomasz
2011-03-30
The Matlab software is a one of the most advanced development tool for application in engineering practice. From our point of view the most important is the image processing toolbox, offering many built-in functions, including mathematical morphology, and implementation of a many artificial neural networks as AI. It is very popular platform for creation of the specialized program for image analysis, also in pathology. Based on the latest version of Matlab Builder Java toolbox, it is possible to create the software, serving as a remote system for image analysis in pathology via internet communication. The internet platform can be realized based on Java Servlet Pages with Tomcat server as servlet container. In presented software implementation we propose remote image analysis realized by Matlab algorithms. These algorithms can be compiled to executable jar file with the help of Matlab Builder Java toolbox. The Matlab function must be declared with the set of input data, output structure with numerical results and Matlab web figure. Any function prepared in that manner can be used as a Java function in Java Servlet Pages (JSP). The graphical user interface providing the input data and displaying the results (also in graphical form) must be implemented in JSP. Additionally the data storage to database can be implemented within algorithm written in Matlab with the help of Matlab Database Toolbox directly with the image processing. The complete JSP page can be run by Tomcat server. The proposed tool for remote image analysis was tested on the Computerized Analysis of Medical Images (CAMI) software developed by author. The user provides image and case information (diagnosis, staining, image parameter etc.). When analysis is initialized, input data with image are sent to servlet on Tomcat. When analysis is done, client obtains the graphical results as an image with marked recognized cells and also the quantitative output. Additionally, the results are stored in a server database. The internet platform was tested on PC Intel Core2 Duo T9600 2.8 GHz 4 GB RAM server with 768x576 pixel size, 1.28 Mb tiff format images reffering to meningioma tumour (x400, Ki-67/MIB-1). The time consumption was as following: at analysis by CAMI, locally on a server - 3.5 seconds, at remote analysis - 26 seconds, from which 22 seconds were used for data transfer via internet connection. At jpg format image (102 Kb) the consumption time was reduced to 14 seconds. The results have confirmed that designed remote platform can be useful for pathology image analysis. The time consumption is depended mainly on the image size and speed of the internet connections. The presented implementation can be used for many types of analysis at different staining, tissue, morphometry approaches, etc. The significant problem is the implementation of the JSP page in the multithread form, that can be used parallelly by many users. The presented platform for image analysis in pathology can be especially useful for small laboratory without its own image analysis system.
Using MATLAB software with Tomcat server and Java platform for remote image analysis in pathology
2011-01-01
Background The Matlab software is a one of the most advanced development tool for application in engineering practice. From our point of view the most important is the image processing toolbox, offering many built-in functions, including mathematical morphology, and implementation of a many artificial neural networks as AI. It is very popular platform for creation of the specialized program for image analysis, also in pathology. Based on the latest version of Matlab Builder Java toolbox, it is possible to create the software, serving as a remote system for image analysis in pathology via internet communication. The internet platform can be realized based on Java Servlet Pages with Tomcat server as servlet container. Methods In presented software implementation we propose remote image analysis realized by Matlab algorithms. These algorithms can be compiled to executable jar file with the help of Matlab Builder Java toolbox. The Matlab function must be declared with the set of input data, output structure with numerical results and Matlab web figure. Any function prepared in that manner can be used as a Java function in Java Servlet Pages (JSP). The graphical user interface providing the input data and displaying the results (also in graphical form) must be implemented in JSP. Additionally the data storage to database can be implemented within algorithm written in Matlab with the help of Matlab Database Toolbox directly with the image processing. The complete JSP page can be run by Tomcat server. Results The proposed tool for remote image analysis was tested on the Computerized Analysis of Medical Images (CAMI) software developed by author. The user provides image and case information (diagnosis, staining, image parameter etc.). When analysis is initialized, input data with image are sent to servlet on Tomcat. When analysis is done, client obtains the graphical results as an image with marked recognized cells and also the quantitative output. Additionally, the results are stored in a server database. The internet platform was tested on PC Intel Core2 Duo T9600 2.8GHz 4GB RAM server with 768x576 pixel size, 1.28Mb tiff format images reffering to meningioma tumour (x400, Ki-67/MIB-1). The time consumption was as following: at analysis by CAMI, locally on a server – 3.5 seconds, at remote analysis – 26 seconds, from which 22 seconds were used for data transfer via internet connection. At jpg format image (102 Kb) the consumption time was reduced to 14 seconds. Conclusions The results have confirmed that designed remote platform can be useful for pathology image analysis. The time consumption is depended mainly on the image size and speed of the internet connections. The presented implementation can be used for many types of analysis at different staining, tissue, morphometry approaches, etc. The significant problem is the implementation of the JSP page in the multithread form, that can be used parallelly by many users. The presented platform for image analysis in pathology can be especially useful for small laboratory without its own image analysis system. PMID:21489188
Access to Land Data Products Through the Land Processes DAAC
NASA Astrophysics Data System (ADS)
Klaassen, A. L.; Gacke, C. K.
2004-12-01
The Land Processes Distributed Active Archive Center (LP DAAC) was established as part of NASA's Earth Observing System (EOS) Data and Information System (EOSDIS) initiative to process, archive, and distribute land-related data collected by EOS sensors, thereby promoting the inter-disciplinary study and understanding of the integrated Earth system. The LP DAAC is responsible for archiving, product development, distribution, and user support of Moderate Resolution Imaging Spectroradiometer (MODIS) land products derived from data acquired by the Terra and Aqua satellites and processing and distribution of Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) data products. These data are applied in scientific research, management of natural resources, emergency response to natural disaster, and Earth Science Education. There are several web interfaces by which the inventory may be searched and the products ordered. The LP DAAC web site (http://lpdaac.usgs.gov/) provides product-specific information and links to data access tools. The primary search and order tool is the EOS Data Gateway (EDG) (http://edcimswww.cr.usgs.gov/pub/imswelcome/) that allows users to search data holdings, retrieve descriptions of data sets, view browse images, and place orders. The EDG is the only tool to search the entire inventory of ASTER and MODIS products available from the LP DAAC. The Data Pool (http://lpdaac.usgs.gov/datapool/datapool.asp) is an online archive that provides immediate FTP access to selected LP DAAC data products. The data can be downloaded by going directly to the FTP site, where you can navigate to the desired granule, metadata file or browse image. It includes the ability to convert files from the standard HDF-EOS data format into GeoTIFF, to change the data projections, or perform spatial subsetting by using the HDF-EOS to GeoTIFF Converter (HEG) for selected data types. The Browse Tool also known as the USGS Global Visualization Viewer (http://lpdaac.usgs.gov/aster/glovis.asp) provides a easy online method to search, browse, and order the LP DAAC ASTER and MODIS land data by viewing browse images to define spatial and temporal queries. The LP DAAC User Services Office is the interface for support for the ASTER and MODIS data products and services. The user services representatives are available to answer questions, assist with ordering data, technical support and referrals, and provide information on a variety of tools available to assist in data preparation. The LP DAAC User Services contact information is: LP DAAC User Services U.S. Geological Survey EROS Data Center 47914 252nd Street Sioux Falls, SD 57198-0001 Voice: (605) 594-6116 Toll Free: 866-573-3222 Fax: 605-594-6963 E-mail: edc@eos.nasa.gov "This abstract was prepared under Contract number 03CRCN0001 between SAIC and U.S. Geological Survey. Abstract has not been reviewed for conformity with USGS editorial standards and has been submitted for approval by the USGS Director."
CWG - MUTUAL COUPLING PROGRAM FOR CIRCULAR WAVEGUIDE-FED APERTURE ARRAY (IBM PC VERSION)
NASA Technical Reports Server (NTRS)
Bailey, M. C.
1994-01-01
Mutual Coupling Program for Circular Waveguide-fed Aperture Array (CWG) was developed to calculate the electromagnetic interaction between elements of an antenna array of circular apertures with specified aperture field distributions. The field distributions were assumed to be a superposition of the modes which could exist in a circular waveguide. Various external media were included to provide flexibility of use, for example, the flexibility to determine the effects of dielectric covers (i.e., thermal protection system tiles) upon the impedance of aperture type antennas. The impedance and radiation characteristics of planar array antennas depend upon the mutual interaction between all the elements of the array. These interactions are influenced by several parameters (e.g., the array grid geometry, the geometry and excitation of each array element, the medium outside the array, and the internal network feeding the array.) For the class of array antenna whose radiating elements consist of small holes in a flat conducting plate, the electromagnetic problem can be divided into two parts, the internal and the external. In solving the external problem for an array of circular apertures, CWG will compute the mutual interaction between various combinations of circular modal distributions and apertures. CWG computes the mutual coupling between various modes assumed to exist in circular apertures that are located in a flat conducting plane of infinite dimensions. The apertures can radiate into free space, a homogeneous medium, a multilayered region or a reflecting surface. These apertures are assumed to be excited by one or more modes corresponding to the modal distributions in circular waveguides of the same cross sections as the apertures. The apertures may be of different sizes and also of different polarizations. However, the program assumes that each aperture field contains the same modal distributions, and calculates the complex scattering matrix between all mode and aperture combinations. The scattering matrix can then be used to determine the complex modal field amplitudes for each aperture with a specified array excitation. CWG is written in VAX FORTRAN for DEC VAX series computers running VMS (LAR-15236) and IBM PC series and compatible computers running MS-DOS (LAR-15226). It requires 360K of RAM for execution. To compile the source code for the PC version, the NDP Fortran compiler and linker will be required; however, the distribution medium for the PC version of CWG includes a sample MS-DOS executable which was created using NDP Fortran with the -vms compiler option. The standard distribution medium for the PC version of CWG is a 3.5 inch 1.44Mb MS-DOS format diskette. The standard distribution medium for the VAX version of CWG is a 1600 BPI 9track magnetic tape in DEC VAX BACKUP format. The VAX version is also available on a TK50 tape cartridge in DEC VAX BACKUP format. Both machine versions of CWG include an electronic version of the documentation in Microsoft Word for Windows format. CWG was developed in 1993 and is a copyrighted work with all copyright vested in NASA.
CWG - MUTUAL COUPLING PROGRAM FOR CIRCULAR WAVEGUIDE-FED APERTURE ARRAY (VAX VMS VERSION)
NASA Technical Reports Server (NTRS)
Bailey, M. C.
1994-01-01
Mutual Coupling Program for Circular Waveguide-fed Aperture Array (CWG) was developed to calculate the electromagnetic interaction between elements of an antenna array of circular apertures with specified aperture field distributions. The field distributions were assumed to be a superposition of the modes which could exist in a circular waveguide. Various external media were included to provide flexibility of use, for example, the flexibility to determine the effects of dielectric covers (i.e., thermal protection system tiles) upon the impedance of aperture type antennas. The impedance and radiation characteristics of planar array antennas depend upon the mutual interaction between all the elements of the array. These interactions are influenced by several parameters (e.g., the array grid geometry, the geometry and excitation of each array element, the medium outside the array, and the internal network feeding the array.) For the class of array antenna whose radiating elements consist of small holes in a flat conducting plate, the electromagnetic problem can be divided into two parts, the internal and the external. In solving the external problem for an array of circular apertures, CWG will compute the mutual interaction between various combinations of circular modal distributions and apertures. CWG computes the mutual coupling between various modes assumed to exist in circular apertures that are located in a flat conducting plane of infinite dimensions. The apertures can radiate into free space, a homogeneous medium, a multilayered region or a reflecting surface. These apertures are assumed to be excited by one or more modes corresponding to the modal distributions in circular waveguides of the same cross sections as the apertures. The apertures may be of different sizes and also of different polarizations. However, the program assumes that each aperture field contains the same modal distributions, and calculates the complex scattering matrix between all mode and aperture combinations. The scattering matrix can then be used to determine the complex modal field amplitudes for each aperture with a specified array excitation. CWG is written in VAX FORTRAN for DEC VAX series computers running VMS (LAR-15236) and IBM PC series and compatible computers running MS-DOS (LAR-15226). It requires 360K of RAM for execution. To compile the source code for the PC version, the NDP Fortran compiler and linker will be required; however, the distribution medium for the PC version of CWG includes a sample MS-DOS executable which was created using NDP Fortran with the -vms compiler option. The standard distribution medium for the PC version of CWG is a 3.5 inch 1.44Mb MS-DOS format diskette. The standard distribution medium for the VAX version of CWG is a 1600 BPI 9track magnetic tape in DEC VAX BACKUP format. The VAX version is also available on a TK50 tape cartridge in DEC VAX BACKUP format. Both machine versions of CWG include an electronic version of the documentation in Microsoft Word for Windows format. CWG was developed in 1993 and is a copyrighted work with all copyright vested in NASA.
Computer versus paper--does it make any difference in test performance?
Karay, Yassin; Schauber, Stefan K; Stosch, Christoph; Schüttpelz-Brauns, Katrin
2015-01-01
CONSTRUCT: In this study, we examine the differences in test performance between the paper-based and the computer-based version of the Berlin formative Progress Test. In this context it is the first study that allows controlling for students' prior performance. Computer-based tests make possible a more efficient examination procedure for test administration and review. Although university staff will benefit largely from computer-based tests, the question arises if computer-based tests influence students' test performance. A total of 266 German students from the 9th and 10th semester of medicine (comparable with the 4th-year North American medical school schedule) participated in the study (paper = 132, computer = 134). The allocation of the test format was conducted as a randomized matched-pair design in which students were first sorted according to their prior test results. The organizational procedure, the examination conditions, the room, and seating arrangements, as well as the order of questions and answers, were identical in both groups. The sociodemographic variables and pretest scores of both groups were comparable. The test results from the paper and computer versions did not differ. The groups remained within the allotted time, but students using the computer version (particularly the high performers) needed significantly less time to complete the test. In addition, we found significant differences in guessing behavior. Low performers using the computer version guess significantly more than low-performing students in the paper-pencil version. Participants in computer-based tests are not at a disadvantage in terms of their test results. The computer-based test required less processing time. The reason for the longer processing time when using the paper-pencil version might be due to the time needed to write the answer down, controlling for transferring the answer correctly. It is still not known why students using the computer version (particularly low-performing students) guess at a higher rate. Further studies are necessary to understand this finding.
Haley, Stephen M; Fragala-Pinkham, Maria; Ni, Pengsheng
2006-07-01
To examine the relative sensitivity to detect functional mobility changes with a full-length parent questionnaire compared with a computerized adaptive testing version of the questionnaire after a 16-week group fitness programme. Prospective, pre- and posttest study with a 16-week group fitness intervention. Three community-based fitness centres. Convenience sample of children (n = 28) with physical or developmental disabilities. A 16-week group exercise programme held twice a week in a community setting. A full-length (161 items) paper version of a mobility parent questionnaire based on the Pediatric Evaluation of Disability Inventory, but expanded to include expected skills of children up to 15 years old was compared with a 15-item computer adaptive testing version. Both measures were administered at pre- and posttest intervals. Both the full-length Pediatric Evaluation of Disability Inventory and the 15-item computer adaptive testing version detected significant changes between pre- and posttest scores, had large effect sizes, and standardized response means, with a modest decrease in the computer adaptive test as compared with the 161-item paper version. Correlations between the computer adaptive and paper formats across pre- and posttest scores ranged from r = 0.76 to 0.86. Both functional mobility test versions were able to detect positive functional changes at the end of the intervention period. Greater variability in score estimates was generated by the computerized adaptive testing version, which led to a relative reduction in sensitivity as defined by the standardized response mean. Extreme scores were generally more difficult for the computer adaptive format to estimate with as much accuracy as scores in the mid-range of the scale. However, the reduction in accuracy and sensitivity, which did not influence the group effect results in this study, is counterbalanced by the large reduction in testing burden.
A Java-based tool for the design of classification microarrays.
Meng, Da; Broschat, Shira L; Call, Douglas R
2008-08-04
Classification microarrays are used for purposes such as identifying strains of bacteria and determining genetic relationships to understand the epidemiology of an infectious disease. For these cases, mixed microarrays, which are composed of DNA from more than one organism, are more effective than conventional microarrays composed of DNA from a single organism. Selection of probes is a key factor in designing successful mixed microarrays because redundant sequences are inefficient and limited representation of diversity can restrict application of the microarray. We have developed a Java-based software tool, called PLASMID, for use in selecting the minimum set of probe sequences needed to classify different groups of plasmids or bacteria. The software program was successfully applied to several different sets of data. The utility of PLASMID was illustrated using existing mixed-plasmid microarray data as well as data from a virtual mixed-genome microarray constructed from different strains of Streptococcus. Moreover, use of data from expression microarray experiments demonstrated the generality of PLASMID. In this paper we describe a new software tool for selecting a set of probes for a classification microarray. While the tool was developed for the design of mixed microarrays-and mixed-plasmid microarrays in particular-it can also be used to design expression arrays. The user can choose from several clustering methods (including hierarchical, non-hierarchical, and a model-based genetic algorithm), several probe ranking methods, and several different display methods. A novel approach is used for probe redundancy reduction, and probe selection is accomplished via stepwise discriminant analysis. Data can be entered in different formats (including Excel and comma-delimited text), and dendrogram, heat map, and scatter plot images can be saved in several different formats (including jpeg and tiff). Weights generated using stepwise discriminant analysis can be stored for analysis of subsequent experimental data. Additionally, PLASMID can be used to construct virtual microarrays with genomes from public databases, which can then be used to identify an optimal set of probes.
Progress of Interoperability in Planetary Research for Geospatial Data Analysis
NASA Astrophysics Data System (ADS)
Hare, T. M.; Gaddis, L. R.
2015-12-01
For nearly a decade there has been a push in the planetary science community to support interoperable methods of accessing and working with geospatial data. Common geospatial data products for planetary research include image mosaics, digital elevation or terrain models, geologic maps, geographic location databases (i.e., craters, volcanoes) or any data that can be tied to the surface of a planetary body (including moons, comets or asteroids). Several U.S. and international cartographic research institutions have converged on mapping standards that embrace standardized image formats that retain geographic information (e.g., GeoTiff, GeoJpeg2000), digital geologic mapping conventions, planetary extensions for symbols that comply with U.S. Federal Geographic Data Committee cartographic and geospatial metadata standards, and notably on-line mapping services as defined by the Open Geospatial Consortium (OGC). The latter includes defined standards such as the OGC Web Mapping Services (simple image maps), Web Feature Services (feature streaming), Web Coverage Services (rich scientific data streaming), and Catalog Services for the Web (data searching and discoverability). While these standards were developed for application to Earth-based data, they have been modified to support the planetary domain. The motivation to support common, interoperable data format and delivery standards is not only to improve access for higher-level products but also to address the increasingly distributed nature of the rapidly growing volumes of data. The strength of using an OGC approach is that it provides consistent access to data that are distributed across many facilities. While data-steaming standards are well-supported by both the more sophisticated tools used in Geographic Information System (GIS) and remote sensing industries, they are also supported by many light-weight browsers which facilitates large and small focused science applications and public use. Here we provide an overview of the interoperability initiatives that are currently ongoing in the planetary research community, examples of their successful application, and challenges that remain.
Analytic Patch Configuration (APC) gateway version 1.0 user's guide
NASA Technical Reports Server (NTRS)
Bingel, Bradford D.
1990-01-01
The Analytic Patch Configuration (APC) is an interactive software tool which translates aircraft configuration geometry files from one format into another. This initial release of the APC Gateway accommodates six formats: the four accepted APC formats (89f, 89fd, 89u, and 89ud), the PATRAN 2.x phase 1 neutral file format, and the Integrated Aerodynamic Analysis System (IAAS) General Geometry (GG) format. Written in ANSI FORTRAN 77 and completely self-contained, the APC Gateway is very portable and was already installed on CDC/NOS, VAX/VMS, SUN, SGI/IRIS, CONVEX, and GRAY hosts.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-08
... official version of this document is the document published in the Federal Register. Free Internet access... free, at 1- 800-877-8339. Accessible Format: Individuals with disabilities can obtain this document in... Document Format (PDF). To use PDF, you must have Adobe Acrobat Reader, which is available free at this site...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-09
... official version of this document is the document published in the Federal Register. Free Internet access... (TTY), call the Federal Relay Service (FRS), toll free, at 1- 800-877-8339. Accessible Format... Document Format (PDF). To use PDF you must have Adobe Acrobat Reader, which is available free at the site...
ERIC Educational Resources Information Center
Ginsburg, Herbert P.; Lee, Young-Sun; Pappas, Sandra
2016-01-01
Formative assessment involves the gathering of information that can guide the teaching of individual or groups of children. This approach requires a sound understanding of children's thinking and learning, as well as an effective method for gaining the information. We propose that formative assessment should employ a version of clinical…
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-19
... request materials in accessible formats for people with disabilities (Braille, large print, electronic files, audio format), send an email to [email protected] or call the Consumer & Governmental Affairs Bureau... whether a census block group is identified as having hard rock was modified for the non- contiguous areas...
Is All Formative Influence Immoral?
ERIC Educational Resources Information Center
Tillson, John
2018-01-01
Is it true that all formative influence is unethical, and that we ought to avoid influencing children (and indeed anyone at all)? There are more or less defensible versions of this doctrine, and we shall follow some of the strands of argument that lead to this conclusion. It seems that in maintaining that all influence is immoral, one commits…
A new version of Visual tool for estimating the fractal dimension of images
NASA Astrophysics Data System (ADS)
Grossu, I. V.; Felea, D.; Besliu, C.; Jipa, Al.; Bordeianu, C. C.; Stan, E.; Esanu, T.
2010-04-01
This work presents a new version of a Visual Basic 6.0 application for estimating the fractal dimension of images (Grossu et al., 2009 [1]). The earlier version was limited to bi-dimensional sets of points, stored in bitmap files. The application was extended for working also with comma separated values files and three-dimensional images. New version program summaryProgram title: Fractal Analysis v02 Catalogue identifier: AEEG_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEEG_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 9999 No. of bytes in distributed program, including test data, etc.: 4 366 783 Distribution format: tar.gz Programming language: MS Visual Basic 6.0 Computer: PC Operating system: MS Windows 98 or later RAM: 30 M Classification: 14 Catalogue identifier of previous version: AEEG_v1_0 Journal reference of previous version: Comput. Phys. Comm. 180 (2009) 1999 Does the new version supersede the previous version?: Yes Nature of problem: Estimating the fractal dimension of 2D and 3D images. Solution method: Optimized implementation of the box-counting algorithm. Reasons for new version:The previous version was limited to bitmap image files. The new application was extended in order to work with objects stored in comma separated values (csv) files. The main advantages are: Easier integration with other applications (csv is a widely used, simple text file format); Less resources consumed and improved performance (only the information of interest, the "black points", are stored); Higher resolution (the points coordinates are loaded into Visual Basic double variables [2]); Possibility of storing three-dimensional objects (e.g. the 3D Sierpinski gasket). In this version the optimized box-counting algorithm [1] was extended to the three-dimensional case. Summary of revisions:The application interface was changed from SDI (single document interface) to MDI (multi-document interface). One form was added in order to provide a graphical user interface for the new functionalities (fractal analysis of 2D and 3D images stored in csv files). Additional comments: User friendly graphical interface; Easy deployment mechanism. Running time: In the first approximation, the algorithm is linear. References:[1] I.V. Grossu, C. Besliu, M.V. Rusu, Al. Jipa, C.C. Bordeianu, D. Felea, Comput. Phys. Comm. 180 (2009) 1999-2001.[2] F. Balena, Programming Microsoft Visual Basic 6.0, Microsoft Press, US, 1999.
TAE+ 5.1 - TRANSPORTABLE APPLICATIONS ENVIRONMENT PLUS, VERSION 5.1 (HP9000 SERIES 300/400 VERSION)
NASA Technical Reports Server (NTRS)
TAE SUPPORT OFFICE
1994-01-01
TAE (Transportable Applications Environment) Plus is an integrated, portable environment for developing and running interactive window, text, and graphical object-based application systems. The program allows both programmers and non-programmers to easily construct their own custom application interface and to move that interface and application to different machine environments. TAE Plus makes both the application and the machine environment transparent, with noticeable improvements in the learning curve. The main components of TAE Plus are as follows: (1) the WorkBench, a What You See Is What You Get (WYSIWYG) tool for the design and layout of a user interface; (2) the Window Programming Tools Package (WPT), a set of callable subroutines that control an application's user interface; and (3) TAE Command Language (TCL), an easy-to-learn command language that provides an easy way to develop an executable application prototype with a run-time interpreted language. The WorkBench tool allows the application developer to interactively construct the layout of an application's display screen by manipulating a set of interaction objects including input items such as buttons, icons, and scrolling text lists. User interface interactive objects include data-driven graphical objects such as dials, thermometers, and strip charts as well as menubars, option menus, file selection items, message items, push buttons, and color loggers. The WorkBench user specifies the windows and interaction objects that will make up the user interface, then specifies the sequence of the user interface dialogue. The description of the designed user interface is then saved into resource files. For those who desire to develop the designed user interface into an operational application, the WorkBench tool also generates source code (C, C++, Ada, and TCL) which fully controls the application's user interface through function calls to the WPTs. The WPTs are the runtime services used by application programs to display and control the user interfaces. Since the WPTs access the workbench-generated resource files during each execution, details such as color, font, location, and object type remain independent from the application code, allowing changes to the user interface without recompiling and relinking. In addition to WPTs, TAE Plus can control interaction of objects from the interpreted TAE Command Language. TCL provides a means for the more experienced developer to quickly prototype an application's use of TAE Plus interaction objects and add programming logic without the overhead of compiling or linking. TAE Plus requires MIT's X Window System, Version 11 Release 4, and the Open Software Foundation's Motif. The Workbench and WPTs are written in C++ and the remaining code is written in C. TAE Plus is available by license for an unlimited time period. The licensed program product includes the TAE Plus source code and one set of supporting documentation. Additional documentation may be purchased separately at the price indicated below. The amount of disk space required to load the TAE Plus tar format tape is between 35Mb and 67Mb depending on the machine version. The recommended minimum memory is 12Mb. Each TAE Plus platform delivery tape includes pre-built libraries and executable binary code for that particular machine, as well as source code, so users do not have to do an installation. Users wishing to recompile the source will need both a C compiler and either GNU's C++ Version 1.39 or later, or a C++ compiler based on AT&T 2.0 cfront. TAE Plus was developed in 1989 and version 5.2 was released in 1993. TAE Plus 5.2 is expected to be available on media suitable for seven different machine platforms: 1) DEC VAX computers running VMS (TK50 cartridge in VAX BACKUP format), 2) IBM RS/6000 series workstations running AIX (.25 inch tape cartridge in UNIX tar format), 3) DEC RISC workstations running ULTRIX (TK50 cartridge in UNIX tar format), 4) HP9000 Series 300/400 computers running HP-UX (.25 inch HP-preformatted tape cartridge in UNIX tar format), 5) HP9000 Series 700 computers running HP-UX (HP 4mm DDS DAT tape cartridge in UNIX tar format), 6) Sun4 (SPARC) series computers running SunOS (.25 inch tape cartridge in UNIX tar format), and 7) SGI Indigo computers running IRIX (.25 inch IRIS tape cartridge in UNIX tar format). Please contact COSMIC to obtain detailed information about the supported operating system and OSF/Motif releases required for each of these machine versions. An optional Motif Object Code License is available for the Sun4 version of TAE Plus 5.2.
TAE+ 5.1 - TRANSPORTABLE APPLICATIONS ENVIRONMENT PLUS, VERSION 5.1 (VAX VMS VERSION)
NASA Technical Reports Server (NTRS)
TAE SUPPORT OFFICE
1994-01-01
TAE (Transportable Applications Environment) Plus is an integrated, portable environment for developing and running interactive window, text, and graphical object-based application systems. The program allows both programmers and non-programmers to easily construct their own custom application interface and to move that interface and application to different machine environments. TAE Plus makes both the application and the machine environment transparent, with noticeable improvements in the learning curve. The main components of TAE Plus are as follows: (1) the WorkBench, a What You See Is What You Get (WYSIWYG) tool for the design and layout of a user interface; (2) the Window Programming Tools Package (WPT), a set of callable subroutines that control an application's user interface; and (3) TAE Command Language (TCL), an easy-to-learn command language that provides an easy way to develop an executable application prototype with a run-time interpreted language. The WorkBench tool allows the application developer to interactively construct the layout of an application's display screen by manipulating a set of interaction objects including input items such as buttons, icons, and scrolling text lists. User interface interactive objects include data-driven graphical objects such as dials, thermometers, and strip charts as well as menubars, option menus, file selection items, message items, push buttons, and color loggers. The WorkBench user specifies the windows and interaction objects that will make up the user interface, then specifies the sequence of the user interface dialogue. The description of the designed user interface is then saved into resource files. For those who desire to develop the designed user interface into an operational application, the WorkBench tool also generates source code (C, C++, Ada, and TCL) which fully controls the application's user interface through function calls to the WPTs. The WPTs are the runtime services used by application programs to display and control the user interfaces. Since the WPTs access the workbench-generated resource files during each execution, details such as color, font, location, and object type remain independent from the application code, allowing changes to the user interface without recompiling and relinking. In addition to WPTs, TAE Plus can control interaction of objects from the interpreted TAE Command Language. TCL provides a means for the more experienced developer to quickly prototype an application's use of TAE Plus interaction objects and add programming logic without the overhead of compiling or linking. TAE Plus requires MIT's X Window System, Version 11 Release 4, and the Open Software Foundation's Motif. The Workbench and WPTs are written in C++ and the remaining code is written in C. TAE Plus is available by license for an unlimited time period. The licensed program product includes the TAE Plus source code and one set of supporting documentation. Additional documentation may be purchased separately at the price indicated below. The amount of disk space required to load the TAE Plus tar format tape is between 35Mb and 67Mb depending on the machine version. The recommended minimum memory is 12Mb. Each TAE Plus platform delivery tape includes pre-built libraries and executable binary code for that particular machine, as well as source code, so users do not have to do an installation. Users wishing to recompile the source will need both a C compiler and either GNU's C++ Version 1.39 or later, or a C++ compiler based on AT&T 2.0 cfront. TAE Plus was developed in 1989 and version 5.2 was released in 1993. TAE Plus 5.2 is expected to be available on media suitable for seven different machine platforms: 1) DEC VAX computers running VMS (TK50 cartridge in VAX BACKUP format), 2) IBM RS/6000 series workstations running AIX (.25 inch tape cartridge in UNIX tar format), 3) DEC RISC workstations running ULTRIX (TK50 cartridge in UNIX tar format), 4) HP9000 Series 300/400 computers running HP-UX (.25 inch HP-preformatted tape cartridge in UNIX tar format), 5) HP9000 Series 700 computers running HP-UX (HP 4mm DDS DAT tape cartridge in UNIX tar format), 6) Sun4 (SPARC) series computers running SunOS (.25 inch tape cartridge in UNIX tar format), and 7) SGI Indigo computers running IRIX (.25 inch IRIS tape cartridge in UNIX tar format). Please contact COSMIC to obtain detailed information about the supported operating system and OSF/Motif releases required for each of these machine versions. An optional Motif Object Code License is available for the Sun4 version of TAE Plus 5.2.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jakowatz, C.V. Jr.; Wahl, D.E.; Thompson, P.A.
1996-12-31
Wavefront curvature defocus effects can occur in spotlight-mode SAR imagery when reconstructed via the well-known polar formatting algorithm (PFA) under certain scenarios that include imaging at close range, use of very low center frequency, and/or imaging of very large scenes. The range migration algorithm (RMA), also known as seismic migration, was developed to accommodate these wavefront curvature effects. However, the along-track upsampling of the phase history data required of the original version of range migration can in certain instances represent a major computational burden. A more recent version of migration processing, the Frequency Domain Replication and Downsampling (FReD) algorithm, obviatesmore » the need to upsample, and is accordingly more efficient. In this paper the authors demonstrate that the combination of traditional polar formatting with appropriate space-variant post-filtering for refocus can be as efficient or even more efficient than FReD under some imaging conditions, as demonstrated by the computer-simulated results in this paper. The post-filter can be pre-calculated from a theoretical derivation of the curvature effect. The conclusion is that the new polar formatting with post filtering algorithm (PF2) should be considered as a viable candidate for a spotlight-mode image formation processor when curvature effects are present.« less
DEMAID - A DESIGN MANAGER'S AID FOR INTELLIGENT DECOMPOSITION (SUN VERSION)
NASA Technical Reports Server (NTRS)
Rogers, J. L.
1994-01-01
Many engineering systems are large and multi-disciplinary. Before the design of new complex systems such as large space platforms can begin, the possible interactions among subsystems and their parts must be determined. Once this is completed the proposed system can be decomposed to identify its hierarchical structure. DeMAID (A Design Manager's Aid for Intelligent Decomposition) is a knowledge-based system for ordering the sequence of modules and identifying a possible multilevel structure for the design problem. DeMAID displays the modules in an N x N matrix format (called a design structure matrix) where a module is any process that requires input and generates an output. (Modules which generate an output but do not require an input, such as an initialization process, are also acceptable.) Although DeMAID requires an investment of time to generate and refine the list of modules for input, it could save a considerable amount of money and time in the total design process, particularly in new design problems where the ordering of the modules has not been defined. The decomposition of a complex design system into subsystems requires the judgement of the design manager. DeMAID reorders and groups the modules based on the links (interactions) among the modules, helping the design manager make decomposition decisions early in the design cycle. The modules are grouped into circuits (the subsystems) and displayed in an N x N matrix format. Feedback links, which indicate an iterative process, are minimized and only occur within a subsystem. Since there are no feedback links among the circuits, the circuits can be displayed in a multilevel format. Thus, a large amount of information is reduced to one or two displays which are stored for later retrieval and modification. The design manager and leaders of the design teams then have a visual display of the design problem and the intricate interactions among the different modules. The design manager could save a substantial amount of time if circuits on the same level of the multilevel structure are executed in parallel. DeMAID estimates the time savings based on the number of available processors. In addition to decomposing the system into subsystems, DeMAID examines the dependencies of a problem with independent variables and dependant functions. A dependency matrix is created to show the relationship. DeMAID is based on knowledge base techniques to provide flexibility and ease in adding new capabilities. Although DeMAID was originally written for design problems, it has proven to be very general in solving any problem which contains modules (processes) which take an input and generate an output. For example, one group is applying DeMAID to gain understanding of the data flow of a very large computer program. In this example, the modules are the subroutines of the program. The design manager begins the design of a system by determining the level of modules which need to be ordered. The level is the "granularity" of the problem. For example, the design manager may wish to examine disciplines (a coarse model), analysis programs, or the data level (a fine model). Once the system is divided into these modules, each module's input and output is determined, creating a data file for input to the main program. DeMAID is executed through a system of menus. The user has the choice to plan, schedule, display the N x N matrix, display the multilevel organization, or examine the dependency matrix. The main program calls a subroutine which reads a rule file and a data file, asserts facts into the knowledge base, and executes the inference engine of the artificial intelligence/expert systems program, CLIPS (C Language Integrated Production System). To determine the effects of changes in the design process, DeMAID includes a trace effects feature. There are two methods available to trace the effects of a change in the design process. The first method traces forward through the outputs to determine the effects of an output with respect to a change in a particular input. The second method traces backward to determine what modules must be re-executed if the output of a module must be recomputed. DeMAID is available in three machine versions: a Macintosh version which is written in Symantec's Think C 3.01, a Sun version, and an SGI IRIS version, both of which are written in C language. The Macintosh version requires system software 6.0.2 or later and CLIPS 4.3. The source code for the Macintosh version will not compile under version 4.0 of Think C; however, a sample executable is provided on the distribution media. QuickDraw is required for plotting. The Sun version requires GKS 4.1 graphics libraries, OpenWindows 3, and CLIPS 4.3. The SGI IRIS version requires CLIPS 4.3. Since DeMAID is not compatible with CLIPS 5.0 or later, the source code for CLIPS 4.3 is included on the distribution media; however, the documentation for CLIPS 4.3 is not included in the documentation package for DeMAID. It is available from COSMIC separately as the documentation for MSC-21208. The standard distribution medium for the Macintosh version of DeMAID is a set of four 3.5 inch 800K Macintosh format diskettes. The standard distribution medium for the Sun version of DeMAID is a .25 inch streaming magnetic tape cartridge (QIC-24) in UNIX tar format. The standard distribution medium for the IRIS version is a .25 inch IRIX compatible streaming magnetic tape cartridge in UNIX tar format. All versions include sample input. DeMAID was originally developed for use on VAX VMS computers in 1989. The Macintosh version of DeMAID was released in 1991 and updated in 1992. The Sun version of DeMAID was released in 1992 and updated in 1993. The SGI IRIS version was released in 1993.
DEMAID - A DESIGN MANAGER'S AID FOR INTELLIGENT DECOMPOSITION (SGI IRIS VERSION)
NASA Technical Reports Server (NTRS)
Rogers, J. L.
1994-01-01
Many engineering systems are large and multi-disciplinary. Before the design of new complex systems such as large space platforms can begin, the possible interactions among subsystems and their parts must be determined. Once this is completed the proposed system can be decomposed to identify its hierarchical structure. DeMAID (A Design Manager's Aid for Intelligent Decomposition) is a knowledge-based system for ordering the sequence of modules and identifying a possible multilevel structure for the design problem. DeMAID displays the modules in an N x N matrix format (called a design structure matrix) where a module is any process that requires input and generates an output. (Modules which generate an output but do not require an input, such as an initialization process, are also acceptable.) Although DeMAID requires an investment of time to generate and refine the list of modules for input, it could save a considerable amount of money and time in the total design process, particularly in new design problems where the ordering of the modules has not been defined. The decomposition of a complex design system into subsystems requires the judgement of the design manager. DeMAID reorders and groups the modules based on the links (interactions) among the modules, helping the design manager make decomposition decisions early in the design cycle. The modules are grouped into circuits (the subsystems) and displayed in an N x N matrix format. Feedback links, which indicate an iterative process, are minimized and only occur within a subsystem. Since there are no feedback links among the circuits, the circuits can be displayed in a multilevel format. Thus, a large amount of information is reduced to one or two displays which are stored for later retrieval and modification. The design manager and leaders of the design teams then have a visual display of the design problem and the intricate interactions among the different modules. The design manager could save a substantial amount of time if circuits on the same level of the multilevel structure are executed in parallel. DeMAID estimates the time savings based on the number of available processors. In addition to decomposing the system into subsystems, DeMAID examines the dependencies of a problem with independent variables and dependant functions. A dependency matrix is created to show the relationship. DeMAID is based on knowledge base techniques to provide flexibility and ease in adding new capabilities. Although DeMAID was originally written for design problems, it has proven to be very general in solving any problem which contains modules (processes) which take an input and generate an output. For example, one group is applying DeMAID to gain understanding of the data flow of a very large computer program. In this example, the modules are the subroutines of the program. The design manager begins the design of a system by determining the level of modules which need to be ordered. The level is the "granularity" of the problem. For example, the design manager may wish to examine disciplines (a coarse model), analysis programs, or the data level (a fine model). Once the system is divided into these modules, each module's input and output is determined, creating a data file for input to the main program. DeMAID is executed through a system of menus. The user has the choice to plan, schedule, display the N x N matrix, display the multilevel organization, or examine the dependency matrix. The main program calls a subroutine which reads a rule file and a data file, asserts facts into the knowledge base, and executes the inference engine of the artificial intelligence/expert systems program, CLIPS (C Language Integrated Production System). To determine the effects of changes in the design process, DeMAID includes a trace effects feature. There are two methods available to trace the effects of a change in the design process. The first method traces forward through the outputs to determine the effects of an output with respect to a change in a particular input. The second method traces backward to determine what modules must be re-executed if the output of a module must be recomputed. DeMAID is available in three machine versions: a Macintosh version which is written in Symantec's Think C 3.01, a Sun version, and an SGI IRIS version, both of which are written in C language. The Macintosh version requires system software 6.0.2 or later and CLIPS 4.3. The source code for the Macintosh version will not compile under version 4.0 of Think C; however, a sample executable is provided on the distribution media. QuickDraw is required for plotting. The Sun version requires GKS 4.1 graphics libraries, OpenWindows 3, and CLIPS 4.3. The SGI IRIS version requires CLIPS 4.3. Since DeMAID is not compatible with CLIPS 5.0 or later, the source code for CLIPS 4.3 is included on the distribution media; however, the documentation for CLIPS 4.3 is not included in the documentation package for DeMAID. It is available from COSMIC separately as the documentation for MSC-21208. The standard distribution medium for the Macintosh version of DeMAID is a set of four 3.5 inch 800K Macintosh format diskettes. The standard distribution medium for the Sun version of DeMAID is a .25 inch streaming magnetic tape cartridge (QIC-24) in UNIX tar format. The standard distribution medium for the IRIS version is a .25 inch IRIX compatible streaming magnetic tape cartridge in UNIX tar format. All versions include sample input. DeMAID was originally developed for use on VAX VMS computers in 1989. The Macintosh version of DeMAID was released in 1991 and updated in 1992. The Sun version of DeMAID was released in 1992 and updated in 1993. The SGI IRIS version was released in 1993.
DEMAID - A DESIGN MANAGER'S AID FOR INTELLIGENT DECOMPOSITION (MACINTOSH VERSION)
NASA Technical Reports Server (NTRS)
Rogers, J. L.
1994-01-01
Many engineering systems are large and multi-disciplinary. Before the design of new complex systems such as large space platforms can begin, the possible interactions among subsystems and their parts must be determined. Once this is completed the proposed system can be decomposed to identify its hierarchical structure. DeMAID (A Design Manager's Aid for Intelligent Decomposition) is a knowledge-based system for ordering the sequence of modules and identifying a possible multilevel structure for the design problem. DeMAID displays the modules in an N x N matrix format (called a design structure matrix) where a module is any process that requires input and generates an output. (Modules which generate an output but do not require an input, such as an initialization process, are also acceptable.) Although DeMAID requires an investment of time to generate and refine the list of modules for input, it could save a considerable amount of money and time in the total design process, particularly in new design problems where the ordering of the modules has not been defined. The decomposition of a complex design system into subsystems requires the judgement of the design manager. DeMAID reorders and groups the modules based on the links (interactions) among the modules, helping the design manager make decomposition decisions early in the design cycle. The modules are grouped into circuits (the subsystems) and displayed in an N x N matrix format. Feedback links, which indicate an iterative process, are minimized and only occur within a subsystem. Since there are no feedback links among the circuits, the circuits can be displayed in a multilevel format. Thus, a large amount of information is reduced to one or two displays which are stored for later retrieval and modification. The design manager and leaders of the design teams then have a visual display of the design problem and the intricate interactions among the different modules. The design manager could save a substantial amount of time if circuits on the same level of the multilevel structure are executed in parallel. DeMAID estimates the time savings based on the number of available processors. In addition to decomposing the system into subsystems, DeMAID examines the dependencies of a problem with independent variables and dependant functions. A dependency matrix is created to show the relationship. DeMAID is based on knowledge base techniques to provide flexibility and ease in adding new capabilities. Although DeMAID was originally written for design problems, it has proven to be very general in solving any problem which contains modules (processes) which take an input and generate an output. For example, one group is applying DeMAID to gain understanding of the data flow of a very large computer program. In this example, the modules are the subroutines of the program. The design manager begins the design of a system by determining the level of modules which need to be ordered. The level is the "granularity" of the problem. For example, the design manager may wish to examine disciplines (a coarse model), analysis programs, or the data level (a fine model). Once the system is divided into these modules, each module's input and output is determined, creating a data file for input to the main program. DeMAID is executed through a system of menus. The user has the choice to plan, schedule, display the N x N matrix, display the multilevel organization, or examine the dependency matrix. The main program calls a subroutine which reads a rule file and a data file, asserts facts into the knowledge base, and executes the inference engine of the artificial intelligence/expert systems program, CLIPS (C Language Integrated Production System). To determine the effects of changes in the design process, DeMAID includes a trace effects feature. There are two methods available to trace the effects of a change in the design process. The first method traces forward through the outputs to determine the effects of an output with respect to a change in a particular input. The second method traces backward to determine what modules must be re-executed if the output of a module must be recomputed. DeMAID is available in three machine versions: a Macintosh version which is written in Symantec's Think C 3.01, a Sun version, and an SGI IRIS version, both of which are written in C language. The Macintosh version requires system software 6.0.2 or later and CLIPS 4.3. The source code for the Macintosh version will not compile under version 4.0 of Think C; however, a sample executable is provided on the distribution media. QuickDraw is required for plotting. The Sun version requires GKS 4.1 graphics libraries, OpenWindows 3, and CLIPS 4.3. The SGI IRIS version requires CLIPS 4.3. Since DeMAID is not compatible with CLIPS 5.0 or later, the source code for CLIPS 4.3 is included on the distribution media; however, the documentation for CLIPS 4.3 is not included in the documentation package for DeMAID. It is available from COSMIC separately as the documentation for MSC-21208. The standard distribution medium for the Macintosh version of DeMAID is a set of four 3.5 inch 800K Macintosh format diskettes. The standard distribution medium for the Sun version of DeMAID is a .25 inch streaming magnetic tape cartridge (QIC-24) in UNIX tar format. The standard distribution medium for the IRIS version is a .25 inch IRIX compatible streaming magnetic tape cartridge in UNIX tar format. All versions include sample input. DeMAID was originally developed for use on VAX VMS computers in 1989. The Macintosh version of DeMAID was released in 1991 and updated in 1992. The Sun version of DeMAID was released in 1992 and updated in 1993. The SGI IRIS version was released in 1993.
Soil Crust Home Crust 101 Advanced Gallery References CCERS Site Links Updated: April 24, 2006 References The complete biological soil crust reference list is available in three formats: HTML Version
Cozendey-Silva, Eliana Napoleão; da Silva, Cintia Ribeiro; Larentis, Ariane Leites; Wasserman, Julio Cesar; Rozemberg, Brani; Teixeira, Liliane Reis
2016-09-05
Periodic assessment is one of the recommendations for improving health-care waste management worldwide. This study aimed at translating and adapting the Health-Care Waste Management - Rapid Assessment Tool (HCWM-RAT), proposed by the World Health Organization, to a Brazilian Portuguese version, and resolving its cultural and legal issues. The work focused on the evaluation of the concepts, items and semantic equivalence between the original tool and the Brazilian Portuguese version. A cross-cultural adaptation methodology was used, including: initial translation to Brazilian Portuguese; back translation to English; syntheses of these translation versions; formation of an expert committee to achieve consensus about the preliminary version; and evaluation of the target audience's comprehension. Both the translated and the original versions' concepts, items and semantic equivalence are presented. The constructs in the original instrument were considered relevant and applicable to the Brazilian context. The Brazilian version of the tool has the potential to generate indicators, develop official database, feedback and subsidize political decisions at many geographical and organizational levels strengthening the Monitoring and evaluation (M&E) mechanism. Moreover, the cross-cultural translation expands the usefulness of the instrument to Portuguese-speaking countries in developing regions. The translated and original versions presented concept, item and semantic equivalence and can be applied to Brazil.
Sala-Sastre, N; Herdman, M; Navarro, L; de la Prada, M; Pujol, R; Serra, C; Alonso, J; Flyvholm, M A; Giménez-Arnau, A M
2009-10-01
Eczema of the hands and urticaria are very common occupational dermatoses. The Nordic Occupational Skin Questionnaire (NOSQ-2002), developed in English, is an essential tool for the study of occupational skin diseases. The short version of the questionnaire is useful for screening and the long version is used to study risk factors. OBJECTIVE. The aim of this study was to culturally adapt the long version of the NOSQ to Spanish and Catalan and to ensure comprehension, semantic validity, and equivalence with the original. The principles of the International Society for Pharmacoeconomics and Outcomes Research for good research practices were applied. A 4-phase method was used, with direct, revised translation, back translation, and cognitive interviews. After direct translation, a first version was issued by the Spanish Working Group. This version was evaluated in cognitive interviews. Modifications were made to 39 questions (68 %) in the Spanish version and 27 questions (47 %) in the Catalan version. Changes included addition of examples to improve understanding, reformulation of instructions, change to use of a direct question format, and addition of certain definitions. The back translation was evaluated by the original authors, leading to a further 7 changes in the Spanish version and 2 in the Catalan version. The third consensus version underwent a second round of cognitive interviews, after which the definitive version in each language was issued. CONCLUSION. Spanish and Catalan versions of the NOSQ-2002 questionnaire are available at www.ami.dk/NOSQ and www.arbejdsmiljoforskning.dk.
NASA Technical Reports Server (NTRS)
Saltsman, J. F.
1994-01-01
TS-SRP/PACK is a set of computer programs for characterizing and predicting fatigue and creep-fatigue resistance of metallic materials in the high-temperature, long-life regime for isothermal and nonisothermal fatigue. The programs use the total strain version of the Strainrange Partitioning (TS-SRP). The user should be thoroughly familiar with the TS-SRP method before attempting to use any of these programs. The document for this program includes a theory manual as well as a detailed user's manual with a tutorial to guide the user in the proper use of TS-SRP. An extensive database has also been developed in a parallel effort. This database is an excellent source of high-temperature, creep-fatigue test data and can be used with other life-prediction methods as well. Five programs are included in TS-SRP/PACK along with the alloy database. The TABLE program is used to print the datasets, which are in NAMELIST format, in a reader friendly format. INDATA is used to create new datasets or add to existing ones. The FAIL program is used to characterize the failure behavior of an alloy as given by the constants in the strainrange-life relations used by the total strain version of SRP (TS-SRP) and the inelastic strainrange-based version of SRP. The program FLOW is used to characterize the flow behavior (the constitutive response) of an alloy as given by the constants in the flow equations used by TS-SRP. Finally, LIFE is used to predict the life of a specified cycle, using the constants characterizing failure and flow behavior determined by FAIL and FLOW. LIFE is written in interpretive BASIC to avoid compiling and linking every time the equation constants are changed. Four out of five programs in this package are written in FORTRAN 77 for IBM PC series and compatible computers running MS-DOS and are designed to read data using the NAMELIST format statement. The fifth is written in BASIC version 3.0 for IBM PC series and compatible computers running MS-DOS version 3.10. The executables require at least 239K of memory and DOS 3.1 or higher. To compile the source, a Lahey FORTRAN compiler is required. Source code modifications will be necessary if the compiler to be used does not support NAMELIST input. Probably the easiest revision to make is to use a list-directed READ statement. The standard distribution medium for this program is a set of two 5.25 inch 360K MS-DOS format diskettes. The contents of the diskettes are compressed using the PKWARE archiving tools. The utility to unarchive the files, PKUNZIP.EXE, is included. TS-SRP/PACK was developed in 1992.
Smelter, Andrey; Astra, Morgan; Moseley, Hunter N B
2017-03-17
The Biological Magnetic Resonance Data Bank (BMRB) is a public repository of Nuclear Magnetic Resonance (NMR) spectroscopic data of biological macromolecules. It is an important resource for many researchers using NMR to study structural, biophysical, and biochemical properties of biological macromolecules. It is primarily maintained and accessed in a flat file ASCII format known as NMR-STAR. While the format is human readable, the size of most BMRB entries makes computer readability and explicit representation a practical requirement for almost any rigorous systematic analysis. To aid in the use of this public resource, we have developed a package called nmrstarlib in the popular open-source programming language Python. The nmrstarlib's implementation is very efficient, both in design and execution. The library has facilities for reading and writing both NMR-STAR version 2.1 and 3.1 formatted files, parsing them into usable Python dictionary- and list-based data structures, making access and manipulation of the experimental data very natural within Python programs (i.e. "saveframe" and "loop" records represented as individual Python dictionary data structures). Another major advantage of this design is that data stored in original NMR-STAR can be easily converted into its equivalent JavaScript Object Notation (JSON) format, a lightweight data interchange format, facilitating data access and manipulation using Python and any other programming language that implements a JSON parser/generator (i.e., all popular programming languages). We have also developed tools to visualize assigned chemical shift values and to convert between NMR-STAR and JSONized NMR-STAR formatted files. Full API Reference Documentation, User Guide and Tutorial with code examples are also available. We have tested this new library on all current BMRB entries: 100% of all entries are parsed without any errors for both NMR-STAR version 2.1 and version 3.1 formatted files. We also compared our software to three currently available Python libraries for parsing NMR-STAR formatted files: PyStarLib, NMRPyStar, and PyNMRSTAR. The nmrstarlib package is a simple, fast, and efficient library for accessing data from the BMRB. The library provides an intuitive dictionary-based interface with which Python programs can read, edit, and write NMR-STAR formatted files and their equivalent JSONized NMR-STAR files. The nmrstarlib package can be used as a library for accessing and manipulating data stored in NMR-STAR files and as a command-line tool to convert from NMR-STAR file format into its equivalent JSON file format and vice versa, and to visualize chemical shift values. Furthermore, the nmrstarlib implementation provides a guide for effectively JSONizing other older scientific formats, improving the FAIRness of data in these formats.
ARC2D - EFFICIENT SOLUTION METHODS FOR THE NAVIER-STOKES EQUATIONS (DEC RISC ULTRIX VERSION)
NASA Technical Reports Server (NTRS)
Biyabani, S. R.
1994-01-01
ARC2D is a computational fluid dynamics program developed at the NASA Ames Research Center specifically for airfoil computations. The program uses implicit finite-difference techniques to solve two-dimensional Euler equations and thin layer Navier-Stokes equations. It is based on the Beam and Warming implicit approximate factorization algorithm in generalized coordinates. The methods are either time accurate or accelerated non-time accurate steady state schemes. The evolution of the solution through time is physically realistic; good solution accuracy is dependent on mesh spacing and boundary conditions. The mathematical development of ARC2D begins with the strong conservation law form of the two-dimensional Navier-Stokes equations in Cartesian coordinates, which admits shock capturing. The Navier-Stokes equations can be transformed from Cartesian coordinates to generalized curvilinear coordinates in a manner that permits one computational code to serve a wide variety of physical geometries and grid systems. ARC2D includes an algebraic mixing length model to approximate the effect of turbulence. In cases of high Reynolds number viscous flows, thin layer approximation can be applied. ARC2D allows for a variety of solutions to stability boundaries, such as those encountered in flows with shocks. The user has considerable flexibility in assigning geometry and developing grid patterns, as well as in assigning boundary conditions. However, the ARC2D model is most appropriate for attached and mildly separated boundary layers; no attempt is made to model wake regions and widely separated flows. The techniques have been successfully used for a variety of inviscid and viscous flowfield calculations. The Cray version of ARC2D is written in FORTRAN 77 for use on Cray series computers and requires approximately 5Mb memory. The program is fully vectorized. The tape includes variations for the COS and UNICOS operating systems. Also included is a sample routine for CONVEX computers to emulate Cray system time calls, which should be easy to modify for other machines as well. The standard distribution media for this version is a 9-track 1600 BPI ASCII Card Image format magnetic tape. The Cray version was developed in 1987. The IBM ES/3090 version is an IBM port of the Cray version. It is written in IBM VS FORTRAN and has the capability of executing in both vector and parallel modes on the MVS/XA operating system and in vector mode on the VM/XA operating system. Various options of the IBM VS FORTRAN compiler provide new features for the ES/3090 version, including 64-bit arithmetic and up to 2 GB of virtual addressability. The IBM ES/3090 version is available only as a 9-track, 1600 BPI IBM IEBCOPY format magnetic tape. The IBM ES/3090 version was developed in 1989. The DEC RISC ULTRIX version is a DEC port of the Cray version. It is written in FORTRAN 77 for RISC-based Digital Equipment platforms. The memory requirement is approximately 7Mb of main memory. It is available in UNIX tar format on TK50 tape cartridge. The port to DEC RISC ULTRIX was done in 1990. COS and UNICOS are trademarks and Cray is a registered trademark of Cray Research, Inc. IBM, ES/3090, VS FORTRAN, MVS/XA, and VM/XA are registered trademarks of International Business Machines. DEC and ULTRIX are registered trademarks of Digital Equipment Corporation.
ARC2D - EFFICIENT SOLUTION METHODS FOR THE NAVIER-STOKES EQUATIONS (CRAY VERSION)
NASA Technical Reports Server (NTRS)
Pulliam, T. H.
1994-01-01
ARC2D is a computational fluid dynamics program developed at the NASA Ames Research Center specifically for airfoil computations. The program uses implicit finite-difference techniques to solve two-dimensional Euler equations and thin layer Navier-Stokes equations. It is based on the Beam and Warming implicit approximate factorization algorithm in generalized coordinates. The methods are either time accurate or accelerated non-time accurate steady state schemes. The evolution of the solution through time is physically realistic; good solution accuracy is dependent on mesh spacing and boundary conditions. The mathematical development of ARC2D begins with the strong conservation law form of the two-dimensional Navier-Stokes equations in Cartesian coordinates, which admits shock capturing. The Navier-Stokes equations can be transformed from Cartesian coordinates to generalized curvilinear coordinates in a manner that permits one computational code to serve a wide variety of physical geometries and grid systems. ARC2D includes an algebraic mixing length model to approximate the effect of turbulence. In cases of high Reynolds number viscous flows, thin layer approximation can be applied. ARC2D allows for a variety of solutions to stability boundaries, such as those encountered in flows with shocks. The user has considerable flexibility in assigning geometry and developing grid patterns, as well as in assigning boundary conditions. However, the ARC2D model is most appropriate for attached and mildly separated boundary layers; no attempt is made to model wake regions and widely separated flows. The techniques have been successfully used for a variety of inviscid and viscous flowfield calculations. The Cray version of ARC2D is written in FORTRAN 77 for use on Cray series computers and requires approximately 5Mb memory. The program is fully vectorized. The tape includes variations for the COS and UNICOS operating systems. Also included is a sample routine for CONVEX computers to emulate Cray system time calls, which should be easy to modify for other machines as well. The standard distribution media for this version is a 9-track 1600 BPI ASCII Card Image format magnetic tape. The Cray version was developed in 1987. The IBM ES/3090 version is an IBM port of the Cray version. It is written in IBM VS FORTRAN and has the capability of executing in both vector and parallel modes on the MVS/XA operating system and in vector mode on the VM/XA operating system. Various options of the IBM VS FORTRAN compiler provide new features for the ES/3090 version, including 64-bit arithmetic and up to 2 GB of virtual addressability. The IBM ES/3090 version is available only as a 9-track, 1600 BPI IBM IEBCOPY format magnetic tape. The IBM ES/3090 version was developed in 1989. The DEC RISC ULTRIX version is a DEC port of the Cray version. It is written in FORTRAN 77 for RISC-based Digital Equipment platforms. The memory requirement is approximately 7Mb of main memory. It is available in UNIX tar format on TK50 tape cartridge. The port to DEC RISC ULTRIX was done in 1990. COS and UNICOS are trademarks and Cray is a registered trademark of Cray Research, Inc. IBM, ES/3090, VS FORTRAN, MVS/XA, and VM/XA are registered trademarks of International Business Machines. DEC and ULTRIX are registered trademarks of Digital Equipment Corporation.
Definition of the Flexible Image Transport System (FITS), version 3.0
NASA Astrophysics Data System (ADS)
Pence, W. D.; Chiappetti, L.; Page, C. G.; Shaw, R. A.; Stobie, E.
2010-12-01
The Flexible Image Transport System (FITS) has been used by astronomers for over 30 years as a data interchange and archiving format; FITS files are now handled by a wide range of astronomical software packages. Since the FITS format definition document (the “standard”) was last printed in this journal in 2001, several new features have been developed and standardized, notably support for 64-bit integers in images and tables, variable-length arrays in tables, and new world coordinate system conventions which provide a mapping from an element in a data array to a physical coordinate on the sky or within a spectrum. The FITS Working Group of the International Astronomical Union has therefore produced this new version 3.0 of the FITS standard, which is provided here in its entirety. In addition to describing the new features in FITS, numerous editorial changes were made to the previous version to clarify and reorganize many of the sections. Also included are some appendices which are not formally part of the standard. The FITS standard is likely to undergo further evolution, in which case the latest version may be found on the FITS Support Office Web site at
ERIC Educational Resources Information Center
Kelly, Michael J. B.; Fallot, Lucas B.; Gustafson, Jeffrey L.; Bergdahl, B. Mikael
2016-01-01
The synthesis of alkenes using the Wittig reaction is a traditional part of many undergraduate organic chemistry teaching laboratory curricula. The aqueous medium version of the Wittig reaction presented is a reliable adaptation of this alkene formation reaction as a very safe alternative in the introductory organic chemistry laboratory. The…
ERIC Educational Resources Information Center
Jarodzka, Halszka; Janssen, Noortje; Kirschner, Paul A.; Erkens, Gijsbert
2015-01-01
This study investigated whether design guidelines for computer-based learning can be applied to computer-based testing (CBT). Twenty-two students completed a CBT exam with half of the questions presented in a split-screen format that was analogous to the original paper-and-pencil version and half in an integrated format. Results show that students…
Directory interchange format manual, version 3.0
NASA Technical Reports Server (NTRS)
1990-01-01
The Directory Interchange Format (DIF) is a data structure used to exchange directory level information about data sets among information systems. The format consists of a number of fields that describe the attributes of a directory entry and text blocks that contain a descriptive summary of and references for the directory entry. All fields and the summary are preceded by labels identifying their contents. All values are ASCII character strings. The structure is intended to be flexible, allowing for future changes in the contents of directory entries.
Steigerwald, Celia H.; Mutschler, Felix E.; Ludington, Steve
1983-01-01
This bibliography of 1117 citations brings together references on stockwork molybdenite deposits and related topics in a format that can be sorted by topic and(or) geographic area. Each reference is preceded by a key, or keys, which may be read and sorted visually or by computer, The bibliography is available in two formats: (1) paper- or microfiche-hardcopy, and (2) fixed format computer reasonable magnetic tape, A FORTRAN program is provided for sorting the magnetic tape version,
New version: GRASP2K relativistic atomic structure package
NASA Astrophysics Data System (ADS)
Jönsson, P.; Gaigalas, G.; Bieroń, J.; Fischer, C. Froese; Grant, I. P.
2013-09-01
A revised version of GRASP2K [P. Jönsson, X. He, C. Froese Fischer, I.P. Grant, Comput. Phys. Commun. 177 (2007) 597] is presented. It supports earlier non-block and block versions of codes as well as a new block version in which the njgraf library module [A. Bar-Shalom, M. Klapisch, Comput. Phys. Commun. 50 (1988) 375] has been replaced by the librang angular package developed by Gaigalas based on the theory of [G. Gaigalas, Z.B. Rudzikas, C. Froese Fischer, J. Phys. B: At. Mol. Phys. 30 (1997) 3747, G. Gaigalas, S. Fritzsche, I.P. Grant, Comput. Phys. Commun. 139 (2001) 263]. Tests have shown that errors encountered by njgraf do not occur with the new angular package. The three versions are denoted v1, v2, and v3, respectively. In addition, in v3, the coefficients of fractional parentage have been extended to j=9/2, making calculations feasible for the lanthanides and actinides. Changes in v2 include minor improvements. For example, the new version of rci2 may be used to compute quantum electrodynamic (QED) corrections only from selected orbitals. In v3, a new program, jj2lsj, reports the percentage composition of the wave function in LSJ and the program rlevels has been modified to report the configuration state function (CSF) with the largest coefficient of an LSJ expansion. The bioscl2 and bioscl3 application programs have been modified to produce a file of transition data with one record for each transition in the same format as in ATSP2K [C. Froese Fischer, G. Tachiev, G. Gaigalas, M.R. Godefroid, Comput. Phys. Commun. 176 (2007) 559], which identifies each atomic state by the total energy and a label for the CSF with the largest expansion coefficient in LSJ intermediate coupling. All versions of the codes have been adapted for 64-bit computer architecture. Program SummaryProgram title: GRASP2K, version 1_1 Catalogue identifier: ADZL_v1_1 Program summary URL: http://cpc.cs.qub.ac.uk/summaries/ADZL_v1_1.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 730252 No. of bytes in distributed program, including test data, etc.: 14808872 Distribution format: tar.gz Programming language: Fortran. Computer: Intel Xeon, 2.66 GHz. Operating system: Suse, Ubuntu, and Debian Linux 64-bit. RAM: 500 MB or more Classification: 2.1. Catalogue identifier of previous version: ADZL_v1_0 Journal reference of previous version: Comput. Phys. Comm. 177 (2007) 597 Does the new version supersede the previous version?: Yes Nature of problem: Prediction of atomic properties — atomic energy levels, oscillator strengths, radiative decay rates, hyperfine structure parameters, Landé gJ-factors, and specific mass shift parameters — using a multiconfiguration Dirac-Hartree-Fock approach. Solution method: The computational method is the same as in the previous GRASP2K [1] version except that for v3 codes the njgraf library module [2] for recoupling has been replaced by librang [3,4]. Reasons for new version: New angular libraries with improved performance are available. Also methodology for transforming from jj- to LSJ-coupling has been developed. Summary of revisions: New angular libraries where the coefficients of fractional parentage have been extended to j=9/2, making calculations feasible for the lanthanides and actinides. Inclusion of a new program jj2lsj, which reports the percentage composition of the wave function in LSJ. Transition programs have been modified to produce a file of transition data with one record for each transition in the same format as Atsp2K [C. Froese Fischer, G. Tachiev, G. Gaigalas and M.R. Godefroid, Comput. Phys. Commun. 176 (2007) 559], which identifies each atomic state by the total energy and a label for the CSF with the largest expansion coefficient in LSJ intermediate coupling. Updated to 64-bit architecture. A comprehensive user manual in pdf format for the program package has been added. Restrictions: The packing algorithm restricts the maximum number of orbitals to be ≤214. The tables of reduced coefficients of fractional parentage used in this version are limited to subshells with j≤9/2 [5]; occupied subshells with j>9/2 are, therefore, restricted to a maximum of two electrons. Some other parameters, such as the maximum number of subshells of a CSF outside a common set of closed shells are determined by a parameter.def file that can be modified prior to compile time. Unusual features: The bioscl3 program reports transition data in the same format as in Atsp2K [6], and the data processing program tables of the latter package can be used. The tables program takes a name.lsj file, usually a concatenated file of all the .lsj transition files for a given atom or ion, and finds the energy structure of the levels and the multiplet transition arrays. The tables posted at the website http://atoms.vuse.vanderbilt.edu are examples of tables produced by the tables program. With the extension of coefficients of fractional parentage to j=9/2, calculations for the lanthanides and actinides become possible. Running time: CPU time required to execute test cases: 70.5 s.
The Surface Ocean CO2 Atlas: Stewarding Underway Carbon Data from Collection to Archival
NASA Astrophysics Data System (ADS)
O'Brien, K.; Smith, K. M.; Pfeil, B.; Landa, C.; Bakker, D. C. E.; Olsen, A.; Jones, S.; Shrestha, B.; Kozyr, A.; Manke, A. B.; Schweitzer, R.; Burger, E. F.
2016-02-01
The Surface Ocean CO2 Atlas (SOCAT, www.socat.info) is a quality controlled, global surface ocean carbon dioxide (CO2) data set gathered on research vessels, SOOP and buoys. To the degree feasible SOCAT is comprehensive; it draws together and applies uniform QC procedures to all such observations made across the international community. The first version of SOCAT (version 1.5) was publicly released September 2011(Bakker et al., 2011) with 6.3 million observations. This was followed by the release of SOCAT version 2, expanded to over 10 million observations, in June 2013 (Bakker et al., 2013). Most recently, in September 2015 SOCAT version 3 was released containing over 14 millions observations spanning almost 60 years! The process of assembling, QC'ing and publishing V1.5 and V2 of SOCAT required an unsustainable level of manual effort. To ease the burden on data managers and data providers, the SOCAT community agreed to embark an automated data ingestion process which would create a streamlined workflow to improve data stewardship from ingestion to quality control and from publishing to archival. To that end, for version 3 and beyond, the SOCAT automation team created a framework which was based upon standards and conventions, yet at the same time allows scientists to work in the data formats they felt most comfortable with (ie, csv files). This automated workflow provides several advantages: 1) data ingestion into uniform and standards-based file formats; 2) ease of data integration into standard quality control system; 3) data ingestion and quality control can be performed in parallel; 4) provides uniform method of archiving carbon data and generation of digital object identifiers (DOI).In this presentation, we will discuss and demonstrate the SOCAT data ingestion dashboard and the quality control system. We will also discuss the standards, conventions, and tools that were leveraged to create a workflow that allows scientists to work in their own formats, yet provides a framework for creating high quality data products on an annual basis, while meeting or exceeding data requirements for access, documentation and archival.
HELAC-PHEGAS: A generator for all parton level processes
NASA Astrophysics Data System (ADS)
Cafarella, Alessandro; Papadopoulos, Costas G.; Worek, Malgorzata
2009-10-01
The updated version of the HELAC-PHEGAS event generator is presented. The matrix elements are calculated through Dyson-Schwinger recursive equations using color connection representation. Phase-space generation is based on a multichannel approach, including optimization. HELAC-PHEGAS generates parton level events with all necessary information, in the most recent Les Houches Accord format, for the study of any process within the Standard Model in hadron and lepton colliders. New version program summaryProgram title: HELAC-PHEGAS Catalogue identifier: ADMS_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADMS_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 35 986 No. of bytes in distributed program, including test data, etc.: 380 214 Distribution format: tar.gz Programming language: Fortran Computer: All Operating system: Linux Classification: 11.1, 11.2 External routines: Optionally Les Houches Accord (LHA) PDF Interface library ( http://projects.hepforge.org/lhapdf/) Catalogue identifier of previous version: ADMS_v1_0 Journal reference of previous version: Comput. Phys. Comm. 132 (2000) 306 Does the new version supersede the previous version?: Yes, partly Nature of problem: One of the most striking features of final states in current and future colliders is the large number of events with several jets. Being able to predict their features is essential. To achieve this, the calculations need to describe as accurately as possible the full matrix elements for the underlying hard processes. Even at leading order, perturbation theory based on Feynman graphs runs into computational problems, since the number of graphs contributing to the amplitude grows as n!. Solution method: Recursive algorithms based on Dyson-Schwinger equations have been developed recently in order to overcome the computational obstacles. The calculation of the amplitude, using Dyson-Schwinger recursive equations, results in a computational cost growing asymptotically as 3 n, where n is the number of particles involved in the process. Off-shell subamplitudes are introduced, for which a recursion relation has been obtained allowing to express an n-particle amplitude in terms of subamplitudes, with 1-, 2-, … up to (n-1) particles. The color connection representation is used in order to treat amplitudes involving colored particles. In the present version HELAC-PHEGAS can be used to efficiently obtain helicity amplitudes, total cross sections, parton-level event samples in LHA format, for arbitrary multiparticle processes in the Standard Model in leptonic, pp¯ and pp collisions. Reasons for new version: Substantial improvements, major functionality upgrade. Summary of revisions: Color connection representation, efficient integration over PDF via the PARNI algorithm, interface to LHAPDF, parton level events generated in the most recent LHA format, k reweighting for Parton Shower matching, numerical predictions for amplitudes for arbitrary processes for phase-space points provided by the user, new user interface and the possibility to run over computer clusters. Running time: Depending on the process studied. Usually from seconds to hours. References:A. Kanaki, C.G. Papadopoulos, Comput. Phys. Comm. 132 (2000) 306. C.G. Papadopoulos, Comput. Phys. Comm. 137 (2001) 247. URL: http://www.cern.ch/helac-phegas.
Terrain - Umbra Package v. 1.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oppel, Fred; Hart, Brian; Rigdon, James Brian
This library contains modules that read terrain files (e.g., OpenFlight, Open Scene Graph IVE, GeoTIFF Image) and to read and manage ESRI terrain datasets. All data is stored and managed in Open Scene Graph (OSG). Terrain system accesses OSG and provides elevation data, access to meta-data such as soil types and enables linears, areals and buildings to be placed in a terrain, These geometry objects include boxes, point, path, and polygon (region), and sector modules. Utilities have been made available for clamping objects to the terrain and accessing LOS information. This assertion includes a managed C++ wrapper code (TerrainWrapper) tomore » enable C# applications, such as OpShed and UTU, to incorporate this library.« less
Natural-Color Image Mosaics of Afghanistan: Digital Databases and Maps
Davis, Philip A.; Hare, Trent M.
2007-01-01
Explanation: The 50 tiled images in this dataset are natural-color renditions of the calibrated six-band Landsat mosaics created from Landsat Enhanced Thematic Mapper Plus (ETM+) data. Natural-color images depict the surface as seen by the human eye. The calibration of the Landsat ETM+ maps produced by Davis (2006) are relative reflectance and need to be grounded with ground-reflectance data, but the difficulties in performing fieldwork in Afghanistan precluded ground-reflectance surveys. For natural color calibration, which involves only the blue, green, and red color bands of Landsat, we could use ground photographs, Munsell color readings of ground surfaces, or another image base that accurately depicts the surface color. Each map quadrangle is 1? of latitude by? of longitude. The numbers assigned to each map quadrangle refer to the latitude and longitude coordinates of the lower left corner of the quadrangle. For example, quadrangle Q2960 has its lower left corner at lat 29? N., long 60? E. Each quadrangle overlaps adjacent quadrangles by 100 pixels (2.85 km). Only the 14.25-m-spacial-resolution UTM and 28.5-m-spacial-resolution WGS84 geographic geotiff datasets are available in this report to decrease the amount of space needed. The images are (three-band, eight-bit) geotiffs with embedded georeferencing. As such, most software will not require the associated world files. An index of all available images in geographic is displayed here: Index_Geo_DD.pdf. The country of Afghanistan spans three UTM zones: (41-43). Maps are stored as geoTIFFs in their respective UTM zone projection. Indexes of all available topographic map sheets in their respective UTM zone are displayed here: Index_UTM_Z41.pdf, Index_UTM_Z42.pdf, Index_UTM_Z43.pdf. You will need Adobe Reader to view the PDF files. Download a copy of the latest version of Adobe Reader for free.
Walthouwer, Michel Jean Louis; Oenema, Anke; Lechner, Lilian; de Vries, Hein
2015-10-19
Web-based computer-tailored interventions often suffer from small effect sizes and high drop-out rates, particularly among people with a low level of education. Using videos as a delivery format can possibly improve the effects and attractiveness of these interventions The main aim of this study was to examine the effects of a video and text version of a Web-based computer-tailored obesity prevention intervention on dietary intake, physical activity, and body mass index (BMI) among Dutch adults. A second study aim was to examine differences in appreciation between the video and text version. The final study aim was to examine possible differences in intervention effects and appreciation per educational level. A three-armed randomized controlled trial was conducted with a baseline and 6 months follow-up measurement. The intervention consisted of six sessions, lasting about 15 minutes each. In the video version, the core tailored information was provided by means of videos. In the text version, the same tailored information was provided in text format. Outcome variables were self-reported and included BMI, physical activity, energy intake, and appreciation of the intervention. Multiple imputation was used to replace missing values. The effect analyses were carried out with multiple linear regression analyses and adjusted for confounders. The process evaluation data were analyzed with independent samples t tests. The baseline questionnaire was completed by 1419 participants and the 6 months follow-up measurement by 1015 participants (71.53%). No significant interaction effects of educational level were found on any of the outcome variables. Compared to the control condition, the video version resulted in lower BMI (B=-0.25, P=.049) and lower average daily energy intake from energy-dense food products (B=-175.58, P<.001), while the text version had an effect only on energy intake (B=-163.05, P=.001). No effects on physical activity were found. Moreover, the video version was rated significantly better than the text version on feelings of relatedness (P=.041), usefulness (P=.047), and grade given to the intervention (P=.018). The video version of the Web-based computer-tailored obesity prevention intervention was the most effective intervention and most appreciated. Future research needs to examine if the effects are maintained in the long term and how the intervention can be optimized. Netherlands Trial Register: NTR3501; http://www.trialregister.nl/trialreg/admin/rctview.asp?TC=3501 (Archived by WebCite at http://www.webcitation.org/6cBKIMaW1).
NASA Technical Reports Server (NTRS)
Warren, W. H., Jr.
1984-01-01
The machine-readable version of the Astronomical Netherlands Satellite ultraviolet photometry catalog is described in detail, with a byte-by-byte format description and characteristics of the data file given. The catalog is a compilation of ultraviolet photometry in five bands, within the wavelength range 155 nm to 330 nm, for 3573 mostly stellar objects. Additional cross reference data (object identification, UBV photometry and MK spectral types) are included in the catalog.
PDF Version of EPA Communication Product Standards Stylebook
This stylebook provides style and format guidance for most media, including print documents, audiovisual, broadcast, presentation and exhibit work. Also find templates and samples, copyright requirements, publishing information, and logo use standards.
Immunization Schedules for Adults
... color [2 pages] PDF black & white [2 pages] Spanish Version (en español) Vacunas recomendadas para adultos según ... easy-to-read format in English and/or Spanish on your website. See examples of how the ...
Rare Earth Geochemistry of Rock Core form WY Reservoirs
Quillinan, Scott; Bagdonnas, Davin; McLaughlin, J. Fred; Nye, Charles
2016-10-01
These data include major, minor, trace and rare earth element concentration of geologic formations in Wyoming oil and gas fields. *Note - Link below contains updated version of spreadsheet (6/14/2017)
González-Beltrán, Alejandra; Neumann, Steffen; Maguire, Eamonn; Sansone, Susanna-Assunta; Rocca-Serra, Philippe
2014-01-01
The ISA-Tab format and software suite have been developed to break the silo effect induced by technology-specific formats for a variety of data types and to better support experimental metadata tracking. Experimentalists seldom use a single technique to monitor biological signals. Providing a multi-purpose, pragmatic and accessible format that abstracts away common constructs for describing Investigations, Studies and Assays, ISA is increasingly popular. To attract further interest towards the format and extend support to ensure reproducible research and reusable data, we present the Risa package, which delivers a central component to support the ISA format by enabling effortless integration with R, the popular, open source data crunching environment. The Risa package bridges the gap between the metadata collection and curation in an ISA-compliant way and the data analysis using the widely used statistical computing environment R. The package offers functionality for: i) parsing ISA-Tab datasets into R objects, ii) augmenting annotation with extra metadata not explicitly stated in the ISA syntax; iii) interfacing with domain specific R packages iv) suggesting potentially useful R packages available in Bioconductor for subsequent processing of the experimental data described in the ISA format; and finally v) saving back to ISA-Tab files augmented with analysis specific metadata from R. We demonstrate these features by presenting use cases for mass spectrometry data and DNA microarray data. The Risa package is open source (with LGPL license) and freely available through Bioconductor. By making Risa available, we aim to facilitate the task of processing experimental data, encouraging a uniform representation of experimental information and results while delivering tools for ensuring traceability and provenance tracking. The Risa package is available since Bioconductor 2.11 (version 1.0.0) and version 1.2.1 appeared in Bioconductor 2.12, both along with documentation and examples. The latest version of the code is at the development branch in Bioconductor and can also be accessed from GitHub https://github.com/ISA-tools/Risa, where the issue tracker allows users to report bugs or feature requests.
The Risa R/Bioconductor package: integrative data analysis from experimental metadata and back again
2014-01-01
Background The ISA-Tab format and software suite have been developed to break the silo effect induced by technology-specific formats for a variety of data types and to better support experimental metadata tracking. Experimentalists seldom use a single technique to monitor biological signals. Providing a multi-purpose, pragmatic and accessible format that abstracts away common constructs for describing Investigations, Studies and Assays, ISA is increasingly popular. To attract further interest towards the format and extend support to ensure reproducible research and reusable data, we present the Risa package, which delivers a central component to support the ISA format by enabling effortless integration with R, the popular, open source data crunching environment. Results The Risa package bridges the gap between the metadata collection and curation in an ISA-compliant way and the data analysis using the widely used statistical computing environment R. The package offers functionality for: i) parsing ISA-Tab datasets into R objects, ii) augmenting annotation with extra metadata not explicitly stated in the ISA syntax; iii) interfacing with domain specific R packages iv) suggesting potentially useful R packages available in Bioconductor for subsequent processing of the experimental data described in the ISA format; and finally v) saving back to ISA-Tab files augmented with analysis specific metadata from R. We demonstrate these features by presenting use cases for mass spectrometry data and DNA microarray data. Conclusions The Risa package is open source (with LGPL license) and freely available through Bioconductor. By making Risa available, we aim to facilitate the task of processing experimental data, encouraging a uniform representation of experimental information and results while delivering tools for ensuring traceability and provenance tracking. Software availability The Risa package is available since Bioconductor 2.11 (version 1.0.0) and version 1.2.1 appeared in Bioconductor 2.12, both along with documentation and examples. The latest version of the code is at the development branch in Bioconductor and can also be accessed from GitHub https://github.com/ISA-tools/Risa, where the issue tracker allows users to report bugs or feature requests. PMID:24564732
GFVO: the Genomic Feature and Variation Ontology.
Baran, Joachim; Durgahee, Bibi Sehnaaz Begum; Eilbeck, Karen; Antezana, Erick; Hoehndorf, Robert; Dumontier, Michel
2015-01-01
Falling costs in genomic laboratory experiments have led to a steady increase of genomic feature and variation data. Multiple genomic data formats exist for sharing these data, and whilst they are similar, they are addressing slightly different data viewpoints and are consequently not fully compatible with each other. The fragmentation of data format specifications makes it hard to integrate and interpret data for further analysis with information from multiple data providers. As a solution, a new ontology is presented here for annotating and representing genomic feature and variation dataset contents. The Genomic Feature and Variation Ontology (GFVO) specifically addresses genomic data as it is regularly shared using the GFF3 (incl. FASTA), GTF, GVF and VCF file formats. GFVO simplifies data integration and enables linking of genomic annotations across datasets through common semantics of genomic types and relations. Availability and implementation. The latest stable release of the ontology is available via its base URI; previous and development versions are available at the ontology's GitHub repository: https://github.com/BioInterchange/Ontologies; versions of the ontology are indexed through BioPortal (without external class-/property-equivalences due to BioPortal release 4.10 limitations); examples and reference documentation is provided on a separate web-page: http://www.biointerchange.org/ontologies.html. GFVO version 1.0.2 is licensed under the CC0 1.0 Universal license (https://creativecommons.org/publicdomain/zero/1.0) and therefore de facto within the public domain; the ontology can be appropriated without attribution for commercial and non-commercial use.
Modulation of habit formation by levodopa in Parkinson's disease.
Marzinzik, Frank; Wotka, Johann; Wahl, Michael; Krugel, Lea K; Kordsachia, Catarina; Klostermann, Fabian
2011-01-01
Dopamine promotes the execution of positively reinforced actions, but its role for the formation of behaviour when feedback is unavailable remains open. To study this issue, the performance of treated/untreated patients with Parkinson's disease and controls was analysed in an implicit learning task, hypothesising dopamine-dependent adherence to hidden task rules. Sixteen patients on/off levodopa and fourteen healthy subjects engaged in a Go/NoGo paradigm comprising four equiprobable stimuli. One of the stimuli was defined as target which was first consistently preceded by one of the three non-target stimuli (conditioning), whereas this coupling was dissolved thereafter (deconditioning). Two task versions were presented: in a 'Go version', only the target cue required the execution of a button press, whereas non-target stimuli were not instructive of a response; in a 'NoGo version', only the target cue demanded the inhibition of the button press which was demanded upon any non-target stimulus. Levodopa influenced in which task version errors grew from conditioning to deconditioning: in unmedicated patients just as controls errors only rose in the NoGo version with an increase of incorrect responses to target cues. Contrarily, in medicated patients errors went up only in the Go version with an increase of response omissions to target cues. The error increases during deconditioning can be understood as a perpetuation of reaction tendencies acquired during conditioning. The levodopa-mediated modulation of this carry-over effect suggests that dopamine supports habit conditioning under the task demand of response execution, but dampens it when inhibition is required. However, other than in reinforcement learning, supporting dopaminergic actions referred to the most frequent, i. e., non-target behaviour. Since this is passive whenever selective actions are executed against an inactive background, dopaminergic treatment could in according scenarios contribute to passive behaviour in patients with Parkinson's disease.
NASA Astrophysics Data System (ADS)
Reisel, John R.; Jablonski, Marissa; Hosseini, Hossein; Munson, Ethan
2012-06-01
A summer bridge program for incoming engineering and computer science freshmen has been used at the University of Wisconsin-Milwaukee from 2007 to 2010. The primary purpose of this program has been to improve the mathematics course placement for incoming students who initially place into a course below Calculus I on the math placement examination. The students retake the university's math placement examination after completing the bridge program to determine if they then place into a higher-level mathematics course. If the students improve their math placement, the program is considered successful for that student. The math portion of the bridge program is designed around using the ALEKS software package for targeted, self-guided learning. In the 2007 and 2008 versions of the program, both an on-line version and an on-campus version with additional instruction were offered. In 2009 and 2010, the program was exclusively in an on-campus format, and also featured a required residential component and additional engineering activities for the students. From the results of these four programs, we are able to evaluate the success of the program in its different formats. In addition, data has been collected and analysed regarding the impact of other factors on the program's success. The factors include student preparation before the beginning of the program (as measured by math ACT scores) and the amount of time the student spent working on the material during the program. Better math preparation and the amount of time spent on the program were found to be good indicators of success. Furthermore, the on-campus version of the program is more effective than the on-line version.
Sala-Sastre, Nohemi; Herdman, Mike; Navarro, Lidia; de la Prada, Miriam; Pujol, Ramón M; Serra, Consol; Alonso, Jordi; Flyvholm, Mari-Ann; Giménez-Arnau, Ana M
2009-08-01
Occupational skin diseases are among the most frequent work-related diseases in industrialized countries. The Nordic Occupational Skin Questionnaire (NOSQ-2002), developed in English, is a useful tool for screening of occupational skin diseases. To culturally adapt the NOSQ-2002 to Spanish and Catalan and to assess the clarity, comprehension, cultural relevance and appropriateness of the translated versions. The International Society for Pharmacoeconomics and Outcomes Research (ISPOR) principles of good practice for the translation and cultural adaptation of patient-reported outcomes were followed. After translation into the target language, a first consensus version of the questionnaire was evaluated in multiple cognitive debriefing interviews. The expert panel introduced some modifications in 39 (68%) and 27 (47%) items in the Spanish and Catalan version, respectively (e.g. addition of examples and definitions, reformulation of instructions and use of direct question format). This version was back translated and submitted to the original authors, who suggested a further seven and two modifications in the Spanish and Catalan versions, respectively. A second set of cognitive interviews were performed. A consensus version of both questionnaires was obtained after final modifications based on comments by the patients. The final versions of the Spanish and Catalan NOSQ-2002 questionnaires are now available at www.NRCWE.dk/NOSQ.
Carswell, C Melody; Lio, Cindy H; Grant, Russell; Klein, Martina I; Clarke, Duncan; Seales, W Brent; Strup, Stephen
2010-12-01
Subjective workload measures are usually administered in a visual-manual format, either electronically or by paper and pencil. However, vocal responses to spoken queries may sometimes be preferable, for example when experimental manipulations require continuous manual responding or when participants have certain sensory/motor impairments. In the present study, we evaluated the acceptability of the hands-free administration of two subjective workload questionnaires - the NASA Task Load Index (NASA-TLX) and the Multiple Resources Questionnaire (MRQ) - in a surgical training environment where manual responding is often constrained. Sixty-four undergraduates performed fifteen 90-s trials of laparoscopic training tasks (five replications of 3 tasks - cannulation, ring transfer, and rope manipulation). Half of the participants provided workload ratings using a traditional paper-and-pencil version of the NASA-TLX and MRQ; the remainder used a vocal (hands-free) version of the questionnaires. A follow-up experiment extended the evaluation of the hands-free version to actual medical students in a Minimally Invasive Surgery (MIS) training facility. The NASA-TLX was scored in 2 ways - (1) the traditional procedure using participant-specific weights to combine its 6 subscales, and (2) a simplified procedure - the NASA Raw Task Load Index (NASA-RTLX) - using the unweighted mean of the subscale scores. Comparison of the scores obtained from the hands-free and written administration conditions yielded coefficients of equivalence of r=0.85 (NASA-TLX) and r=0.81 (NASA-RTLX). Equivalence estimates for the individual subscales ranged from r=0.78 ("mental demand") to r=0.31 ("effort"). Both administration formats and scoring methods were equally sensitive to task and repetition effects. For the MRQ, the coefficient of equivalence for the hands-free and written versions was r=0.96 when tested on undergraduates. However, the sensitivity of the hands-free MRQ to task demands (η(partial)(2)=0.138) was substantially less than that for the written version (η(partial)(2)=0.252). This potential shortcoming of the hands-free MRQ did not seem to generalize to medical students who showed robust task effects when using the hands-free MRQ (η(partial)(2)=0.396). A detailed analysis of the MRQ subscales also revealed differences that may be attributable to a "spillover" effect in which participants' judgments about the demands of completing the questionnaires contaminated their judgments about the primary surgical training tasks. Vocal versions of the NASA-TLX are acceptable alternatives to standard written formats when researchers wish to obtain global workload estimates. However, care should be used when interpreting the individual subscales if the object is to make comparisons between studies or conditions that use different administration modalities. For the MRQ, the vocal version was less sensitive to experimental manipulations than its written counterpart; however, when medical students rather than undergraduates used the vocal version, the instrument's sensitivity increased well beyond that obtained with any other combination of administration modality and instrument in this study. Thus, the vocal version of the MRQ may be an acceptable workload assessment technique for selected populations, and it may even be a suitable substitute for the NASA-TLX. Copyright © 2010 Elsevier Ltd. All rights reserved.
Extraction of Suspended Sediments from Landsat Imagery in the Northern Gulf of Mexico
NASA Astrophysics Data System (ADS)
Hardin, D. M.; Drewry, M.; He, M. Y.; Ebersole, S.
2011-12-01
The Sediment Analysis Network for Decision Support (SANDS) project is utilizing enhancement methods to highlight suspended sediment in remotely sensed data and imagery of the Northern Gulf of Mexico. The analysis thus far has shown that areas of suspended sediments can be extracted from Landsat imagery. In addition, although not an original goal of SANDS, the analysis techniques have revealed oil floating on the water's surface. Detection of oil floating on the surface through remotely sensed imagery can be helpful in identifying and understanding the geographic distribution and movement of oil for environmental concerns. Data from Landsat, and MODIS were obtained from NASA Earth Science Data Centers by the Information Technology and Systems Center at the University of Alabama in Huntsville and prepared for analysis by subsetting to the region of interest and converting from HDF-EOS format (in the case of MODIS) to GeoTiff. Analysts at the Geological Survey of Alabama (GSA) working with Landsat data initially, employed enhancement methods, including false color composites, spectral ratios, and other spectral enhancements based on the mineral composition of sediments, to combinations of visible and infrared bands of data. Initial results of this approach revealed suspended sediments. The analysis technique also revealed areas of oil floating on the surface of the Gulf near Chandeleur Island immediately after Hurricane Katrina in 2005. True color Landsat imagery compares the original Landsat scene to the same region after enhancement. The areas of floating oil are clearly visible. The oil washed out from oil spills on land. This paper will present the intermediate result of the SANDS project thus far.
NASA Astrophysics Data System (ADS)
Schneider, Uwe; Strack, Ruediger
1992-04-01
apART reflects the structure of an open, distributed environment. According to the general trend in the area of imaging, network-capable, general purpose workstations with capabilities of open system image communication and image input are used. Several heterogeneous components like CCD cameras, slide scanners, and image archives can be accessed. The system is driven by an object-oriented user interface where devices (image sources and destinations), operators (derived from a commercial image processing library), and images (of different data types) are managed and presented uniformly to the user. Browsing mechanisms are used to traverse devices, operators, and images. An audit trail mechanism is offered to record interactive operations on low-resolution image derivatives. These operations are processed off-line on the original image. Thus, the processing of extremely high-resolution raster images is possible, and the performance of resolution dependent operations is enhanced significantly during interaction. An object-oriented database system (APRIL), which can be browsed, is integrated into the system. Attribute retrieval is supported by the user interface. Other essential features of the system include: implementation on top of the X Window System (X11R4) and the OSF/Motif widget set; a SUN4 general purpose workstation, inclusive ethernet, magneto optical disc, etc., as the hardware platform for the user interface; complete graphical-interactive parametrization of all operators; support of different image interchange formats (GIF, TIFF, IIF, etc.); consideration of current IPI standard activities within ISO/IEC for further refinement and extensions.
Training system for digital mammographic diagnoses of breast cancer
NASA Astrophysics Data System (ADS)
Thomaz, R. L.; Nirschl Crozara, M. G.; Patrocinio, A. C.
2013-03-01
As the technology evolves, the analog mammography systems are being replaced by digital systems. The digital system uses video monitors as the display of mammographic images instead of the previously used screen-film and negatoscope for analog images. The change in the way of visualizing mammographic images may require a different approach for training the health care professionals in diagnosing the breast cancer with digital mammography. Thus, this paper presents a computational approach to train the health care professionals providing a smooth transition between analog and digital technology also training to use the advantages of digital image processing tools to diagnose the breast cancer. This computational approach consists of a software where is possible to open, process and diagnose a full mammogram case from a database, which has the digital images of each of the mammographic views. The software communicates with a gold standard digital mammogram cases database. This database contains the digital images in Tagged Image File Format (TIFF) and the respective diagnoses according to BI-RADSTM, these files are read by software and shown to the user as needed. There are also some digital image processing tools that can be used to provide better visualization of each single image. The software was built based on a minimalist and a user-friendly interface concept that might help in the smooth transition. It also has an interface for inputting diagnoses from the professional being trained, providing a result feedback. This system has been already completed, but hasn't been applied to any professional training yet.
Modular reweighting software for statistical mechanical analysis of biased equilibrium data
NASA Astrophysics Data System (ADS)
Sindhikara, Daniel J.
2012-07-01
Here a simple, useful, modular approach and software suite designed for statistical reweighting and analysis of equilibrium ensembles is presented. Statistical reweighting is useful and sometimes necessary for analysis of equilibrium enhanced sampling methods, such as umbrella sampling or replica exchange, and also in experimental cases where biasing factors are explicitly known. Essentially, statistical reweighting allows extrapolation of data from one or more equilibrium ensembles to another. Here, the fundamental separable steps of statistical reweighting are broken up into modules - allowing for application to the general case and avoiding the black-box nature of some “all-inclusive” reweighting programs. Additionally, the programs included are, by-design, written with little dependencies. The compilers required are either pre-installed on most systems, or freely available for download with minimal trouble. Examples of the use of this suite applied to umbrella sampling and replica exchange molecular dynamics simulations will be shown along with advice on how to apply it in the general case. New version program summaryProgram title: Modular reweighting version 2 Catalogue identifier: AEJH_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEJH_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public License, version 3 No. of lines in distributed program, including test data, etc.: 179 118 No. of bytes in distributed program, including test data, etc.: 8 518 178 Distribution format: tar.gz Programming language: C++, Python 2.6+, Perl 5+ Computer: Any Operating system: Any RAM: 50-500 MB Supplementary material: An updated version of the original manuscript (Comput. Phys. Commun. 182 (2011) 2227) is available Classification: 4.13 Catalogue identifier of previous version: AEJH_v1_0 Journal reference of previous version: Comput. Phys. Commun. 182 (2011) 2227 Does the new version supersede the previous version?: Yes Nature of problem: While equilibrium reweighting is ubiquitous, there are no public programs available to perform the reweighting in the general case. Further, specific programs often suffer from many library dependencies and numerical instability. Solution method: This package is written in a modular format that allows for easy applicability of reweighting in the general case. Modules are small, numerically stable, and require minimal libraries. Reasons for new version: Some minor bugs, some upgrades needed, error analysis added. analyzeweight.py/analyzeweight.py2 has been replaced by “multihist.py”. This new program performs all the functions of its predecessor while being versatile enough to handle other types of histograms and probability analysis. “bootstrap.py” was added. This script performs basic bootstrap resampling allowing for error analysis of data. “avg_dev_distribution.py” was added. This program computes the averages and standard deviations of multiple distributions, making error analysis (e.g. from bootstrap resampling) easier to visualize. WRE.cpp was slightly modified purely for cosmetic reasons. The manual was updated for clarity and to reflect version updates. Examples were removed from the manual in favor of online tutorials (packaged examples remain). Examples were updated to reflect the new format. An additional example is included to demonstrate error analysis. Running time: Preprocessing scripts 1-5 minutes, WHAM engine <1 minute, postprocess script ∼1-5 minutes.
48 CFR 3452.239-70 - Internet protocol version 6 (IPv6).
Code of Federal Regulations, 2011 CFR
2011-10-01
... utilizing system packets that are formatted in accordance with commercial standards of Internet protocol (IP... of IPv4 products. (b) Specifically, any new IP product or system developed, acquired, or produced...
Rare Earth Element Geochemistry for Produced Waters, WY
Quillinan, Scott; Nye, Charles; McLing, Travis; Neupane, Hari
2016-06-30
These data represent major, minor, trace, isotopes, and rare earth element concentrations in geologic formations and water associated with oil and gas production. *Note - Link below contains updated version of spreadsheet (6/14/2017)
NASA Astrophysics Data System (ADS)
Baldwin, R.; Ansari, S.; Reid, G.; Lott, N.; Del Greco, S.
2007-12-01
The main goal in developing and deploying Geographic Information System (GIS) services at NOAA's National Climatic Data Center (NCDC) is to provide users with simple access to data archives while integrating new and informative climate products. Several systems at NCDC provide a variety of climatic data in GIS formats and/or map viewers. The Online GIS Map Services provide users with data discovery options which flow into detailed product selection maps, which may be queried using standard "region finder" tools or gazetteer (geographical dictionary search) functions. Each tabbed selection offers steps to help users progress through the systems. A series of additional base map layers or data types have been added to provide companion information. New map services include: Severe Weather Data Inventory, Local Climatological Data, Divisional Data, Global Summary of the Day, and Normals/Extremes products. THREDDS Data Server technology is utilized to provide access to gridded multidimensional datasets such as Model, Satellite and Radar. This access allows users to download data as a gridded NetCDF file, which is readable by ArcGIS. In addition, users may subset the data for a specific geographic region, time period, height range or variable prior to download. The NCDC Weather Radar Toolkit (WRT) is a client tool which accesses Weather Surveillance Radar 1988 Doppler (WSR-88D) data locally or remotely from the NCDC archive, NOAA FTP server or any URL or THREDDS Data Server. The WRT Viewer provides tools for custom data overlays, Web Map Service backgrounds, animations and basic filtering. The export of images and movies is provided in multiple formats. The WRT Data Exporter allows for data export in both vector polygon (Shapefile, Well-Known Text) and raster (GeoTIFF, ESRI Grid, VTK, NetCDF, GrADS) formats. As more users become accustom to GIS, questions of better, cheaper, faster access soon follow. Expanding use and availability can best be accomplished through standards which promote interoperability. Our GIS related products provide Open Geospatial Consortium (OGC) compliant Web Map Services (WMS), Web Feature Services (WFS), Web Coverage Services (WCS) and Federal Geographic Data Committee (FGDC) metadata as a complement to the map viewers. KML/KMZ data files (soon to be compliant OGC specifications) also provide access.
XTCE: XML Telemetry and Command Exchange Tutorial, XTCE Version 1
NASA Technical Reports Server (NTRS)
Rice, Kevin; Kizzort, Brad
2008-01-01
These presentation slides are a tutorial on XML Telemetry and Command Exchange (XTCE). The goal of XTCE is to provide an industry standard mechanism for describing telemetry and command streams (particularly from satellites.) it wiill lower cost and increase validation over traditional formats, and support exchange or native format.XCTE is designed to describe bit streams, that are typical of telemetry and command in the historic space domain.
Enhancement of CLAIM (clinical accounting information) for a localized Chinese version.
Guo, Jinqiu; Takada, Akira; Niu, Tie; He, Miao; Tanaka, Koji; Sato, Junzo; Suzuki, Muneou; Takahashi, Kiwamu; Daimon, Hiroyuki; Suzuki, Toshiaki; Nakashima, Yusei; Araki, Kenji; Yoshihara, Hiroyuki
2005-10-01
CLinical Accounting InforMation (CLAIM) is a standard for the exchange of data between patient accounting systems and electronic medical record (EMR) systems. It uses eXtensible Markup Language (XML) as a meta-language and was developed in Japan. CLAIM is subordinate to the Medical Markup Language (MML) standard, which allows the exchange of medical data between different medical institutions. It has inherited the basic structure of MML 2.x and the current version, version 2.1, contains two modules and nine data definition tables. In China, no data exchange standard yet exists that links EMR systems to accounting systems. Taking advantage of CLAIM's flexibility, we created a localized Chinese version based on CLAIM 2.1. Since Chinese receipt systems differ from those of Japan, some information such as prescription formats, etc. are also different from those in Japan. Two CLAIM modules were re-engineered and six data definition tables were either added or redefined. The Chinese version of CLAIM takes local needs into account, and consequently it is now possible to transfer data between the patient accounting systems and EMR systems of Chinese medical institutions effectively.
SAMICS: Input data preparation. [Solar Array Manufacturing Industry Costing Standards
NASA Technical Reports Server (NTRS)
Chamberlain, R. G.; Aster, R. W.
1979-01-01
The Solar Array Manufacturing Industry Costing Standards (SAMICS) provide standard formats, data, assumptions, and procedures for estimating the price that a manufacturer would have to charge for the product of a specified manufacturing process sequence. A line-by-line explanation is given of those standard formats which describe the economically important characteristics of the manufacturing processes and the technological structure of the companies and the industry. This revision provides an updated presentation of Format A Process Description, consistent with the October 1978 version of that form. A checklist of items which should be entered on Format A as direct expenses is included.
Rare Earth Element Concentration of Wyoming Thermal Waters Update
DOE Office of Scientific and Technical Information (OSTI.GOV)
Quillinan, Scott; Nye, Charles; Neupane, Hari
Updated version of data generated from rare earth element investigation of produced waters. These data represent major, minor, trace, isotopes, and rare earth element concentrations in geologic formations and water associated with oil and gas production.
Means of storage and automated monitoring of versions of text technical documentation
NASA Astrophysics Data System (ADS)
Leonovets, S. A.; Shukalov, A. V.; Zharinov, I. O.
2018-03-01
The paper presents automation of the process of preparation, storage and monitoring of version control of a text designer, and program documentation by means of the specialized software is considered. Automation of preparation of documentation is based on processing of the engineering data which are contained in the specifications and technical documentation or in the specification. Data handling assumes existence of strictly structured electronic documents prepared in widespread formats according to templates on the basis of industry standards and generation by an automated method of the program or designer text document. Further life cycle of the document and engineering data entering it are controlled. At each stage of life cycle, archive data storage is carried out. Studies of high-speed performance of use of different widespread document formats in case of automated monitoring and storage are given. The new developed software and the work benches available to the developer of the instrumental equipment are described.
Task 28: Web Accessible APIs in the Cloud Trade Study
NASA Technical Reports Server (NTRS)
Gallagher, James; Habermann, Ted; Jelenak, Aleksandar; Lee, Joe; Potter, Nathan; Yang, Muqun
2017-01-01
This study explored three candidate architectures for serving NASA Earth Science Hierarchical Data Format Version 5 (HDF5) data via Hyrax running on Amazon Web Services (AWS). We studied the cost and performance for each architecture using several representative Use-Cases. The objectives of the project are: Conduct a trade study to identify one or more high performance integrated solutions for storing and retrieving NASA HDF5 and Network Common Data Format Version 4 (netCDF4) data in a cloud (web object store) environment. The target environment is Amazon Web Services (AWS) Simple Storage Service (S3).Conduct needed level of software development to properly evaluate solutions in the trade study and to obtain required benchmarking metrics for input into government decision of potential follow-on prototyping. Develop a cloud cost model for the preferred data storage solution (or solutions) that accounts for different granulation and aggregation schemes as well as cost and performance trades.
MSFC crack growth analysis computer program, version 2 (users manual)
NASA Technical Reports Server (NTRS)
Creager, M.
1976-01-01
An updated version of the George C. Marshall Space Flight Center Crack Growth Analysis Program is described. The updated computer program has significantly expanded capabilities over the original one. This increased capability includes an extensive expansion of the library of stress intensity factors, plotting capability, increased design iteration capability, and the capability of performing proof test logic analysis. The technical approaches used within the computer program are presented, and the input and output formats and options are described. Details of the stress intensity equations, example data, and example problems are presented.
The Lagrangian particle dispersion model FLEXPART version 10
NASA Astrophysics Data System (ADS)
Pisso, Ignacio; Sollum, Espen; Grythe, Henrik; Kristiansen, Nina; Cassiani, Massimo; Eckhardt, Sabine; Thompson, Rona; Groot Zwaaftnik, Christine; Evangeliou, Nikolaos; Hamburger, Thomas; Sodemann, Harald; Haimberger, Leopold; Henne, Stephan; Brunner, Dominik; Burkhart, John; Fouilloux, Anne; Fang, Xuekun; Phillip, Anne; Seibert, Petra; Stohl, Andreas
2017-04-01
The Lagrangian particle dispersion model FLEXPART was in its first original release in 1998 designed for calculating the long-range and mesoscale dispersion of air pollutants from point sources, such as after an accident in a nuclear power plant. The model has now evolved into a comprehensive tool for atmospheric transport modelling and analysis. Its application fields are extended to a range of atmospheric transport processes for both atmospheric gases and aerosols, e.g. greenhouse gases, short-lived climate forces like black carbon, volcanic ash and gases as well as studies of the water cycle. We present the newest release, FLEXPART version 10. Since the last publication fully describing FLEXPART (version 6.2), the model code has been parallelised in order to allow for the possibility to speed up computation. A new, more detailed gravitational settling parametrisation for aerosols was implemented, and the wet deposition scheme for aerosols has been heavily modified and updated to provide a more accurate representation of this physical process. In addition, an optional new turbulence scheme for the convective boundary layer is available, that considers the skewness in the vertical velocity distribution. Also, temporal variation and temperature dependence of the OH-reaction are included. Finally, user input files are updated to a more convenient and user-friendly namelist format, and the option to produce the output-files in netCDF-format instead of binary format is implemented. We present these new developments and show recent model applications. Moreover, we also introduce some tools for the preparation of the meteorological input data, as well as for the processing of FLEXPART output data.
TAE+ 5.1 - TRANSPORTABLE APPLICATIONS ENVIRONMENT PLUS, VERSION 5.1 (DEC VAX ULTRIX VERSION)
NASA Technical Reports Server (NTRS)
TAE SUPPORT OFFICE
1994-01-01
TAE (Transportable Applications Environment) Plus is an integrated, portable environment for developing and running interactive window, text, and graphical object-based application systems. The program allows both programmers and non-programmers to easily construct their own custom application interface and to move that interface and application to different machine environments. TAE Plus makes both the application and the machine environment transparent, with noticeable improvements in the learning curve. The main components of TAE Plus are as follows: (1) the WorkBench, a What You See Is What You Get (WYSIWYG) tool for the design and layout of a user interface; (2) the Window Programming Tools Package (WPT), a set of callable subroutines that control an application's user interface; and (3) TAE Command Language (TCL), an easy-to-learn command language that provides an easy way to develop an executable application prototype with a run-time interpreted language. The WorkBench tool allows the application developer to interactively construct the layout of an application's display screen by manipulating a set of interaction objects including input items such as buttons, icons, and scrolling text lists. Data-driven graphical objects such as dials, thermometers, and strip charts are also included. TAE Plus updates the strip chart as the data values change. The WorkBench user specifies the windows and interaction objects that will make up the user interface, then specifies the sequence of the user interface dialogue. The description of the designed user interface is then saved into resource files. For those who desire to develop the designed user interface into an operational application, the WorkBench tool also generates source code (C, Ada, and TCL) which fully controls the application's user interface through function calls to the WPTs. The WPTs are the runtime services used by application programs to display and control the user interfaces. Since the WPTs access the workbench-generated resource files during each execution, details such as color, font, location, and object type remain independent from the application code, allowing changes to the user interface without recompiling and relinking. The Silicon Graphics version of TAE Plus now has a font caching scheme and a color caching scheme to make color allocation more efficient. In addition to WPTs, TAE Plus can control interaction of objects from the interpreted TAE Command Language. TCL provides an extremely powerful means for the more experienced developer to quickly prototype an application's use of TAE Plus interaction objects and add programming logic without the overhead of compiling or linking. TAE Plus requires MIT's X Window System, Version 11 Release 4, and the Open Software Foundation's Motif Toolkit 1.1 or 1.1.1. The Workbench and WPTs are written in C++ and the remaining code is written in C. TAE Plus is available by license for an unlimited time period. The licensed program product includes the TAE Plus source code and one set of supporting documentation. Additional documentation may be purchased separately at the price indicated below. The amount of disk space required to load the TAE Plus tar format tape is between 35Mb and 67Mb depending on the machine version. The recommended minimum memory is 12Mb. Each TAE Plus platform delivery tape includes pre-built libraries and executable binary code for that particular machine, as well as source code, so users do not have to do an installation. Users wishing to recompile the source will need both a C compiler and either GNU's C++ Version 1.39 or later, or a C++ compiler based on AT&T 2.0 cfront. TAE Plus comes with InterViews and idraw, two software packages developed by Stanford University and integrated in TAE Plus. TAE Plus was developed in 1989 and version 5.1 was released in 1991. TAE Plus is currently available on media suitable for eight different machine platforms: 1) DEC VAX computers running VMS 5.3 or higher (TK50 cartridge in VAX BACKUP format), 2) DEC VAXstations running ULTRIX 4.1 or later (TK50 cartridge in UNIX tar format), 3) DEC RISC workstations running ULTRIX 4.1 or later (TK50 cartridge in UNIX tar format), 4) HP9000 Series 300/400 computers running HP-UX 8.0 (.25 inch HP-preformatted tape cartridge in UNIX tar format), 5) HP9000 Series 700 computers running HP-UX 8.05 (HP 4mm DDS DAT tape cartridge in UNIX tar format), 6) Sun3 series computers running SunOS 4.1.1 (.25 inch tape cartridge in UNIX tar format), 7) Sun4 (SPARC) series computers running SunOS 4.1.1 (.25 inch tape cartridge in UNIX tar format), and 8) SGI Indigo computers running IRIX 4.0.1 and IRIX/Motif 1.0.1 (.25 inch IRIS tape cartridge in UNIX tar format). An optional Motif Object Code License is available for either Sun version. TAE is a trademark of the National Aeronautics and Space Administration. X Window System is a trademark of the Massachusetts Institute of Technology. Motif is a trademark of the Open Software Foundation. DEC, VAX, VMS, TK50 and ULTRIX are trademarks of Digital Equipment Corporation. HP9000 and HP-UX are trademarks of Hewlett-Packard Co. Sun3, Sun4, SunOS, and SPARC are trademarks of Sun Microsystems, Inc. SGI and IRIS are registered trademarks of Silicon Graphics, Inc.
TAE+ 5.1 - TRANSPORTABLE APPLICATIONS ENVIRONMENT PLUS, VERSION 5.1 (SUN3 VERSION)
NASA Technical Reports Server (NTRS)
TAE SUPPORT OFFICE
1994-01-01
TAE (Transportable Applications Environment) Plus is an integrated, portable environment for developing and running interactive window, text, and graphical object-based application systems. The program allows both programmers and non-programmers to easily construct their own custom application interface and to move that interface and application to different machine environments. TAE Plus makes both the application and the machine environment transparent, with noticeable improvements in the learning curve. The main components of TAE Plus are as follows: (1) the WorkBench, a What You See Is What You Get (WYSIWYG) tool for the design and layout of a user interface; (2) the Window Programming Tools Package (WPT), a set of callable subroutines that control an application's user interface; and (3) TAE Command Language (TCL), an easy-to-learn command language that provides an easy way to develop an executable application prototype with a run-time interpreted language. The WorkBench tool allows the application developer to interactively construct the layout of an application's display screen by manipulating a set of interaction objects including input items such as buttons, icons, and scrolling text lists. Data-driven graphical objects such as dials, thermometers, and strip charts are also included. TAE Plus updates the strip chart as the data values change. The WorkBench user specifies the windows and interaction objects that will make up the user interface, then specifies the sequence of the user interface dialogue. The description of the designed user interface is then saved into resource files. For those who desire to develop the designed user interface into an operational application, the WorkBench tool also generates source code (C, Ada, and TCL) which fully controls the application's user interface through function calls to the WPTs. The WPTs are the runtime services used by application programs to display and control the user interfaces. Since the WPTs access the workbench-generated resource files during each execution, details such as color, font, location, and object type remain independent from the application code, allowing changes to the user interface without recompiling and relinking. The Silicon Graphics version of TAE Plus now has a font caching scheme and a color caching scheme to make color allocation more efficient. In addition to WPTs, TAE Plus can control interaction of objects from the interpreted TAE Command Language. TCL provides an extremely powerful means for the more experienced developer to quickly prototype an application's use of TAE Plus interaction objects and add programming logic without the overhead of compiling or linking. TAE Plus requires MIT's X Window System, Version 11 Release 4, and the Open Software Foundation's Motif Toolkit 1.1 or 1.1.1. The Workbench and WPTs are written in C++ and the remaining code is written in C. TAE Plus is available by license for an unlimited time period. The licensed program product includes the TAE Plus source code and one set of supporting documentation. Additional documentation may be purchased separately at the price indicated below. The amount of disk space required to load the TAE Plus tar format tape is between 35Mb and 67Mb depending on the machine version. The recommended minimum memory is 12Mb. Each TAE Plus platform delivery tape includes pre-built libraries and executable binary code for that particular machine, as well as source code, so users do not have to do an installation. Users wishing to recompile the source will need both a C compiler and either GNU's C++ Version 1.39 or later, or a C++ compiler based on AT&T 2.0 cfront. TAE Plus comes with InterViews and idraw, two software packages developed by Stanford University and integrated in TAE Plus. TAE Plus was developed in 1989 and version 5.1 was released in 1991. TAE Plus is currently available on media suitable for eight different machine platforms: 1) DEC VAX computers running VMS 5.3 or higher (TK50 cartridge in VAX BACKUP format), 2) DEC VAXstations running ULTRIX 4.1 or later (TK50 cartridge in UNIX tar format), 3) DEC RISC workstations running ULTRIX 4.1 or later (TK50 cartridge in UNIX tar format), 4) HP9000 Series 300/400 computers running HP-UX 8.0 (.25 inch HP-preformatted tape cartridge in UNIX tar format), 5) HP9000 Series 700 computers running HP-UX 8.05 (HP 4mm DDS DAT tape cartridge in UNIX tar format), 6) Sun3 series computers running SunOS 4.1.1 (.25 inch tape cartridge in UNIX tar format), 7) Sun4 (SPARC) series computers running SunOS 4.1.1 (.25 inch tape cartridge in UNIX tar format), and 8) SGI Indigo computers running IRIX 4.0.1 and IRIX/Motif 1.0.1 (.25 inch IRIS tape cartridge in UNIX tar format). An optional Motif Object Code License is available for either Sun version. TAE is a trademark of the National Aeronautics and Space Administration. X Window System is a trademark of the Massachusetts Institute of Technology. Motif is a trademark of the Open Software Foundation. DEC, VAX, VMS, TK50 and ULTRIX are trademarks of Digital Equipment Corporation. HP9000 and HP-UX are trademarks of Hewlett-Packard Co. Sun3, Sun4, SunOS, and SPARC are trademarks of Sun Microsystems, Inc. SGI and IRIS are registered trademarks of Silicon Graphics, Inc.
TAE+ 5.1 - TRANSPORTABLE APPLICATIONS ENVIRONMENT PLUS, VERSION 5.1 (SUN3 VERSION WITH MOTIF)
NASA Technical Reports Server (NTRS)
TAE SUPPORT OFFICE
1994-01-01
TAE (Transportable Applications Environment) Plus is an integrated, portable environment for developing and running interactive window, text, and graphical object-based application systems. The program allows both programmers and non-programmers to easily construct their own custom application interface and to move that interface and application to different machine environments. TAE Plus makes both the application and the machine environment transparent, with noticeable improvements in the learning curve. The main components of TAE Plus are as follows: (1) the WorkBench, a What You See Is What You Get (WYSIWYG) tool for the design and layout of a user interface; (2) the Window Programming Tools Package (WPT), a set of callable subroutines that control an application's user interface; and (3) TAE Command Language (TCL), an easy-to-learn command language that provides an easy way to develop an executable application prototype with a run-time interpreted language. The WorkBench tool allows the application developer to interactively construct the layout of an application's display screen by manipulating a set of interaction objects including input items such as buttons, icons, and scrolling text lists. Data-driven graphical objects such as dials, thermometers, and strip charts are also included. TAE Plus updates the strip chart as the data values change. The WorkBench user specifies the windows and interaction objects that will make up the user interface, then specifies the sequence of the user interface dialogue. The description of the designed user interface is then saved into resource files. For those who desire to develop the designed user interface into an operational application, the WorkBench tool also generates source code (C, Ada, and TCL) which fully controls the application's user interface through function calls to the WPTs. The WPTs are the runtime services used by application programs to display and control the user interfaces. Since the WPTs access the workbench-generated resource files during each execution, details such as color, font, location, and object type remain independent from the application code, allowing changes to the user interface without recompiling and relinking. The Silicon Graphics version of TAE Plus now has a font caching scheme and a color caching scheme to make color allocation more efficient. In addition to WPTs, TAE Plus can control interaction of objects from the interpreted TAE Command Language. TCL provides an extremely powerful means for the more experienced developer to quickly prototype an application's use of TAE Plus interaction objects and add programming logic without the overhead of compiling or linking. TAE Plus requires MIT's X Window System, Version 11 Release 4, and the Open Software Foundation's Motif Toolkit 1.1 or 1.1.1. The Workbench and WPTs are written in C++ and the remaining code is written in C. TAE Plus is available by license for an unlimited time period. The licensed program product includes the TAE Plus source code and one set of supporting documentation. Additional documentation may be purchased separately at the price indicated below. The amount of disk space required to load the TAE Plus tar format tape is between 35Mb and 67Mb depending on the machine version. The recommended minimum memory is 12Mb. Each TAE Plus platform delivery tape includes pre-built libraries and executable binary code for that particular machine, as well as source code, so users do not have to do an installation. Users wishing to recompile the source will need both a C compiler and either GNU's C++ Version 1.39 or later, or a C++ compiler based on AT&T 2.0 cfront. TAE Plus comes with InterViews and idraw, two software packages developed by Stanford University and integrated in TAE Plus. TAE Plus was developed in 1989 and version 5.1 was released in 1991. TAE Plus is currently available on media suitable for eight different machine platforms: 1) DEC VAX computers running VMS 5.3 or higher (TK50 cartridge in VAX BACKUP format), 2) DEC VAXstations running ULTRIX 4.1 or later (TK50 cartridge in UNIX tar format), 3) DEC RISC workstations running ULTRIX 4.1 or later (TK50 cartridge in UNIX tar format), 4) HP9000 Series 300/400 computers running HP-UX 8.0 (.25 inch HP-preformatted tape cartridge in UNIX tar format), 5) HP9000 Series 700 computers running HP-UX 8.05 (HP 4mm DDS DAT tape cartridge in UNIX tar format), 6) Sun3 series computers running SunOS 4.1.1 (.25 inch tape cartridge in UNIX tar format), 7) Sun4 (SPARC) series computers running SunOS 4.1.1 (.25 inch tape cartridge in UNIX tar format), and 8) SGI Indigo computers running IRIX 4.0.1 and IRIX/Motif 1.0.1 (.25 inch IRIS tape cartridge in UNIX tar format). An optional Motif Object Code License is available for either Sun version. TAE is a trademark of the National Aeronautics and Space Administration. X Window System is a trademark of the Massachusetts Institute of Technology. Motif is a trademark of the Open Software Foundation. DEC, VAX, VMS, TK50 and ULTRIX are trademarks of Digital Equipment Corporation. HP9000 and HP-UX are trademarks of Hewlett-Packard Co. Sun3, Sun4, SunOS, and SPARC are trademarks of Sun Microsystems, Inc. SGI and IRIS are registered trademarks of Silicon Graphics, Inc.
Hubble peers inside a celestial geode
NASA Astrophysics Data System (ADS)
2004-08-01
celestial geode hi-res Size hi-res: 148 Kb Credits: ESA/NASA, Yäel Nazé (University of Liège, Belgium) and You-Hua Chu (University of Illinois, Urbana, USA) Hubble peers inside a celestial geode In this unusual image, the NASA/ESA Hubble Space Telescope captures a rare view of the celestial equivalent of a geode - a gas cavity carved by the stellar wind and intense ultraviolet radiation from a young hot star. Real geodes are handball-sized, hollow rocks that start out as bubbles in volcanic or sedimentary rock. Only when these inconspicuous round rocks are split in half by a geologist, do we get a chance to appreciate the inside of the rock cavity that is lined with crystals. In the case of Hubble's 35 light-year diameter ‘celestial geode’ the transparency of its bubble-like cavity of interstellar gas and dust reveals the treasures of its interior. Low resolution version (JPG format) 148 Kb High resolution version (TIFF format) 1929 Kb Acknowledgment: This image was created with the help of the ESA/ESO/NASA Photoshop FITS Liberator. Real geodes are handball-sized, hollow rocks that start out as bubbles in volcanic or sedimentary rock. Only when these inconspicuous round rocks are split in half by a geologist, do we get a chance to appreciate the inside of the rock cavity that is lined with crystals. In the case of Hubble's 35 light-year diameter ‘celestial geode’ the transparency of its bubble-like cavity of interstellar gas and dust reveals the treasures of its interior. The object, called N44F, is being inflated by a torrent of fast-moving particles (what astronomers call a 'stellar wind') from an exceptionally hot star (the bright star just below the centre of the bubble) once buried inside a cold dense cloud. Compared with our Sun (which is losing mass through the so-called 'solar wind'), the central star in N44F is ejecting more than a 100 million times more mass per second and the hurricane of particles moves much faster at 7 million km per hour (as opposed to less than 1.5 million km per hour for our Sun). Because the bright central star does not exist in empty space but is surrounded by an envelope of gas, the stellar wind collides with this gas, pushing it out, like a snow plough. This forms a bubble, whose striking structure is clearly visible in the crisp Hubble image. The nebula N44F is one of a handful of known interstellar bubbles. Bubbles like these have been seen around evolved massive stars (called 'Wolf-Rayet stars'), and also around clusters of stars (where they are called 'super-bubbles'). But they have rarely been viewed around isolated stars, as is the case here. On closer inspection N44F harbours additional surprises. The interior wall of its gaseous cavity is lined with several four to eight light-year high finger-like columns of cool dust and gas. (The structure of these 'columns' is similar to the Eagle Nebula’s iconic 'Pillars of Creation' photographed by Hubble a decade ago, and is seen in a few other nebulae as well). The fingers are created by a blistering ultraviolet radiation from the central star. Like wind socks caught in a gale, they point in the direction of the energy flow. These pillars look small in this image only because they are much farther away from us then the Eagle Nebula’s pillars. N44F is located about 160 000 light-years in the neighbouring dwarf galaxy the Large Magellanic Cloud, in the direction of the southern constellation Dorado. N44F is part of the larger N44 complex, which contains a large super-bubble, blown out by the combined action of stellar winds and multiple supernova explosions. N44 itself is roughly 1000 light-years across. Several compact star-forming regions, including N44F, are found along the rim of the central super-bubble. This image was taken with Hubble's Wide Field Planetary Camera 2, using filters that isolate light emitted by sulphur (shown in blue, a 1200-second exposure) and hydrogen gas (shown in red, a 1000-second exposure).
NASA Astrophysics Data System (ADS)
Schmaltz, J. E.; Ilavajhala, S.; Plesea, L.; Hall, J. R.; Boller, R. A.; Chang, G.; Sadaqathullah, S.; Kim, R.; Murphy, K. J.; Thompson, C. K.
2012-12-01
Expedited processing of imagery from NASA satellites for near-real time use by non-science applications users has a long history, especially since the beginning of the Terra and Aqua missions. Several years ago, the Land Atmosphere Near-real-time Capability for EOS (LANCE) was created to greatly expand the range of near-real time data products from a variety of Earth Observing System (EOS) instruments. NASA's Earth Observing System Data and Information System (EOSDIS) began exploring methods to distribute these data as imagery in an intuitive, geo-referenced format, which would be available within three hours of acquisition. Toward this end, EOSDIS has developed the Global Imagery Browse Services (GIBS, http://earthdata.nasa.gov/gibs) to provide highly responsive, scalable, and expandable imagery services. The baseline technology chosen for GIBS was a Tiled Web Mapping Service (TWMS) developed at the Jet Propulsion Laboratory. Using this, global images and mosaics are divided into tiles with fixed bounding boxes for a pyramid of fixed resolutions. Initially, the satellite imagery is created at the existing data systems for each sensor, ensuring the oversight of those most knowledgeable about the science. There, the satellite data is geolocated and converted to an image format such as JPEG, TIFF, or PNG. The GIBS ingest server retrieves imagery from the various data systems and converts them into image tiles, which are stored in a highly-optimized raster format named Meta Raster Format (MRF). The image tiles are then served to users via HTTP by means of an Apache module. Services are available for the entire globe (lat-long projection) and for both polar regions (polar stereographic projection). Requests to the services can be made with the non-standard, but widely known, TWMS format or via the well-known OGC Web Map Tile Service (WMTS) standard format. Standard OGC Web Map Service (WMS) access to the GIBS server is also available. In addition, users may request a KML pyramid. This variety of access methods allows stakeholders to develop visualization/browse clients for a diverse variety of specific audiences. Currently, EOSDIS is providing an OpenLayers web client, Worldview (http://earthdata.nasa.gov/worldview), as an interface to GIBS. A variety of other existing clients can also be developed using such tools as Google Earth, Google Earth browser Plugin, ESRI's Adobe Flash/Flex Client Library, NASA World Wind, Perceptive Pixel Client, Esri's iOS Client Library, and OpenLayers for Mobile. The imagery browse capabilities from GIBS can be combined with other EOSDIS services (i.e. ECHO OpenSearch) via a client that ties them both together to provide an interface that enables data download from the onscreen imagery. Future plans for GIBS include providing imagery based on science quality data from the entire data record of these EOS instruments.
77 FR 49788 - National Committee on Foreign Medical Education and Accreditation
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-17
... official version of this document is the document published in the Federal Register. Free Internet access... Document Format (PDF). To use PDF, you must have Adobe Acrobat Reader, which is available free at the site...
MISR Data Product Specifications
Atmospheric Science Data Center
2016-11-25
... and usage of metadata. Improvements to MISR algorithmic software occasionally result in changes to file formats. While these changes ... (DPS). DPS Revision: Rev. S Software Version: 5.0.9 Date: September 20, 2010, updated April ...
Biological Environmental Sampling Technologies Assessment
2015-12-01
unlimited. 13. SUPPLEMENTARY NOTES 14. ABSTRACT: U.S. Army Edgewood Chemical Biological Center, Research and Technology Directorate, BioSensors ...format (pdf) electronic version of this report: ECBC R&T Directorate, Biosciences Division, BioSensors Branch RDCB-DRB-S ATTN: Gostomski, J
A22316 Gametophyte and sporophyte (version 2.0)
USDA-ARS?s Scientific Manuscript database
Gametogenesis is the process of gamete formation, which includes micro- and megagametogenesis. Gametogenesis initiates after specialized cells in the sporophyte undergo meiosis, and subsequent mitotic divisions yield the gametophytic phase of the plant life cycle. In higher plants, microgametogenesi...
SARAH 3.2: Dirac gauginos, UFO output, and more
NASA Astrophysics Data System (ADS)
Staub, Florian
2013-07-01
SARAH is a Mathematica package optimized for the fast, efficient and precise study of supersymmetric models beyond the MSSM: a new model can be defined in a short form and all vertices are derived. This allows SARAH to create model files for FeynArts/FormCalc, CalcHep/CompHep and WHIZARD/O'Mega. The newest version of SARAH now provides the possibility to create model files in the UFO format which is supported by MadGraph 5, MadAnalysis 5, GoSam, and soon by Herwig++. Furthermore, SARAH also calculates the mass matrices, RGEs and 1-loop corrections to the mass spectrum. This information is used to write source code for SPheno in order to create a precision spectrum generator for the given model. This spectrum-generator-generator functionality as well as the output of WHIZARD and CalcHep model files has seen further improvement in this version. Also models including Dirac gauginos are supported with the new version of SARAH, and additional checks for the consistency of the implementation of new models have been created. Program summaryProgram title:SARAH Catalogue identifier: AEIB_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEIB_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 3 22 411 No. of bytes in distributed program, including test data, etc.: 3 629 206 Distribution format: tar.gz Programming language: Mathematica. Computer: All for which Mathematica is available. Operating system: All for which Mathematica is available. Classification: 11.1, 11.6. Catalogue identifier of previous version: AEIB_v1_0 Journal reference of previous version: Comput. Phys. Comm. 182 (2011) 808 Does the new version supersede the previous version?: Yes, the new version includes all known features of the previous version but also provides the new features mentioned below. Nature of problem: To use Madgraph for new models it is necessary to provide the corresponding model files which include all information about the interactions of the model. However, the derivation of the vertices for a given model and putting those into model files which can be used with Madgraph is usually very time consuming. Dirac gauginos are not present in the minimal supersymmetric standard model (MSSM) or many extensions of it. Dirac mass terms for vector superfields lead to new structures in the supersymmetric (SUSY) Lagrangian (bilinear mass term between gaugino and matter fermion as well as new D-terms) and modify also the SUSY renormalization group equations (RGEs). The Dirac character of gauginos can change the collider phenomenology. In addition, they come with an extended Higgs sector for which a precise calculation of the 1-loop masses has not happened so far. Solution method: SARAH calculates the complete Lagrangian for a given model whose gauge sector can be any direct product of SU(N) gauge groups. The chiral superfields can transform as any, irreducible representation with respect to these gauge groups and it is possible to handle an arbitrary number of symmetry breakings or particle rotations. Also the gauge fixing is automatically added. Using this information, SARAH derives all vertices for a model. These vertices can be exported to model files in the UFO which is supported by Madgraph and other codes like GoSam, MadAnalysis or ALOHA. The user can also study models with Dirac gauginos. In that case SARAH includes all possible terms in the Lagrangian stemming from the new structures and can also calculate the RGEs. The entire impact of these terms is then taken into account in the output of SARAH to UFO, CalcHep, WHIZARD, FeynArts and SPheno. Reasons for new version: SARAH provides, with this version, the possibility of creating model files in the UFO format. The UFO format is supposed to become a standard format for model files which should be supported by many different tools in the future. Also models with Dirac gauginos were not supported in earlier versions. Summary of revisions: Support of models with Dirac gauginos. Output of model files in the UFO format, speed improvement in the output of WHIZARD model files, CalcHep output supports the internal diagonalization of mass matrices, output of control files for LHPC spectrum plotter, support of generalized PDG numbering scheme PDG.IX, improvement of the calculation of the decay widths and branching ratios with SPheno, the calculation of new low energy observables are added to the SPheno output, the handling of gauge fixing terms has been significantly simplified. Restrictions: SARAH can only derive the Lagrangian in an automatized way for N=1 SUSY models, but not for those with more SUSY generators. Furthermore, SARAH supports only renormalizable operators in the output of model files in the UFO format and also for CalcHep, FeynArts and WHIZARD. Also color sextets are not yet included in the model files for Monte Carlo tools. Dimension 5 operators are only supported in the calculation of the RGEs and mass matrices. Unusual features: SARAH does not need the Lagrangian of a model as input to calculate the vertices. The gauge structure, particle and content and superpotential as well as rotations stemming from gauge symmetry breaking are sufficient. All further information is derived by SARAH on its own. Therefore, the model files are very short and the implementation of new models is fast and easy. In addition, the implementation of a model can be checked for physical and formal consistency. In addition, SARAH can generate Fortran code for a full 1-loop analysis of the mass spectrum in the context for Dirac gauginos. Running time: Measured CPU time for the evaluation of the MSSM using a Lenovo Thinkpad X220 with i7 processor (2.53 GHz). Calculating the complete Lagrangian: 9 s. Calculating all vertices: 51 s. Output of the UFO model files: 49 s.
NASA Astrophysics Data System (ADS)
Bytev, Vladimir V.; Kniehl, Bernd A.
2016-09-01
We present a further extension of the HYPERDIRE project, which is devoted to the creation of a set of Mathematica-based program packages for manipulations with Horn-type hypergeometric functions on the basis of differential equations. Specifically, we present the implementation of the differential reduction for the Lauricella function FC of three variables. Catalogue identifier: AEPP_v4_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEPP_v4_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public License, version 3 No. of lines in distributed program, including test data, etc.: 243461 No. of bytes in distributed program, including test data, etc.: 61610782 Distribution format: tar.gz Programming language: Mathematica. Computer: All computers running Mathematica. Operating system: Operating systems running Mathematica. Classification: 4.4. Does the new version supersede the previous version?: No, it significantly extends the previous version. Nature of problem: Reduction of hypergeometric function FC of three variables to a set of basis functions. Solution method: Differential reduction. Reasons for new version: The extension package allows the user to handle the Lauricella function FC of three variables. Summary of revisions: The previous version goes unchanged. Running time: Depends on the complexity of the problem.
SARAH 4: A tool for (not only SUSY) model builders
NASA Astrophysics Data System (ADS)
Staub, Florian
2014-06-01
We present the new version of the Mathematica package SARAH which provides the same features for a non-supersymmetric model as previous versions for supersymmetric models. This includes an easy and straightforward definition of the model, the calculation of all vertices, mass matrices, tadpole equations, and self-energies. Also the two-loop renormalization group equations for a general gauge theory are now included and have been validated with the independent Python code PyR@TE. Model files for FeynArts, CalcHep/CompHep, WHIZARD and in the UFO format can be written, and source code for SPheno for the calculation of the mass spectrum, a set of precision observables, and the decay widths and branching ratios of all states can be generated. Furthermore, the new version includes routines to output model files for Vevacious for both, supersymmetric and non-supersymmetric, models. Global symmetries are also supported with this version and by linking Susyno the handling of Lie groups has been improved and extended.
Ferchiou, A; Todorov, L; Lajnef, M; Baudin, G; Pignon, B; Richard, J-R; Leboyer, M; Szöke, A; Schürhoff, F
2017-12-01
The main objective of the study was to explore the factorial structure of the French version of the Schizotypal Personality Questionnaire-Brief (SPQ-B) in a Likert format, in a representative sample of the general population. In addition, differences in the dimensional scores of schizotypy according to gender and age were analyzed. As the study in the general population of schizotypal traits and its determinants has been recently proposed as a way toward the understanding of aetiology and pathophysiology of schizophrenia, consistent self-report tools are crucial to measure psychometric schizotypy. A shorter version of the widely used Schizotypal Personality Questionnaire (SPQ-Brief) has been extensively investigated in different countries, particularly in samples of students or clinical adolescents, and more recently, a few studies used a Likert-type scale format which allows partial endorsement of items and reduces the risk of defensive answers. A sample of 233 subjects representative of the adult population from an urban area near Paris (Créteil) was recruited using the "itinerary method". They completed the French version of the SPQ-B with a 5-point Likert-type response format (1=completely disagree; 5=completely agree). We examined the dimensional structure of the French version of the SPQ-B with a Principal Components Analysis (PCA) followed by a promax rotation. Factor selection was based on Eigenvalues over 1.0 (Kaiser's criterion), Cattell's Scree-plot test, and interpretability of the factors. Items with loadings greater than 0.4 were retained for each dimension. The internal consistency estimate of the dimensions was calculated with Cronbach's α. In order to study the influence of age and gender, we carried out a simple linear regression with the subscales as dependent variables. Our sample was composed of 131 women (mean age=52.5±18.2 years) and 102 men (mean age=53±18.1 years). SPQ-B Likert total scores ranged from 22 to 84 points (mean=43.6±13). Factor analysis resulted in a 3-factor solution that explained 47.7% of the variance. Factor 1 (disorganized; 10 items) included items related to "odd behavior", "odd speech", as well as "social anxiety", one item of "constricted affect" and one item of "ideas of reference". Factor 2 (interpersonal; 7 items) included items related to "no close friends", "constricted affect", and three of the items of "suspiciousness". Factor 3 (cognitive-perceptual; 5 items) included items related to "ideas of reference", "magical thinking", "unusual perceptual experiences" and one item of "suspiciousness". Coefficient α for the three subscales and total scale were respectively 0.81, 0.81, 0.77 and 0.88. We found no differences in total schizotypy and the three dimensions scores according to age and sex. Factor analysis of the French version of the SPQ-B in a Likert format confirmed the three-factor structure of schizotypy. We found a pure cognitive perceptual dimension including the most representative positive features. As expected, "Suspiciousness" subscale is included in both positive and negative dimensions, but mainly in the negative dimension. Surprisingly, "social anxiety" subscale is included in the disorganized dimension in our analysis. The SPQ-B in a Likert format demonstrated good internal reliability for both total and subscales scores. Unlike previous published results, we did not find any influence of age or gender on schizotypal dimensions. Copyright © 2016 L'Encéphale, Paris. Published by Elsevier Masson SAS. All rights reserved.
Williams, Bryan; Cockcroft, John R; Kario, Kazuomi; Zappe, Dion H; Cardenas, Pamela; Hester, Allen; Brunel, Patrick; Zhang, Jack
2014-02-04
Hypertension in elderly people is characterised by elevated systolic blood pressure (SBP) and increased pulse pressure (PP), which indicate large artery ageing and stiffness. LCZ696, a first-in-class angiotensin receptor neprilysin inhibitor (ARNI), is being developed to treat hypertension and heart failure. The Prospective comparison of Angiotensin Receptor neprilysin inhibitor with Angiotensin receptor blocker MEasuring arterial sTiffness in the eldERly (PARAMETER) study will assess the efficacy of LCZ696 versus olmesartan on aortic stiffness and central aortic haemodynamics. In this 52-week multicentre study, patients with hypertension aged ≥60 years with a mean sitting (ms) SBP ≥150 to <180 and a PP>60 mm Hg will be randomised to once daily LCZ696 200 mg or olmesartan 20 mg for 4 weeks, followed by a forced-titration to double the initial doses for the next 8 weeks. At 12-24 weeks, if the BP target has not been attained (msSBP <140 and ms diastolic BP <90 mm Hg), amlodipine (2.5-5 mg) and subsequently hydrochlorothiazide (6.25-25 mg) can be added. The primary and secondary endpoints are changes from baseline in central aortic systolic pressure (CASP) and central aortic PP (CAPP) at week 12, respectively. Other secondary endpoints are the changes in CASP and CAPP at week 52. A sample size of 432 randomised patients is estimated to ensure a power of 90% to assess the superiority of LCZ696 over olmesartan at week 12 in the change from baseline of mean CASP, assuming an SD of 19 mm Hg, the difference of 6.5 mm Hg and a 15% dropout rate. The primary variable will be analysed using a two-way analysis of covariance. The study was initiated in December 2012 and final results are expected in 2015. The results of this study will impact the design of future phase III studies assessing cardiovascular protection. EUDract number 2012-002899-14 and ClinicalTrials.gov NCT01692301.
NHTSA data reference guide version 4. Volume 1, vehicle tests
DOT National Transportation Integrated Search
1997-04-01
This guide documents the format of magnetic media (3.5 inch high density diskettes) to be submitted : to the National Highway Traffic Safety Administration (NHTSA) for vehicle crash tests. This guide is : designated Volume I. NHTSA Data Reference Gui...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-01
... must be submitted electronically in machine-readable format. PDF images created by scanning a paper document may not be submitted, except in cases in which a word- processing version of the document is not...
39 CFR 3001.10 - Form and number of copies of documents.
Code of Federal Regulations, 2010 CFR
2010-07-01
... service must be printed from a text-based pdf version of the document, where possible. Otherwise, they may... generated in either Acrobat (pdf), Word, or WordPerfect, or Rich Text Format (rtf). [67 FR 67559, Nov. 6...
Atmospheric Science Data Center
2018-04-12
SSE Global Data Text files of monthly averaged data for the entire ... Version: V6 Location: Global Spatial Coverage: (90N, 90S)(180W,180E) ... File Format: ASCII Order Data: SSE Global Data: Order Data SCAR-B Block: ...
NHTSA data reference guide version 4. Volume 3, component tests
DOT National Transportation Integrated Search
1997-04-01
This guide documents the format of magnetic media (3.5 inch high density diskettes) to be submitted : to the National Highway Traffic Safety Administration (NHTSA) for component tests. This guide is : designated Volume III. NHTSA Data Reference Guide...
Mallinckrodt, Brent; Tekie, Yacob T
2016-11-01
The Working Alliance Inventory (WAI) has made great contributions to psychotherapy research. However, studies suggest the 7-point response format and 3-factor structure of the client version may have psychometric problems. This study used Rasch item response theory (IRT) to (a) improve WAI response format, (b) compare two brief 12-item versions (WAI-sr; WAI-s), and (c) develop a new 16-item Brief Alliance Inventory (BAI). Archival data from 1786 counseling center and community clients were analyzed. IRT findings suggested problems with crossed category thresholds. A rescoring scheme that combines neighboring responses to create 5- and 4-point scales sharply reduced these problems. Although subscale variance was reduced by 11-26%, rescoring yielded improved reliability and generally higher correlations with therapy process (session depth and smoothness) and outcome measures (residual gain symptom improvement). The 16-item BAI was designed to maximize "bandwidth" of item difficulty and preserve a broader range of WAI sensitivity than WAI-s or WAI-sr. Comparisons suggest the BAI performed better in several respects than the WAI-s or WAI-sr and equivalent to the full WAI on several performance indicators.
Sass, Rachelle; Frick, Susanne; Reips, Ulf-Dietrich; Wetzel, Eunike
2018-03-01
The multidimensional forced-choice (MFC) format has been proposed as an alternative to the rating scale (RS) response format. However, it is unclear how changing the response format may affect the response process and test motivation of participants. In Study 1, we investigated the MFC response process using the think-aloud technique. In Study 2, we compared test motivation between the RS format and different versions of the MFC format (presenting 2, 3, 4, and 5 items simultaneously). The response process to MFC item blocks was similar to the RS response process but involved an additional step of weighing the items within a block against each other. The RS and MFC response format groups did not differ in their test motivation. Thus, from the test taker's perspective, the MFC format is somewhat more demanding to respond to, but this does not appear to decrease test motivation.
A Pyramid Scheme for Constructing Geologic Maps on Geobrowsers
NASA Astrophysics Data System (ADS)
Whitmeyer, S. J.; de Paor, D. G.; Daniels, J.; Jeremy, N.; Michael, R.; Santangelo, B.
2008-12-01
Hundreds of geologic maps have been draped onto Google Earth (GE) using the ground overlay tag of Keyhole Markup Language (KML) and dozens have been published on academic and survey web pages as downloadable KML or KMZ (zipped KML) files. The vast majority of these are small KML docs that link to single, large - often very large - image files (jpegs, tiffs, etc.) Files that exceed 50 MB in size defeat the purpose of GE as an interactive and responsive, and therefore fast, virtual terrain medium. KML supports super-overlays (a.k.a. image pyramids), which break large graphic files into manageable tiles that load only when they are in the visible region at a sufficient level of detail (LOD), and several automatic tile-generating applications have been written. The process of exporting map data from applications such as ArcGIS® to KML format is becoming more manageable but still poses challenges. Complications arise, for example, because of differences between grid-north at a point on a map and true north at the equivalent location on the virtual globe. In our recent field season, we devised ways of overcoming many of these obstacles in order to generate responsive, panable, zoomable geologic maps in which data is layered in a pyramid structure similar to the image pyramid used for default GE terrain. The structure of our KML code for each level of the pyramid is self-similar: (i) check whether the current tile is in the visible region, (ii) if so, render the current overlay, (iii) add the current data level, and (iv) using four network links, check the visibility and LOD of four nested tiles. By using this pyramid structure we provide the user with access to geologic and map data at multiple levels of observation. For example, when the viewpoint is distant, regional structures and stratigraphy (e.g. lithological groups and terrane boundaries) are visible. As the user zooms to lower elevations, formations and ultimately individual outcrops come into focus. The pyramid structure is ideally suited to geologic data which tends to be unevenly exposed across the earth's surface.
NASA Astrophysics Data System (ADS)
Eberle, J.; Gerlach, R.; Hese, S.; Schmullius, C.
2012-04-01
To provide earth observation products in the area of Siberia, the Siberian Earth System Science Cluster (SIB-ESS-C) was established as a spatial data infrastructure at the University of Jena (Germany), Department for Earth Observation. This spatial data infrastructure implements standards published by the Open Geospatial Consortium (OGC) and the International Organizsation for Standardization (ISO) for data discovery, data access, data processing and data analysis. The objective of SIB-ESS-C is to faciliate environmental research and Earth system science in Siberia. The region for this project covers the entire Asian part of the Russian Federation approximately between 58°E - 170°W and 48°N - 80°N. To provide discovery, access and analysis services a webportal was published for searching and visualisation of available data. This webportal is based on current web technologies like AJAX, Drupal Content Management System as backend software and a user-friendly surface with Drag-n-Drop and further mouse events. To have a wide range of regular updated earth observation products, some products from sensor MODIS at the satellites Aqua and Terra were processed. A direct connection to NASA archive servers makes it possible to download MODIS Level 3 and 4 products and integrate it in the SIB-ESS-C infrastructure. These data can be downloaded in a file format called Hierarchical Data Format (HDF). For visualisation and further analysis, this data is reprojected, converted to GeoTIFF and global products clipped to the project area. All these steps are implemented as an automatic process chain. If new MODIS data is available within the infrastructure this process chain is executed. With the link to a MODIS catalogue system, the system gets new data daily. With the implemented analysis processes, timeseries data can be analysed, for example to plot a trend or different time series against one another. Scientists working in this area and working with MODIS data can make use of this service over the webportal. Both searching manually the NASA archive for MODIS data, processing these data automatically and then download it for further processing and using the regular updated products.
Mercurio, Meagan D; Dambergs, Robert G; Herderich, Markus J; Smith, Paul A
2007-06-13
The methyl cellulose precipitable (MCP) tannin assay and a modified version of the Somers and Evans color assay were adapted to high-throughput (HTP) analysis. To improve efficiency of the MCP tannin assay, a miniaturized 1 mL format and a HTP format using 96 well plates were developed. The Somers color assay was modified to allow the standardization of pH and ethanol concentrations of wine samples in a simple one-step dilution with a buffer solution, thus removing inconsistencies between wine matrices prior to analysis and allowing for its adaptation to a HTP format. Validation studies showed that all new formats were efficient, and results were reproducible and analogous to the original formats.
Social Structure Simulation and Inference Using Artificial Intelligence Techniques
2005-06-15
Batagelj and Mrvar , 2003] comes closest to defining a universal interchange format for social network data. PAJEK .net format is defined using a...ObjectStyle, 2005] and in future version of PAJEK[ Batagelj and Mrvar , 2003] GXL[Holt, Winter, and Schürr, 2000][Taentzer, 2001][Winter, 2001] was...Barabási and R. Albert. Emergence of scaling in random networks. Science, 286(5439):509–512, Oct 1999. V. Batagelj and A. Mrvar . Pajek - analysis and
Rajmil, Luis; Robles, Noemí; Rodriguez-Arjona, Dolors; Azuara, Marta; Codina, Francisco; Raat, Hein; Ravens-Sieberer, Ulrike
2014-01-01
Background The objectives of the study were to develop web-based Spanish and Catalan versions of the KIDSCREEN, and to compare scores and psychometric properties with the paper version. Methods Internet and paper Spanish and Catalan versions of the KIDSCREEN-52 were included in a cross-sectional study in school-age children. Web-based and paper Spanish or Catalan versions of the KIDSCREEN-52 were administered to students aged 8 to 18 years from primary and secondary schools in Palafolls (Barcelona, Spain, n = 923). All students completed both web-based and paper versions during school time with an interval of at least 2 hours between administrations. The order of administration was randomized. The KIDSCREEN-52, the Strengths and Difficulties Questionnaire (SDQ), and sociodemographic variables were collected. Missing values, floor and ceiling effects, and internal consistency were compared between both versions, as well as mean score differences, level of agreement, and known groups and construct validity. Results Participation rate was 77% (n = 715). Web-based and paper versions showed low percentage of missing values and similar high ceiling effect (range 0 to 44%). Mean score differences showed an effect size (ES) lower than 0.2 in all dimensions. Internal consistency ranged from 0.7 to 0.88, and degree of agreement was excellent (Intraclass correlation coefficient [ICC] range 0.75 to 0.87). Expected differences were seen by sex, age, socioeconomic status and mental health status. Conclusions The web-based KIDSCREEN-52 showed similar scale score and reliability and validity than the paper version. It will incorporate the child population in the assessment of quality of life providing a more attractive format. PMID:25479465
NHTSA data reference guide version 4.b. Volume 2, biomechanical tests
DOT National Transportation Integrated Search
1999-05-01
This guide documents the format of media (3.5 inch high density diskettes or CD-ROMs) to : be submitted to the National Highway Traffic Safety Administration (NHTSA) for : biomechanical tests. This guide is designated Volume II. NHTSA Data Reference ...
MOVES2014 for Experienced Users, September 2014 Webinar Slides
This webinar assumes a basic knowledge of past versions of the MOtor Vehicle Emission Simulator (MOVES) and includes a demonstration of the conversion of MOVES2010b input files to MOVES2014 format, changes to the MOVES GUI, and new input options.
NHTSA data reference guide version 4. Volume 4, signal waveform generator tests
DOT National Transportation Integrated Search
1997-09-01
This guide documents the format of magnetic media (3.5 inch high density diskettes) to be submitted : to the National Highway Traffic Safety Administration (NHTSA) for SWG tests. This guide is : designated Volume IV. NHTSA Data Reference Guide (Signa...
48 CFR 2152.215-70 - Contractor records retention.
Code of Federal Regulations, 2013 CFR
2013-10-01
... Contractor chooses to maintain paper documents in electronic format, the electronic version must be an exact replica of the paper document. (End of clause) [70 FR 41155, July 18, 2005] ... MANAGEMENT, FEDERAL EMPLOYEES GROUP LIFE INSURANCE FEDERAL ACQUISITION REGULATION CLAUSES AND FORMS...
48 CFR 2152.215-70 - Contractor records retention.
Code of Federal Regulations, 2012 CFR
2012-10-01
... Contractor chooses to maintain paper documents in electronic format, the electronic version must be an exact replica of the paper document. (End of clause) [70 FR 41155, July 18, 2005] ... MANAGEMENT, FEDERAL EMPLOYEES GROUP LIFE INSURANCE FEDERAL ACQUISITION REGULATION CLAUSES AND FORMS...
48 CFR 2152.215-70 - Contractor records retention.
Code of Federal Regulations, 2014 CFR
2014-10-01
... Contractor chooses to maintain paper documents in electronic format, the electronic version must be an exact replica of the paper document. (End of clause) [70 FR 41155, July 18, 2005] ... MANAGEMENT, FEDERAL EMPLOYEES GROUP LIFE INSURANCE FEDERAL ACQUISITION REGULATION CLAUSES AND FORMS...
48 CFR 2152.215-70 - Contractor records retention.
Code of Federal Regulations, 2011 CFR
2011-10-01
... Contractor chooses to maintain paper documents in electronic format, the electronic version must be an exact replica of the paper document. (End of clause) [70 FR 41155, July 18, 2005] ... MANAGEMENT, FEDERAL EMPLOYEES GROUP LIFE INSURANCE FEDERAL ACQUISITION REGULATION CLAUSES AND FORMS...
MISR Level 3 Radiance Versioning
Atmospheric Science Data Center
2016-11-04
... ESDT Product File Name Prefix Current Quality Designations MIL3DRD, MIL3MRD, MIL3QRD, and MIL3YRD ... Data Product Specification Rev K (PDF). Update to work with new format of the input PGE 1 files. F02_0007 ...
OMERO and Bio-Formats 5: flexible access to large bioimaging datasets at scale
NASA Astrophysics Data System (ADS)
Moore, Josh; Linkert, Melissa; Blackburn, Colin; Carroll, Mark; Ferguson, Richard K.; Flynn, Helen; Gillen, Kenneth; Leigh, Roger; Li, Simon; Lindner, Dominik; Moore, William J.; Patterson, Andrew J.; Pindelski, Blazej; Ramalingam, Balaji; Rozbicki, Emil; Tarkowska, Aleksandra; Walczysko, Petr; Allan, Chris; Burel, Jean-Marie; Swedlow, Jason
2015-03-01
The Open Microscopy Environment (OME) has built and released Bio-Formats, a Java-based proprietary file format conversion tool and OMERO, an enterprise data management platform under open source licenses. In this report, we describe new versions of Bio-Formats and OMERO that are specifically designed to support large, multi-gigabyte or terabyte scale datasets that are routinely collected across most domains of biological and biomedical research. Bio- Formats reads image data directly from native proprietary formats, bypassing the need for conversion into a standard format. It implements the concept of a file set, a container that defines the contents of multi-dimensional data comprised of many files. OMERO uses Bio-Formats to read files natively, and provides a flexible access mechanism that supports several different storage and access strategies. These new capabilities of OMERO and Bio-Formats make them especially useful for use in imaging applications like digital pathology, high content screening and light sheet microscopy that create routinely large datasets that must be managed and analyzed.
NASA AVOSS Fast-Time Models for Aircraft Wake Prediction: User's Guide (APA3.8 and TDP2.1)
NASA Technical Reports Server (NTRS)
Ahmad, Nash'at N.; VanValkenburg, Randal L.; Pruis, Matthew J.; Limon Duparcmeur, Fanny M.
2016-01-01
NASA's current distribution of fast-time wake vortex decay and transport models includes APA (Version 3.8) and TDP (Version 2.1). This User's Guide provides detailed information on the model inputs, file formats, and model outputs. A brief description of the Memphis 1995, Dallas/Fort Worth 1997, and the Denver 2003 wake vortex datasets is given along with the evaluation of models. A detailed bibliography is provided which includes publications on model development, wake field experiment descriptions, and applications of the fast-time wake vortex models.
SMP: A solid modeling program version 2.0
NASA Technical Reports Server (NTRS)
Randall, D. P.; Jones, K. H.; Vonofenheim, W. H.; Gates, R. L.; Matthews, C. G.
1986-01-01
The Solid Modeling Program (SMP) provides the capability to model complex solid objects through the composition of primitive geometric entities. In addition to the construction of solid models, SMP has extensive facilities for model editing, display, and analysis. The geometric model produced by the software system can be output in a format compatible with existing analysis programs such as PATRAN-G. The present version of the SMP software supports six primitives: boxes, cones, spheres, paraboloids, tori, and trusses. The details for creating each of the major primitive types is presented. The analysis capabilities of SMP, including interfaces to existing analysis programs, are discussed.
NASA Technical Reports Server (NTRS)
Warren, W. H., Jr.
1981-01-01
The machine-readable version of the N30 catalog available on magnetic tape from the Astronomical Data Center is described. Numerical representations of some data fields on the original catalog were changed to conform more closely to formats being used for star-catalog data, plus all records having asterisks indicating footnotes in the published catalog now have corresponding remarks entries in a second tape file; i.e. the footnotes in the published catalog were computerized and are contained in a second file of the tape.
Software For Calibration Of Polarimetric SAR Data
NASA Technical Reports Server (NTRS)
Van Zyl, Jakob; Zebker, Howard; Freeman, Anthony; Holt, John; Dubois, Pascale; Chapman, Bruce
1994-01-01
POLCAL (Polarimetric Radar Calibration) software tool intended to assist in calibration of synthetic-aperture radar (SAR) systems. In particular, calibrates Stokes-matrix-format data produced as standard product by NASA/Jet Propulsion Laboratory (JPL) airborne imaging synthetic aperture radar (AIRSAR). Version 4.0 of POLCAL is upgrade of version 2.0. New options include automatic absolute calibration of 89/90 data, distributed-target analysis, calibration of nearby scenes with corner reflectors, altitude or roll-angle corrections, and calibration of errors introduced by known topography. Reduces crosstalk and corrects phase calibration without use of ground calibration equipment. Written in FORTRAN 77.
USAID Expands eMODIS Coverage for Famine Early Warning
NASA Astrophysics Data System (ADS)
Jenkerson, C.; Meyer, D. J.; Evenson, K.; Merritt, M.
2011-12-01
Food security in countries at risk is monitored by U.S. Agency for International Development (USAID) through its Famine Early Warning Systems Network (FEWS NET) using many methods including Moderate Resolution Imaging Spectroradiometer (MODIS) data processed by U.S. Geological Survey (USGS) into eMODIS Normalized Difference Vegetation Index (NDVI) products. Near-real time production is used comparatively with trends derived from the eMODIS archive to operationally monitor vegetation anomalies indicating threatened cropland and rangeland conditions. eMODIS production over Central America and the Caribbean (CAMCAR) began in 2009, and processes 10-day NDVI composites every 5 days from surface reflectance inputs produced using predicted spacecraft and climatology information at Land and Atmosphere Near real time Capability for Earth Observing Systems (EOS) (LANCE). These expedited eMODIS composites are backed by a parallel archive of precision-based NDVI calculated from surface reflectance data ordered through Level 1 and Atmosphere Archive and Distribution System (LAADS). Success in the CAMCAR region led to the recent expansion of eMODIS production to include Africa in 2010, and Central Asia in 2011. Near-real time 250-meter products are available for each region on the last day of an acquisition interval (generally before midnight) from an anonymous file transfer protocol (FTP) distribution site (ftp://emodisftp.cr.usgs.gov/eMODIS). The FTP site concurrently hosts the regional historical collections (2000 to present) which are also searchable using the USGS Earth Explorer (http://edcsns17.cr.usgs.gov/NewEarthExplorer). As eMODIS coverage continues to grow, these geographically gridded, georeferenced tagged image file format (GeoTIFF) NDVI composites increase their utility as effective tools for operational monitoring of near-real time vegetation data against historical trends.
cp-R, an interface the R programming language for clinical laboratory method comparisons.
Holmes, Daniel T
2015-02-01
Clinical scientists frequently need to compare two different bioanalytical methods as part of assay validation/monitoring. As a matter necessity, regression methods for quantitative comparison in clinical chemistry, hematology and other clinical laboratory disciplines must allow for error in both the x and y variables. Traditionally the methods popularized by 1) Deming and 2) Passing and Bablok have been recommended. While commercial tools exist, no simple open source tool is available. The purpose of this work was to develop and entirely open-source GUI-driven program for bioanalytical method comparisons capable of performing these regression methods and able to produce highly customized graphical output. The GUI is written in python and PyQt4 with R scripts performing regression and graphical functions. The program can be run from source code or as a pre-compiled binary executable. The software performs three forms of regression and offers weighting where applicable. Confidence bands of the regression are calculated using bootstrapping for Deming and Passing Bablok methods. Users can customize regression plots according to the tools available in R and can produced output in any of: jpg, png, tiff, bmp at any desired resolution or ps and pdf vector formats. Bland Altman plots and some regression diagnostic plots are also generated. Correctness of regression parameter estimates was confirmed against existing R packages. The program allows for rapid and highly customizable graphical output capable of conforming to the publication requirements of any clinical chemistry journal. Quick method comparisons can also be performed and cut and paste into spreadsheet or word processing applications. We present a simple and intuitive open source tool for quantitative method comparison in a clinical laboratory environment. Copyright © 2014 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.
1982-07-15
D1FF MV-. WD MVF fE)J FLOW fUAfl. U( FFF C(HFFV MIH TFMW(F) KS! JD1HI-if. (FM1 fFFG Q ON J2778 2S3.40 17819.081r? ,3 C. 6,I, .1 PK 0.90 120.1 Eo 29.~ S...85.952 THE MAX. MECHANICAL EFFICIENCY IS: 91.2238 THE MIH . MECHANICAL EFFICIENCY IS: 90. 1712 THE AJERAGE MECHANICAL EFFICIENC’’ IS’ 90.6795 I’ iI ~I...EEC M 13503H TIfF (VilFT flIFF PSS T(AF . F7.ff T (mm I ixt FFF ?li I TT MIH TF’P(F) PFS1 IHi P F$m (Fm FFF(%.) Q’) 0. fe 177.40 IV I I .. 14U ..P W
NASA Technical Reports Server (NTRS)
1999-01-01
"TerrAvoid" and "Position Integrity" combine Global Positioning Satellite (GPS) data with high-resolution maps of the Earth's topography. Dubbs & Severino, Inc., based in Irvine, California, has developed software that allows the system to be run on a battery-powered laptop in the cockpit. The packages, designed primarily for military sponsors and now positioned to hit the consumer market in coming months, came about as the result of the Jet Propulsion Laboratory's Technology Affiliates Program. Intended to give American industry assistance from NASA experts and to facilitate business use of intellectual property developed for the space program, the Technology Affiliates Program introduced the start-up company of Dubbs & Severino to JPL's Dr. Nevin Bryant four years ago. GeoTIFF is now in the public domain, and its use for commercial product development has evolved into an industry standard over the last year.
The Hazards Data Distribution System update
Jones, Brenda K.; Lamb, Rynn M.
2010-01-01
After a major disaster, a satellite image or a collection of aerial photographs of the event is frequently the fastest, most effective way to determine its scope and severity. The U.S. Geological Survey (USGS) Emergency Operations Portal provides emergency first responders and support personnel with easy access to imagery and geospatial data, geospatial Web services, and a digital library focused on emergency operations. Imagery and geospatial data are accessed through the Hazards Data Distribution System (HDDS). HDDS historically provided data access and delivery services through nongraphical interfaces that allow emergency response personnel to select and obtain pre-event baseline data and (or) event/disaster response data. First responders are able to access full-resolution GeoTIFF images or JPEG images at medium- and low-quality compressions through ftp downloads. USGS HDDS home page: http://hdds.usgs.gov/hdds2/
Detection And Mapping (DAM) package. Volume 4A: Software System Manual, part 1
NASA Technical Reports Server (NTRS)
Schlosser, E. H.
1980-01-01
The package is an integrated set of manual procedures, computer programs, and graphic devices designed for efficient production of precisely registered and formatted maps from digital LANDSAT multispectral scanner (MSS) data. The software can be readily implemented on any Univac 1100 series computer with standard peripheral equipment. This version of the software includes predefined spectral limits for use in classifying and mapping surface water for LANDSAT-1, LANDSAT-2, and LANDSAT-3. Tape formats supported include X, AM, and PM.
The formation of planetary systems during the evolution of close binary stars
NASA Astrophysics Data System (ADS)
Tutukov, A. V.
1991-08-01
Modern scenarios of the formation of planetary systems around single stars and products of merging close binaries are described. The frequencies of the realization of different scenarios in the Galaxy are estimated. It is concluded that the modern theory of the early stages of the evolution of single stars and the theory of the evolution of close binaries offer several possible versions for the origin of planetary systems, while the scenario dating back to Kant and Laplace remains the likeliest.
AIMSsim Version 2.3.4 - User Manual
2008-01-01
sera en mesure d’utiliser le système efficacement et moyennant une formation minimale, un prototype d’interface humain -machine (IHM) a été développé...d’utiliser l’ensemble de capteurs efficacement et moyennant une formation minimale, un prototype d’interface humain -machine (IHM) a été développé pour...recherche AIMSsim offrent à l’expérimentateur un niveau de simulation assez détaillé pour mener des analyses du rendement humain , qui fournissent à
Guidance for Product Category Rule Development, Version 1.0
Environmental claims based on life cycle assessment (LCA) can provide quantitative, full life cycle information on products in a format that can permit comparisons and thereby inform purchasing decisions. In recent years, a number of standards and guides have emerged for making b...
EPA Releases Update to Popular School Integrated Pest Management Publication
An updated version reflects recent innovations in school IPM, provides links to new information, and has been redesigned into an easily printable format. It provides an overview of IPM and details the steps a school can follow to establish an IPM program.
ERIC Educational Resources Information Center
Tenopir, Carol; Barry, Jeff
1997-01-01
Profiles 25 database distribution and production companies, all of which responded to a 1997 survey with information on 54 separate online, Web-based, or CD-ROM systems. Highlights increased competition, distribution formats, Web versions versus local area networks, full-text delivery, and pricing policies. Tables present a sampling of customers…
VOTable Format Definition Version 1.3
NASA Astrophysics Data System (ADS)
Ochsenbein, Francois; Taylor, Mark; Williams, Roy; Davenhall, Clive; Demleitner, Markus; Durand, Daniel; Fernique, Pierre; Giaretta, David; Hanisch, Robert; McGlynn, Tom; Szalay, Alex; Wicenec, Andreas; Ochsenbein, Francois; Taylor, Mark
2013-09-01
This document describes the structures making up the VOTable standard. The main part of this document describes the adopted part of the VOTable standard; it is followed by appendices presenting extensions which have been proposed and/or discussed, but which are not part of the standard.
Terrestrial Investigation Model, TIM, has several appendices to its user guide. This is the appendix that includes an example input file in its preserved format. Both parameters and comments defining them are included.
The GNAT: A new tool for processing NMR data.
Castañar, Laura; Poggetto, Guilherme Dal; Colbourne, Adam A; Morris, Gareth A; Nilsson, Mathias
2018-06-01
The GNAT (General NMR Analysis Toolbox) is a free and open-source software package for processing, visualising, and analysing NMR data. It supersedes the popular DOSY Toolbox, which has a narrower focus on diffusion NMR. Data import of most common formats from the major NMR platforms is supported, as well as a GNAT generic format. Key basic processing of NMR data (e.g., Fourier transformation, baseline correction, and phasing) is catered for within the program, as well as more advanced techniques (e.g., reference deconvolution and pure shift FID reconstruction). Analysis tools include DOSY and SCORE for diffusion data, ROSY T 1 /T 2 estimation for relaxation data, and PARAFAC for multilinear analysis. The GNAT is written for the MATLAB® language and comes with a user-friendly graphical user interface. The standard version is intended to run with a MATLAB installation, but completely free-standing compiled versions for Windows, Mac, and Linux are also freely available. © 2018 The Authors Magnetic Resonance in Chemistry Published by John Wiley & Sons Ltd.
VOTable JAVA Streaming Writer and Applications.
NASA Astrophysics Data System (ADS)
Kulkarni, P.; Kembhavi, A.; Kale, S.
2004-07-01
Virtual Observatory related tools use a new standard for data transfer called the VOTable format. This is a variant of the xml format that enables easy transfer of data over the web. We describe a streaming interface that can bridge the VOTable format, through a user friendly graphical interface, with the FITS and ASCII formats, which are commonly used by astronomers. A streaming interface is important for efficient use of memory because of the large size of catalogues. The tools are developed in JAVA to provide a platform independent interface. We have also developed a stand-alone version that can be used to convert data stored in ASCII or FITS format on a local machine. The Streaming writer is successfully being used in VOPlot (See Kale et al 2004 for a description of VOPlot).We present the test results of converting huge FITS and ASCII data into the VOTable format on machines that have only limited memory.
Alteration in cellular acetylcholine influences dauer formation in Caenorhabditis elegans.
Lee, Jeeyong; Kim, Kwang-Youl; Paik, Young-Ki
2014-02-01
Altered acetylcholine (Ach) homeostasis is associated with loss of viability in flies, developmental defects in mice, and cognitive deficits in human. Here, we assessed the importance of Ach in Caenorhabditis elegans development, focusing on the role of Ach during dauer formation. We found that dauer formation was disturbed in choline acetyltransferase (cha-1) and acetylcholinesterase (ace) mutants defective in Ach biosynthesis and degradation, respectively. When examined the potential role of G-proteins in dauer formation, goa-1 and egl-30 mutant worms, expressing mutated versions of mammalian G(o) and G(q) homolog, respectively, showed some abnormalities in dauer formation. Using quantitative mass spectrometry, we also found that dauer larvae had lower Ach content than did reproductively grown larvae. In addition, a proteomic analysis of acetylcholinesterase mutant worms, which have excessive levels of Ach, showed differential expression of metabolic genes. Collectively, these results indicate that alterations in Ach release may influence dauer formation in C. elegans.
Method and apparatus for production of subsea hydrocarbon formations
Blandford, Joseph W.
1995-01-01
A system for controlling, separating, processing and exporting well fluids produced from subsea hydrocarbon formations is disclosed. The subsea well tender system includes a surface buoy supporting one or more decks above the water surface for accommodating equipment to process oil, gas and water recovered from the subsea hydrocarbon formation. The surface buoy includes a surface-piercing central flotation column connected to one or more external floatation tanks located below the water surface. The surface buoy is secured to the seabed by one or more tendons which are anchored to a foundation with piles imbedded in the seabed. The system accommodates multiple versions on the surface buoy configuration.
SPLICER - A GENETIC ALGORITHM TOOL FOR SEARCH AND OPTIMIZATION, VERSION 1.0 (MACINTOSH VERSION)
NASA Technical Reports Server (NTRS)
Wang, L.
1994-01-01
SPLICER is a genetic algorithm tool which can be used to solve search and optimization problems. Genetic algorithms are adaptive search procedures (i.e. problem solving methods) based loosely on the processes of natural selection and Darwinian "survival of the fittest." SPLICER provides the underlying framework and structure for building a genetic algorithm application. These algorithms apply genetically-inspired operators to populations of potential solutions in an iterative fashion, creating new populations while searching for an optimal or near-optimal solution to the problem at hand. SPLICER 1.0 was created using a modular architecture that includes a Genetic Algorithm Kernel, interchangeable Representation Libraries, Fitness Modules and User Interface Libraries, and well-defined interfaces between these components. The architecture supports portability, flexibility, and extensibility. SPLICER comes with all source code and several examples. For instance, a "traveling salesperson" example searches for the minimum distance through a number of cities visiting each city only once. Stand-alone SPLICER applications can be used without any programming knowledge. However, to fully utilize SPLICER within new problem domains, familiarity with C language programming is essential. SPLICER's genetic algorithm (GA) kernel was developed independent of representation (i.e. problem encoding), fitness function or user interface type. The GA kernel comprises all functions necessary for the manipulation of populations. These functions include the creation of populations and population members, the iterative population model, fitness scaling, parent selection and sampling, and the generation of population statistics. In addition, miscellaneous functions are included in the kernel (e.g., random number generators). Different problem-encoding schemes and functions are defined and stored in interchangeable representation libraries. This allows the GA kernel to be used with any representation scheme. The SPLICER tool provides representation libraries for binary strings and for permutations. These libraries contain functions for the definition, creation, and decoding of genetic strings, as well as multiple crossover and mutation operators. Furthermore, the SPLICER tool defines the appropriate interfaces to allow users to create new representation libraries. Fitness modules are the only component of the SPLICER system a user will normally need to create or alter to solve a particular problem. Fitness functions are defined and stored in interchangeable fitness modules which must be created using C language. Within a fitness module, a user can create a fitness (or scoring) function, set the initial values for various SPLICER control parameters (e.g., population size), create a function which graphically displays the best solutions as they are found, and provide descriptive information about the problem. The tool comes with several example fitness modules, while the process of developing a fitness module is fully discussed in the accompanying documentation. The user interface is event-driven and provides graphic output in windows. SPLICER is written in Think C for Apple Macintosh computers running System 6.0.3 or later and Sun series workstations running SunOS. The UNIX version is easily ported to other UNIX platforms and requires MIT's X Window System, Version 11 Revision 4 or 5, MIT's Athena Widget Set, and the Xw Widget Set. Example executables and source code are included for each machine version. The standard distribution media for the Macintosh version is a set of three 3.5 inch Macintosh format diskettes. The standard distribution medium for the UNIX version is a .25 inch streaming magnetic tape cartridge in UNIX tar format. For the UNIX version, alternate distribution media and formats are available upon request. SPLICER was developed in 1991.
Parkhurst, David L.; Appelo, C.A.J.
1999-01-01
PHREEQC version 2 is a computer program written in the C programming language that is designed to perform a wide variety of low-temperature aqueous geochemical calculations. PHREEQC is based on an ion-association aqueous model and has capabilities for (1) speciation and saturation-index calculations; (2) batch-reaction and one-dimensional (1D) transport calculations involving reversible reactions, which include aqueous, mineral, gas, solid-solution, surface-complexation, and ion-exchange equilibria, and irreversible reactions, which include specified mole transfers of reactants, kinetically controlled reactions, mixing of solutions, and temperature changes; and (3) inverse modeling, which finds sets of mineral and gas mole transfers that account for differences in composition between waters, within specified compositional uncertainty limits.New features in PHREEQC version 2 relative to version 1 include capabilities to simulate dispersion (or diffusion) and stagnant zones in 1D-transport calculations, to model kinetic reactions with user-defined rate expressions, to model the formation or dissolution of ideal, multicomponent or nonideal, binary solid solutions, to model fixed-volume gas phases in addition to fixed-pressure gas phases, to allow the number of surface or exchange sites to vary with the dissolution or precipitation of minerals or kinetic reactants, to include isotope mole balances in inverse modeling calculations, to automatically use multiple sets of convergence parameters, to print user-defined quantities to the primary output file and (or) to a file suitable for importation into a spreadsheet, and to define solution compositions in a format more compatible with spreadsheet programs. This report presents the equations that are the basis for chemical equilibrium, kinetic, transport, and inverse-modeling calculations in PHREEQC; describes the input for the program; and presents examples that demonstrate most of the program's capabilities.
HIGH PRESSURE COAL COMBUSTION KINETICS PROJECT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chris Guenther; Bill Rogers
2001-09-15
The HPCCK project was initiated with a kickoff meeting held on June 12, 2001 in Morgantown, WV, which was attended by all project participants. SRI's existing g-RCFR reactor was reconfigured to a SRT-RCFR geometry (Task 1.1). This new design is suitable for performing the NBFZ experiments of Task 1.2. It was decided that the SRT-RCFR apparatus could be modified and used for the HPBO experiments. The purchase, assembly, and testing of required instrumentation and hardware is nearly complete (Task 1.1 and 1.2). Initial samples of PBR coal have been shipped from FWC to SRI (Task 1.1). The ECT device formore » coal flow measurements used at FWC will not be used in the SRI apparatus and a screw type feeder has been suggested instead (Task 5.1). NEA has completed a upgrade of an existing Fluent simulator for SRI's RCFR to a version that is suitable for interpreting results from tests in the NBFZ configuration (Task 1.3) this upgrade includes finite-rate submodels for devolatilization, secondary volatiles pyrolysis, volatiles combustion, and char oxidation. Plans for an enhanced version of CBK have been discussed and development of this enhanced version has begun (Task 2.5). A developmental framework for implementing pressure and oxygen effects on ash formation in an ash formation model (Task 3.3) has begun.« less
Paper to Electronic Questionnaires: Effects on Structured Questionnaire Forms
NASA Technical Reports Server (NTRS)
Trujillo, Anna C.
2009-01-01
With the use of computers, paper questionnaires are being replaced by electronic questionnaires. The formats of traditional paper questionnaires have been found to effect a subject's rating. Consequently, the transition from paper to electronic format can subtly change results. The research presented begins to determine how electronic questionnaire formats change subjective ratings. For formats where subjects used a flow chart to arrive at their rating, starting at the worst and middle ratings of the flow charts were the most accurate but subjects took slightly more time to arrive at their answers. Except for the electronic paper format, starting at the worst rating was the most preferred. The paper and electronic paper versions had the worst accuracy. Therefore, for flowchart type of questionnaires, flowcharts should start at the worst rating and work their way up to better ratings.
NASA Astrophysics Data System (ADS)
Gordov, Evgeny; Okladnikov, Igor; Titov, Alexander
2017-04-01
For comprehensive usage of large geospatial meteorological and climate datasets it is necessary to create a distributed software infrastructure based on the spatial data infrastructure (SDI) approach. Currently, it is generally accepted that the development of client applications as integrated elements of such infrastructure should be based on the usage of modern web and GIS technologies. The paper describes the Web GIS for complex processing and visualization of geospatial (mainly in NetCDF and PostGIS formats) datasets as an integral part of the dedicated Virtual Research Environment for comprehensive study of ongoing and possible future climate change, and analysis of their implications, providing full information and computing support for the study of economic, political and social consequences of global climate change at the global and regional levels. The Web GIS consists of two basic software parts: 1. Server-side part representing PHP applications of the SDI geoportal and realizing the functionality of interaction with computational core backend, WMS/WFS/WPS cartographical services, as well as implementing an open API for browser-based client software. Being the secondary one, this part provides a limited set of procedures accessible via standard HTTP interface. 2. Front-end part representing Web GIS client developed according to a "single page application" technology based on JavaScript libraries OpenLayers (http://openlayers.org/), ExtJS (https://www.sencha.com/products/extjs), GeoExt (http://geoext.org/). It implements application business logic and provides intuitive user interface similar to the interface of such popular desktop GIS applications, as uDIG, QuantumGIS etc. Boundless/OpenGeo architecture was used as a basis for Web-GIS client development. According to general INSPIRE requirements to data visualization Web GIS provides such standard functionality as data overview, image navigation, scrolling, scaling and graphical overlay, displaying map legends and corresponding metadata information. The specialized Web GIS client contains three basic tires: • Tier of NetCDF metadata in JSON format • Middleware tier of JavaScript objects implementing methods to work with: o NetCDF metadata o XML file of selected calculations configuration (XML task) o WMS/WFS/WPS cartographical services • Graphical user interface tier representing JavaScript objects realizing general application business logic Web-GIS developed provides computational processing services launching to support solving tasks in the area of environmental monitoring, as well as presenting calculation results in the form of WMS/WFS cartographical layers in raster (PNG, JPG, GeoTIFF), vector (KML, GML, Shape), and binary (NetCDF) formats. It has shown its effectiveness in the process of solving real climate change research problems and disseminating investigation results in cartographical formats. The work is supported by the Russian Science Foundation grant No 16-19-10257.
Storm Water Management Model User’s Manual Version 5.1 - manual
SWMM 5 provides an integrated environment for editing study area input data, running hydrologic, hydraulic and water quality simulations, and viewing the results in a variety of formats. These include color-coded drainage area and conveyance system maps, time series graphs and ta...
Sustainable Materials Management: At Your Fingertips
EPA recently announced the release of the beta version of EPA’s Materials Management Wizard web application (or “M-Wiz,” for short)—that puts that wealth of knowledge at your fingertips in a guided, easy-to-use format you can tailor to your specific needs.
Environmental Profiles of Paper vs. Electronic UC-CEAS Annual Reports
In 2010, the University of Cincinnati College of Engineering and Applied Sciences (UC-CEAS) created a new electronic format for the Annual Report that could be distributed through the college’s website to replace the prior print version. In order to determine the environmental co...
Choosing the Right Database Management Program.
ERIC Educational Resources Information Center
Vockell, Edward L.; Kopenec, Donald
1989-01-01
Provides a comparison of four database management programs commonly used in schools: AppleWorks, the DOS 3.3 and ProDOS versions of PFS, and MECC's Data Handler. Topics discussed include information storage, spelling checkers, editing functions, search strategies, graphs, printout formats, library applications, and HyperCard. (LRW)
Telemetry Attributes Transfer Standard (TMATS) Handbook
2015-07-01
Example ......................... 6-1 Appendix A. Extensible Markup Language TMATS Differences ...................................... A-1 Appendix B...return-to-zero - level TG Telemetry Group TM telemetry TMATS Telemetry Attributes Transfer Standard XML eXtensible Markup Language Telemetry... Markup Language) format. The initial version of a standard 1 Range Commanders Council. Telemetry
NASA Technical Reports Server (NTRS)
Manning, R. M.
1994-01-01
The frequency and intensity of rain attenuation affecting the communication between a satellite and an earth terminal is an important consideration in planning satellite links. The NASA Lewis Research Center Satellite Link Attenuation Model Program (LeRC-SLAM) provides a static and dynamic statistical assessment of the impact of rain attenuation on a communications link established between an earth terminal and a geosynchronous satellite. The program is designed for use in the specification, design and assessment of satellite links for any terminal location in the continental United States. The basis for LeRC-SLAM is the ACTS Rain Attenuation Prediction Model, which uses a log-normal cumulative probability distribution to describe the random process of rain attenuation on satellite links. The derivation of the statistics for the rainrate process at the specified terminal location relies on long term rainfall records compiled by the U.S. Weather Service during time periods of up to 55 years in length. The theory of extreme value statistics is also utilized. The user provides 1) the longitudinal position of the satellite in geosynchronous orbit, 2) the geographical position of the earth terminal in terms of latitude and longitude, 3) the height above sea level of the terminal site, 4) the yearly average rainfall at the terminal site, and 5) the operating frequency of the communications link (within 1 to 1000 GHz, inclusive). Based on the yearly average rainfall at the terminal location, LeRC-SLAM calculates the relevant rain statistics for the site using an internal data base. The program then generates rain attenuation data for the satellite link. This data includes a description of the static (i.e., yearly) attenuation process, an evaluation of the cumulative probability distribution for attenuation effects, and an evaluation of the probability of fades below selected fade depths. In addition, LeRC-SLAM calculates the elevation and azimuth angles of the terminal antenna required to establish a link with the satellite, the statistical parameters that characterize the rainrate process at the terminal site, the length of the propagation path within the potential rain region, and its projected length onto the local horizontal. The IBM PC version of LeRC-SLAM (LEW-14979) is written in Microsoft QuickBASIC for an IBM PC compatible computer with a monitor and printer capable of supporting an 80-column format. The IBM PC version is available on a 5.25 inch MS-DOS format diskette. The program requires about 30K RAM. The source code and executable are included. The Macintosh version of LeRC-SLAM (LEW-14977) is written in Microsoft Basic, Binary (b) v2.00 for Macintosh II series computers running MacOS. This version requires 400K RAM and is available on a 3.5 inch 800K Macintosh format diskette, which includes source code only. The Macintosh version was developed in 1987 and the IBM PC version was developed in 1989. IBM PC is a trademark of International Business Machines. MS-DOS is a registered trademark of Microsoft Corporation. Macintosh is a registered trademark of Apple Computer, Inc.
LERC-SLAM - THE NASA LEWIS RESEARCH CENTER SATELLITE LINK ATTENUATION MODEL PROGRAM (IBM PC VERSION)
NASA Technical Reports Server (NTRS)
Manning, R. M.
1994-01-01
The frequency and intensity of rain attenuation affecting the communication between a satellite and an earth terminal is an important consideration in planning satellite links. The NASA Lewis Research Center Satellite Link Attenuation Model Program (LeRC-SLAM) provides a static and dynamic statistical assessment of the impact of rain attenuation on a communications link established between an earth terminal and a geosynchronous satellite. The program is designed for use in the specification, design and assessment of satellite links for any terminal location in the continental United States. The basis for LeRC-SLAM is the ACTS Rain Attenuation Prediction Model, which uses a log-normal cumulative probability distribution to describe the random process of rain attenuation on satellite links. The derivation of the statistics for the rainrate process at the specified terminal location relies on long term rainfall records compiled by the U.S. Weather Service during time periods of up to 55 years in length. The theory of extreme value statistics is also utilized. The user provides 1) the longitudinal position of the satellite in geosynchronous orbit, 2) the geographical position of the earth terminal in terms of latitude and longitude, 3) the height above sea level of the terminal site, 4) the yearly average rainfall at the terminal site, and 5) the operating frequency of the communications link (within 1 to 1000 GHz, inclusive). Based on the yearly average rainfall at the terminal location, LeRC-SLAM calculates the relevant rain statistics for the site using an internal data base. The program then generates rain attenuation data for the satellite link. This data includes a description of the static (i.e., yearly) attenuation process, an evaluation of the cumulative probability distribution for attenuation effects, and an evaluation of the probability of fades below selected fade depths. In addition, LeRC-SLAM calculates the elevation and azimuth angles of the terminal antenna required to establish a link with the satellite, the statistical parameters that characterize the rainrate process at the terminal site, the length of the propagation path within the potential rain region, and its projected length onto the local horizontal. The IBM PC version of LeRC-SLAM (LEW-14979) is written in Microsoft QuickBASIC for an IBM PC compatible computer with a monitor and printer capable of supporting an 80-column format. The IBM PC version is available on a 5.25 inch MS-DOS format diskette. The program requires about 30K RAM. The source code and executable are included. The Macintosh version of LeRC-SLAM (LEW-14977) is written in Microsoft Basic, Binary (b) v2.00 for Macintosh II series computers running MacOS. This version requires 400K RAM and is available on a 3.5 inch 800K Macintosh format diskette, which includes source code only. The Macintosh version was developed in 1987 and the IBM PC version was developed in 1989. IBM PC is a trademark of International Business Machines. MS-DOS is a registered trademark of Microsoft Corporation. Macintosh is a registered trademark of Apple Computer, Inc.
A computer program (MACPUMP) for interactive aquifer-test analysis
Day-Lewis, F. D.; Person, M.A.; Konikow, Leonard F.
1995-01-01
This report introduces MACPUMP (Version 1.0), an aquifer-test-analysis package for use with Macintosh4 computers. The report outlines the input- data format, describes the solutions encoded in the program, explains the menu-items, and offers a tutorial illustrating the use of the program. The package reads list-directed aquifer-test data from a file, plots the data to the screen, generates and plots type curves for several different test conditions, and allows mouse-controlled curve matching. MACPUMP features pull-down menus, a simple text viewer for displaying data-files, and optional on-line help windows. This version includes the analytical solutions for nonleaky and leaky confined aquifers, using both type curves and straight-line methods, and for the analysis of single-well slug tests using type curves. An executable version of the code and sample input data sets are included on an accompanying floppy disk.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ankrum, A.R.; Bohlander, K.L.; Gilbert, E.R.
This report provides the results of comparisons of the cited and latest versions of ANS, ASME, AWS and NFPA standards cited in the NRC Standard Review Plan for the Review of Safety Analysis Reports for Nuclear Power Plants (NUREG 0800) and related documents. The comparisons were performed by Battelle Pacific Northwest Laboratories in support of the NRC`s Standard Review Plan Update and Development Program. Significant changes to the standards, from the cited version to the latest version, are described and discussed in a tabular format for each standard. Recommendations for updating each citation in the Standard Review Plan are presented.more » Technical considerations and suggested changes are included for related regulatory documents (i.e., Regulatory Guides and the Code of Federal Regulations) citing the standard. The results and recommendations presented in this document have not been subjected to NRC staff review.« less
National Hydropower Plant Dataset, Version 2 (FY18Q3)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Samu, Nicole; Kao, Shih-Chieh; O'Connor, Patrick
The National Hydropower Plant Dataset, Version 2 (FY18Q3) is a geospatially comprehensive point-level dataset containing locations and key characteristics of U.S. hydropower plants that are currently either in the hydropower development pipeline (pre-operational), operational, withdrawn, or retired. These data are provided in GIS and tabular formats with corresponding metadata for each. In addition, we include access to download 2 versions of the National Hydropower Map, which was produced with these data (i.e. Map 1 displays the geospatial distribution and characteristics of all operational hydropower plants; Map 2 displays the geospatial distribution and characteristics of operational hydropower plants with pumped storagemore » and mixed capabilities only). This dataset is a subset of ORNL's Existing Hydropower Assets data series, updated quarterly as part of ORNL's National Hydropower Asset Assessment Program.« less
Zholdikova, Z I; Kharchevnikova, N V
2006-01-01
A version of logical-combinatorial JSM type intelligent system was used to predict the presence and the degree of a carcinogenic effect. This version was based on combined description of chemical substances including both structural and numeric parameters. The new version allows for the fact that the toxicity and danger caused by chemical substances often depend on their biological activation in the organism. The authors substantiate classifying chemicals according to their carcinogenic activity, and illustrate the use of the system to predict the carcinogenicity of polycyclic aromatic hydrocarbons using a model of bioactivation via the formation of diolepoxides, and the carcinogenicity of halogenated alkanes using a model of bioactivation via oxidative dehalogenation. The paper defined the boundary level of an energetic parameter, the exceeding of which correlated with the inhibition of halogenated alkanes's metabolism and the absence of carcinogenic activity.
Forde, Arnell S.; Flocks, James G.; Wiese, Dana S.; Fredericks, Jake J.
2016-03-29
The archived trace data are in standard SEG Y rev. 0 format (Barry and others, 1975); the first 3,200 bytes of the card image header are in American Standard Code for Information Interchange (ASCII) format instead of Extended Binary Coded Decimal Interchange Code (EBCDIC) format. The SEG Y files are available on the DVD version of this report or online, downloadable via the USGS Coastal and Marine Geoscience Data System (http://cmgds.marine.usgs.gov). The data are also available for viewing using GeoMapApp (http://www.geomapapp.org) and Virtual Ocean (http://www.virtualocean.org) multi-platform open source software. The Web version of this archive does not contain the SEG Y trace files. To obtain the complete DVD archive, contact USGS Information Services at 1-888-ASK-USGS or infoservices@usgs.gov. The SEG Y files may be downloaded and processed with commercial or public domain software such as Seismic Unix (SU) (Cohen and Stockwell, 2010). See the How To Download SEG Y Data page for download instructions. The printable profiles are provided as Graphics Interchange Format (GIF) images processed and gained using SU software and can be viewed from theProfiles page or by using the links located on the trackline maps; refer to the Software page for links to example SU processing scripts.
MDplot: Visualise Molecular Dynamics.
Margreitter, Christian; Oostenbrink, Chris
2017-05-10
The MDplot package provides plotting functions to allow for automated visualisation of molecular dynamics simulation output. It is especially useful in cases where the plot generation is rather tedious due to complex file formats or when a large number of plots are generated. The graphs that are supported range from those which are standard, such as RMsD/RMsF (root-mean-square deviation and root-mean-square fluctuation, respectively) to less standard, such as thermodynamic integration analysis and hydrogen bond monitoring over time. All told, they address many commonly used analyses. In this article, we set out the MDplot package's functions, give examples of the function calls, and show the associated plots. Plotting and data parsing is separated in all cases, i.e. the respective functions can be used independently. Thus, data manipulation and the integration of additional file formats is fairly easy. Currently, the loading functions support GROMOS, GROMACS, and AMBER file formats. Moreover, we also provide a Bash interface that allows simple embedding of MDplot into Bash scripts as the final analysis step. The package can be obtained in the latest major version from CRAN (https://cran.r-project.org/package=MDplot) or in the most recent version from the project's GitHub page at https://github.com/MDplot/MDplot, where feedback is also most welcome. MDplot is published under the GPL-3 license.
NASA Technical Reports Server (NTRS)
Gladden, Roy E.; Khanampornpan, Teerapat; Fisher, Forest W.
2010-01-01
Version 5.0 of the AutoGen software has been released. Previous versions, variously denoted Autogen and autogen, were reported in two articles: Automated Sequence Generation Process and Software (NPO-30746), Software Tech Briefs (Special Supplement to NASA Tech Briefs), September 2007, page 30, and Autogen Version 2.0 (NPO- 41501), NASA Tech Briefs, Vol. 31, No. 10 (October 2007), page 58. To recapitulate: AutoGen (now signifying automatic sequence generation ) automates the generation of sequences of commands in a standard format for uplink to spacecraft. AutoGen requires fewer workers than are needed for older manual sequence-generation processes, and greatly reduces sequence-generation times. The sequences are embodied in spacecraft activity sequence files (SASFs). AutoGen automates generation of SASFs by use of another previously reported program called APGEN. AutoGen encodes knowledge of different mission phases and of how the resultant commands must differ among the phases. AutoGen also provides means for customizing sequences through use of configuration files. The approach followed in developing AutoGen has involved encoding the behaviors of a system into a model and encoding algorithms for context-sensitive customizations of the modeled behaviors. This version of AutoGen addressed the MRO (Mars Reconnaissance Orbiter) primary science phase (PSP) mission phase. On previous Mars missions this phase has more commonly been referred to as mapping phase. This version addressed the unique aspects of sequencing orbital operations and specifically the mission specific adaptation of orbital operations for MRO. This version also includes capabilities for MRO s role in Mars relay support for UHF relay communications with the MER rovers and the Phoenix lander.
VENTURE/PC manual: A multidimensional multigroup neutron diffusion code system. Version 3
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shapiro, A.; Huria, H.C.; Cho, K.W.
1991-12-01
VENTURE/PC is a recompilation of part of the Oak Ridge BOLD VENTURE code system, which will operate on an IBM PC or compatible computer. Neutron diffusion theory solutions are obtained for multidimensional, multigroup problems. This manual contains information associated with operating the code system. The purpose of the various modules used in the code system, and the input for these modules are discussed. The PC code structure is also given. Version 2 included several enhancements not given in the original version of the code. In particular, flux iterations can be done in core rather than by reading and writing tomore » disk, for problems which allow sufficient memory for such in-core iterations. This speeds up the iteration process. Version 3 does not include any of the special processors used in the previous versions. These special processors utilized formatted input for various elements of the code system. All such input data is now entered through the Input Processor, which produces standard interface files for the various modules in the code system. In addition, a Standard Interface File Handbook is included in the documentation which is distributed with the code, to assist in developing the input for the Input Processor.« less
NASA Technical Reports Server (NTRS)
Warren, W. H., Jr.
1984-01-01
A detailed description of the machine-readable revised catalog as it is currently being distributed from the Astronomical Data Center is given. This catalog of star images was compiled from imagery obtained by the Naval Research Laboratory (NRL) Far-Ultraviolet Camera/Spectrograph (Experiments S201) operated from 21 to 23 April 1972 on the lunar surface during the Apollo 16 mission. The documentation includes a detailed data format description, a table of indigenous characteristics of the magnetic tape file, and a sample listing of data records exactly as they are presented in the machine-readable version.
Geoinformation web-system for processing and visualization of large archives of geo-referenced data
NASA Astrophysics Data System (ADS)
Gordov, E. P.; Okladnikov, I. G.; Titov, A. G.; Shulgina, T. M.
2010-12-01
Developed working model of information-computational system aimed at scientific research in area of climate change is presented. The system will allow processing and analysis of large archives of geophysical data obtained both from observations and modeling. Accumulated experience of developing information-computational web-systems providing computational processing and visualization of large archives of geo-referenced data was used during the implementation (Gordov et al, 2007; Okladnikov et al, 2008; Titov et al, 2009). Functional capabilities of the system comprise a set of procedures for mathematical and statistical analysis, processing and visualization of data. At present five archives of data are available for processing: 1st and 2nd editions of NCEP/NCAR Reanalysis, ECMWF ERA-40 Reanalysis, JMA/CRIEPI JRA-25 Reanalysis, and NOAA-CIRES XX Century Global Reanalysis Version I. To provide data processing functionality a computational modular kernel and class library providing data access for computational modules were developed. Currently a set of computational modules for climate change indices approved by WMO is available. Also a special module providing visualization of results and writing to Encapsulated Postscript, GeoTIFF and ESRI shape files was developed. As a technological basis for representation of cartographical information in Internet the GeoServer software conforming to OpenGIS standards is used. Integration of GIS-functionality with web-portal software to provide a basis for web-portal’s development as a part of geoinformation web-system is performed. Such geoinformation web-system is a next step in development of applied information-telecommunication systems offering to specialists from various scientific fields unique opportunities of performing reliable analysis of heterogeneous geophysical data using approved computational algorithms. It will allow a wide range of researchers to work with geophysical data without specific programming knowledge and to concentrate on solving their specific tasks. The system would be of special importance for education in climate change domain. This work is partially supported by RFBR grant #10-07-00547, SB RAS Basic Program Projects 4.31.1.5 and 4.31.2.7, SB RAS Integration Projects 4 and 9.
75 FR 36066 - Promise Neighborhoods Program
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-24
... official version of this document is the document published in the Federal Register. Free Internet access... Federal Relay Service, toll free, at 1-800-877-8339. Electronic Access to This Document: You can view this... Adobe Portable Document Format (PDF) on the Internet at the following site: http://www.ed.gov/news...
Are These Books, or What? CD-ROM and the Literary Industry.
ERIC Educational Resources Information Center
Lyall, Sarah
1994-01-01
Considers the concept of print books versus newer electronic formats, including CD-ROM and online versions. Topics discussed include changes in the publishing industry; a focus on content; reference books, including encyclopedias and dictionaries; children's books; multimedia publishers versus traditional book publishers; and production and…
Assessing Reading Strategies of Engineering Students: Think Aloud Approach
ERIC Educational Resources Information Center
Nalliveettil, George Mathew
2014-01-01
Literacy in reading and understanding printed words is significant for all undergraduate students to succeed in their academic career. Developments in digital technology improved the quality of academic texts in terms of design, format and layout. Further, availability of academic related English language resources in electronic versions gave…
The American Indian: A Multimedia Encyclopedia.
ERIC Educational Resources Information Center
Carter, Christina E.
1993-01-01
Reviews "The American Indian: A Multimedia Encyclopedia," Version 1.0 (New York, Facts on File, Inc., 1993). This electronic product (compact disk) presents a great amount of material on American Indians from various formats, but its effectiveness is limited by the dated nature of some materials. Software design and searching features are…
CD-ROM in a High School Library Media Center.
ERIC Educational Resources Information Center
Barlow, Diane; And Others
1987-01-01
Describes the experiences of high school students using microcomputers to access an electronic version of an encyclopedia in the school's media center. The topics discussed include hardware and software requirements of the CD-ROM format, information seeking strategies and problems observed, student satisfaction with the system, and recommendations…
Inordinate Fondness: The Feds and the Internet.
ERIC Educational Resources Information Center
Morehead, Joe
1997-01-01
Examines the move to make U. S. government information available solely in an electronic format. Discusses inability of general purpose search engines to access the information; shift of cost to the consumer; the online version of the "Monthly Catalog of United States Government Publications"; federal statistics; Agency Web sites; and a…
ERIC Educational Resources Information Center
Schoel, Jim
2002-01-01
The evolution of Project Adventure's Full Value Contract from its original No Discount format is described. Although wording varies among groups, all versions ask the group to create safe and respectful behavioral norms under which it will operate, to commit to those norms, and to accept a shared responsibility for their maintenance. (TD)
Index Nuclear Wallet Cards Contents Current Version Radioactive Nuclides (Homeland Security) Nuclear Materials Management & Safeguards System 8th Edition 2011 Nuclear Wallet Cards Resources Search Nuclear Wallet Cards 8th Edition PDF Format 8thEdition, Android Market Download Nuclear Wallet Cards Nuclear
NASADIG - NASA DEVICE INDEPENDENT GRAPHICS LIBRARY (AMDAHL VERSION)
NASA Technical Reports Server (NTRS)
Rogers, J. E.
1994-01-01
The NASA Device Independent Graphics Library, NASADIG, can be used with many computer-based engineering and management applications. The library gives the user the opportunity to translate data into effective graphic displays for presentation. The software offers many features which allow the user flexibility in creating graphics. These include two-dimensional plots, subplot projections in 3D-space, surface contour line plots, and surface contour color-shaded plots. Routines for three-dimensional plotting, wireframe surface plots, surface plots with hidden line removal, and surface contour line plots are provided. Other features include polar and spherical coordinate plotting, world map plotting utilizing either cylindrical equidistant or Lambert equal area projection, plot translation, plot rotation, plot blowup, splines and polynomial interpolation, area blanking control, multiple log/linear axes, legends and text control, curve thickness control, and multiple text fonts (18 regular, 4 bold). NASADIG contains several groups of subroutines. Included are subroutines for plot area and axis definition; text set-up and display; area blanking; line style set-up, interpolation, and plotting; color shading and pattern control; legend, text block, and character control; device initialization; mixed alphabets setting; and other useful functions. The usefulness of many routines is dependent on the prior definition of basic parameters. The program's control structure uses a serial-level construct with each routine restricted for activation at some prescribed level(s) of problem definition. NASADIG provides the following output device drivers: Selanar 100XL, VECTOR Move/Draw ASCII and PostScript files, Tektronix 40xx, 41xx, and 4510 Rasterizer, DEC VT-240 (4014 mode), IBM AT/PC compatible with SmartTerm 240 emulator, HP Lasergrafix Film Recorder, QMS 800/1200, DEC LN03+ Laserprinters, and HP LaserJet (Series III). NASADIG is written in FORTRAN and is available for several platforms. NASADIG 5.7 is available for DEC VAX series computers running VMS 5.0 or later (MSC-21801), Cray X-MP and Y-MP series computers running UNICOS (COS-10049), and Amdahl 5990 mainframe computers running UTS (COS-10050). NASADIG 5.1 is available for UNIX-based operating systems (MSC-22001). The UNIX version has been successfully implemented on Sun4 series computers running SunOS, SGI IRIS computers running IRIX, Hewlett Packard 9000 computers running HP-UX, and Convex computers running Convex OS (MSC-22001). The standard distribution medium for MSC-21801 is a set of two 6250 BPI 9-track magnetic tapes in DEC VAX BACKUP format. It is also available on a set of two TK50 tape cartridges in DEC VAX BACKUP format. The standard distribution medium for COS-10049 and COS-10050 is a 6250 BPI 9-track magnetic tape in UNIX tar format. Other distribution media and formats may be available upon request. The standard distribution medium for MSC-22001 is a .25 inch streaming magnetic tape cartridge (Sun QIC-24) in UNIX tar format. Alternate distribution media and formats are available upon request. With minor modification, the UNIX source code can be ported to other platforms including IBM PC/AT series computers and compatibles. NASADIG is also available bundled with TRASYS, the Thermal Radiation Analysis System (COS-10026, DEC VAX version; COS-10040, CRAY version).
NASADIG - NASA DEVICE INDEPENDENT GRAPHICS LIBRARY (UNIX VERSION)
NASA Technical Reports Server (NTRS)
Rogers, J. E.
1994-01-01
The NASA Device Independent Graphics Library, NASADIG, can be used with many computer-based engineering and management applications. The library gives the user the opportunity to translate data into effective graphic displays for presentation. The software offers many features which allow the user flexibility in creating graphics. These include two-dimensional plots, subplot projections in 3D-space, surface contour line plots, and surface contour color-shaded plots. Routines for three-dimensional plotting, wireframe surface plots, surface plots with hidden line removal, and surface contour line plots are provided. Other features include polar and spherical coordinate plotting, world map plotting utilizing either cylindrical equidistant or Lambert equal area projection, plot translation, plot rotation, plot blowup, splines and polynomial interpolation, area blanking control, multiple log/linear axes, legends and text control, curve thickness control, and multiple text fonts (18 regular, 4 bold). NASADIG contains several groups of subroutines. Included are subroutines for plot area and axis definition; text set-up and display; area blanking; line style set-up, interpolation, and plotting; color shading and pattern control; legend, text block, and character control; device initialization; mixed alphabets setting; and other useful functions. The usefulness of many routines is dependent on the prior definition of basic parameters. The program's control structure uses a serial-level construct with each routine restricted for activation at some prescribed level(s) of problem definition. NASADIG provides the following output device drivers: Selanar 100XL, VECTOR Move/Draw ASCII and PostScript files, Tektronix 40xx, 41xx, and 4510 Rasterizer, DEC VT-240 (4014 mode), IBM AT/PC compatible with SmartTerm 240 emulator, HP Lasergrafix Film Recorder, QMS 800/1200, DEC LN03+ Laserprinters, and HP LaserJet (Series III). NASADIG is written in FORTRAN and is available for several platforms. NASADIG 5.7 is available for DEC VAX series computers running VMS 5.0 or later (MSC-21801), Cray X-MP and Y-MP series computers running UNICOS (COS-10049), and Amdahl 5990 mainframe computers running UTS (COS-10050). NASADIG 5.1 is available for UNIX-based operating systems (MSC-22001). The UNIX version has been successfully implemented on Sun4 series computers running SunOS, SGI IRIS computers running IRIX, Hewlett Packard 9000 computers running HP-UX, and Convex computers running Convex OS (MSC-22001). The standard distribution medium for MSC-21801 is a set of two 6250 BPI 9-track magnetic tapes in DEC VAX BACKUP format. It is also available on a set of two TK50 tape cartridges in DEC VAX BACKUP format. The standard distribution medium for COS-10049 and COS-10050 is a 6250 BPI 9-track magnetic tape in UNIX tar format. Other distribution media and formats may be available upon request. The standard distribution medium for MSC-22001 is a .25 inch streaming magnetic tape cartridge (Sun QIC-24) in UNIX tar format. Alternate distribution media and formats are available upon request. With minor modification, the UNIX source code can be ported to other platforms including IBM PC/AT series computers and compatibles. NASADIG is also available bundled with TRASYS, the Thermal Radiation Analysis System (COS-10026, DEC VAX version; COS-10040, CRAY version).
A Python library for FAIRer access and deposition to the Metabolomics Workbench Data Repository.
Smelter, Andrey; Moseley, Hunter N B
2018-01-01
The Metabolomics Workbench Data Repository is a public repository of mass spectrometry and nuclear magnetic resonance data and metadata derived from a wide variety of metabolomics studies. The data and metadata for each study is deposited, stored, and accessed via files in the domain-specific 'mwTab' flat file format. In order to improve the accessibility, reusability, and interoperability of the data and metadata stored in 'mwTab' formatted files, we implemented a Python library and package. This Python package, named 'mwtab', is a parser for the domain-specific 'mwTab' flat file format, which provides facilities for reading, accessing, and writing 'mwTab' formatted files. Furthermore, the package provides facilities to validate both the format and required metadata elements of a given 'mwTab' formatted file. In order to develop the 'mwtab' package we used the official 'mwTab' format specification. We used Git version control along with Python unit-testing framework as well as continuous integration service to run those tests on multiple versions of Python. Package documentation was developed using sphinx documentation generator. The 'mwtab' package provides both Python programmatic library interfaces and command-line interfaces for reading, writing, and validating 'mwTab' formatted files. Data and associated metadata are stored within Python dictionary- and list-based data structures, enabling straightforward, 'pythonic' access and manipulation of data and metadata. Also, the package provides facilities to convert 'mwTab' files into a JSON formatted equivalent, enabling easy reusability of the data by all modern programming languages that implement JSON parsers. The 'mwtab' package implements its metadata validation functionality based on a pre-defined JSON schema that can be easily specialized for specific types of metabolomics studies. The library also provides a command-line interface for interconversion between 'mwTab' and JSONized formats in raw text and a variety of compressed binary file formats. The 'mwtab' package is an easy-to-use Python package that provides FAIRer utilization of the Metabolomics Workbench Data Repository. The source code is freely available on GitHub and via the Python Package Index. Documentation includes a 'User Guide', 'Tutorial', and 'API Reference'. The GitHub repository also provides 'mwtab' package unit-tests via a continuous integration service.
A New Source Biasing Approach in ADVANTG
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bevill, Aaron M; Mosher, Scott W
2012-01-01
The ADVANTG code has been developed at Oak Ridge National Laboratory to generate biased sources and weight window maps for MCNP using the CADIS and FW-CADIS methods. In preparation for an upcoming RSICC release, a new approach for generating a biased source has been developed. This improvement streamlines user input and improves reliability. Previous versions of ADVANTG generated the biased source from ADVANTG input, writing an entirely new general fixed-source definition (SDEF). Because volumetric sources were translated into SDEF-format as a finite set of points, the user had to perform a convergence study to determine whether the number of sourcemore » points used accurately represented the source region. Further, the large number of points that must be written in SDEF-format made the MCNP input and output files excessively long and difficult to debug. ADVANTG now reads SDEF-format distributions and generates corresponding source biasing cards, eliminating the need for a convergence study. Many problems of interest use complicated source regions that are defined using cell rejection. In cell rejection, the source distribution in space is defined using an arbitrarily complex cell and a simple bounding region. Source positions are sampled within the bounding region but accepted only if they fall within the cell; otherwise, the position is resampled entirely. When biasing in space is applied to sources that use rejection sampling, current versions of MCNP do not account for the rejection in setting the source weight of histories, resulting in an 'unfair game'. This problem was circumvented in previous versions of ADVANTG by translating volumetric sources into a finite set of points, which does not alter the mean history weight ({bar w}). To use biasing parameters without otherwise modifying the original cell-rejection SDEF-format source, ADVANTG users now apply a correction factor for {bar w} in post-processing. A stratified-random sampling approach in ADVANTG is under development to automatically report the correction factor with estimated uncertainty. This study demonstrates the use of ADVANTG's new source biasing method, including the application of {bar w}.« less
Walthouwer, Michel Jean Louis; Oenema, Anke; Lechner, Lilian; de Vries, Hein
2015-09-25
Many Web-based computer-tailored interventions are characterized by high dropout rates, which limit their potential impact. This study had 4 aims: (1) examining if the use of a Web-based computer-tailored obesity prevention intervention can be increased by using videos as the delivery format, (2) examining if the delivery of intervention content via participants' preferred delivery format can increase intervention use, (3) examining if intervention effects are moderated by intervention use and matching or mismatching intervention delivery format preference, (4) and identifying which sociodemographic factors and intervention appreciation variables predict intervention use. Data were used from a randomized controlled study into the efficacy of a video and text version of a Web-based computer-tailored obesity prevention intervention consisting of a baseline measurement and a 6-month follow-up measurement. The intervention consisted of 6 weekly sessions and could be used for 3 months. ANCOVAs were conducted to assess differences in use between the video and text version and between participants allocated to a matching and mismatching intervention delivery format. Potential moderation by intervention use and matching/mismatching delivery format on self-reported body mass index (BMI), physical activity, and energy intake was examined using regression analyses with interaction terms. Finally, regression analysis was performed to assess determinants of intervention use. In total, 1419 participants completed the baseline questionnaire (follow-up response=71.53%, 1015/1419). Intervention use declined rapidly over time; the first 2 intervention sessions were completed by approximately half of the participants and only 10.9% (104/956) of the study population completed all 6 sessions of the intervention. There were no significant differences in use between the video and text version. Intervention use was significantly higher among participants who were allocated to an intervention condition that matched their preferred intervention delivery format. There were no significant interaction terms for any of the outcome variables; a match and more intervention use did not result in better intervention effects. Participants with a high BMI and participants who felt involved and supported by the intervention were more likely to use the intervention more often. Video delivery of tailored feedback does not increase the use of Web-based computer-tailored interventions. However, intervention use can potentially be increased by delivering intervention content via participants' preferred intervention delivery format and creating feelings of relatedness. Because more intervention use was not associated with better intervention outcomes, more research is needed to examine the optimum number of intervention sessions in terms of maximizing use and effects. Nederlands Trial Register: NTR3501; http://www.trialregister.nl/trialreg/admin/rctview.asp?TC=3501 (Archived by WebCite at http://www.webcitation.org/6b2tsH8Pk).
Global Mapping Project - Applications and Development of Version 2 Dataset
NASA Astrophysics Data System (ADS)
Ubukawa, T.; Nakamura, T.; Otsuka, T.; Iimura, T.; Kishimoto, N.; Nakaminami, K.; Motojima, Y.; Suga, M.; Yatabe, Y.; Koarai, M.; Okatani, T.
2012-07-01
The Global Mapping Project aims to develop basic geospatial information of the whole land area of the globe, named Global Map, through the cooperation of National Mapping Organizations (NMOs) around the world. The Global Map data can be a base of global geospatial infrastructure and is composed of eight layers: Boundaries, Drainage, Transportation, Population Centers, Elevation, Land Use, Land Cover and Vegetation. The Global Map Version 1 was released in 2008, and the Version 2 will be released in 2013 as the data are to be updated every five years. In 2009, the International Steering Committee for Global Mapping (ISCGM) adopted new Specifications to develop the Global Map Version 2 with a change of its format so that it is compatible with the international standards, namely ISO 19136 and ISO 19115. With the support of the secretariat of ISCGM, the project participating countries are accelerating their data development toward the completion of the global coverage in 2013, while some countries have already released their Global Map version 2 datasets since 2010. Global Map data are available from the Internet free of charge for non-commercial purposes, which can be used to predict, assess, prepare for and cope with global issues by combining with other spatial data. There are a lot of Global Map applications in various fields, and further utilization of Global Map is expected. This paper summarises the activities toward the development of the Global Map Version 2 as well as some examples of the Global Map applications in various fields.
Cultural adaptation of the Test of Narrative Language (TNL) into Brazilian Portuguese.
Rossi, Natalia Freitas; Lindau, Tâmara de Andrade; Gillam, Ronald Bradley; Giacheti, Célia Maria
To accomplish the translation and cultural adaptation of the Test of Narrative Language (TNL) into Brazilian Portuguese. The TNL is a formal instrument which assesses narrative comprehension and oral narration of children between the ages of 5-0 and 11-11 (years-months). The TNL translation and adaptation process had the following steps: (1) translation into the target language; (2) summary of the translated versions; (3) back-translation; (4) checking of the conceptual, semantics and cultural equivalence process and (5) pilot study (56 children within the test age range and from both genders). The adapted version maintained the same structure as the original version: number of tasks (both, three comprehension and oral narration), narrative formats (no picture, sequenced pictures and single picture) and scoring system. There were no adjustments to the pictures. The "McDonald's Story" was replaced by the "Snack Bar History" to meet the semantic and experiential equivalence of the target population. The other stories had semantic and grammatical adjustments. Statistically significant difference was found when comparing the raw score (comprehension, narration and total) of age groups from the adapted version. Adjustments were required to meet the equivalence between the original and the translated versions. The adapted version showed it has the potential to identify differences in oral narratives of children in the age range provided by the test. Measurement equivalence for validation and test standardization are in progress and will be able to supplement the study outcomes.
Alexander, Marcalee S; New, Peter W; Biering-Sørensen, Fin; Courtois, Frederique; Popolo, Giulio Del; Elliott, Stacy; Kiekens, Carlotte; Vogel, Lawrence; Previnaire, Jean G
2017-01-01
Data set review and modification. To describe modifications in the International Spinal Cord Injury (SCI) Male Sexual Function Basic Data Set Version 2.0 and the International SCI Female Sexual and Reproductive Function Basic Data Set Version 2.0. International expert work group using on line communication. An international team of experts was compiled to review and revise the International SCI Male Sexual Function and Female Sexual and Reproductive Function Basic Data Sets Version 1.0. The group adapted Version 1.0 based upon review of published research, suggestions from concerned individuals and on line work group consensus. The revised data sets were then posted on the International Spinal Cord Society (ISCoS) and American Spinal Injury Association (ASIA) websites for 2 months for review. Subsequently, the data sets were approved by the ISCoS Scientific and Executive Committees and ASIA board of directors. The data sets were modified to a self-report format. They were reviewed for appropriateness for the pediatric age group and adapted to include a new variable to address the issue of sexual orientation. A clarification of the difference between the data sets and the autonomic standards was also developed. Sexuality is a continuously evolving topic. Modifications were needed to address this topic in a comprehensive fashion. It is recommended that Version 2.0 of these data sets are used for ongoing documentation of sexual status in the medical record and for documentation of sexual concerns during on-going research.
NASA Technical Reports Server (NTRS)
Cullimore, B.
1994-01-01
SINDA, the Systems Improved Numerical Differencing Analyzer, is a software system for solving lumped parameter representations of physical problems governed by diffusion-type equations. SINDA was originally designed for analyzing thermal systems represented in electrical analog, lumped parameter form, although its use may be extended to include other classes of physical systems which can be modeled in this form. As a thermal analyzer, SINDA can handle such interrelated phenomena as sublimation, diffuse radiation within enclosures, transport delay effects, and sensitivity analysis. FLUINT, the FLUid INTegrator, is an advanced one-dimensional fluid analysis program that solves arbitrary fluid flow networks. The working fluids can be single phase vapor, single phase liquid, or two phase. The SINDA'85/FLUINT system permits the mutual influences of thermal and fluid problems to be analyzed. The SINDA system consists of a programming language, a preprocessor, and a subroutine library. The SINDA language is designed for working with lumped parameter representations and finite difference solution techniques. The preprocessor accepts programs written in the SINDA language and converts them into standard FORTRAN. The SINDA library consists of a large number of FORTRAN subroutines that perform a variety of commonly needed actions. The use of these subroutines can greatly reduce the programming effort required to solve many problems. A complete run of a SINDA'85/FLUINT model is a four step process. First, the user's desired model is run through the preprocessor which writes out data files for the processor to read and translates the user's program code. Second, the translated code is compiled. The third step requires linking the user's code with the processor library. Finally, the processor is executed. SINDA'85/FLUINT program features include 20,000 nodes, 100,000 conductors, 100 thermal submodels, and 10 fluid submodels. SINDA'85/FLUINT can also model two phase flow, capillary devices, user defined fluids, gravity and acceleration body forces on a fluid, and variable volumes. SINDA'85/FLUINT offers the following numerical solution techniques. The Finite difference formulation of the explicit method is the Forward-difference explicit approximation. The formulation of the implicit method is the Crank-Nicolson approximation. The program allows simulation of non-uniform heating and facilitates modeling thin-walled heat exchangers. The ability to model non-equilibrium behavior within two-phase volumes is included. Recent improvements to the program were made in modeling real evaporator-pumps and other capillary-assist evaporators. SINDA'85/FLUINT is available by license for a period of ten (10) years to approved licensees. The licensed program product includes the source code and one copy of the supporting documentation. Additional copies of the documentation may be purchased separately at any time. SINDA'85/FLUINT is written in FORTRAN 77. Version 2.3 has been implemented on Cray series computers running UNICOS, CONVEX computers running CONVEX OS, and DEC RISC computers running ULTRIX. Binaries are included with the Cray version only. The Cray version of SINDA'85/FLUINT also contains SINGE, an additional graphics program developed at Johnson Space Flight Center. Both source and executable code are provided for SINGE. Users wishing to create their own SINGE executable will also need the NASA Device Independent Graphics Library (NASADIG, previously known as SMDDIG; UNIX version, MSC-22001). The Cray and CONVEX versions of SINDA'85/FLUINT are available on 9-track 1600 BPI UNIX tar format magnetic tapes. The CONVEX version is also available on a .25 inch streaming magnetic tape cartridge in UNIX tar format. The DEC RISC ULTRIX version is available on a TK50 magnetic tape cartridge in UNIX tar format. SINDA was developed in 1971, and first had fluid capability added in 1975. SINDA'85/FLUINT version 2.3 was released in 1990.
Quantum games of opinion formation based on the Marinatto-Weber quantum game scheme
NASA Astrophysics Data System (ADS)
Deng, Xinyang; Deng, Yong; Liu, Qi; Shi, Lei; Wang, Zhen
2016-06-01
Quantization has become a new way to investigate classical game theory since quantum strategies and quantum games were proposed. In the existing studies, many typical game models, such as the prisoner's dilemma, battle of the sexes, Hawk-Dove game, have been extensively explored by using quantization approach. Along a similar method, here several game models of opinion formations will be quantized on the basis of the Marinatto-Weber quantum game scheme, a frequently used scheme of converting classical games to quantum versions. Our results show that the quantization can fascinatingly change the properties of some classical opinion formation game models so as to generate win-win outcomes.
Method and apparatus for production of subsea hydrocarbon formations
Blandford, J.W.
1995-01-17
A system for controlling, separating, processing and exporting well fluids produced from subsea hydrocarbon formations is disclosed. The subsea well tender system includes a surface buoy supporting one or more decks above the water surface for accommodating equipment to process oil, gas and water recovered from the subsea hydrocarbon formation. The surface buoy includes a surface-piercing central flotation column connected to one or more external flotation tanks located below the water surface. The surface buoy is secured to the sea bed by one or more tendons which are anchored to a foundation with piles imbedded in the sea bed. The system accommodates multiple versions on the surface buoy configuration. 20 figures.
Shakir'yanova, Yu P; Leonov, S V; Pinchuk, P V; Sukhareva, M A
This article was designed to share the experience gained with the three-dimensional modeling for the purpose of situational expertise intended to reconstruct the occurrence circumstances and check up the alternative investigative leads concerning formation of potential injuries to a concrete person. Simulation was performed with the use of the dimensionally scaled model of the place of occurrence as well as the models of the human head and body totally consistent with the anthropometric characteristics of the victim. The results of this work made it possible to reject several potential opportunities for the formation of injuries to the victim and identify the most probable version.
IOS: PDP 11/45 formatted input/output task stacker and processer. [In MACRO-II
DOE Office of Scientific and Technical Information (OSTI.GOV)
Koschik, J.
1974-07-08
IOS allows the programer to perform formated Input/Output at assembly language level to/from any peripheral device. It runs under DOS versions V8-O8 or V9-19, reading and writing DOS-compatible files. Additionally, IOS will run, with total transparency, in an environment with memory management enabled. Minimum hardware required is a 16K PDP 11/45, Keyboard Device, DISK (DK,DF, or DC), and Line Frequency Clock. The source language is MACRO-11 (3.3K Decimal Words).
Montana geoenvironmental explorer
Lee, Greg K.
2001-01-01
This report is the result of a multidisciplinary effort to assess relative potential for acidic, metal-rich drainage in the State of Montana; evaluate alternative GIS-based modeling strategies; and provide the statewide digital spatial data produced and compiled for the project. The CD is usable on various computer systems (Windows 95, 98, NT, and 2000; MacOS 7.1 or later; many versions of UNIX and Linux; and OS/2). This report and maps are in PDF format, and the data have been provided in various GIS formats. Software for viewing the report and data is included.
Balan, Daniela; Adolfsson, Hans
2002-04-05
The direct formation of alpha-methylene-beta-amino acid derivatives is achieved using the aza version of the Baylis-Hillman protocol. The products are readily formed in a three-component one-pot reaction between arylaldehydes, sulfonamides, and alpha,beta-unsaturated carbonyl compounds. The reaction is efficiently catalyzed by titanium isopropoxide and 2-hydroxyquinuclidine in the presence of molecular sieves. The protocol allows for structural variation of the substrates, tolerating electron-poor and electron-rich arylaldehydes and various Michael acceptors.
Preliminary results of 3D dose calculations with MCNP-4B code from a SPECT image.
Rodríguez Gual, M; Lima, F F; Sospedra Alfonso, R; González González, J; Calderón Marín, C
2004-01-01
Interface software was developed to generate the input file to run Monte Carlo MCNP-4B code from medical image in Interfile format version 3.3. The software was tested using a spherical phantom of tomography slides with known cumulated activity distribution in Interfile format generated with IMAGAMMA medical image processing system. The 3D dose calculation obtained with Monte Carlo MCNP-4B code was compared with the voxel S factor method. The results show a relative error between both methods less than 1 %.
Pick_sw: a program for interactive picking of S-wave data, version 2.00
Ellefsen, Karl J.
2002-01-01
Program pick_sw is used to interactively pick travel times from S-wave data. It is assumed that the data are collected using 2 shots of opposite polarity at each shot location. The traces must be in either the SEG-2 format or the SU format. The program is written in the IDL and C programming languages, and the program is executed under the Windows operating system. (The program may also execute under other operating systems like UNIX if the C language functions are re-compiled).
Bekenstein bounds, Penrose inequalities, and black hole formation
NASA Astrophysics Data System (ADS)
Jaracz, Jaroslaw S.; Khuri, Marcus A.
2018-06-01
A universal geometric inequality for bodies relating energy, size, angular momentum, and charge is naturally implied by Bekenstein's entropy bounds. We establish versions of this inequality for axisymmetric bodies satisfying appropriate energy conditions, thus lending credence to the most general form of Bekenstein's bound. Similar techniques are then used to prove a Penrose-like inequality in which the ADM energy is bounded from below in terms of horizon area, angular momentum, and charge. Lastly, new criteria for the formation of black holes is presented involving concentration of angular momentum, charge, and nonelectromagnetic matter energy.
1992-05-01
especially true for friend-enemy or danger-safe designations. Dots, dashes, shapes, and video effects are recommended. Care must be taken to avoid visual...MAY 92 10.3 Screen Design - Format 10.3.1.4 Use of Contrasting Features Use contrasting features such as inverse video and color to call attention to...captions. Do not use reverse video or highlighting for labels. 13.2.3.2 Formatting For single fields, locate the caption to the left of the entry fields
Noreen, Eric
2000-01-01
These images were processed from a raw format using Integrated Software for Images and Spectrometers (ISIS) to perform radiometric corrections and projection. All the images were projected in sinusoidal using a center longitude of 0 degrees. There are two versions of the mosaic, one unfiltered (sinusmos.tif), and one produced with all images processed through a box filter with an averaged pixel tone of 7.5 (sinusmosflt.tif). Both mosaics are ArcView-ArcInfo(2) ready in TIF format with associated world files (*.tfw).
Noreen, Eric
2000-01-01
These images were processed from a raw format using Integrated Software for Images and Spectrometers (ISIS) to perform radiometric corrections and projection. All the images were projected in sinusoidal using a center longitude of 70 degrees. There are two versions of the mosaic, one unfiltered (vallesmos.tif), and one produced with all images processed through a box filter with an averaged pixel tone of 7.699 (vallesmosflt.tif). Both mosaics are ArcView-ArcInfo ready in TIF format with associated world files (*.tfw).
NASA Technical Reports Server (NTRS)
Warren, W. H., Jr.
1983-01-01
A description of the machine readable catalog, including detailed format and tape file characteristics, is given. The machine file is a computation of mean values for position and magnitude at a mean epoch of observation for each unique star in the Oxford, Paris, Bordeaux, Toulouse and Northern Hemisphere Algiers zone. The format was changed to effect more efficient data searching by position and additional duplicate entries were removed. The final catalog contains data for 997311 stars.
DOE Office of Scientific and Technical Information (OSTI.GOV)
2015-08-01
Version 00 COG LibMaker contains various utilities to convert common data formats into a format usable by the COG - Multi-particle Monte Carlo Code System package, (C00777MNYCP01). Utilities included: ACEtoCOG - ACE formatted neutron data: Currently ENDFB7R0.BNL, ENDFB7R1.BNL, JEFF3.1, JEFF3.1.1, JEFF3.1.2, MCNP.50c, MCNP.51c, MCNP.55c, MCNP.66c, and MCNP.70c. ACEUtoCOG - ACEU formatted photonuclear data: Currently PN.MCNP.30c and PN.MCNP.70u. ACTLtoCOG - Creates a COG library from ENDL formatted activation data COG library. EDDLtoCOG - Creates a COG library from ENDL formatted LLNL deuteron data. ENDLtoCOG - Creates a COG library from ENDL formatted LLNL neutron data. EPDLtoCOG - Creates a COG librarymore » from ENDL formatted LLNL photon data. LEX - Creates a COG dictionary file. SAB.ACEtoCOG - Creates a COG library from ACE formatted S(a,b) data. SABtoCOG - Creates a COG library from ENDF6 formatted S(a,b) data. URRtoCOG - Creates a COG library from ACE formatted probability table data. This package also includes library checking and bit swapping capability.« less
The development and validation of the Questionnaire on Anticipated Discrimination (QUAD).
Gabbidon, Jheanell; Brohan, Elaine; Clement, Sarah; Henderson, R Claire; Thornicroft, Graham
2013-11-07
The anticipation of mental health-related discrimination is common amongst people with mental health problems and can have serious adverse effects. This study aimed to develop and validate a measure assessing the extent to which people with mental health problems anticipate that they will personally experience discrimination across a range of contexts. The items and format for the Questionnaire on Anticipated Discrimination (QUAD) were developed from previous versions of the Discrimination and Stigma Scale (DISC), focus groups and cognitive debriefing interviews which were used to further refine the content and format. The resulting provisional version of the QUAD was completed by 117 service users in an online survey and reliability, validity, precision and acceptability were assessed. A final version of the scale was agreed and analyses re-run using the online survey data and data from an independent sample to report the psychometric properties of the finalised scale. The provisional version of the QUAD had 17 items, good internal consistency (alpha = 0.86) and adequate convergent validity as supported by the significant positive correlations with the Stigma Scale (SS) (r = 0.40, p < 0.001) and the Internalised Stigma of Mental Illness Scale (ISMI) (r = 0.40, p < 0.001). Three items were removed due to low endorsements, high inter-correlation or conceptual concerns. The finalised 14 item QUAD had good internal consistency (alpha = 0.86), good test re-test reliability (ρ(c) = 0.81) and adequate convergent validity: correlations with the ISMI (r = 0.45, p < 0.001) and with the SS (r = 0.39, p < 0.001). Reading ease scores indicated good acceptability for general adult populations. Cross-replication in an independent sample further indicated good internal consistency (alpha = 0.88), adequate convergent validity and revealed two factors summarised by institutions/services and interpersonal/professional relationships. The QUAD expanded upon previous versions of the DISC. It is a reliable, valid and acceptable measure which can be used to identify key life areas in which people may personally anticipate discrimination, and an overall tendency to anticipate discrimination. It may also be useful in planning interventions aimed at reducing the stigma of mental illness.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hartin, Corinne A.; Fine, Rana A.; Kamenkovich, Igor
2014-01-28
Average formation rates for Subantarctic Mode (SAMW) and Antarctic Intermediate Waters (AAIW) in the South Pacific are calculated from the National Center for Atmospheric Research Community Climate System Model version 4 (NCAR-CCSM4), using chlorofluorocarbon inventories. CFC-12 inventories and formation rates are compared to ocean observations. CCSM4 accurately simulates the southeast Pacific as the main formation region for SAMW and AAIW. CCSM4 formation rates for SAMW are 3.4 Sv, about half of the observational rate. Shallow mixed layers and a thinner SAMW in CCSM4 are responsible for lower formation rates. A formation rate of 8.1 Sv for AAIW in CCSM4 ismore » higher than observations. Higher inventories in CCSM4 in the southwest and central Pacific, and higher surface concentrations are the main reasons for higher formation rates of AAIW. This comparison of model and observations is useful for understanding the uptake and transport of other gases, e.g., CO2 by the model.« less
Webster, Kate E.; Feller, Julian A.
2018-01-01
Background: The Anterior Cruciate Ligament Return to Sport After Injury (ACL-RSI) scale was developed to measure an athlete’s psychological readiness to return to sport after anterior cruciate ligament (ACL) injury and reconstruction surgery. The scale is being used with increasing frequency in both research and clinical settings. Purpose: To generate and validate a short version of the ACL-RSI scale. Study Design: Cohort study (diagnosis); Level of evidence, 2. Methods: The ACL-RSI scale was administered to 535 patients who had undergone ACL reconstruction surgery. Reliability (Cronbach alpha) was determined and factor analysis of the full scale was undertaken along with a process of item selection and elimination. A second group of 250 patients participated in a predictive validation analysis. This group completed the ACL-RSI scale at 6 months and reported return-to-sport outcomes 12 months following ACL reconstruction surgery. The predictive validity of both scales (full and short versions) was assessed by use of receiver operating characteristic (ROC) curve statistics. Results: The scale was found to have high internal consistency (Cronbach alpha, 0.96), which suggested that item redundancy was present. After an item selection process, the scale was reduced to a 6-item format. Cronbach alpha for the short version was 0.92, and factor analysis confirmed the presence of 1 factor accounting for 71% of the total variance. Scores for the short version were significantly different between patients who had and those who had not returned to sport. Six-month ACL-RSI scores for both the full and short versions had fair to good predictive ability for 12-month return-to-sport outcomes (full version: area under ROC curve, 0.77 [95% CI, 0.7-0.8]; short version: area under ROC curve, 0.75 [95% CI, 0.7-0.8]). Conclusion: A 6-item short version of the ACL-RSI scale was developed from a large cohort of patients undergoing ACL reconstruction. The short version appears to be as robust as the full version for discriminating between and predicting return-to-sport outcomes. The short version of the ACL-RSI may be of use in busy clinical settings to help identify athletes who may find return to sport challenging. PMID:29662909
NASA Technical Reports Server (NTRS)
Carlson, H. W.
1994-01-01
This code was developed to aid design engineers in the selection and evaluation of aerodynamically efficient wing-canard and wing-horizontal-tail configurations that may employ simple hinged-flap systems. Rapid estimates of the longitudinal aerodynamic characteristics of conceptual airplane lifting surface arrangements are provided. The method is particularly well suited to configurations which, because of high speed flight requirements, must employ thin wings with highly swept leading edges. The code is applicable to wings with either sharp or rounded leading edges. The code provides theoretical pressure distributions over the wing, the canard or horizontal tail, and the deflected flap surfaces as well as estimates of the wing lift, drag, and pitching moments which account for attainable leading edge thrust and leading edge separation vortex forces. The wing planform information is specified by a series of leading edge and trailing edge breakpoints for a right hand wing panel. Up to 21 pairs of coordinates may be used to describe both the leading edge and the trailing edge. The code has been written to accommodate 2000 right hand panel elements, but can easily be modified to accommodate a larger or smaller number of elements depending on the capacity of the target computer platform. The code provides solutions for wing surfaces composed of all possible combinations of leading edge and trailing edge flap settings provided by the original deflection multipliers and by the flap deflection multipliers. Up to 25 pairs of leading edge and trailing edge flap deflection schedules may thus be treated simultaneously. The code also provides for an improved accounting of hinge-line singularities in determination of wing forces and moments. To determine lifting surface perturbation velocity distributions, the code provides for a maximum of 70 iterations. The program is constructed so that successive runs may be made with a given code entry. To make additional runs, it is necessary only to add an identification record and the namelist data that are to be changed from the previous run. This code was originally developed in 1989 in FORTRAN V on a CDC 6000 computer system, and was later ported to an MS-DOS environment. Both versions are available from COSMIC. There are only a few differences between the PC version (LAR-14458) and CDC version (LAR-14178) of AERO2S distributed by COSMIC. The CDC version has one main source code file while the PC version has two files which are easier to edit and compile on a PC. The PC version does not require a FORTRAN compiler which supports NAMELIST because a special INPUT subroutine has been added. The CDC version includes two MODIFY decks which can be used to improve the code and prevent the possibility of some infrequently occurring errors while PC-version users will have to make these code changes manually. The PC version includes an executable which was generated with the Ryan McFarland/FORTRAN compiler and requires 253K RAM and an 80x87 math co-processor. Using this executable, the sample case requires about four hours to execute on an 8MHz AT-class microcomputer with a co-processor. The source code conforms to the FORTRAN 77 standard except that it uses variables longer than six characters. With two minor modifications, the PC version should be portable to any computer with a FORTRAN compiler and sufficient memory. The CDC version of AERO2S is available in CDC NOS Internal format on a 9-track 1600 BPI magnetic tape. The PC version is available on a set of two 5.25 inch 360K MS-DOS format diskettes. IBM AT is a registered trademark of International Business Machines. MS-DOS is a registered trademark of Microsoft Corporation. CDC is a registered trademark of Control Data Corporation. NOS is a trademark of Control Data Corporation.
NASA Technical Reports Server (NTRS)
Darden, C. M.
1994-01-01
This code was developed to aid design engineers in the selection and evaluation of aerodynamically efficient wing-canard and wing-horizontal-tail configurations that may employ simple hinged-flap systems. Rapid estimates of the longitudinal aerodynamic characteristics of conceptual airplane lifting surface arrangements are provided. The method is particularly well suited to configurations which, because of high speed flight requirements, must employ thin wings with highly swept leading edges. The code is applicable to wings with either sharp or rounded leading edges. The code provides theoretical pressure distributions over the wing, the canard or horizontal tail, and the deflected flap surfaces as well as estimates of the wing lift, drag, and pitching moments which account for attainable leading edge thrust and leading edge separation vortex forces. The wing planform information is specified by a series of leading edge and trailing edge breakpoints for a right hand wing panel. Up to 21 pairs of coordinates may be used to describe both the leading edge and the trailing edge. The code has been written to accommodate 2000 right hand panel elements, but can easily be modified to accommodate a larger or smaller number of elements depending on the capacity of the target computer platform. The code provides solutions for wing surfaces composed of all possible combinations of leading edge and trailing edge flap settings provided by the original deflection multipliers and by the flap deflection multipliers. Up to 25 pairs of leading edge and trailing edge flap deflection schedules may thus be treated simultaneously. The code also provides for an improved accounting of hinge-line singularities in determination of wing forces and moments. To determine lifting surface perturbation velocity distributions, the code provides for a maximum of 70 iterations. The program is constructed so that successive runs may be made with a given code entry. To make additional runs, it is necessary only to add an identification record and the namelist data that are to be changed from the previous run. This code was originally developed in 1989 in FORTRAN V on a CDC 6000 computer system, and was later ported to an MS-DOS environment. Both versions are available from COSMIC. There are only a few differences between the PC version (LAR-14458) and CDC version (LAR-14178) of AERO2S distributed by COSMIC. The CDC version has one main source code file while the PC version has two files which are easier to edit and compile on a PC. The PC version does not require a FORTRAN compiler which supports NAMELIST because a special INPUT subroutine has been added. The CDC version includes two MODIFY decks which can be used to improve the code and prevent the possibility of some infrequently occurring errors while PC-version users will have to make these code changes manually. The PC version includes an executable which was generated with the Ryan McFarland/FORTRAN compiler and requires 253K RAM and an 80x87 math co-processor. Using this executable, the sample case requires about four hours to execute on an 8MHz AT-class microcomputer with a co-processor. The source code conforms to the FORTRAN 77 standard except that it uses variables longer than six characters. With two minor modifications, the PC version should be portable to any computer with a FORTRAN compiler and sufficient memory. The CDC version of AERO2S is available in CDC NOS Internal format on a 9-track 1600 BPI magnetic tape. The PC version is available on a set of two 5.25 inch 360K MS-DOS format diskettes. IBM AT is a registered trademark of International Business Machines. MS-DOS is a registered trademark of Microsoft Corporation. CDC is a registered trademark of Control Data Corporation. NOS is a trademark of Control Data Corporation.
Goncalves, Daniel Maffasioli; Cloninger, C Robert
2010-07-01
The Temperament and Character Inventory (TCI) was first described in 1993. It was designed to measure the character and temperament dimensions of Cloninger's model of personality using a true-false response format. The revised TCI (TCI-R) uses a five-point-Likert format and has multiple subscales for persistence to improve its reliability. We tested the clinical validity of an original Brazilian-Portuguese translation of the TCI-R. The 595 volunteers completed the BrP version of TCI-R, Beck Depression Inventory (BDI), Beck Anxiety Inventory (BAI), and Satisfaction with Life Scale (SWLS). The internal consistency was satisfactory for all dimensions (Cronbach alpha coefficients above 0.7). The cumulative variances for temperament and character were 58% and 60%. BAI was positively correlated with harm avoidance and negatively with persistence, self-directedness and cooperativeness. SWLS was correlated negatively with harm avoidance and positively with self-directedness and cooperativeness. The congruence coefficients between each facet of BrP TCI-R and the US TCI-R original data were 95% or higher (except NS1). The main limitation of this study is the convenience sampling. The BrP version of the TCI-R had good psychometric properties regardless of the cultural and educational backgrounds of subjects. The present study supported the validity of the BrP translation of the TCI-R, which encourages its use in both clinical and general community samples.
MARD—A moving average rose diagram application for the geosciences
NASA Astrophysics Data System (ADS)
Munro, Mark A.; Blenkinsop, Thomas G.
2012-12-01
MARD 1.0 is a computer program for generating smoothed rose diagrams by using a moving average, which is designed for use across the wide range of disciplines encompassed within the Earth Sciences. Available in MATLAB®, Microsoft® Excel and GNU Octave formats, the program is fully compatible with both Microsoft® Windows and Macintosh operating systems. Each version has been implemented in a user-friendly way that requires no prior experience in programming with the software. MARD conducts a moving average smoothing, a form of signal processing low-pass filter, upon the raw circular data according to a set of pre-defined conditions selected by the user. This form of signal processing filter smoothes the angular dataset, emphasising significant circular trends whilst reducing background noise. Customisable parameters include whether the data is uni- or bi-directional, the angular range (or aperture) over which the data is averaged, and whether an unweighted or weighted moving average is to be applied. In addition to the uni- and bi-directional options, the MATLAB® and Octave versions also possess a function for plotting 2-dimensional dips/pitches in a single, lower, hemisphere. The rose diagrams from each version are exportable as one of a selection of common graphical formats. Frequently employed statistical measures that determine the vector mean, mean resultant (or length), circular standard deviation and circular variance are also included. MARD's scope is demonstrated via its application to a variety of datasets within the Earth Sciences.
76 FR 27309 - Committee on Measures of Student Success
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-11
... version of this document is the document published in the Federal Register. Free Internet access to the... text or Adobe Portable Document Format (PDF) on the Internet at the following site: http://www.ed.gov/news/fed-register/index.html . To use PDF you must have Adobe Acrobat Reader, which is available free...
75 FR 16763 - Ready-to-Learn Television Program
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-02
... official version of this document is the document published in the Federal Register. Free Internet access... Service, toll free, at 1-800-877-8339. Electronic Access to This Document: You can view this document, as... Portable Document Format (PDF) on the Internet at the following site: http://www.ed.gov/news/fedregister...
77 FR 37893 - Privacy Act of 1974; System of Records
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-25
... official versions of these documents are the documents published in the Federal Register. Free Internet... telephone (TTY), call the Federal Relay Service (FRS), toll free, at 1- 800-877-8339. Individuals with... Portable Document Format (PDF). To use PDF you must have Adobe Acrobat Reader, which is available free at...
75 FR 38797 - Predominantly Black Institutions Formula Grant Program
Federal Register 2010, 2011, 2012, 2013, 2014
2010-07-06
... official version of this document is the document published in the Federal Register. Free Internet access... (FRS), toll free, at 1-800-877-8339. Electronic Access to This Document: You can view this document, as... Portable Document Format (PDF) on the Internet at the following site: http://www.ed.gov/news/fedregister...
76 FR 50198 - Committee on Measures of Student Success
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-12
...: The official version of this document is the document published in the Federal Register. Free Internet... Federal Register, in text or Adobe Portable Document Format (PDF) on the Internet at the following site... is available free at this site. If you have questions about using PDF, call the U.S. Government...
Measuring Eating Competence: Psychometric Properties and Validity of the ecSatter Inventory
ERIC Educational Resources Information Center
Lohse, Barbara; Satter, Ellyn; Horacek, Tanya; Gebreselassie, Tesfayi; Oakland, Mary Jane
2007-01-01
Objective: Assess validity of the ecSatter Inventory (ecSI) to measure eating competence (EC). Design: Concurrent administration of ecSI with validated measures of eating behaviors using on-line and paper-pencil formats. Setting: The on-line survey was completed by 370 participants; 462 completed the paper version. Participants: Participants…
MISR Near Real Time Products Available
Atmospheric Science Data Center
2014-09-04
... containing both Ellipsoid- and Terrain-projected radiance information, and the L2 Cloud Motion Vector (CMV) product containing ... The NRT versions of MISR data products employ the same retrieval algorithms as standard production, yielding equivalent science ... product is available in HDFEOS and BUFR format. For more information, please consult the MISR CMV DPS and Documentation for the ...
1992-03-01
Hueneme, CA 93043-5003 9. SPONSORING/MiONITORING AGENCY NAME(S) AND ADDRESSE(S) 10 . SPONSORINGINMONITORING AGENZY REPORT NUMBER Naval Facilities...8 6. Calculation Procedure .. .. ....... ...... ...... 10 7. Marine Crane Rating Chart Format .. ..... ...... ... 15 8...Rating for List. .. .... ....... ....... .... 16 9. Pendulation .. .. ....... ...... ....... ... 17 10 . References
19 CFR 351.303 - Filing, format, translation, service, and certification of documents.
Code of Federal Regulations, 2011 CFR
2011-04-01
... to the Secretary of Commerce, Attention: Import Administration, APO/Dockets Unit, Room 1870, U.S... service list by personal service or first class mail. (ii) Service of public versions or a party's own... be by first class airmail. (ii) Request for review. In addition to the certificate of service...
Summary of atmospheric wind design criteria for wind energy conversion system development
NASA Technical Reports Server (NTRS)
Frost, W.; Turner, R. E.
1979-01-01
Basic design values are presented of significant wind criteria, in graphical format, for use in the design and development of wind turbine generators for energy research. It is a condensed version of portions of the Engineering Handbook on the Atmospheric Environmental Guidelines for Use in Wind Turbine Generator Development.
NASA Technical Reports Server (NTRS)
1980-01-01
Detailed software and hardware documentation for the Cardiopulmonary Data Acquisition System is presented. General wiring and timing diagrams are given including those for the LSI-11 computer control panel and interface cables. Flowcharts and complete listings of system programs are provided along with the format of the floppy disk file.
48 CFR 752.7005 - Submission requirements for development experience documents.
Code of Federal Regulations, 2011 CFR
2011-10-01
..., technologies, management, research, results and experience as outlined in the Agency's ADS Chapter 540, section... paragraph (b)(1)(i) of this clause. (2) Format. (i) Descriptive information is required for all Contractor... descriptive information: (A) Name and version of the application software used to create the file, e.g., Word...
48 CFR 752.7005 - Submission requirements for development experience documents.
Code of Federal Regulations, 2012 CFR
2012-10-01
..., technologies, management, research, results and experience as outlined in the Agency's ADS Chapter 540, section... paragraph (b)(1)(i) of this clause. (2) Format. (i) Descriptive information is required for all Contractor... descriptive information: (A) Name and version of the application software used to create the file, e.g., Word...
48 CFR 752.7005 - Submission requirements for development experience documents.
Code of Federal Regulations, 2013 CFR
2013-10-01
..., technologies, management, research, results and experience as outlined in the Agency's ADS Chapter 540, section... paragraph (b)(1)(i) of this clause. (2) Format. (i) Descriptive information is required for all Contractor... descriptive information: (A) Name and version of the application software used to create the file, e.g., Word...
48 CFR 752.7005 - Submission requirements for development experience documents.
Code of Federal Regulations, 2014 CFR
2014-10-01
..., technologies, management, research, results and experience as outlined in the Agency's ADS Chapter 540, section... paragraph (b)(1)(i) of this clause. (2) Format. (i) Descriptive information is required for all Contractor... descriptive information: (A) Name and version of the application software used to create the file, e.g., Word...
48 CFR 752.7005 - Submission requirements for development experience documents.
Code of Federal Regulations, 2010 CFR
2010-10-01
..., technologies, management, research, results and experience as outlined in the Agency's ADS Chapter 540, section... paragraph (b)(1)(i) of this clause. (2) Format. (i) Descriptive information is required for all Contractor... descriptive information: (A) Name and version of the application software used to create the file, e.g., Word...
Sources of Kant's Cosmopolitanism: Basedow, Rousseau, and Cosmopolitan Education
ERIC Educational Resources Information Center
Cavallar, Georg
2014-01-01
The goal of this essay is to analyse the influence of Johann Bernhard Basedow and Rousseau on Kant's cosmopolitanism and concept of cosmopolitan education. It argues that both Basedow and Kant defined cosmopolitan education as non-denominational moral formation or "Bildung", encompassing--in different forms--a thin version of moral…
MODELING THE ATMOSPHERE FORMATION OF REACTIVE MERCURY IN FLORIDA AND THE GREAT LAKES
Reactive mercury in the troposphere is affected by a complex mix of local emissions, global-scale transport, and gas and aqueous-phase chemistry. Here, we describe a modified version of the EPA model for urban/regional air quality (CMAQ) to include the chemistry of mercury, and m...
75 FR 18170 - Ready-to-Learn Television Program
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-09
... official version of this document is the document published in the Federal Register. Free Internet access... Federal Relay Service, toll free, at 1-800-877-8339. Electronic Access to This Document: You can view this... Adobe Portable Document Format (PDF) on the Internet at the following site: http://www.ed.gov/news...
Sesame IO Library User Manual Version 8
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abhold, Hilary; Young, Ginger Ann
This document is a user manual for SES_IO, a low-level library for reading and writing sesame files. The purpose of the SES_IO library is to provide a simple user interface for accessing and creating sesame files that does not change across sesame format type (such as binary, ascii, and xml).
ArrayInitiative - a tool that simplifies creating custom Affymetrix CDFs
2011-01-01
Background Probes on a microarray represent a frozen view of a genome and are quickly outdated when new sequencing studies extend our knowledge, resulting in significant measurement error when analyzing any microarray experiment. There are several bioinformatics approaches to improve probe assignments, but without in-house programming expertise, standardizing these custom array specifications as a usable file (e.g. as Affymetrix CDFs) is difficult, owing mostly to the complexity of the specification file format. However, without correctly standardized files there is a significant barrier for testing competing analysis approaches since this file is one of the required inputs for many commonly used algorithms. The need to test combinations of probe assignments and analysis algorithms led us to develop ArrayInitiative, a tool for creating and managing custom array specifications. Results ArrayInitiative is a standalone, cross-platform, rich client desktop application for creating correctly formatted, custom versions of manufacturer-provided (default) array specifications, requiring only minimal knowledge of the array specification rules and file formats. Users can import default array specifications, import probe sequences for a default array specification, design and import a custom array specification, export any array specification to multiple output formats, export the probe sequences for any array specification and browse high-level information about the microarray, such as version and number of probes. The initial release of ArrayInitiative supports the Affymetrix 3' IVT expression arrays we currently analyze, but as an open source application, we hope that others will contribute modules for other platforms. Conclusions ArrayInitiative allows researchers to create new array specifications, in a standard format, based upon their own requirements. This makes it easier to test competing design and analysis strategies that depend on probe definitions. Since the custom array specifications are easily exported to the manufacturer's standard format, researchers can analyze these customized microarray experiments using established software tools, such as those available in Bioconductor. PMID:21548938
A database of georeferenced nutrient chemistry data for mountain lakes of the Western United States
Williams, Jason; Labou, Stephanie G.
2017-01-01
Human activities have increased atmospheric nitrogen and phosphorus deposition rates relative to pre-industrial background. In the Western U.S., anthropogenic nutrient deposition has increased nutrient concentrations and stimulated algal growth in at least some remote mountain lakes. The Georeferenced Lake Nutrient Chemistry (GLNC) Database was constructed to create a spatially-extensive lake chemistry database needed to assess atmospheric nutrient deposition effects on Western U.S. mountain lakes. The database includes nitrogen and phosphorus water chemistry data spanning 1964–2015, with 148,336 chemistry results from 51,048 samples collected across 3,602 lakes in the Western U.S. Data were obtained from public databases, government agencies, scientific literature, and researchers, and were formatted into a consistent table structure. All data are georeferenced to a modified version of the National Hydrography Dataset Plus version 2. The database is transparent and reproducible; R code and input files used to format data are provided in an appendix. The database will likely be useful to those assessing spatial patterns of lake nutrient chemistry associated with atmospheric deposition or other environmental stressors. PMID:28509907
NASA Technical Reports Server (NTRS)
Dreher, Joseph G.
2009-01-01
For expedience in delivering dispersion guidance in the diversity of operational situations, National Weather Service Melbourne (MLB) and Spaceflight Meteorology Group (SMG) are becoming increasingly reliant on the PC-based version of the HYSPLIT model run through a graphical user interface (GUI). While the GUI offers unique advantages when compared to traditional methods, it is difficult for forecasters to run and manage in an operational environment. To alleviate the difficulty in providing scheduled real-time trajectory and concentration guidance, the Applied Meteorology Unit (AMU) configured a Linux version of the Hybrid Single-Particle Lagrangian Integrated Trajectory (HYSPLIT) (HYSPLIT) model that ingests the National Centers for Environmental Prediction (NCEP) guidance, such as the North American Mesoscale (NAM) and the Rapid Update Cycle (RUC) models. The AMU configured the HYSPLIT system to automatically download the NCEP model products, convert the meteorological grids into HYSPLIT binary format, run the model from several pre-selected latitude/longitude sites, and post-process the data to create output graphics. In addition, the AMU configured several software programs to convert local Weather Research and Forecast (WRF) model output into HYSPLIT format.
Hucka, Michael; Bergmann, Frank T.; Dräger, Andreas; Hoops, Stefan; Keating, Sarah M.; Le Novére, Nicolas; Myers, Chris J.; Olivier, Brett G.; Sahle, Sven; Schaff, James C.; Smith, Lucian P.; Waltemath, Dagmar; Wilkinson, Darren J.
2017-01-01
Summary Computational models can help researchers to interpret data, understand biological function, and make quantitative predictions. The Systems Biology Markup Language (SBML) is a file format for representing computational models in a declarative form that can be exchanged between different software systems. SBML is oriented towards describing biological processes of the sort common in research on a number of topics, including metabolic pathways, cell signaling pathways, and many others. By supporting SBML as an input/output format, different tools can all operate on an identical representation of a model, removing opportunities for translation errors and assuring a common starting point for analyses and simulations. This document provides the specification for Version 5 of SBML Level 2. The specification defines the data structures prescribed by SBML as well as their encoding in XML, the eXtensible Markup Language. This specification also defines validation rules that determine the validity of an SBML document, and provides many examples of models in SBML form. Other materials and software are available from the SBML project web site, http://sbml.org/. PMID:26528569
The Systems Biology Markup Language (SBML): Language Specification for Level 3 Version 1 Core
Hucka, Michael; Bergmann, Frank T.; Hoops, Stefan; Keating, Sarah M.; Sahle, Sven; Schaff, James C.; Smith, Lucian P.; Wilkinson, Darren J.
2017-01-01
Summary Computational models can help researchers to interpret data, understand biological function, and make quantitative predictions. The Systems Biology Markup Language (SBML) is a file format for representing computational models in a declarative form that can be exchanged between different software systems. SBML is oriented towards describing biological processes of the sort common in research on a number of topics, including metabolic pathways, cell signaling pathways, and many others. By supporting SBML as an input/output format, different tools can all operate on an identical representation of a model, removing opportunities for translation errors and assuring a common starting point for analyses and simulations. This document provides the specification for Version 1 of SBML Level 3 Core. The specification defines the data structures prescribed by SBML as well as their encoding in XML, the eXtensible Markup Language. This specification also defines validation rules that determine the validity of an SBML document, and provides many examples of models in SBML form. Other materials and software are available from the SBML project web site, http://sbml.org/. PMID:26528564
Hucka, Michael; Bergmann, Frank T; Dräger, Andreas; Hoops, Stefan; Keating, Sarah M; Le Novère, Nicolas; Myers, Chris J; Olivier, Brett G; Sahle, Sven; Schaff, James C; Smith, Lucian P; Waltemath, Dagmar; Wilkinson, Darren J
2015-09-04
Computational models can help researchers to interpret data, understand biological function, and make quantitative predictions. The Systems Biology Markup Language (SBML) is a file format for representing computational models in a declarative form that can be exchanged between different software systems. SBML is oriented towards describing biological processes of the sort common in research on a number of topics, including metabolic pathways, cell signaling pathways, and many others. By supporting SBML as an input/output format, different tools can all operate on an identical representation of a model, removing opportunities for translation errors and assuring a common starting point for analyses and simulations. This document provides the specification for Version 5 of SBML Level 2. The specification defines the data structures prescribed by SBML as well as their encoding in XML, the eXtensible Markup Language. This specification also defines validation rules that determine the validity of an SBML document, and provides many examples of models in SBML form. Other materials and software are available from the SBML project web site, http://sbml.org.
The Systems Biology Markup Language (SBML): Language Specification for Level 3 Version 1 Core.
Hucka, Michael; Bergmann, Frank T; Hoops, Stefan; Keating, Sarah M; Sahle, Sven; Schaff, James C; Smith, Lucian P; Wilkinson, Darren J
2015-09-04
Computational models can help researchers to interpret data, understand biological function, and make quantitative predictions. The Systems Biology Markup Language (SBML) is a file format for representing computational models in a declarative form that can be exchanged between different software systems. SBML is oriented towards describing biological processes of the sort common in research on a number of topics, including metabolic pathways, cell signaling pathways, and many others. By supporting SBML as an input/output format, different tools can all operate on an identical representation of a model, removing opportunities for translation errors and assuring a common starting point for analyses and simulations. This document provides the specification for Version 1 of SBML Level 3 Core. The specification defines the data structures prescribed by SBML as well as their encoding in XML, the eXtensible Markup Language. This specification also defines validation rules that determine the validity of an SBML document, and provides many examples of models in SBML form. Other materials and software are available from the SBML project web site, http://sbml.org/.
The Systems Biology Markup Language (SBML): Language Specification for Level 3 Version 1 Core.
Hucka, Michael; Bergmann, Frank T; Hoops, Stefan; Keating, Sarah M; Sahle, Sven; Schaff, James C; Smith, Lucian P; Wilkinson, Darren J
2015-06-01
Computational models can help researchers to interpret data, understand biological function, and make quantitative predictions. The Systems Biology Markup Language (SBML) is a file format for representing computational models in a declarative form that can be exchanged between different software systems. SBML is oriented towards describing biological processes of the sort common in research on a number of topics, including metabolic pathways, cell signaling pathways, and many others. By supporting SBML as an input/output format, different tools can all operate on an identical representation of a model, removing opportunities for translation errors and assuring a common starting point for analyses and simulations. This document provides the specification for Version 1 of SBML Level 3 Core. The specification defines the data structures prescribed by SBML as well as their encoding in XML, the eXtensible Markup Language. This specification also defines validation rules that determine the validity of an SBML document, and provides many examples of models in SBML form. Other materials and software are available from the SBML project web site, http://sbml.org/.
Hucka, Michael; Bergmann, Frank T; Dräger, Andreas; Hoops, Stefan; Keating, Sarah M; Le Novère, Nicolas; Myers, Chris J; Olivier, Brett G; Sahle, Sven; Schaff, James C; Smith, Lucian P; Waltemath, Dagmar; Wilkinson, Darren J
2015-06-01
Computational models can help researchers to interpret data, understand biological function, and make quantitative predictions. The Systems Biology Markup Language (SBML) is a file format for representing computational models in a declarative form that can be exchanged between different software systems. SBML is oriented towards describing biological processes of the sort common in research on a number of topics, including metabolic pathways, cell signaling pathways, and many others. By supporting SBML as an input/output format, different tools can all operate on an identical representation of a model, removing opportunities for translation errors and assuring a common starting point for analyses and simulations. This document provides the specification for Version 5 of SBML Level 2. The specification defines the data structures prescribed by SBML as well as their encoding in XML, the eXtensible Markup Language. This specification also defines validation rules that determine the validity of an SBML document, and provides many examples of models in SBML form. Other materials and software are available from the SBML project web site, http://sbml.org/.
Williams, Bryan; Cockcroft, John R; Kario, Kazuomi; Zappe, Dion H; Cardenas, Pamela; Hester, Allen; Brunel, Patrick; Zhang, Jack
2014-01-01
Introduction Hypertension in elderly people is characterised by elevated systolic blood pressure (SBP) and increased pulse pressure (PP), which indicate large artery ageing and stiffness. LCZ696, a first-in-class angiotensin receptor neprilysin inhibitor (ARNI), is being developed to treat hypertension and heart failure. The Prospective comparison of Angiotensin Receptor neprilysin inhibitor with Angiotensin receptor blocker MEasuring arterial sTiffness in the eldERly (PARAMETER) study will assess the efficacy of LCZ696 versus olmesartan on aortic stiffness and central aortic haemodynamics. Methods and analysis In this 52-week multicentre study, patients with hypertension aged ≥60 years with a mean sitting (ms) SBP ≥150 to <180 and a PP>60 mm Hg will be randomised to once daily LCZ696 200 mg or olmesartan 20 mg for 4 weeks, followed by a forced-titration to double the initial doses for the next 8 weeks. At 12–24 weeks, if the BP target has not been attained (msSBP <140 and ms diastolic BP <90 mm Hg), amlodipine (2.5–5 mg) and subsequently hydrochlorothiazide (6.25–25 mg) can be added. The primary and secondary endpoints are changes from baseline in central aortic systolic pressure (CASP) and central aortic PP (CAPP) at week 12, respectively. Other secondary endpoints are the changes in CASP and CAPP at week 52. A sample size of 432 randomised patients is estimated to ensure a power of 90% to assess the superiority of LCZ696 over olmesartan at week 12 in the change from baseline of mean CASP, assuming an SD of 19 mm Hg, the difference of 6.5 mm Hg and a 15% dropout rate. The primary variable will be analysed using a two-way analysis of covariance. Ethics and dissemination The study was initiated in December 2012 and final results are expected in 2015. The results of this study will impact the design of future phase III studies assessing cardiovascular protection. Clinical trials identifier EUDract number 2012-002899-14 and ClinicalTrials.gov NCT01692301. PMID:24496699
'Lyell' Panorama inside Victoria Crater (False Color)
NASA Technical Reports Server (NTRS)
2008-01-01
During four months prior to the fourth anniversary of its landing on Mars, NASA's Mars Exploration Rover Opportunity examined rocks inside an alcove called 'Duck Bay' in the western portion of Victoria Crater. The main body of the crater appears in the upper right of this stereo panorama, with the far side of the crater lying about 800 meters (half a mile) away. Bracketing that part of the view are two promontories on the crater's rim at either side of Duck Bay. They are 'Cape Verde,' about 6 meters (20 feet) tall, on the left, and 'Cabo Frio,' about 15 meters (50 feet) tall, on the right. The rest of the image, other than sky and portions of the rover, is ground within Duck Bay. Opportunity's targets of study during the last quarter of 2007 were rock layers within a band exposed around the interior of the crater, about 6 meters (20 feet) from the rim. Bright rocks within the band are visible in the foreground of the panorama. The rover science team assigned informal names to three subdivisions of the band: 'Steno,' 'Smith,' and 'Lyell.' This view combines many images taken by Opportunity's panoramic camera (Pancam) from the 1,332nd through 1,379th Martian days, or sols, of the mission (Oct. 23 to Dec. 11, 2007). Images taken through Pancam filters centered on wavelengths of 753 nanometers, 535 nanometers and 432 nanometers were mixed to produce this view, which is presented in a false-color stretch to bring out subtle color differences in the scene. Some visible patterns in dark and light tones are the result of combining frames that were affected by dust on the front sapphire window of the rover's camera. Opportunity landed on Jan. 25, 2004, Universal Time, (Jan. 24, Pacific Time) inside a much smaller crater about 6 kilometers (4 miles) north of Victoria Crater, to begin a surface mission designed to last 3 months and drive about 600 meters (0.4 mile).VENTURE/PC manual: A multidimensional multigroup neutron diffusion code system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shapiro, A.; Huria, H.C.; Cho, K.W.
1991-12-01
VENTURE/PC is a recompilation of part of the Oak Ridge BOLD VENTURE code system, which will operate on an IBM PC or compatible computer. Neutron diffusion theory solutions are obtained for multidimensional, multigroup problems. This manual contains information associated with operating the code system. The purpose of the various modules used in the code system, and the input for these modules are discussed. The PC code structure is also given. Version 2 included several enhancements not given in the original version of the code. In particular, flux iterations can be done in core rather than by reading and writing tomore » disk, for problems which allow sufficient memory for such in-core iterations. This speeds up the iteration process. Version 3 does not include any of the special processors used in the previous versions. These special processors utilized formatted input for various elements of the code system. All such input data is now entered through the Input Processor, which produces standard interface files for the various modules in the code system. In addition, a Standard Interface File Handbook is included in the documentation which is distributed with the code, to assist in developing the input for the Input Processor.« less
Siegel, Michael; Kurland, Rachel P; Castrini, Marisa; Morse, Catherine; de Groot, Alexander; Retamozo, Cynthia; Roberts, Sarah P; Ross, Craig S; Jernigan, David H
No previous paper has examined alcohol advertising on the internet versions of television programs popular among underage youth. To assess the volume of alcohol advertising on web sites of television networks which stream television programs popular among youth. Multiple viewers analyzed the product advertising appearing on 12 television programs that are available in full episode format on the internet. During a baseline period of one week, six coders analyzed all 12 programs. For the nine programs that contained alcohol advertising, three underage coders (ages 10, 13, and 18) analyzed the programs to quantify the extent of that advertising over a four-week period. Alcohol advertisements are highly prevalent on these programs, with nine of the 12 shows carrying alcohol ads, and six programs averaging at least one alcohol ad per episode. There was no difference in alcohol ad exposure for underage and legal age viewers. There is a substantial potential for youth exposure to alcohol advertising on the internet through internet-based versions of television programs. The Federal Trade Commission should require alcohol companies to report the underage youth and adult audiences for internet versions of television programs on which they advertise.
The Clinical Interview Schedule-Revised (CIS-R)-Malay Version, Clinical Validation.
Subramaniam, Kavitha; Krishnaswamy, Saroja; Jemain, Abdul Aziz; Hamid, Abdul; Patel, Vikram
2006-01-01
Use of instruments or questionnaires in different cultural settings without proper validation can result in inaccurate results. Issues like reliability, validity, feasibility and acceptability should be considered in the use of an instrument. The study aims to determine the usefulness of the CIS-R Malay version in detecting common mental health problems specifically to establish the validity. The CIS-R instrument (PROQSY* format) was translated through the back translation process into Malay. Inter rater reliability was established for raters who were medical students. Cases and controls for the study were psychiatric in patients, out patient and relatives or friends accompanying the patients to the clinic or visiting the inpatients. The Malay version of CIS-R was administered to all cases and controls. All cases and controls involved in the study were rated by psychiatrists for psychiatric morbidity using the SCID as a guideline. Specificity and sensitivity of the CIS-R to the assessment by the psychiatrist were determined. The Malay version of CIS-R showed 100% sensitivity and 96.15% specificity at a cut off score of 9. The CIS-R can be a useful instrument for clinical and research use in the Malaysian population for diagnosing common mental disorders like depression and anxiety.
SHABERTH - ANALYSIS OF A SHAFT BEARING SYSTEM (CRAY VERSION)
NASA Technical Reports Server (NTRS)
Coe, H. H.
1994-01-01
The SHABERTH computer program was developed to predict operating characteristics of bearings in a multibearing load support system. Lubricated and non-lubricated bearings can be modeled. SHABERTH calculates the loads, torques, temperatures, and fatigue life for ball and/or roller bearings on a single shaft. The program also allows for an analysis of the system reaction to the termination of lubricant supply to the bearings and other lubricated mechanical elements. SHABERTH has proven to be a valuable tool in the design and analysis of shaft bearing systems. The SHABERTH program is structured with four nested calculation schemes. The thermal scheme performs steady state and transient temperature calculations which predict system temperatures for a given operating state. The bearing dimensional equilibrium scheme uses the bearing temperatures, predicted by the temperature mapping subprograms, and the rolling element raceway load distribution, predicted by the bearing subprogram, to calculate bearing diametral clearance for a given operating state. The shaft-bearing system load equilibrium scheme calculates bearing inner ring positions relative to the respective outer rings such that the external loading applied to the shaft is brought into equilibrium by the rolling element loads which develop at each bearing inner ring for a given operating state. The bearing rolling element and cage load equilibrium scheme calculates the rolling element and cage equilibrium positions and rotational speeds based on the relative inner-outer ring positions, inertia effects, and friction conditions. The ball bearing subprograms in the current SHABERTH program have several model enhancements over similar programs. These enhancements include an elastohydrodynamic (EHD) film thickness model that accounts for thermal heating in the contact area and lubricant film starvation; a new model for traction combined with an asperity load sharing model; a model for the hydrodynamic rolling and shear forces in the inlet zone of lubricated contacts, which accounts for the degree of lubricant film starvation; modeling normal and friction forces between a ball and a cage pocket, which account for the transition between the hydrodynamic and elastohydrodynamic regimes of lubrication; and a model of the effect on fatigue life of the ratio of the EHD plateau film thickness to the composite surface roughness. SHABERTH is intended to be as general as possible. The models in SHABERTH allow for the complete mathematical simulation of real physical systems. Systems are limited to a maximum of five bearings supporting the shaft, a maximum of thirty rolling elements per bearing, and a maximum of one hundred temperature nodes. The SHABERTH program structure is modular and has been designed to permit refinement and replacement of various component models as the need and opportunities develop. A preprocessor is included in the IBM PC version of SHABERTH to provide a user friendly means of developing SHABERTH models and executing the resulting code. The preprocessor allows the user to create and modify data files with minimal effort and a reduced chance for errors. Data is utilized as it is entered; the preprocessor then decides what additional data is required to complete the model. Only this required information is requested. The preprocessor can accommodate data input for any SHABERTH compatible shaft bearing system model. The system may include ball bearings, roller bearings, and/or tapered roller bearings. SHABERTH is written in FORTRAN 77, and two machine versions are available from COSMIC. The CRAY version (LEW-14860) has a RAM requirement of 176K of 64 bit words. The IBM PC version (MFS-28818) is written for IBM PC series and compatible computers running MS-DOS, and includes a sample MS-DOS executable. For execution, the PC version requires at least 1Mb of RAM and an 80386 or 486 processor machine with an 80x87 math co-processor. The standard distribution medium for the IBM PC version is a set of two 5.25 inch 360K MS-DOS format diskettes. The contents of the diskettes are compressed using the PKWARE archiving tools. The utility to unarchive the files, PKUNZIP.EXE, is included. The standard distribution medium for the CRAY version is also a 5.25 inch 360K MS-DOS format diskette, but alternate distribution media and formats are available upon request. The original version of SHABERTH was developed in FORTRAN IV at Lewis Research Center for use on a UNIVAC 1100 series computer. The Cray version was released in 1988, and was updated in 1990 to incorporate fluid rheological data for Rocket Propellant 1 (RP-1), thereby allowing the analysis of bearings lubricated with RP-1. The PC version is a port of the 1990 CRAY version and was developed in 1992 by SRS Technologies under contract to NASA Marshall Space Flight Center.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gilbert, B.G.; Richards, R.E.; Reece, W.J.
1992-10-01
This Reference Guide contains instructions on how to install and use Version 3.5 of the NRC-sponsored Nuclear Computerized Library for Assessing Reactor Reliability (NUCLARR). The NUCLARR data management system is contained in compressed files on the floppy diskettes that accompany this Reference Guide. NUCLARR is comprised of hardware component failure data (HCFD) and human error probability (HEP) data, both of which are available via a user-friendly, menu driven retrieval system. The data may be saved to a file in a format compatible with IRRAS 3.0 and commercially available statistical packages, or used to formulate log-plots and reports of data retrievalmore » and aggregation findings.« less
Burn, K W; Daffara, C; Gualdrini, G; Pierantoni, M; Ferrari, P
2007-01-01
The question of Monte Carlo simulation of radiation transport in voxel geometries is addressed. Patched versions of the MCNP and MCNPX codes are developed aimed at transporting radiation both in the standard geometry mode and in the voxel geometry treatment. The patched code reads an unformatted FORTRAN file derived from DICOM format data and uses special subroutines to handle voxel-to-voxel radiation transport. The various phases of the development of the methodology are discussed together with the new input options. Examples are given of employment of the code in internal and external dosimetry and comparisons with results from other groups are reported.
NASA Technical Reports Server (NTRS)
Jiang, Bo-Nan; Sonnad, Vijay
1991-01-01
A p-version of the least squares finite element method, based on the velocity-pressure-vorticity formulation, is developed for solving steady state incompressible viscous flow problems. The resulting system of symmetric and positive definite linear equations can be solved satisfactorily with the conjugate gradient method. In conjunction with the use of rapid operator application which avoids the formation of either element of global matrices, it is possible to achieve a highly compact and efficient solution scheme for the incompressible Navier-Stokes equations. Numerical results are presented for two-dimensional flow over a backward facing step. The effectiveness of simple outflow boundary conditions is also demonstrated.
NASA Technical Reports Server (NTRS)
Warren, W. H., Jr.
1984-01-01
The machine-readable version of the Atlas as it is currently being distributed from the Astronomical Data Center is described. The data were obtained with the Oke multichannel scanner on the 5-meter Hale reflector for purposes of synthesizing galaxy spectra, and the digitized Atlas contains normalized spectral energy distributions, computed colors, scan line and continuum indices for 175 selected stars covering the complete ranges of spectral type and luminosity class. The documentation includes a byte-by-byte format description, a table of the indigenous characteristics of the magnetic tape file, and a sample listing of logical records exactly as they are recorded on the tape.
NASA Astrophysics Data System (ADS)
Wang, Z. B.; Hu, M.; Mogensen, D.; Yue, D. L.; Zheng, J.; Zhang, R. Y.; Liu, Y.; Yuan, B.; Li, X.; Shao, M.; Zhou, L.; Wu, Z. J.; Wiedensohler, A.; Boy, M.
2013-11-01
Simulations of sulfuric acid concentration and new particle formation are performed by using the zero-dimensional version of the model MALTE (Model to predict new Aerosol formation in the Lower TropospherE) and measurements from the Campaign of Air Quality Research in Beijing and Surrounding areas (CAREBeijing) in 2008. Chemical reactions from the Master Chemical Mechanism version 3.2 (MCM v3.2) are used in the model. High correlation (slope = 0.72, R = 0.74) between the modelled and observed sulfuric acid concentrations is found during daytime (06:00-18:00). The aerosol dynamics are simulated by the University of Helsinki Multicomponent Aerosol (UHMA) model including several nucleation mechanisms. The results indicate that the model is able to predict the on- and offset of new particle formation in an urban atmosphere in China. In addition, the number concentrations of newly formed particles in kinetic-type nucleation including homogenous homomolecular (J=K[H2SO4]2) and homogenous heteromolecular nucleation involving organic vapours (J=Khet[H2SO4][Org]) are in satisfactory agreement with the observations. However, the specific organic compounds that possibly participate in the nucleation process should be investigated in further studies. For the particle growth, only a small fraction of the oxidized total organics condense onto the particles in polluted environments. Meanwhile, the OH and O3 oxidation mechanism contribute 5.5% and 94.5% to the volume concentration of small particles, indicating the particle growth is more controlled by the precursor gases and their oxidation by O3.
AAPB-B - Committee offers revised exchange format for transferring geologic and petroleum data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Waller, H.O.; Guinn, D.; Herkommer, M.
1990-04-01
Comments received since the publication of Exchange Format for Transfer of Geologic and Petroleum Data revealed the need for more flexibility with the AAPG-A Format (Shaw and Waller, 1989). This discussion resulted in the proposed AAPG-B version, which has unlimited number of data fields per record and unlimited number of records. Comment lines can appear anywhere, including in data records, to help document data transfer. Data dictionary hooks have been added. The American Petroleum Institute has assisted by supplying an ANSI envelope for this format, which will permit the electronic transfer with verification of data sets between any two ANSImore » installations. The American Association of Petroleum Geologists Database Standards Subcommittee invites comments on the proposed revisions, and will review the suggestions when it meets June 2 in San Francisco.« less
Sex differences in effects of testing medium and response format on a visuospatial task.
Cherney, Isabelle D; Rendell, Jariel A
2010-06-01
Sex differences on visuospatial tests are among the most reliably replicated. It is unclear to what extent these performance differences reflect underlying differences in skills or testing factors. To assess whether testing medium and response format affect visuospatial sex differences, performances of introductory psychology students (100 men, 104 women) were examined on a visuospatial task presented in paper-and-pencil and tablet computer forms. Both sexes performed better when tested on paper, although men outperformed women. The introduction of an open-ended component to the visuospatial task eliminated sex differences when prior spatial experiences were controlled, but men outperformed women when prior spatial experiences were not considered. In general, the open-ended version and computerized format of the test diminished performance, suggesting that response format and medium are testing factors that influence visuospatial abilities.
MDplot: Visualise Molecular Dynamics
Margreitter, Christian; Oostenbrink, Chris
2017-01-01
The MDplot package provides plotting functions to allow for automated visualisation of molecular dynamics simulation output. It is especially useful in cases where the plot generation is rather tedious due to complex file formats or when a large number of plots are generated. The graphs that are supported range from those which are standard, such as RMsD/RMsF (root-mean-square deviation and root-mean-square fluctuation, respectively) to less standard, such as thermodynamic integration analysis and hydrogen bond monitoring over time. All told, they address many commonly used analyses. In this article, we set out the MDplot package′s functions, give examples of the function calls, and show the associated plots. Plotting and data parsing is separated in all cases, i.e. the respective functions can be used independently. Thus, data manipulation and the integration of additional file formats is fairly easy. Currently, the loading functions support GROMOS, GROMACS, and AMBER file formats. Moreover, we also provide a Bash interface that allows simple embedding of MDplot into Bash scripts as the final analysis step. Availability The package can be obtained in the latest major version from CRAN (https://cran.r-project.org/package=MDplot) or in the most recent version from the project′s GitHub page at https://github.com/MDplot/MDplot, where feedback is also most welcome. MDplot is published under the GPL-3 license. PMID:28845302
Masquillier, Caroline; Wouters, Edwin; Loos, Jasna; Nöstlinger, Christiana
2012-01-01
Background and Objectives Access to antiretroviral treatment among adolescents living with HIV (ALH) is increasing. Health-related quality of life (HRQOL) is relevant for monitoring the impact of the disease on both well-being and treatment outcomes. However, adequate screening tools to assess HRQOL in low-resource settings are scarce. This study aims to fill this research gap, by 1) assessing the psychometric properties and reliability of an Eastern African English version of a European HRQOL scale for adolescents (KIDSCREEN) and 2) determining which version of the KIDSCREEN (52-, 27- and 10-item version) is most suitable for low-resource settings. Methods The KIDSCREEN was translated into Eastern African English, Luganda (Uganda) and Dholuo (Kenya) according to standard procedures. The reconciled version was administered in 2011 to ALH aged 13–17 in Kenya (n = 283) and Uganda (n = 299). All three KIDSCREEN versions were fitted to the data with confirmatory factor analysis (CFA). After comparison, the most suitable version was adapted based on the CFA outcomes utilizing the results of previous formative research. In order to develop a general HRQOL factor, a second-order measurement model was fitted to the data. Results The CFA results showed that without adjustments, the KIDSCREEN cannot be used for measuring the HRQOL of HIV-positive adolescents. After comparison, the most suitable version for low-resource settings - the 27-item version - was adapted further. The introduction of a negative wording factor was required for the Dholuo model. The Dholuo (CFI: 0.93; RMSEA: 0.039) and the Luganda model (CFI: 0.90; RMSEA: 0.052) showed a good fit. All cronbach’s alphas of the factors were 0.70 or above. The alpha value of the Dholuo and Lugandan HRQOL second-order factor was respectively 0.84 and 0.87. Conclusions The study showed that the adapted KIDSCREEN-27 is an adequate tool for measuring HRQOL in low-resource settings with high HIV prevalence. PMID:22815776
Hyper-Fractal Analysis: A visual tool for estimating the fractal dimension of 4D objects
NASA Astrophysics Data System (ADS)
Grossu, I. V.; Grossu, I.; Felea, D.; Besliu, C.; Jipa, Al.; Esanu, T.; Bordeianu, C. C.; Stan, E.
2013-04-01
This work presents a new version of a Visual Basic 6.0 application for estimating the fractal dimension of images and 3D objects (Grossu et al. (2010) [1]). The program was extended for working with four-dimensional objects stored in comma separated values files. This might be of interest in biomedicine, for analyzing the evolution in time of three-dimensional images. New version program summaryProgram title: Hyper-Fractal Analysis (Fractal Analysis v03) Catalogue identifier: AEEG_v3_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEEG_v3_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: Standard CPC license, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 745761 No. of bytes in distributed program, including test data, etc.: 12544491 Distribution format: tar.gz Programming language: MS Visual Basic 6.0 Computer: PC Operating system: MS Windows 98 or later RAM: 100M Classification: 14 Catalogue identifier of previous version: AEEG_v2_0 Journal reference of previous version: Comput. Phys. Comm. 181 (2010) 831-832 Does the new version supersede the previous version? Yes Nature of problem: Estimating the fractal dimension of 4D images. Solution method: Optimized implementation of the 4D box-counting algorithm. Reasons for new version: Inspired by existing applications of 3D fractals in biomedicine [3], we extended the optimized version of the box-counting algorithm [1, 2] to the four-dimensional case. This might be of interest in analyzing the evolution in time of 3D images. The box-counting algorithm was extended in order to support 4D objects, stored in comma separated values files. A new form was added for generating 2D, 3D, and 4D test data. The application was tested on 4D objects with known dimension, e.g. the Sierpinski hypertetrahedron gasket, Df=ln(5)/ln(2) (Fig. 1). The algorithm could be extended, with minimum effort, to higher number of dimensions. Easy integration with other applications by using the very simple comma separated values file format for storing multi-dimensional images. Implementation of χ2 test as a criterion for deciding whether an object is fractal or not. User friendly graphical interface. Hyper-Fractal Analysis-Test on the Sierpinski hypertetrahedron 4D gasket (Df=ln(5)/ln(2)≅2.32). Running time: In a first approximation, the algorithm is linear [2]. References: [1] V. Grossu, D. Felea, C. Besliu, Al. Jipa, C.C. Bordeianu, E. Stan, T. Esanu, Computer Physics Communications, 181 (2010) 831-832. [2] I.V. Grossu, C. Besliu, M.V. Rusu, Al. Jipa, C. C. Bordeianu, D. Felea, Computer Physics Communications, 180 (2009) 1999-2001. [3] J. Ruiz de Miras, J. Navas, P. Villoslada, F.J. Esteban, Computer Methods and Programs in Biomedicine, 104 Issue 3 (2011) 452-460.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-21
...The U.S. Nuclear Regulatory Commission (NRC or Commission) is issuing a revision to regulatory guide (RG) 3.39, ``Standard Format and Content of License Applications for Mixed Oxide Fuel Fabrication Facilities.'' This guide endorses the standard format and content for license applications and integrated safety analysis (ISA) summaries described in the current version of NUREG-1718, ``Standard Review Plan for the Review of an Application for a Mixed Oxide (MOX) Fuel Fabrication Facility,'' as a method that the NRC staff finds acceptable for meeting the regulatory requirements of Title 10 of the Code of Federal Regulations (10 CFR) part 70, ``Domestic Licensing of Special Nuclear Material'' for mixed oxide fuel fabrication facilities.
VisBOL: Web-Based Tools for Synthetic Biology Design Visualization.
McLaughlin, James Alastair; Pocock, Matthew; Mısırlı, Göksel; Madsen, Curtis; Wipat, Anil
2016-08-19
VisBOL is a Web-based application that allows the rendering of genetic circuit designs, enabling synthetic biologists to visually convey designs in SBOL visual format. VisBOL designs can be exported to formats including PNG and SVG images to be embedded in Web pages, presentations and publications. The VisBOL tool enables the automated generation of visualizations from designs specified using the Synthetic Biology Open Language (SBOL) version 2.0, as well as a range of well-known bioinformatics formats including GenBank and Pigeoncad notation. VisBOL is provided both as a user accessible Web site and as an open-source (BSD) JavaScript library that can be used to embed diagrams within other content and software.
CanSIS Regional Soils Data in Vector Format
NASA Technical Reports Server (NTRS)
Monette, Bryan; Knapp, David; Hall, Forrest G. (Editor)
2000-01-01
This data set is the original vector data set received from Canada Soil Information System (CanSIS). The data include the provinces of Saskatchewan and Manitoba. Attribute tables provide the various soil data for the polygons; there is one attribute table for Saskatchewan and one for Manitoba. The data are stored in ARC/INFO export format files. Based on agreements made with Agriculture Canada, these data are available only to individuals and groups that have an official relationship with the BOREAS project. These data are not included on the BOReal Ecosystem-Atmosphere Study (BOREAS) CD-ROM set. A raster version of this data set titled 'BOREAS Regional Soils Data in Raster Format and AEAC Projection' is publicly available and is included on the BOREAS CD-ROM set.
Communicating Scientific Research to Non-Specialists
NASA Astrophysics Data System (ADS)
Holman, Megan
Public outreach to effectively communicate current scientific advances is an essential component of the scientific process. The challenge in making this information accessible is forming a clear, accurate, and concise version of the information from a variety of different sources, so that the information is understandable and compelling to non-specialists in the general public. We are preparing a magazine article about planetary system formation. This article will include background information about star formation and different theories and observations of planet formation to provide context. We will then discuss the latest research and theories describing how planetary systems may be forming in different areas of the universe. We demonstrate here the original professional-level scientific work alongside our public-level explanations and original graphics to demonstrate our editorial process.
Peebles, P. J. E.
1998-01-01
It is argued that within the standard Big Bang cosmological model the bulk of the mass of the luminous parts of the large galaxies likely had been assembled by redshift z ∼ 10. Galaxy assembly this early would be difficult to fit in the widely discussed adiabatic cold dark matter model for structure formation, but it could agree with an isocurvature version in which the cold dark matter is the remnant of a massive scalar field frozen (or squeezed) from quantum fluctuations during inflation. The squeezed field fluctuations would be Gaussian with zero mean, and the distribution of the field mass therefore would be the square of a random Gaussian process. This offers a possibly interesting new direction for the numerical exploration of models for cosmic structure formation. PMID:9419326
NASA Technical Reports Server (NTRS)
Saltsman, James F.
1992-01-01
This manual presents computer programs for characterizing and predicting fatigue and creep-fatigue resistance of metallic materials in the high-temperature, long-life regime for isothermal and nonisothermal fatigue. The programs use the total strain version of Strainrange Partitioning (TS-SRP). An extensive database has also been developed in a parallel effort. This database is probably the largest source of high-temperature, creep-fatigue test data available in the public domain and can be used with other life prediction methods as well. This users manual, software, and database are all in the public domain and are available through COSMIC (382 East Broad Street, Athens, GA 30602; (404) 542-3265, FAX (404) 542-4807). Two disks accompany this manual. The first disk contains the source code, executable files, and sample output from these programs. The second disk contains the creep-fatigue data in a format compatible with these programs.
Thistlethwaite, Jill; Dallest, Kathy; Moran, Monica; Dunston, Roger; Roberts, Chris; Eley, Diann; Bogossian, Fiona; Forman, Dawn; Bainbridge, Lesley; Drynan, Donna; Fyfe, Sue
2016-07-01
The individual Teamwork Observation and Feedback Tool (iTOFT) was devised by a consortium of seven universities in recognition of the need for a means of observing and giving feedback to individual learners undertaking an interprofessional teamwork task. It was developed through a literature review of the existing teamwork assessment tools, a discussion of accreditation standards for the health professions, Delphi consultation and field-testing with an emphasis on its feasibility and acceptability for formative assessment. There are two versions: the Basic tool is for use with students who have little clinical teamwork experience and lists 11 observable behaviours under two headings: 'shared decision making' and 'working in a team'. The Advanced version is for senior students and junior health professionals and has 10 observable behaviours under four headings: 'shared decision making', 'working in a team', 'leadership', and 'patient safety'. Both versions include a comprehensive scale and item descriptors. Further testing is required to focus on its validity and educational impact.
A formative evaluation of the SWITCH® obesity prevention program: print versus online programming.
Welk, Gregory J; Chen, Senlin; Nam, Yoon Ho; Weber, Tara E
2015-01-01
SWITCH® is an evidence-based childhood obesity prevention program that works through schools to impact parenting practices. The present study was designed as a formative evaluation to test whether an online version of SWITCH® would work equivalently as the established print version. Ten elementary schools were matched by socio-economic status and randomly assigned to receive either the print (n = 5) or online (n = 5) version. A total of 211 children from 22, 3(rd) grade classrooms were guided through the 4 month program by a team of program leaders working in cooperation with the classroom teachers. Children were tasked with completing weekly SWITCH® Trackers with their parents to monitor goal setting efforts in showing positive Do (≥60 minutes of moderate-to-vigorous physical activity), View (≤2 hours of screen time), and Chew (≥5 servings of fruits and vegetables) behaviors on each day. A total of 91 parents completed a brief survey to assess project-specific interactions with their child and the impact on their behaviors. The majority of parents (93.2%) reported satisfactory experiences with either the online or print SWITCH® program. The return rate for the SWITCH® Trackers was higher (42.5% ± 11%) from the print schools compared to the online schools (27.4% ± 10.9%). District program managers rated the level of teacher engagement in regards to program facilitation and the results showed a higher Trackers return rate in the highly engaged schools (38.5% ± 13.3%) than the lowly engaged schools (28.6 ± 11.9%). No significant differences were observed in parent/child interactions or reported behavior change (ps > .05) suggesting the equivalence in intervention effect for print and online versions of the SWITCH® program. The findings support the utility of the online SWITCH® platform but school-based modules are needed to facilitate broader school engagement by classroom teachers and PE teachers.
Dragly, Svenn-Arne; Hobbi Mobarhan, Milad; Lepperød, Mikkel E.; Tennøe, Simen; Fyhn, Marianne; Hafting, Torkel; Malthe-Sørenssen, Anders
2018-01-01
Natural sciences generate an increasing amount of data in a wide range of formats developed by different research groups and commercial companies. At the same time there is a growing desire to share data along with publications in order to enable reproducible research. Open formats have publicly available specifications which facilitate data sharing and reproducible research. Hierarchical Data Format 5 (HDF5) is a popular open format widely used in neuroscience, often as a foundation for other, more specialized formats. However, drawbacks related to HDF5's complex specification have initiated a discussion for an improved replacement. We propose a novel alternative, the Experimental Directory Structure (Exdir), an open specification for data storage in experimental pipelines which amends drawbacks associated with HDF5 while retaining its advantages. HDF5 stores data and metadata in a hierarchy within a complex binary file which, among other things, is not human-readable, not optimal for version control systems, and lacks support for easy access to raw data from external applications. Exdir, on the other hand, uses file system directories to represent the hierarchy, with metadata stored in human-readable YAML files, datasets stored in binary NumPy files, and raw data stored directly in subdirectories. Furthermore, storing data in multiple files makes it easier to track for version control systems. Exdir is not a file format in itself, but a specification for organizing files in a directory structure. Exdir uses the same abstractions as HDF5 and is compatible with the HDF5 Abstract Data Model. Several research groups are already using data stored in a directory hierarchy as an alternative to HDF5, but no common standard exists. This complicates and limits the opportunity for data sharing and development of common tools for reading, writing, and analyzing data. Exdir facilitates improved data storage, data sharing, reproducible research, and novel insight from interdisciplinary collaboration. With the publication of Exdir, we invite the scientific community to join the development to create an open specification that will serve as many needs as possible and as a foundation for open access to and exchange of data. PMID:29706879
Dragly, Svenn-Arne; Hobbi Mobarhan, Milad; Lepperød, Mikkel E; Tennøe, Simen; Fyhn, Marianne; Hafting, Torkel; Malthe-Sørenssen, Anders
2018-01-01
Natural sciences generate an increasing amount of data in a wide range of formats developed by different research groups and commercial companies. At the same time there is a growing desire to share data along with publications in order to enable reproducible research. Open formats have publicly available specifications which facilitate data sharing and reproducible research. Hierarchical Data Format 5 (HDF5) is a popular open format widely used in neuroscience, often as a foundation for other, more specialized formats. However, drawbacks related to HDF5's complex specification have initiated a discussion for an improved replacement. We propose a novel alternative, the Experimental Directory Structure (Exdir), an open specification for data storage in experimental pipelines which amends drawbacks associated with HDF5 while retaining its advantages. HDF5 stores data and metadata in a hierarchy within a complex binary file which, among other things, is not human-readable, not optimal for version control systems, and lacks support for easy access to raw data from external applications. Exdir, on the other hand, uses file system directories to represent the hierarchy, with metadata stored in human-readable YAML files, datasets stored in binary NumPy files, and raw data stored directly in subdirectories. Furthermore, storing data in multiple files makes it easier to track for version control systems. Exdir is not a file format in itself, but a specification for organizing files in a directory structure. Exdir uses the same abstractions as HDF5 and is compatible with the HDF5 Abstract Data Model. Several research groups are already using data stored in a directory hierarchy as an alternative to HDF5, but no common standard exists. This complicates and limits the opportunity for data sharing and development of common tools for reading, writing, and analyzing data. Exdir facilitates improved data storage, data sharing, reproducible research, and novel insight from interdisciplinary collaboration. With the publication of Exdir, we invite the scientific community to join the development to create an open specification that will serve as many needs as possible and as a foundation for open access to and exchange of data.
76 FR 5363 - Intent To Compromise Claim Against the State of Oklahoma Department of Education
Federal Register 2010, 2011, 2012, 2013, 2014
2011-01-31
... must have Adobe Acrobat Reader, which is available free at this site. Note: The official version of this document is the document published in the Federal Register. Free Internet access to the official... Document Format (PDF) on the Internet at the following site: http://frwebgate.access.gpo.gov/cgi-bin...
77 FR 48974 - Applications for New Awards; Comprehensive Centers Program (CFDA 84.283B); Correction
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-15
... version of this document is the document published in the Federal Register. Free Internet access to the... Federal Relay Service (FRS), toll free, at 1- 800-877-8339. SUPPLEMENTARY INFORMATION: We make the... Document Format (PDF). To use PDF you must have Adobe Acrobat Reader, which is available free at the site...
Containing Energy, Sustaining Agency-Drama in Middle Years.
ERIC Educational Resources Information Center
Block, Lee Anne
2003-01-01
Describes the author's experience working on a reader's theatre version of a radio play based on the Legend of Sleepy Hollow. Reflects on how the grade 8 students created meaning for themselves and for their audience. Notes limitations of the script and format and her work within those limitations became the structure the group needed, a container…
Earth resources sensor data handling system: NASA JSC version
NASA Technical Reports Server (NTRS)
1974-01-01
The design of the NASA JSC data handling system is presented. Data acquisition parameters and computer display formats and the flow of image data through the system, with recommendations for improving system efficiency are discussed along with modifications to existing data handling procedures which will allow utilization of data duplication techniques and the accurate identification of imagery.
Creativity as a Question of Bildung
ERIC Educational Resources Information Center
Hammershoj, Lars Geer
2009-01-01
The aim of the article is to contribute to the conceptualization of creativity in education. The article makes use of the self-Bildung perspective, which is an up-to-date version of the Neo-humanistic notion of the formation of the personality in order to interpret the original notion of the "four stages" of the creative process. The conclusion is…
Helping Adolescents with Health Problems to Become Socially Competent
ERIC Educational Resources Information Center
Drozdikova-Zaripova, Albina R.; Kostyunina, Nadezhda Yu.
2016-01-01
The purpose of the article is to present and analyze the results of experimental work to verify the efficiency of the developed and approved program aimed at the formation of social competence in adolescents with physical problems. The leading method in the study of this problem is a consequent version of the pedagogical experiment. The results of…