NASA Technical Reports Server (NTRS)
Chimiak, Reine; Harris, Bernard; Williams, Phillip
2013-01-01
Basic Common Data Format (CDF) tools (e.g., cdfedit) provide no specific support for creating International Solar-Terrestrial Physics/Space Physics Data Facility (ISTP/SPDF) standard files. While it is possible for someone who is familiar with the ISTP/SPDF metadata guidelines to create compliant files using just the basic tools, the process is error-prone and unreasonable for someone without ISTP/SPDF expertise. The key problem is the lack of a tool with specific support for creating files that comply with the ISTP/SPDF guidelines. There are basic CDF tools such as cdfedit and skeletoncdf for creating CDF files, but these have no specific support for creating ISTP/ SPDF compliant files. The SPDF ISTP CDF skeleton editor is a cross-platform, Java-based GUI editor program that allows someone with only a basic understanding of the ISTP/SPDF guidelines to easily create compliant files. The editor is a simple graphical user interface (GUI) application for creating and editing ISTP/SPDF guideline-compliant skeleton CDF files. The SPDF ISTP CDF skeleton editor consists of the following components: A swing-based Java GUI program, JavaHelp-based manual/ tutorial, Image/Icon files, and HTML Web page for distribution. The editor is available as a traditional Java desktop application as well as a Java Network Launching Protocol (JNLP) application. Once started, it functions like a typical Java GUI file editor application for creating/editing application-unique files.
WHO Expert Committee on Specifications for Pharmaceutical Preparations.
2011-01-01
The Expert Committee on Specifications for Pharmaceutical Preparations works towards clear, independent and practical standards and guidelines for the quality assurance of medicines. Standards are developed by the Committee through worldwide consultation and an international consensus-building process. The following new guidelines were adopted and recommended for use: procedure for adoption of International Chemical Reference Substances; WHO good practices for pharmaceutical microbiology laboratories; good manufacturing practices: main principles for pharmaceutical products; good manufacturing practices for blood establishments (jointly with the Expert Committee on Biological Standardization); guidelines on good manufacturing practices for heating, ventilation and air-conditioning systems for non-sterile pharmaceutical dosage forms; good manufacturing practices for sterile pharmaceutical products; guidelines on transfer of technology in pharmaceutical manufacturing; good pharmacy practice: standards for quality of pharmacy services (joint FIP/WHO); model guidance for the storage and transport of time- and temperature-sensitive pharmaceutical products (jointly with the Expert Committee on Biological Standardization); procedure for prequalification of pharmaceutical products; guide on submission of documentation for prequalification of innovator finished pharmaceutical products approved by stringent regulatory authorities; prequalification of quality control laboratories: procedure for assessing the acceptability, in principle, of quality control laboratories for use by United Nations agencies; guidelines for preparing a laboratory information file; guidelines for drafting a site master file; guidelines on submission of documentation for a multisource (generic) finished product: general format: preparation of product dossiers in common technical document format.
Interpreting and Presenting Data to Management. Air Professional File Number 36.
ERIC Educational Resources Information Center
Clagett, Craig A.
Guidelines are offered to institutional researchers and planning analysts for presenting research results in formats and levels of sophistication that are accessible to top management. Fundamental principles include: (1) know what is needed; (2) know when the information is needed; (3) match format to analytical sophistication and learning…
The GuideLine Interchange Format
Ohno-Machado, Lucila; Gennari, John H.; Murphy, Shawn N.; Jain, Nilesh L.; Tu, Samson W.; Oliver, Diane E.; Pattison-Gordon, Edward; Greenes, Robert A.; Shortliffe, Edward H.; Barnett, G. Octo
1998-01-01
Objective: To allow exchange of clinical practice guidelines among institutions and computer-based applications. Design: The GuideLine Interchange Format (GLIF) specification consists of the GLIF model and the GLIF syntax. The GLIF model is an object-oriented representation that consists of a set of classes for guideline entities, attributes for those classes, and data types for the attribute values. The GLIF syntax specifies the format of the test file that contains the encoding. Methods: Researchers from the InterMed Collaboratory at Columbia University, Harvard University (Brigham and Women's Hospital and Massachusetts General Hospital), and Stanford University analyzed four existing guideline systems to derive a set of requirements for guideline representation. The GLIF specification is a consensus representation developed through a brainstorming process. Four clinical guidelines were encoded in GLIF to assess its expressivity and to study the variability that occurs when two people from different sites encode the same guideline. Results: The encoders reported that GLIF was adequately expressive. A comparison of the encodings revealed substantial variability. Conclusion: GLIF was sufficient to model the guidelines for the four conditions that were examined. GLIF needs improvement in standard representation of medical concepts, criterion logic, temporal information, and uncertainty. PMID:9670133
U.S. Army Research Laboratory (ARL) Corporate Dari Document Transcription and Translation Guidelines
2012-10-01
text file format. 15. SUBJECT TERMS Transcription, Translation, guidelines, ground truth, Optical character recognition , OCR, Machine Translation, MT...foreign language into a target language in order to train, test, and evaluate optical character recognition (OCR) and machine translation (MT) embedded...graphic element and should not be transcribed. Elements that are not part of the primary text such as handwritten annotations or stamps should not be
NASA Technical Reports Server (NTRS)
Carlson, P. A.
1983-01-01
This document is a set of guidelines to aid a programmer in making the various decisions necessary for a clear user-programmer dialogue. Its goal is to promote an effective and efficient transfer of information between programmer and user. These guidelines are divided into four sections: (1) Format, (2) Sequence, (3) Audience, and (4) Aim. Format, in terms of this study, means the spatial and structural presentation of information. Sequence deals with the procedural aspects of multiple panel displays. This section looks at the issues of timelines of presentation, modularization of information, and patterns of user behavior. Audience looks at the relationship among programmer, user, and message. It covers the issues of analyzing the audience's knowledge, attitudes, and needs, anticipating the audience's inferences, and identifying textual ambiguities. The programmer's aim or intention shows up in everything from tone to format. Aim considers the programmer's purpose.
STEP: What Is It and Should It Be Used for KSC's ISE/CEE Project in the Near Future?
NASA Technical Reports Server (NTRS)
Bareiss, Catherine C.
2000-01-01
The ability to exchange information between different engineering software (i.e, CAD, CAE, CAM) is necessary to aid in collaborative engineering. There are a number of different ways to accomplish this goal. One popular method is to transfer data via different file formats. However this method can lose data and becomes complex as more file formats are added. Another method is to use a standard protocol. STEP is one such standard. This paper gives an overview of STEP, provides a list of where to access more information, and develops guidelines to aid the reader in deciding if STEP is appropriate for his/her use.
Stanzel, Sven; Weimer, Marc; Kopp-Schneider, Annette
2013-06-01
High-throughput screening approaches are carried out for the toxicity assessment of a large number of chemical compounds. In such large-scale in vitro toxicity studies several hundred or thousand concentration-response experiments are conducted. The automated evaluation of concentration-response data using statistical analysis scripts saves time and yields more consistent results in comparison to data analysis performed by the use of menu-driven statistical software. Automated statistical analysis requires that concentration-response data are available in a standardised data format across all compounds. To obtain consistent data formats, a standardised data management workflow must be established, including guidelines for data storage, data handling and data extraction. In this paper two procedures for data management within large-scale toxicological projects are proposed. Both procedures are based on Microsoft Excel files as the researcher's primary data format and use a computer programme to automate the handling of data files. The first procedure assumes that data collection has not yet started whereas the second procedure can be used when data files already exist. Successful implementation of the two approaches into the European project ACuteTox is illustrated. Copyright © 2012 Elsevier Ltd. All rights reserved.
Preparation of digital movie clips for online journal publication.
Yam, Chun-Shan
2006-07-01
This article presents general guidelines for preparing movie clips for online journal publication. As more and more radiology journals establish an online presence, radiologists wishing to submit journal articles with movie clips need to understand the electronic submission process. Viewing a movie clip via an online journal is different from viewing one with PowerPoint using a local desktop computer because the movie file must first be downloaded onto the client computer before it can be displayed. Users thus should be cautious in selecting movie format and compression when creating movie clips for online journals. This article provides step-by-step demonstrations and general guidelines for movie format and compression selections.
mzML2ISA & nmrML2ISA: generating enriched ISA-Tab metadata files from metabolomics XML data
Larralde, Martin; Lawson, Thomas N.; Weber, Ralf J. M.; Moreno, Pablo; Haug, Kenneth; Rocca-Serra, Philippe; Viant, Mark R.; Steinbeck, Christoph; Salek, Reza M.
2017-01-01
Abstract Summary Submission to the MetaboLights repository for metabolomics data currently places the burden of reporting instrument and acquisition parameters in ISA-Tab format on users, who have to do it manually, a process that is time consuming and prone to user input error. Since the large majority of these parameters are embedded in instrument raw data files, an opportunity exists to capture this metadata more accurately. Here we report a set of Python packages that can automatically generate ISA-Tab metadata file stubs from raw XML metabolomics data files. The parsing packages are separated into mzML2ISA (encompassing mzML and imzML formats) and nmrML2ISA (nmrML format only). Overall, the use of mzML2ISA & nmrML2ISA reduces the time needed to capture metadata substantially (capturing 90% of metadata on assay and sample levels), is much less prone to user input errors, improves compliance with minimum information reporting guidelines and facilitates more finely grained data exploration and querying of datasets. Availability and Implementation mzML2ISA & nmrML2ISA are available under version 3 of the GNU General Public Licence at https://github.com/ISA-tools. Documentation is available from http://2isa.readthedocs.io/en/latest/. Contact reza.salek@ebi.ac.uk or isatools@googlegroups.com Supplementary information Supplementary data are available at Bioinformatics online. PMID:28402395
mzML2ISA & nmrML2ISA: generating enriched ISA-Tab metadata files from metabolomics XML data.
Larralde, Martin; Lawson, Thomas N; Weber, Ralf J M; Moreno, Pablo; Haug, Kenneth; Rocca-Serra, Philippe; Viant, Mark R; Steinbeck, Christoph; Salek, Reza M
2017-08-15
Submission to the MetaboLights repository for metabolomics data currently places the burden of reporting instrument and acquisition parameters in ISA-Tab format on users, who have to do it manually, a process that is time consuming and prone to user input error. Since the large majority of these parameters are embedded in instrument raw data files, an opportunity exists to capture this metadata more accurately. Here we report a set of Python packages that can automatically generate ISA-Tab metadata file stubs from raw XML metabolomics data files. The parsing packages are separated into mzML2ISA (encompassing mzML and imzML formats) and nmrML2ISA (nmrML format only). Overall, the use of mzML2ISA & nmrML2ISA reduces the time needed to capture metadata substantially (capturing 90% of metadata on assay and sample levels), is much less prone to user input errors, improves compliance with minimum information reporting guidelines and facilitates more finely grained data exploration and querying of datasets. mzML2ISA & nmrML2ISA are available under version 3 of the GNU General Public Licence at https://github.com/ISA-tools. Documentation is available from http://2isa.readthedocs.io/en/latest/. reza.salek@ebi.ac.uk or isatools@googlegroups.com. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.
Medina-Aunon, J. Alberto; Martínez-Bartolomé, Salvador; López-García, Miguel A.; Salazar, Emilio; Navajas, Rosana; Jones, Andrew R.; Paradela, Alberto; Albar, Juan P.
2011-01-01
The development of the HUPO-PSI's (Proteomics Standards Initiative) standard data formats and MIAPE (Minimum Information About a Proteomics Experiment) guidelines should improve proteomics data sharing within the scientific community. Proteomics journals have encouraged the use of these standards and guidelines to improve the quality of experimental reporting and ease the evaluation and publication of manuscripts. However, there is an evident lack of bioinformatics tools specifically designed to create and edit standard file formats and reports, or embed them within proteomics workflows. In this article, we describe a new web-based software suite (The ProteoRed MIAPE web toolkit) that performs several complementary roles related to proteomic data standards. First, it can verify that the reports fulfill the minimum information requirements of the corresponding MIAPE modules, highlighting inconsistencies or missing information. Second, the toolkit can convert several XML-based data standards directly into human readable MIAPE reports stored within the ProteoRed MIAPE repository. Finally, it can also perform the reverse operation, allowing users to export from MIAPE reports into XML files for computational processing, data sharing, or public database submission. The toolkit is thus the first application capable of automatically linking the PSI's MIAPE modules with the corresponding XML data exchange standards, enabling bidirectional conversions. This toolkit is freely available at http://www.proteored.org/MIAPE/. PMID:21983993
mzResults: An Interactive Viewer for Interrogation and Distribution of Proteomics Results*
Webber, James T.; Askenazi, Manor; Marto, Jarrod A.
2011-01-01
The growing use of mass spectrometry in the context of biomedical research has been accompanied by an increased demand for distribution of results in a format that facilitates rapid and efficient validation of claims by reviewers and other interested parties. However, the continued evolution of mass spectrometry hardware, sample preparation methods, and peptide identification algorithms complicates standardization and creates hurdles related to compliance with journal submission requirements. Moreover, the recently announced Philadelphia Guidelines (1, 2) suggest that authors provide native mass spectrometry data files in support of their peer-reviewed research articles. These trends highlight the need for data viewers and other tools that work independently of manufacturers' proprietary data systems and seamlessly connect proteomics results with original data files to support user-driven data validation and review. Based upon our recently described API1-based framework for mass spectrometry data analysis (3, 4), we created an interactive viewer (mzResults) that is built on established database standards and enables efficient distribution and interrogation of results associated with proteomics experiments, while also providing a convenient mechanism for authors to comply with data submission standards as described in the Philadelphia Guidelines. In addition, the architecture of mzResults supports in-depth queries of the native mass spectrometry files through our multiplierz software environment. We use phosphoproteomics data to illustrate the features and capabilities of mzResults. PMID:21266631
Simulation Experiment Description Markup Language (SED-ML) Level 1 Version 2.
Bergmann, Frank T; Cooper, Jonathan; Le Novère, Nicolas; Nickerson, David; Waltemath, Dagmar
2015-09-04
The number, size and complexity of computational models of biological systems are growing at an ever increasing pace. It is imperative to build on existing studies by reusing and adapting existing models and parts thereof. The description of the structure of models is not sufficient to enable the reproduction of simulation results. One also needs to describe the procedures the models are subjected to, as recommended by the Minimum Information About a Simulation Experiment (MIASE) guidelines. This document presents Level 1 Version 2 of the Simulation Experiment Description Markup Language (SED-ML), a computer-readable format for encoding simulation and analysis experiments to apply to computational models. SED-ML files are encoded in the Extensible Markup Language (XML) and can be used in conjunction with any XML-based model encoding format, such as CellML or SBML. A SED-ML file includes details of which models to use, how to modify them prior to executing a simulation, which simulation and analysis procedures to apply, which results to extract and how to present them. Level 1 Version 2 extends the format by allowing the encoding of repeated and chained procedures.
Simulation Experiment Description Markup Language (SED-ML) Level 1 Version 2.
Bergmann, Frank T; Cooper, Jonathan; Le Novère, Nicolas; Nickerson, David; Waltemath, Dagmar
2015-06-01
The number, size and complexity of computational models of biological systems are growing at an ever increasing pace. It is imperative to build on existing studies by reusing and adapting existing models and parts thereof. The description of the structure of models is not sufficient to enable the reproduction of simulation results. One also needs to describe the procedures the models are subjected to, as recommended by the Minimum Information About a Simulation Experiment (MIASE) guidelines. This document presents Level 1 Version 2 of the Simulation Experiment Description Markup Language (SED-ML), a computer-readable format for encoding simulation and analysis experiments to apply to computational models. SED-ML files are encoded in the Extensible Markup Language (XML) and can be used in conjunction with any XML-based model encoding format, such as CellML or SBML. A SED-ML file includes details of which models to use, how to modify them prior to executing a simulation, which simulation and analysis procedures to apply, which results to extract and how to present them. Level 1 Version 2 extends the format by allowing the encoding of repeated and chained procedures.
Improving File System Performance by Striping
NASA Technical Reports Server (NTRS)
Lam, Terance L.; Kutler, Paul (Technical Monitor)
1998-01-01
This document discusses the performance and advantages of striped file systems on the SGI AD workstations. Performance of several striped file system configurations are compared and guidelines for optimal striping are recommended.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Van de Velde, Joris, E-mail: joris.vandevelde@ugent.be; Department of Radiotherapy, Ghent University, Ghent; Audenaert, Emmanuel
Purpose: To develop contouring guidelines for the brachial plexus (BP) using anatomically validated cadaver datasets. Magnetic resonance imaging (MRI) and computed tomography (CT) were used to obtain detailed visualizations of the BP region, with the goal of achieving maximal inclusion of the actual BP in a small contoured volume while also accommodating for anatomic variations. Methods and Materials: CT and MRI were obtained for 8 cadavers positioned for intensity modulated radiation therapy. 3-dimensional reconstructions of soft tissue (from MRI) and bone (from CT) were combined to create 8 separate enhanced CT project files. Dissection of the corresponding cadavers anatomically validatedmore » the reconstructions created. Seven enhanced CT project files were then automatically fitted, separately in different regions, to obtain a single dataset of superimposed BP regions that incorporated anatomic variations. From this dataset, improved BP contouring guidelines were developed. These guidelines were then applied to the 7 original CT project files and also to 1 additional file, left out from the superimposing procedure. The percentage of BP inclusion was compared with the published guidelines. Results: The anatomic validation procedure showed a high level of conformity for the BP regions examined between the 3-dimensional reconstructions generated and the dissected counterparts. Accurate and detailed BP contouring guidelines were developed, which provided corresponding guidance for each level in a clinical dataset. An average margin of 4.7 mm around the anatomically validated BP contour is sufficient to accommodate for anatomic variations. Using the new guidelines, 100% inclusion of the BP was achieved, compared with a mean inclusion of 37.75% when published guidelines were applied. Conclusion: Improved guidelines for BP delineation were developed using combined MRI and CT imaging with validation by anatomic dissection.« less
76 FR 43679 - Filing via the Internet; Notice of Additional File Formats for efiling
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-21
... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. RM07-16-000] Filing via the Internet; Notice of Additional File Formats for efiling Take notice that the Commission has added to its list of acceptable file formats the four-character file extensions for Microsoft Office 2007/2010...
Alkasab, Tarik K; Bizzo, Bernardo C; Berland, Lincoln L; Nair, Sujith; Pandharipande, Pari V; Harvey, H Benjamin
2017-09-01
Decreasing unnecessary variation in radiology reporting and producing guideline-concordant reports is fundamental to radiology's success in value-based payment models and good for patient care. In this article, we present an open authoring system for point-of-care clinical decision support tools integrated into the radiologist reporting environment referred to as the computer-assisted reporting and decision support (CAR/DS) framework. The CAR/DS authoring system, described herein, includes: (1) a definition format for representing radiology clinical guidelines as structured, machine-readable Extensible Markup Language documents and (2) a user-friendly reference implementation to test the fidelity of the created definition files with the clinical guideline. The proposed definition format and reference implementation will enable content creators to develop CAR/DS tools that voice recognition software (VRS) vendors can use to extend the commercial tools currently in use. In making the definition format and reference implementation software freely available, we hope to empower individual radiologists, expert groups such as the ACR, and VRS vendors to develop a robust ecosystem of CAR/DS tools that can further improve the quality and efficiency of the patient care that our field provides. We hope that this initial effort can serve as the basis for a community-owned open standard for guideline definition that the imaging informatics and VRS vendor communities will embrace and strengthen. To this end, the ACR Assist™ initiative is intended to make the College's clinical content, including the Incidental Findings Committee White Papers, available for decision support tool creation based upon the herein described CAR/DS framework. Copyright © 2017 American College of Radiology. Published by Elsevier Inc. All rights reserved.
... HEADS UP Resources Training Custom PDFs Mobile Apps Videos Graphics Podcasts Social Media File Formats Help: How do I view different file formats (PDF, DOC, PPT, MPEG) on this site? Adobe PDF file Microsoft PowerPoint ... file Apple Quicktime file RealPlayer file Text file ...
Chao, Tian-Jy; Kim, Younghun
2015-02-03
Automatically translating a building architecture file format (Industry Foundation Class) to a simulation file, in one aspect, may extract data and metadata used by a target simulation tool from a building architecture file. Interoperability data objects may be created and the extracted data is stored in the interoperability data objects. A model translation procedure may be prepared to identify a mapping from a Model View Definition to a translation and transformation function. The extracted data may be transformed using the data stored in the interoperability data objects, an input Model View Definition template, and the translation and transformation function to convert the extracted data to correct geometric values needed for a target simulation file format used by the target simulation tool. The simulation file in the target simulation file format may be generated.
Inventions and Patents: A practical tutorial
Tidwell, J. Lille; Liotta, Lance A.
2017-01-01
Patents are designed to protect and encourage creativity and innovation. Patenting a biomedical discovery can be a requirement before a pharma or biotech entity will invest in the lengthy and costly clinical testing necessary to achieve patient benefit. Although scientists and clinicians are well versed in research publication requirements, patent descriptions and claims are formatted in a manner quite different from a research paper. Patents require a) a series of logical statements clearly delineating the boundaries of the novel aspects of the invention and b) sufficient disclosure of the invention so that it can be reproduced by others. Patents are granted only for inventions that meet three conditions: novelty, non-obviousness and usefulness. This chapter provides basic guidelines and definitions for inventions, inventorship, and patent filing which are summarized using a question and answer format. PMID:22081360
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chao, Tian-Jy; Kim, Younghun
Automatically translating a building architecture file format (Industry Foundation Class) to a simulation file, in one aspect, may extract data and metadata used by a target simulation tool from a building architecture file. Interoperability data objects may be created and the extracted data is stored in the interoperability data objects. A model translation procedure may be prepared to identify a mapping from a Model View Definition to a translation and transformation function. The extracted data may be transformed using the data stored in the interoperability data objects, an input Model View Definition template, and the translation and transformation function tomore » convert the extracted data to correct geometric values needed for a target simulation file format used by the target simulation tool. The simulation file in the target simulation file format may be generated.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Temple, Brian Allen; Armstrong, Jerawan Chudoung
This document is a mid-year report on a deliverable for the PYTHON Radiography Analysis Tool (PyRAT) for project LANL12-RS-107J in FY15. The deliverable is deliverable number 2 in the work package and is titled “Add the ability to read in more types of image file formats in PyRAT”. Right now PyRAT can only read in uncompressed TIF files (tiff files). It is planned to expand the file formats that can be read by PyRAT, making it easier to use in more situations. A summary of the file formats added include jpeg, jpg, png and formatted ASCII files.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thoreson, Gregory G
PCF files are binary files designed to contain gamma spectra and neutron count rates from radiation sensors. It is the native format for the GAmma Detector Response and Analysis Software (GADRAS) package [1]. It can contain multiple spectra and information about each spectrum such as energy calibration. This document outlines the format of the file that would allow one to write a computer program to parse and write such files.
Data files from the Grays Harbor Sediment Transport Experiment Spring 2001
Landerman, Laura A.; Sherwood, Christopher R.; Gelfenbaum, Guy; Lacy, Jessica; Ruggiero, Peter; Wilson, Douglas; Chisholm, Tom; Kurrus, Keith
2005-01-01
This publication consists of two DVD-ROMs, both of which are presented here. This report describes data collected during the Spring 2001 Grays Harbor Sediment Transport Experiment, and provides additional information needed to interpret the data. Two DVDs accompany this report; both contain documentation in html format that assist the user in navigating through the data. DVD-ROM-1 contains a digital version of this report in .pdf format, raw Aquatec acoustic backscatter (ABS) data in .zip format, Sonar data files in .avi format, and coastal processes and morphology data in ASCII format. ASCII data files are provided in .zip format; bundled coastal processes ASCII files are separated by deployment and instrument; bundled morphology ASCII files are separated into monthly data collection efforts containing the beach profiles collected (or extracted from the surface map) at that time; weekly surface maps are also bundled together. DVD-ROM-2 contains a digital version of this report in .pdf format, the binary data files collected by the SonTek instrumentation, calibration files for the pressure sensors, and Matlab m-files for loading the ABS data into Matlab and cleaning-up the optical backscatter (OBS) burst time-series data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kraus, Terrence D.
2017-04-01
This report specifies the electronic file format that was agreed upon to be used as the file format for normalized radiological data produced by the software tool developed under this TI project. The NA-84 Technology Integration (TI) Program project (SNL17-CM-635, Normalizing Radiological Data for Analysis and Integration into Models) investigators held a teleconference on December 7, 2017 to discuss the tasks to be completed under the TI program project. During this teleconference, the TI project investigators determined that the comma-separated values (CSV) file format is the most suitable file format for the normalized radiological data that will be outputted frommore » the normalizing tool developed under this TI project. The CSV file format was selected because it provides the requisite flexibility to manage different types of radiological data (i.e., activity concentration, exposure rate, dose rate) from other sources [e.g., Radiological Assessment and Monitoring System (RAMS), Aerial Measuring System (AMS), Monitoring and Sampling). The CSV file format also is suitable for the file format of the normalized radiological data because this normalized data can then be ingested by other software [e.g., RAMS, Visual Sampling Plan (VSP)] used by the NA-84’s Consequence Management Program.« less
77 FR 59692 - 2014 Diversity Immigrant Visa Program
Federal Register 2010, 2011, 2012, 2013, 2014
2012-09-28
... the E-DV system. The entry will not be accepted and must be resubmitted. Group or family photographs... must be in the Joint Photographic Experts Group (JPEG) format. Image File Size: The maximum file size...). Image File Format: The image must be in the Joint Photographic Experts Group (JPEG) format. Image File...
76 FR 18238 - Wind Turbine Guidelines Advisory Committee; Announcement of Public Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-01
...] Wind Turbine Guidelines Advisory Committee; Announcement of Public Meeting AGENCY: Fish and Wildlife... (Service), will host a Wind Turbine Guidelines Advisory Committee (Committee) meeting on April 27, 2011... Turbine Guidelines Advisory Committee. [FR Doc. 2011-7788 Filed 3-31-11; 8:45 am] BILLING CODE 4310-55-P ...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-11
...- Inventor-to-File Provisions of the Leahy-Smith America Invents Act; Reopening of the Period for Comments...-inventor-to-file (FITF) provisions the Leahy-Smith America Invents Act (AIA). The USPTO also conducted a... of the AIA. See Changes to Implement the First Inventor to File Provisions of the Leahy-Smith America...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dr. George L Mesina
Our ultimate goal is to create and maintain RELAP5-3D as the best software tool available to analyze nuclear power plants. This begins with writing excellent programming and requires thorough testing. This document covers development of RELAP5-3D software, the behavior of the RELAP5-3D program that must be maintained, and code testing. RELAP5-3D must perform in a manner consistent with previous code versions with backward compatibility for the sake of the users. Thus file operations, code termination, input and output must remain consistent in form and content while adding appropriate new files, input and output as new features are developed. As computermore » hardware, operating systems, and other software change, RELAP5-3D must adapt and maintain performance. The code must be thoroughly tested to ensure that it continues to perform robustly on the supported platforms. The coding must be written in a consistent manner that makes the program easy to read to reduce the time and cost of development, maintenance and error resolution. The programming guidelines presented her are intended to institutionalize a consistent way of writing FORTRAN code for the RELAP5-3D computer program that will minimize errors and rework. A common format and organization of program units creates a unifying look and feel to the code. This in turn increases readability and reduces time required for maintenance, development and debugging. It also aids new programmers in reading and understanding the program. Therefore, when undertaking development of the RELAP5-3D computer program, the programmer must write computer code that follows these guidelines. This set of programming guidelines creates a framework of good programming practices, such as initialization, structured programming, and vector-friendly coding. It sets out formatting rules for lines of code, such as indentation, capitalization, spacing, etc. It creates limits on program units, such as subprograms, functions, and modules. It establishes documentation guidance on internal comments. The guidelines apply to both existing and new subprograms. They are written for both FORTRAN 77 and FORTRAN 95. The guidelines are not so rigorous as to inhibit a programmer’s unique style, but do restrict the variations in acceptable coding to create sufficient commonality that new readers will find the coding in each new subroutine familiar. It is recognized that this is a “living” document and must be updated as languages, compilers, and computer hardware and software evolve.« less
NAVAIR Portable Source Initiative (NPSI) Standard for Reusable Source Dataset Metadata (RSDM) V2.4
2012-09-26
defining a raster file format: <RasterFileFormat> <FormatName>TIFF</FormatName> <Order>BIP</Order> < DataType >8-BIT_UNSIGNED</ DataType ...interleaved by line (BIL); Band interleaved by pixel (BIP). element RasterFileFormatType/ DataType diagram type restriction of xsd:string facets
An Efficient Format for Nearly Constant-Time Access to Arbitrary Time Intervals in Large Trace Files
Chan, Anthony; Gropp, William; Lusk, Ewing
2008-01-01
A powerful method to aid in understanding the performance of parallel applications uses log or trace files containing time-stamped events and states (pairs of events). These trace files can be very large, often hundreds or even thousands of megabytes. Because of the cost of accessing and displaying such files, other methods are often used that reduce the size of the tracefiles at the cost of sacrificing detail or other information. This paper describes a hierarchical trace file format that provides for display of an arbitrary time window in a time independent of the total size of the file and roughlymore » proportional to the number of events within the time window. This format eliminates the need to sacrifice data to achieve a smaller trace file size (since storage is inexpensive, it is necessary only to make efficient use of bandwidth to that storage). The format can be used to organize a trace file or to create a separate file of annotations that may be used with conventional trace files. We present an analysis of the time to access all of the events relevant to an interval of time and we describe experiments demonstrating the performance of this file format.« less
Carle, S.F.; Glen, J.M.; Langenheim, V.E.; Smith, R.B.; Oliver, H.W.
1990-01-01
The report presents the principal facts for gravity stations compiled for Yellowstone National Park and vicinity. The gravity data were compiled from three sources: Defense Mapping Agency, University of Utah, and U.S. Geological Survey. Part A of the report is a paper copy describing how the compilation was done and presenting the data in tabular format as well as a map; part B is a 5-1/4 inch floppy diskette containing only the data files in ASCII format. Requirements for part B: IBM PC or compatible, DOS v. 2.0 or higher. Files contained on this diskette: DOD.ISO -- File containing the principal facts of the 514 gravity stations obtained from the Defense Mapping Agency. The data are in Plouff format* (see file PFTAB.TEX). UTAH.ISO -- File containing the principal facts of 153 gravity stations obtained from the University of Utah. Data are in Plouff format. USGS.ISO -- File containing the principal facts of 27 gravity stations collected by the U.S. Geological Survey in July 1987. Data are in Plouff format. PFTAB.TXT -- File containing explanation of principal fact format. ACC.TXT -- File containing explanation of accuracy codes.
Mapping DICOM to OpenDocument format
NASA Astrophysics Data System (ADS)
Yu, Cong; Yao, Zhihong
2009-02-01
In order to enhance the readability, extensibility and sharing of DICOM files, we have introduced XML into DICOM file system (SPIE Volume 5748)[1] and the multilayer tree structure into DICOM (SPIE Volume 6145)[2]. In this paper, we proposed mapping DICOM to ODF(OpenDocument Format), for it is also based on XML. As a result, the new format realizes the separation of content(including text content and image) and display style. Meanwhile, since OpenDocument files take the format of a ZIP compressed archive, the new kind of DICOM files can benefit from ZIP's lossless compression to reduce file size. Moreover, this open format can also guarantee long-term access to data without legal or technical barriers, making medical images accessible to various fields.
18 CFR 50.3 - Applications/pre-filing; rules and format.
Code of Federal Regulations, 2010 CFR
2010-04-01
... filings must be signed in compliance with § 385.2005 of this chapter. (e) The Commission will conduct a... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Applications/pre-filing... INTERSTATE ELECTRIC TRANSMISSION FACILITIES § 50.3 Applications/pre-filing; rules and format. (a) Filings are...
Manual for Getdata Version 3.1: a FORTRAN Utility Program for Time History Data
NASA Technical Reports Server (NTRS)
Maine, Richard E.
1987-01-01
This report documents version 3.1 of the GetData computer program. GetData is a utility program for manipulating files of time history data, i.e., data giving the values of parameters as functions of time. The most fundamental capability of GetData is extracting selected signals and time segments from an input file and writing the selected data to an output file. Other capabilities include converting file formats, merging data from several input files, time skewing, interpolating to common output times, and generating calculated output signals as functions of the input signals. This report also documents the interface standards for the subroutines used by GetData to read and write the time history files. All interface to the data files is through these subroutines, keeping the main body of GetData independent of the precise details of the file formats. Different file formats can be supported by changes restricted to these subroutines. Other computer programs conforming to the interface standards can call the same subroutines to read and write files in compatible formats.
Arkansas and Louisiana Aeromagnetic and Gravity Maps and Data - A Website for Distribution of Data
Bankey, Viki; Daniels, David L.
2008-01-01
This report contains digital data, image files, and text files describing data formats for aeromagnetic and gravity data used to compile the State aeromagnetic and gravity maps of Arkansas and Louisiana. The digital files include grids, images, ArcInfo, and Geosoft compatible files. In some of the data folders, ASCII files with the extension 'txt' describe the format and contents of the data files. Read the 'txt' files before using the data files.
A mass spectrometry proteomics data management platform.
Sharma, Vagisha; Eng, Jimmy K; Maccoss, Michael J; Riffle, Michael
2012-09-01
Mass spectrometry-based proteomics is increasingly being used in biomedical research. These experiments typically generate a large volume of highly complex data, and the volume and complexity are only increasing with time. There exist many software pipelines for analyzing these data (each typically with its own file formats), and as technology improves, these file formats change and new formats are developed. Files produced from these myriad software programs may accumulate on hard disks or tape drives over time, with older files being rendered progressively more obsolete and unusable with each successive technical advancement and data format change. Although initiatives exist to standardize the file formats used in proteomics, they do not address the core failings of a file-based data management system: (1) files are typically poorly annotated experimentally, (2) files are "organically" distributed across laboratory file systems in an ad hoc manner, (3) files formats become obsolete, and (4) searching the data and comparing and contrasting results across separate experiments is very inefficient (if possible at all). Here we present a relational database architecture and accompanying web application dubbed Mass Spectrometry Data Platform that is designed to address the failings of the file-based mass spectrometry data management approach. The database is designed such that the output of disparate software pipelines may be imported into a core set of unified tables, with these core tables being extended to support data generated by specific pipelines. Because the data are unified, they may be queried, viewed, and compared across multiple experiments using a common web interface. Mass Spectrometry Data Platform is open source and freely available at http://code.google.com/p/msdapl/.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-28
..., entitled ``Numerical Guidelines Applicable to Volatile Market Opens'' with a new paragraph, entitled...) of Rule 2128 to eliminate the ability of the Exchange to deviate from the Numerical Guidelines... Numerical Guidelines or Reference Prices in various ``Unusual Circumstances.'' The Exchange proposes to...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-28
... existing paragraph (b)(4) of the Rule, entitled ``Numerical Guidelines Applicable to Volatile Market Opens... existing paragraph (b)(2), which provides flexibility to FINRA to use different Numerical Guidelines or... of paragraph (b)(4) (``Numerical Guidelines Applicable to Volatile Market Opens'') of the existing...
Mass spectrometer output file format mzML.
Deutsch, Eric W
2010-01-01
Mass spectrometry is an important technique for analyzing proteins and other biomolecular compounds in biological samples. Each of the vendors of these mass spectrometers uses a different proprietary binary output file format, which has hindered data sharing and the development of open source software for downstream analysis. The solution has been to develop, with the full participation of academic researchers as well as software and hardware vendors, an open XML-based format for encoding mass spectrometer output files, and then to write software to use this format for archiving, sharing, and processing. This chapter presents the various components and information available for this format, mzML. In addition to the XML schema that defines the file structure, a controlled vocabulary provides clear terms and definitions for the spectral metadata, and a semantic validation rules mapping file allows the mzML semantic validator to insure that an mzML document complies with one of several levels of requirements. Complete documentation and example files insure that the format may be uniformly implemented. At the time of release, there already existed several implementations of the format and vendors have committed to supporting the format in their products.
Qian, Li Jun; Zhou, Mi; Xu, Jian Rong
2008-07-01
The objective of this article is to explain an easy and effective approach for managing radiologic files in portable document format (PDF) using iTunes. PDF files are widely used as a standard file format for electronic publications as well as for medical online documents. Unfortunately, there is a lack of powerful software to manage numerous PDF documents. In this article, we explain how to use the hidden function of iTunes (Apple Computer) to manage PDF documents as easily as managing music files.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-02-23
... recommends not more than 32 characters). DO NOT convert Word files or Excel files into PDF format. Converting... not allow HUD to enter data from the Excel files into a database. DO NOT save your logic model in .xlsm format. If necessary save as an Excel 97-2003 .xls format. Using the .xlsm format can result in a...
A Mass Spectrometry Proteomics Data Management Platform*
Sharma, Vagisha; Eng, Jimmy K.; MacCoss, Michael J.; Riffle, Michael
2012-01-01
Mass spectrometry-based proteomics is increasingly being used in biomedical research. These experiments typically generate a large volume of highly complex data, and the volume and complexity are only increasing with time. There exist many software pipelines for analyzing these data (each typically with its own file formats), and as technology improves, these file formats change and new formats are developed. Files produced from these myriad software programs may accumulate on hard disks or tape drives over time, with older files being rendered progressively more obsolete and unusable with each successive technical advancement and data format change. Although initiatives exist to standardize the file formats used in proteomics, they do not address the core failings of a file-based data management system: (1) files are typically poorly annotated experimentally, (2) files are “organically” distributed across laboratory file systems in an ad hoc manner, (3) files formats become obsolete, and (4) searching the data and comparing and contrasting results across separate experiments is very inefficient (if possible at all). Here we present a relational database architecture and accompanying web application dubbed Mass Spectrometry Data Platform that is designed to address the failings of the file-based mass spectrometry data management approach. The database is designed such that the output of disparate software pipelines may be imported into a core set of unified tables, with these core tables being extended to support data generated by specific pipelines. Because the data are unified, they may be queried, viewed, and compared across multiple experiments using a common web interface. Mass Spectrometry Data Platform is open source and freely available at http://code.google.com/p/msdapl/. PMID:22611296
12 CFR 335.801 - Inapplicable SEC regulations; FDIC substituted regulations; additional information.
Code of Federal Regulations, 2013 CFR
2013-01-01
... a continuing hardship exemption under these rules may file the forms with the FDIC in paper format... these rules may file the appropriate forms with the FDIC in paper format. Instructions for continuing...) Previously filed exhibits, whether in paper or electronic format, may be incorporated by reference into an...
12 CFR 335.801 - Inapplicable SEC regulations; FDIC substituted regulations; additional information.
Code of Federal Regulations, 2014 CFR
2014-01-01
... a continuing hardship exemption under these rules may file the forms with the FDIC in paper format... these rules may file the appropriate forms with the FDIC in paper format. Instructions for continuing...) Previously filed exhibits, whether in paper or electronic format, may be incorporated by reference into an...
12 CFR 335.801 - Inapplicable SEC regulations; FDIC substituted regulations; additional information.
Code of Federal Regulations, 2012 CFR
2012-01-01
... a continuing hardship exemption under these rules may file the forms with the FDIC in paper format... these rules may file the appropriate forms with the FDIC in paper format. Instructions for continuing...) Previously filed exhibits, whether in paper or electronic format, may be incorporated by reference into an...
12 CFR 335.801 - Inapplicable SEC regulations; FDIC substituted regulations; additional information.
Code of Federal Regulations, 2011 CFR
2011-01-01
... a continuing hardship exemption under these rules may file the forms with the FDIC in paper format... these rules may file the appropriate forms with the FDIC in paper format. Instructions for continuing...) Previously filed exhibits, whether in paper or electronic format, may be incorporated by reference into an...
ERIC Educational Resources Information Center
American Association of School Personnel Administrators, Seven Hills, OH.
These guidelines are intended to provide personnel administrators with a means of evaluating their current practices and procedures in teacher selection. The guidelines cover recruitment, hiring criteria, employment interviews, and the follow-up to selection. A suggested personnel selection procedure outlines application, file preparation, and the…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-28
... ``Numerical Guidelines Applicable to Volatile Market Opens'' with a new paragraph, entitled ``Individual Stock... to eliminate the ability of the Exchange to deviate from the Numerical Guidelines contained in... existing paragraph (c)(2), which provides flexibility to the Exchange to use different Numerical Guidelines...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-10-06
... replacing existing paragraph (a)(2)(C)(iv) of Rule 3312, entitled ``Numerical Guidelines Applicable to... Exchange to deviate from the Numerical Guidelines contained in paragraph (a)(2)(C)(i) when deciding which... Numerical Guidelines or Reference Prices in various ``Unusual Circumstances.'' The Exchange proposes to...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-28
... proposes replacing existing paragraph (c)(4) of Rule 128, entitled ``Numerical Guidelines Applicable to... the Exchange to deviate from the Numerical Guidelines contained in paragraph (c)(1) (other than under...)(2), which provides flexibility to the Exchange to use different Numerical Guidelines or Reference...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-28
....'' Second, NASDAQ replacing existing paragraph (C)(4) of Rule 11890, entitled ``Numerical Guidelines... NASDAQ to deviate from the Numerical Guidelines contained in paragraph (C)(1) (other than under limited... provides flexibility to NASDAQ to use different Numerical Guidelines or Reference Prices in various...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-29
... proposes replacing existing paragraph (c)(4) of Rule 7.10, entitled ``Numerical Guidelines Applicable to... the Exchange to deviate from the Numerical Guidelines contained in paragraph (c)(1) (other than under...)(2), which provides flexibility to the Exchange to use different Numerical Guidelines or Reference...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-28
..., entitled ``Numerical Guidelines Applicable to Volatile Market Opens'' with a new paragraph, entitled... (g) of Rule 128 to eliminate the ability of the Exchange to deviate from the Numerical Guidelines... flexibility to the Exchange to use different Numerical Guidelines or Reference Prices in various ``Unusual...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-28
... Exchange replacing existing paragraph (c)(4) of Rule 11.19, entitled ``Numerical Guidelines Applicable to... the Exchange to deviate from the Numerical Guidelines contained in paragraph (c)(1) (other than under... provides flexibility to the Exchange to use different Numerical Guidelines or Reference Prices in various...
21 CFR 350.60 - Guidelines for effectiveness testing of antiperspirant drug products.
Code of Federal Regulations, 2010 CFR
2010-04-01
.... These guidelines are on file in the Dockets Management Branch (HFA-305), Food and Drug Administration... 21 Food and Drugs 5 2010-04-01 2010-04-01 false Guidelines for effectiveness testing of antiperspirant drug products. 350.60 Section 350.60 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-01-18
... Public Meeting on the International Maritime Organization Guidelines for Exhaust Gas Cleaning Systems for... that a public meeting on the International Maritime Organization guidelines for exhaust gas cleaning..., 2011. F.J. Strum, Acting Director of Commercial Regulations and Standards. [FR Doc. 2011-1024 Filed 1...
Developing Database Files for Student Use.
ERIC Educational Resources Information Center
Warner, Michael
1988-01-01
Presents guidelines for creating student database files that supplement classroom teaching. Highlights include determining educational objectives, planning the database with computer specialists and subject area specialists, data entry, and creating student worksheets. Specific examples concerning elements of the periodic table and…
Transferable Output ASCII Data (TOAD) gateway: Version 1.0 user's guide
NASA Technical Reports Server (NTRS)
Bingel, Bradford D.
1991-01-01
The Transferable Output ASCII Data (TOAD) Gateway, release 1.0 is described. This is a software tool for converting tabular data from one format into another via the TOAD format. This initial release of the Gateway allows free data interchange among the following file formats: TOAD; Standard Interface File (SIF); Program to Optimize Simulated Trajectories (POST) input; Comma Separated Value (TSV); and a general free-form file format. As required, additional formats can be accommodated quickly and easily.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rearden, Bradley T.
2016-04-01
The format of the TSUNAMI-A sensitivity data file produced by SAMS for cases with deterministic transport solutions is given in Table 6.3.A.1. The occurrence of each entry in the data file is followed by an identification of the data contained on each line of the file and the FORTRAN edit descriptor denoting the format of each line. A brief description of each line is also presented. A sample of the TSUNAMI-A data file for the Flattop-25 sample problem is provided in Figure 6.3.A.1. Here, only two profiles out of the 130 computed are shown.
NASA Technical Reports Server (NTRS)
Bingle, Bradford D.; Shea, Anne L.; Hofler, Alicia S.
1993-01-01
Transferable Output ASCII Data (TOAD) computer program (LAR-13755), implements format designed to facilitate transfer of data across communication networks and dissimilar host computer systems. Any data file conforming to TOAD format standard called TOAD file. TOAD Editor is interactive software tool for manipulating contents of TOAD files. Commonly used to extract filtered subsets of data for visualization of results of computation. Also offers such user-oriented features as on-line help, clear English error messages, startup file, macroinstructions defined by user, command history, user variables, UNDO features, and full complement of mathematical statistical, and conversion functions. Companion program, TOAD Gateway (LAR-14484), converts data files from variety of other file formats to that of TOAD. TOAD Editor written in FORTRAN 77.
78 FR 17233 - Notice of Opportunity To File Amicus Briefs
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-20
.... Any commonly-used word processing format or PDF format is acceptable; text formats are preferable to image formats. Briefs may also be filed with the Office of the Clerk of the Board, Merit Systems...
Displaying Composite and Archived Soundings in the Advanced Weather Interactive Processing System
NASA Technical Reports Server (NTRS)
Barrett, Joe H., III; Volkmer, Matthew R.; Blottman, Peter F.; Sharp, David W.
2008-01-01
In a previous task, the Applied Meteorology Unit (AMU) developed spatial and temporal climatologies of lightning occurrence based on eight atmospheric flow regimes. The AMU created climatological, or composite, soundings of wind speed and direction, temperature, and dew point temperature at four rawinsonde observation stations at Jacksonville, Tampa, Miami, and Cape Canaveral Air Force Station, for each of the eight flow regimes. The composite soundings were delivered to the National Weather Service (NWS) Melbourne (MLB) office for display using the National version of the Skew-T Hodograph analysis and Research Program (NSHARP) software program. The NWS MLB requested the AMU make the composite soundings available for display in the Advanced Weather Interactive Processing System (AWIPS), so they could be overlaid on current observed soundings. This will allow the forecasters to compare the current state of the atmosphere with climatology. This presentation describes how the AMU converted the composite soundings from NSHARP Archive format to Network Common Data Form (NetCDF) format, so that the soundings could be displayed in AWl PS. The NetCDF is a set of data formats, programming interfaces, and software libraries used to read and write scientific data files. In AWIPS, each meteorological data type, such as soundings or surface observations, has a unique NetCDF format. Each format is described by a NetCDF template file. Although NetCDF files are in binary format, they can be converted to a text format called network Common data form Description Language (CDL). A software utility called ncgen is used to create a NetCDF file from a CDL file, while the ncdump utility is used to create a CDL file from a NetCDF file. An AWIPS receives soundings in Binary Universal Form for the Representation of Meteorological data (BUFR) format (http://dss.ucar.edu/docs/formats/bufr/), and then decodes them into NetCDF format. Only two sounding files are generated in AWIPS per day. One file contains all of the soundings received worldwide between 0000 UTC and 1200 UTC, and the other includes all soundings between 1200 UTC and 0000 UTC. In order to add the composite soundings into AWIPS, a procedure was created to configure, or localize, AWIPS. This involved modifying and creating several configuration text files. A unique fourcharacter site identifier was created for each of the 32 soundings so each could be viewed separately. The first three characters were based on the site identifier of the observed sounding, while the last character was based on the flow regime. While researching the localization process for soundings, the AMU discovered a method of archiving soundings so old soundings would not get purged automatically by AWl PS. This method could provide an alternative way of localizing AWl PS for composite soundings. In addition, this would allow forecasters to use archived soundings in AWIPS for case studies. A test sounding file in NetCDF format was written in order to verify the correct format for soundings in AWIPS. After the file was viewed successfully in AWIPS, the AMU wrote a software program in the Tool Command Language/Tool Kit (Tcl/Tk) language to convert the 32 composite soundings from NSHARP Archive to CDL format. The ncgen utility was then used to convert the CDL file to a NetCDF file. The NetCDF file could then be read and displayed in AWIPS.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-02
... nature of the pleading will be included in the body of the electronic mail message. c. The pleading shall..., Department of Defense. Proposed New Order for Electronic Filing of Pleadings Effective (date), all pleadings... Electronic Filing of Pleadings 1. Scope. The United States Court of Appeals for the Armed Forces adopts the...
SEGY to ASCII Conversion and Plotting Program 2.0
Goldman, Mark R.
2005-01-01
INTRODUCTION SEGY has long been a standard format for storing seismic data and header information. Almost every seismic processing package can read and write seismic data in SEGY format. In the data processing world, however, ASCII format is the 'universal' standard format. Very few general-purpose plotting or computation programs will accept data in SEGY format. The software presented in this report, referred to as SEGY to ASCII (SAC), converts seismic data written in SEGY format (Barry et al., 1975) to an ASCII data file, and then creates a postscript file of the seismic data using a general plotting package (GMT, Wessel and Smith, 1995). The resulting postscript file may be plotted by any standard postscript plotting program. There are two versions of SAC: one version for plotting a SEGY file that contains a single gather, such as a stacked CDP or migrated section, and a second version for plotting multiple gathers from a SEGY file containing more than one gather, such as a collection of shot gathers. Note that if a SEGY file has multiple gathers, then each gather must have the same number of traces per gather, and each trace must have the same sample interval and number of samples per trace. SAC will read several common standards of SEGY data, including SEGY files with sample values written in either IBM or IEEE floating-point format. In addition, utility programs are present to convert non-standard Seismic Unix (.sux) SEGY files and PASSCAL (.rsy) SEGY files to standard SEGY files. SAC allows complete user control over all plotting parameters including label size and font, tick mark intervals, trace scaling, and the inclusion of a title and descriptive text. SAC shell scripts create a postscript image of the seismic data in vector rather than bitmap format, using GMT's pswiggle command. Although this can produce a very large postscript file, the image quality is generally superior to that of a bitmap image, and commercial programs such as Adobe Illustrator? can manipulate the image more efficiently.
Tools for Requirements Management: A Comparison of Telelogic DOORS and the HiVe
2006-07-01
types DOORS deals with are text files, spreadsheets, FrameMaker , rich text, Microsoft Word and Microsoft Project. 2.5.1 Predefined file formats DOORS...during the export. DOORS exports FrameMaker files in an incomplete format, meaning DOORS exported files will have to be opened in FrameMaker and saved
76 FR 10405 - Federal Copyright Protection of Sound Recordings Fixed Before February 15, 1972
Federal Register 2010, 2011, 2012, 2013, 2014
2011-02-24
... file in either the Adobe Portable Document File (PDF) format that contains searchable, accessible text (not an image); Microsoft Word; WordPerfect; Rich Text Format (RTF); or ASCII text file format (not a..., comments may be delivered in hard copy. If hand delivered by a private party, an original [[Page 10406...
The prevalence of encoded digital trace evidence in the nonfile space of computer media(,) (.).
Garfinkel, Simson L
2014-09-01
Forensically significant digital trace evidence that is frequently present in sectors of digital media not associated with allocated or deleted files. Modern digital forensic tools generally do not decompress such data unless a specific file with a recognized file type is first identified, potentially resulting in missed evidence. Email addresses are encoded differently for different file formats. As a result, trace evidence can be categorized as Plain in File (PF), Encoded in File (EF), Plain Not in File (PNF), or Encoded Not in File (ENF). The tool bulk_extractor finds all of these formats, but other forensic tools do not. A study of 961 storage devices purchased on the secondary market and shows that 474 contained encoded email addresses that were not in files (ENF). Different encoding formats are the result of different application programs that processed different kinds of digital trace evidence. Specific encoding formats explored include BASE64, GZIP, PDF, HIBER, and ZIP. Published 2014. This article is a U.S. Government work and is in the public domain in the USA. Journal of Forensic Sciences published by Wiley Periodicals, Inc. on behalf of American Academy of Forensic Sciences.
DOT National Transportation Integrated Search
2001-02-01
The Minnesota data system includes the following basic files: Accident data (Accident File, Vehicle File, Occupant File); Roadlog File; Reference Post File; Traffic File; Intersection File; Bridge (Structures) File; and RR Grade Crossing File. For ea...
PDB explorer -- a web based algorithm for protein annotation viewer and 3D visualization.
Nayarisseri, Anuraj; Shardiwal, Rakesh Kumar; Yadav, Mukesh; Kanungo, Neha; Singh, Pooja; Shah, Pratik; Ahmed, Sheaza
2014-12-01
The PDB file format, is a text format characterizing the three dimensional structures of macro molecules available in the Protein Data Bank (PDB). Determined protein structure are found in coalition with other molecules or ions such as nucleic acids, water, ions, Drug molecules and so on, which therefore can be described in the PDB format and have been deposited in PDB database. PDB is a machine generated file, it's not human readable format, to read this file we need any computational tool to understand it. The objective of our present study is to develop a free online software for retrieval, visualization and reading of annotation of a protein 3D structure which is available in PDB database. Main aim is to create PDB file in human readable format, i.e., the information in PDB file is converted in readable sentences. It displays all possible information from a PDB file including 3D structure of that file. Programming languages and scripting languages like Perl, CSS, Javascript, Ajax, and HTML have been used for the development of PDB Explorer. The PDB Explorer directly parses the PDB file, calling methods for parsed element secondary structure element, atoms, coordinates etc. PDB Explorer is freely available at http://www.pdbexplorer.eminentbio.com/home with no requirement of log-in.
NoSQL: collection document and cloud by using a dynamic web query form
NASA Astrophysics Data System (ADS)
Abdalla, Hemn B.; Lin, Jinzhao; Li, Guoquan
2015-07-01
Mongo-DB (from "humongous") is an open-source document database and the leading NoSQL database. A NoSQL (Not Only SQL, next generation databases, being non-relational, deal, open-source and horizontally scalable) presenting a mechanism for storage and retrieval of documents. Previously, we stored and retrieved the data using the SQL queries. Here, we use the MonogoDB that means we are not utilizing the MySQL and SQL queries. Directly importing the documents into our Drives, retrieving the documents on that drive by not applying the SQL queries, using the IO BufferReader and Writer, BufferReader for importing our type of document files to my folder (Drive). For retrieving the document files, the usage is BufferWriter from the particular folder (or) Drive. In this sense, providing the security for those storing files for what purpose means if we store the documents in our local folder means all or views that file and modified that file. So preventing that file, we are furnishing the security. The original document files will be changed to another format like in this paper; Binary format is used. Our documents will be converting to the binary format after that direct storing in one of our folder, that time the storage space will provide the private key for accessing that file. Wherever any user tries to discover the Document files means that file data are in the binary format, the document's file owner simply views that original format using that personal key from receive the secret key from the cloud.
The Design and Usage of the New Data Management Features in NASTRAN
NASA Technical Reports Server (NTRS)
Pamidi, P. R.; Brown, W. K.
1984-01-01
Two new data management features are installed in the April 1984 release of NASTRAN. These two features are the Rigid Format Data Base and the READFILE capability. The Rigid Format Data Base is stored on external files in card image format and can be easily maintained and expanded by the use of standard text editors. This data base provides the user and the NASTRAN maintenance contractor with an easy means for making changes to a Rigid Format or for generating new Rigid Formats without unnecessary compilations and link editing of NASTRAN. Each Rigid Format entry in the data base contains the Direct Matrix Abstraction Program (DMAP), along with the associated restart, DMAP sequence subset and substructure control flags. The READFILE capability allows an user to reference an external secondary file from the NASTRAN primary input file and to read data from this secondary file. There is no limit to the number of external secondary files that may be referenced and read.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sublet, J.-Ch.; Koning, A.J.; Forrest, R.A.
The reasons for the conversion of the European Activation File, EAF into ENDF-6 format are threefold. First, it significantly enhances the JEFF-3.0 release by the addition of an activation file. Second, to considerably increase its usage by using a recognized, official file format, allowing existing plug-in processes to be effective; and third, to move towards a universal nuclear data file in contrast to the current separate general and special-purpose files. The format chosen for the JEFF-3.0/A file uses reaction cross sections (MF-3), cross sections (MF-10), and multiplicities (MF-9). Having the data in ENDF-6 format allows the ENDF suite of utilitiesmore » and checker codes to be used alongside many other utility, visualizing, and processing codes. It is based on the EAF activation file used for many applications from fission to fusion, including dosimetry, inventories, depletion-transmutation, and geophysics. JEFF-3.0/A takes advantage of four generations of EAF files. Extensive benchmarking activities on these files provide feedback and validation with integral measurements. These, in parallel with a detailed graphical analysis based on EXFOR, have been applied stimulating new measurements, significantly increasing the quality of this activation file. The next step is to include the EAF uncertainty data for all channels into JEFF-3.0/A.« less
FRS Geospatial Return File Format
The Geospatial Return File Format describes format that needs to be used to submit latitude and longitude coordinates for use in Envirofacts mapping applications. These coordinates are stored in the Geospatail Reference Tables.
SEDIMENT DATA - COMMENCEMENT BAY HYLEBOS WATERWAY - TACOMA, WA - PRE-REMEDIAL DESIGN PROGRAM
Event 1A/1B Data Files URL address: http://www.epa.gov/r10earth/datalib/superfund/hybos1ab.htm. Sediment Chemistry Data (Database Format): HYBOS1AB.EXE is a self-extracting file which expands to the single-value per record .DBF format database file HYBOS1AB.DBF. This file contai...
76 FR 5431 - Released Rates of Motor Common Carriers of Household Goods
Federal Register 2010, 2011, 2012, 2013, 2014
2011-01-31
... may be submitted either via the Board's e-filing format or in traditional paper format. Any person using e-filing should attach a document and otherwise comply with the instructions at the E- FILING link on the Board's website at http://www.stb.dot.gov . Any person submitting a filing in the traditional...
75 FR 52054 - Assessment of Mediation and Arbitration Procedures
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-24
...: Comments may be submitted either via the Board's e-filing format or in the traditional paper format. Any person using e-filing should attach a document and otherwise comply with the instructions at the E-FILING link on the Board's Web site, at http://www.stb.dot.gov . Any person submitting a filing in the...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-10-01
... need to submit a photo for a child who is already a U.S. citizen or a Legal Permanent Resident. Group... Joint Photographic Experts Group (JPEG) format; it must have a maximum image file size of two hundred... (dpi); the image file format in Joint Photographic Experts Group (JPEG) format; the maximum image file...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-27
... already a U.S. citizen or a Lawful Permanent Resident, but you will not be penalized if you do. Group... specifications: Image File Format: The miage must be in the Joint Photographic Experts Group (JPEG) format. Image... in the Joint Photographic Experts Group (JPEG) format. Image File Size: The maximum image file size...
Photon-HDF5: An Open File Format for Timestamp-Based Single-Molecule Fluorescence Experiments.
Ingargiola, Antonino; Laurence, Ted; Boutelle, Robert; Weiss, Shimon; Michalet, Xavier
2016-01-05
We introduce Photon-HDF5, an open and efficient file format to simplify exchange and long-term accessibility of data from single-molecule fluorescence experiments based on photon-counting detectors such as single-photon avalanche diode, photomultiplier tube, or arrays of such detectors. The format is based on HDF5, a widely used platform- and language-independent hierarchical file format for which user-friendly viewers are available. Photon-HDF5 can store raw photon data (timestamp, channel number, etc.) from any acquisition hardware, but also setup and sample description, information on provenance, authorship and other metadata, and is flexible enough to include any kind of custom data. The format specifications are hosted on a public website, which is open to contributions by the biophysics community. As an initial resource, the website provides code examples to read Photon-HDF5 files in several programming languages and a reference Python library (phconvert), to create new Photon-HDF5 files and convert several existing file formats into Photon-HDF5. To encourage adoption by the academic and commercial communities, all software is released under the MIT open source license. Copyright © 2016 Biophysical Society. Published by Elsevier Inc. All rights reserved.
Photon-HDF5: An Open File Format for Timestamp-Based Single-Molecule Fluorescence Experiments
Ingargiola, Antonino; Laurence, Ted; Boutelle, Robert; Weiss, Shimon; Michalet, Xavier
2016-01-01
We introduce Photon-HDF5, an open and efficient file format to simplify exchange and long-term accessibility of data from single-molecule fluorescence experiments based on photon-counting detectors such as single-photon avalanche diode, photomultiplier tube, or arrays of such detectors. The format is based on HDF5, a widely used platform- and language-independent hierarchical file format for which user-friendly viewers are available. Photon-HDF5 can store raw photon data (timestamp, channel number, etc.) from any acquisition hardware, but also setup and sample description, information on provenance, authorship and other metadata, and is flexible enough to include any kind of custom data. The format specifications are hosted on a public website, which is open to contributions by the biophysics community. As an initial resource, the website provides code examples to read Photon-HDF5 files in several programming languages and a reference Python library (phconvert), to create new Photon-HDF5 files and convert several existing file formats into Photon-HDF5. To encourage adoption by the academic and commercial communities, all software is released under the MIT open source license. PMID:26745406
Ingargiola, A.; Laurence, T. A.; Boutelle, R.; ...
2015-12-23
We introduce Photon-HDF5, an open and efficient file format to simplify exchange and long term accessibility of data from single-molecule fluorescence experiments based on photon-counting detectors such as single-photon avalanche diode (SPAD), photomultiplier tube (PMT) or arrays of such detectors. The format is based on HDF5, a widely used platform- and language-independent hierarchical file format for which user-friendly viewers are available. Photon-HDF5 can store raw photon data (timestamp, channel number, etc) from any acquisition hardware, but also setup and sample description, information on provenance, authorship and other metadata, and is flexible enough to include any kind of custom data. Themore » format specifications are hosted on a public website, which is open to contributions by the biophysics community. As an initial resource, the website provides code examples to read Photon-HDF5 files in several programming languages and a reference python library (phconvert), to create new Photon-HDF5 files and convert several existing file formats into Photon-HDF5. As a result, to encourage adoption by the academic and commercial communities, all software is released under the MIT open source license.« less
OMERO and Bio-Formats 5: flexible access to large bioimaging datasets at scale
NASA Astrophysics Data System (ADS)
Moore, Josh; Linkert, Melissa; Blackburn, Colin; Carroll, Mark; Ferguson, Richard K.; Flynn, Helen; Gillen, Kenneth; Leigh, Roger; Li, Simon; Lindner, Dominik; Moore, William J.; Patterson, Andrew J.; Pindelski, Blazej; Ramalingam, Balaji; Rozbicki, Emil; Tarkowska, Aleksandra; Walczysko, Petr; Allan, Chris; Burel, Jean-Marie; Swedlow, Jason
2015-03-01
The Open Microscopy Environment (OME) has built and released Bio-Formats, a Java-based proprietary file format conversion tool and OMERO, an enterprise data management platform under open source licenses. In this report, we describe new versions of Bio-Formats and OMERO that are specifically designed to support large, multi-gigabyte or terabyte scale datasets that are routinely collected across most domains of biological and biomedical research. Bio- Formats reads image data directly from native proprietary formats, bypassing the need for conversion into a standard format. It implements the concept of a file set, a container that defines the contents of multi-dimensional data comprised of many files. OMERO uses Bio-Formats to read files natively, and provides a flexible access mechanism that supports several different storage and access strategies. These new capabilities of OMERO and Bio-Formats make them especially useful for use in imaging applications like digital pathology, high content screening and light sheet microscopy that create routinely large datasets that must be managed and analyzed.
Keemei: cloud-based validation of tabular bioinformatics file formats in Google Sheets.
Rideout, Jai Ram; Chase, John H; Bolyen, Evan; Ackermann, Gail; González, Antonio; Knight, Rob; Caporaso, J Gregory
2016-06-13
Bioinformatics software often requires human-generated tabular text files as input and has specific requirements for how those data are formatted. Users frequently manage these data in spreadsheet programs, which is convenient for researchers who are compiling the requisite information because the spreadsheet programs can easily be used on different platforms including laptops and tablets, and because they provide a familiar interface. It is increasingly common for many different researchers to be involved in compiling these data, including study coordinators, clinicians, lab technicians and bioinformaticians. As a result, many research groups are shifting toward using cloud-based spreadsheet programs, such as Google Sheets, which support the concurrent editing of a single spreadsheet by different users working on different platforms. Most of the researchers who enter data are not familiar with the formatting requirements of the bioinformatics programs that will be used, so validating and correcting file formats is often a bottleneck prior to beginning bioinformatics analysis. We present Keemei, a Google Sheets Add-on, for validating tabular files used in bioinformatics analyses. Keemei is available free of charge from Google's Chrome Web Store. Keemei can be installed and run on any web browser supported by Google Sheets. Keemei currently supports the validation of two widely used tabular bioinformatics formats, the Quantitative Insights into Microbial Ecology (QIIME) sample metadata mapping file format and the Spatially Referenced Genetic Data (SRGD) format, but is designed to easily support the addition of others. Keemei will save researchers time and frustration by providing a convenient interface for tabular bioinformatics file format validation. By allowing everyone involved with data entry for a project to easily validate their data, it will reduce the validation and formatting bottlenecks that are commonly encountered when human-generated data files are first used with a bioinformatics system. Simplifying the validation of essential tabular data files, such as sample metadata, will reduce common errors and thereby improve the quality and reliability of research outcomes.
A Python library for FAIRer access and deposition to the Metabolomics Workbench Data Repository.
Smelter, Andrey; Moseley, Hunter N B
2018-01-01
The Metabolomics Workbench Data Repository is a public repository of mass spectrometry and nuclear magnetic resonance data and metadata derived from a wide variety of metabolomics studies. The data and metadata for each study is deposited, stored, and accessed via files in the domain-specific 'mwTab' flat file format. In order to improve the accessibility, reusability, and interoperability of the data and metadata stored in 'mwTab' formatted files, we implemented a Python library and package. This Python package, named 'mwtab', is a parser for the domain-specific 'mwTab' flat file format, which provides facilities for reading, accessing, and writing 'mwTab' formatted files. Furthermore, the package provides facilities to validate both the format and required metadata elements of a given 'mwTab' formatted file. In order to develop the 'mwtab' package we used the official 'mwTab' format specification. We used Git version control along with Python unit-testing framework as well as continuous integration service to run those tests on multiple versions of Python. Package documentation was developed using sphinx documentation generator. The 'mwtab' package provides both Python programmatic library interfaces and command-line interfaces for reading, writing, and validating 'mwTab' formatted files. Data and associated metadata are stored within Python dictionary- and list-based data structures, enabling straightforward, 'pythonic' access and manipulation of data and metadata. Also, the package provides facilities to convert 'mwTab' files into a JSON formatted equivalent, enabling easy reusability of the data by all modern programming languages that implement JSON parsers. The 'mwtab' package implements its metadata validation functionality based on a pre-defined JSON schema that can be easily specialized for specific types of metabolomics studies. The library also provides a command-line interface for interconversion between 'mwTab' and JSONized formats in raw text and a variety of compressed binary file formats. The 'mwtab' package is an easy-to-use Python package that provides FAIRer utilization of the Metabolomics Workbench Data Repository. The source code is freely available on GitHub and via the Python Package Index. Documentation includes a 'User Guide', 'Tutorial', and 'API Reference'. The GitHub repository also provides 'mwtab' package unit-tests via a continuous integration service.
Accelerating Malware Detection via a Graphics Processing Unit
2010-09-01
Processing Unit . . . . . . . . . . . . . . . . . . 4 PE Portable Executable . . . . . . . . . . . . . . . . . . . . . 4 COFF Common Object File Format...operating systems for the future [Szo05]. The PE format is an updated version of the common object file format ( COFF ) [Mic06]. Microsoft released a new...NAs02]. These alerts can be costly in terms of time and resources for individuals and organizations to investigate each misidentified file [YWL07] [Vak10
Smelter, Andrey; Astra, Morgan; Moseley, Hunter N B
2017-03-17
The Biological Magnetic Resonance Data Bank (BMRB) is a public repository of Nuclear Magnetic Resonance (NMR) spectroscopic data of biological macromolecules. It is an important resource for many researchers using NMR to study structural, biophysical, and biochemical properties of biological macromolecules. It is primarily maintained and accessed in a flat file ASCII format known as NMR-STAR. While the format is human readable, the size of most BMRB entries makes computer readability and explicit representation a practical requirement for almost any rigorous systematic analysis. To aid in the use of this public resource, we have developed a package called nmrstarlib in the popular open-source programming language Python. The nmrstarlib's implementation is very efficient, both in design and execution. The library has facilities for reading and writing both NMR-STAR version 2.1 and 3.1 formatted files, parsing them into usable Python dictionary- and list-based data structures, making access and manipulation of the experimental data very natural within Python programs (i.e. "saveframe" and "loop" records represented as individual Python dictionary data structures). Another major advantage of this design is that data stored in original NMR-STAR can be easily converted into its equivalent JavaScript Object Notation (JSON) format, a lightweight data interchange format, facilitating data access and manipulation using Python and any other programming language that implements a JSON parser/generator (i.e., all popular programming languages). We have also developed tools to visualize assigned chemical shift values and to convert between NMR-STAR and JSONized NMR-STAR formatted files. Full API Reference Documentation, User Guide and Tutorial with code examples are also available. We have tested this new library on all current BMRB entries: 100% of all entries are parsed without any errors for both NMR-STAR version 2.1 and version 3.1 formatted files. We also compared our software to three currently available Python libraries for parsing NMR-STAR formatted files: PyStarLib, NMRPyStar, and PyNMRSTAR. The nmrstarlib package is a simple, fast, and efficient library for accessing data from the BMRB. The library provides an intuitive dictionary-based interface with which Python programs can read, edit, and write NMR-STAR formatted files and their equivalent JSONized NMR-STAR files. The nmrstarlib package can be used as a library for accessing and manipulating data stored in NMR-STAR files and as a command-line tool to convert from NMR-STAR file format into its equivalent JSON file format and vice versa, and to visualize chemical shift values. Furthermore, the nmrstarlib implementation provides a guide for effectively JSONizing other older scientific formats, improving the FAIRness of data in these formats.
Ischaemic stroke management at Al-Shifa Hospital in the Gaza Strip: a clinical audit.
Abukaresh, Amir; Al-Abadlah, Rami; Böttcher, Bettina; El-Essi, Khamis
2018-02-21
In the 2014 Palestinian annual health report, cerebrovascular accident was ranked as the third leading cause of death in the occupied Palestinian territory. Cerebrovascular accident is also one the most common causes of disability worldwide. Good management decreases mortality and morbidity. The aim of this study was to assess the current management of patients with ischaemic stroke at the Al-Shifa Hospital and to compare this with international guidelines. For this clinical audit, we used simple random sampling to select files of patients admitted with the diagnosis of ischaemic stroke to the Al-Shifa Hospital. Data collection sheets were completed, and clinical practice was compared with the 2013 American Stroke Association guidelines. Between January and June, 2016, 254 patients were admitted with ischaemic stroke, haemorrhagic stroke, or transient ischaemic attack. We selected 55 patient files. The International Classification of Diseases coding for cerebral infarction in patient files was relatively good, with 92% of files correctly coded. However, we found a substantial weakness in the documentation of duration, progression of symptoms (documented in 20% of files only), and physiotherapy assessment. Most essential acute investigations were done on time (for all [100%] patients needing blood count, renal function tests, and CT scan and for 42 [76%] patients needing ECG). However, thrombolytic drugs were not used because they were not available. Long-term antiplatelet therapy was provided properly to 51 (92%) patients discharged from hospital. However, the initial doses of antiplatelet therapy were generally lower than the international recommendations. Findings also showed a marked inconformity of blood pressure management, especially with respect to the treatment decision and the choice of antihypertensive drug. No local guidelines exist. Furthermore, the lack of availability of thrombolysis medication and the poor deviation in blood pressure management show a lack of evidence-based practice. These findings point to the urgent need for the development of local, evidence-based guidelines. None. Copyright © 2018 Elsevier Ltd. All rights reserved.
Digital geologic map of the Butler Peak 7.5' quadrangle, San Bernardino County, California
Miller, Fred K.; Matti, Jonathan C.; Brown, Howard J.; digital preparation by Cossette, P. M.
2000-01-01
Open-File Report 00-145, is a digital geologic map database of the Butler Peak 7.5' quadrangle that includes (1) ARC/INFO (Environmental Systems Research Institute) version 7.2.1 Patch 1 coverages, and associated tables, (2) a Portable Document Format (.pdf) file of the Description of Map Units, Correlation of Map Units chart, and an explanation of symbols used on the map, btlrpk_dcmu.pdf, (3) a Portable Document Format file of this Readme, btlrpk_rme.pdf (the Readme is also included as an ascii file in the data package), and (4) a PostScript plot file of the map, Correlation of Map Units, and Description of Map Units on a single sheet, btlrpk.ps. No paper map is included in the Open-File report, but the PostScript plot file (number 4 above) can be used to produce one. The PostScript plot file generates a map, peripheral text, and diagrams in the editorial format of USGS Geologic Investigation Series (I-series) maps.
Filing Reprints: A Simple System For The Family Physician
Berner, Mark
1978-01-01
This flexible method of filing medical literature without using cards is based on the International Classification of Health Problems in Primary Care. 1Articles, reprints, notes of lectures and rounds, etc. are filed in manilla folders according to a few simple guidelines. This system has proved to be practical and efficient, can be modified for individual needs, and once established requires little time to maintain. PMID:20469289
MXA: a customizable HDF5-based data format for multi-dimensional data sets
NASA Astrophysics Data System (ADS)
Jackson, M.; Simmons, J. P.; De Graef, M.
2010-09-01
A new digital file format is proposed for the long-term archival storage of experimental data sets generated by serial sectioning instruments. The format is known as the multi-dimensional eXtensible Archive (MXA) format and is based on the public domain Hierarchical Data Format (HDF5). The MXA data model, its description by means of an eXtensible Markup Language (XML) file with associated Document Type Definition (DTD) are described in detail. The public domain MXA package is available through a dedicated web site (mxa.web.cmu.edu), along with implementation details and example data files.
NASA Astrophysics Data System (ADS)
Northup, E. A.; Kusterer, J.; Quam, B.; Chen, G.; Early, A. B.; Beach, A. L., III
2015-12-01
The current ICARTT file format standards were developed for the purpose of fulfilling the data management needs for the International Consortium for Atmospheric Research on Transport and Transformation (ICARTT) campaign in 2004. The goal of the ICARTT file format was to establish a common and simple to use data file format to promote data exchange and collaboration among science teams with similar science objectives. ICARTT has been the NASA standard since 2010, and is widely used by NOAA, NSF, and international partners (DLR, FAAM). Despite its level of acceptance, there are a number of issues with the current ICARTT format, especially concerning the machine readability. To enhance usability, the ICARTT Refresh Earth Science Data Systems Working Group (ESDSWG) was established to enable a platform for atmospheric science data producers, users (e.g. modelers) and data managers to collaborate on developing criteria for this file format. Ultimately, this is a cross agency effort to improve and aggregate the metadata records being produced. After conducting a survey to identify deficiencies in the current format, we determined which are considered most important to the various communities. Numerous recommendations were made to improve upon the file format while maintaining backward compatibility. The recommendations made to date and their advantages and limitations will be discussed.
75 FR 29908 - Prothioconazole; Pesticide Tolerances
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-28
....gpoaccess.gov/ecfr . To access the harmonized test guidelines referenced in this document electronically, please go http://www.epa.gov/ocspp and select ``Test Methods and Guidelines.'' C. Can I File an Objection... dietary exposures and all other exposures for which there is reliable information.'' This includes...
NASA Standard for Airborne Data: ICARTT Format ESDS-RFC-019
NASA Astrophysics Data System (ADS)
Thornhill, A.; Brown, C.; Aknan, A.; Crawford, J. H.; Chen, G.; Williams, E. J.
2011-12-01
Airborne field studies generate a plethora of data products in the effort to study atmospheric composition and processes. Data file formats for airborne field campaigns are designed to present data in an understandable and organized way to support collaboration and to document relevant and important meta data. The ICARTT file format was created to facilitate data management during the International Consortium for Atmospheric Research on Transport and Transformation (ICARTT) campaign in 2004 that involved government-agencies and university participants from five countries. Since this mission the ICARTT format has been used in subsequent field campaigns such as Polar Study Using Aircraft Remote Sensing, Surface Measurements and Models of Climates, Chemistry, Aerosols, and Transport (POLARCAT) and the first phase of Deriving Information on Surface Conditions from COlumn and VERtically Resolved Observations Relevant to Air Quality (DISCOVER-AQ). The ICARTT file format has been endorsed as a standard format for airborne data by the Standard Process Group (SPG), one of the Earth Science Data Systems Working Groups (ESDSWG) in 2010. The detailed description of the ICARTT format can be found at http://www-air.larc.nasa.gov/missions/etc/ESDS-RFC-019-v1.00.pdf. The ICARTT data format is an ASCII, comma delimited format that was based on the NASA Ames and GTE file formats. The file header is detailed enough to fully describe the data for users outside of the instrument group and includes a description of the meta data. The ICARTT scanning tools, format structure, implementations, and examples will be presented.
In addition to standard HTML webpages, our website contains files in other formats. You may need additional software or browser plug-ins to view some of these files. The following list shows each format along with links to the corresponding freely available plug-ins or viewers. Documents Adobe Acrobat Reader (.pdf)
Dependency Tree Annotation Software
2015-11-01
formats, and it provides numerous options for customizing how dependency trees are displayed. Built entirely in Java , it can run on a wide range of...tree can be saved as an image, .mxe (a mxGraph editing file), a .conll file, and several other file formats. DTE uses the open source Java version
76 FR 68161 - Marine Mammals; File Nos. 16163, 16160, and 15569
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-03
... guidelines and regulations through the Soundwatch program. Research methods would include close vessel... for Whale Research (CWR; Kenneth C. Balcomb III, Responsible Party) [File No. 15569], PO Box 1577, Friday Harbor, WA 98250, have applied in due form for permits to conduct research on marine mammals...
Representation of thermal infrared imaging data in the DICOM using XML configuration files.
Ruminski, Jacek
2007-01-01
The DICOM standard has become a widely accepted and implemented format for the exchange and storage of medical imaging data. Different imaging modalities are supported however there is not a dedicated solution for thermal infrared imaging in medicine. In this article we propose new ideas and improvements to final proposal of the new DICOM Thermal Infrared Imaging structures and services. Additionally, we designed, implemented and tested software packages for universal conversion of existing thermal imaging files to the DICOM format using XML configuration files. The proposed solution works fast and requires minimal number of user interactions. The XML configuration file enables to compose a set of attributes for any source file format of thermal imaging camera.
Lapham, W.W.; Wilde, F.D.; Koterba, M.T.
1997-01-01
This is the first of a two-part report to document guidelines and standard procedures of the U.S. Geological Survey for the acquisition of data in ground-water-quality studies. This report provides guidelines and procedures for the selection and installation of wells for water-quality studies/*, and the required or recommended supporting documentation of these activities. Topics include (1) documentation needed for well files, field folders, and electronic files; (2) criteria and information needed for the selection of water-supply and observation wells, including site inventory and data collection during field reconnaissance; and (3) criteria and preparation for installation of monitoring wells, including the effects of equipment and materials on the chemistry of ground-water samples, a summary of drilling and coring methods, and information concerning well completion, development, and disposition.
77 FR 56782 - Bifenthrin; Pesticide Tolerances
Federal Register 2010, 2011, 2012, 2013, 2014
2012-09-14
..., hay. Interregional Research Project Number 4 (IR-4) requested these tolerances under the Federal Food... OCSPP test guidelines referenced in this document electronically, please go to http://www.epa.gov/ocspp and select ``Test Methods and Guidelines.'' C. How can I file an objection or hearing request? Under...
78 FR 55635 - Prometryn; Pesticide Tolerances
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-11
.... Interregional Research Project Number 4 (IR-4) requested these tolerances under the Federal Food, Drug, and... test guidelines referenced in this document electronically, please go to http://www.epa.gov/ocspp and select ``Test Methods and Guidelines. C. How can I file an objection or hearing request? Under FFDCA...
Creating databases for biological information: an introduction.
Stein, Lincoln
2002-08-01
The essence of bioinformatics is dealing with large quantities of information. Whether it be sequencing data, microarray data files, mass spectrometric data (e.g., fingerprints), the catalog of strains arising from an insertional mutagenesis project, or even large numbers of PDF files, there inevitably comes a time when the information can simply no longer be managed with files and directories. This is where databases come into play. This unit briefly reviews the characteristics of several database management systems, including flat file, indexed file, and relational databases, as well as ACeDB. It compares their strengths and weaknesses and offers some general guidelines for selecting an appropriate database management system.
Transported Geothermal Energy Technoeconomic Screening Tool - Calculation Engine
Liu, Xiaobing
2016-09-21
This calculation engine estimates technoeconomic feasibility for transported geothermal energy projects. The TGE screening tool (geotool.exe) takes input from input file (input.txt), and list results into output file (output.txt). Both the input and ouput files are in the same folder as the geotool.exe. To use the tool, the input file containing adequate information of the case should be prepared in the format explained below, and the input file should be put into the same folder as geotool.exe. Then the geotool.exe can be executed, which will generate a output.txt file in the same folder containing all key calculation results. The format and content of the output file is explained below as well.
Mahesh, MC; Bhandary, Shreetha
2017-01-01
Introduction Stresses generated during root canal instrumentation have been reported to cause apical cracks. The smaller, less pronounced defects like cracks can later propagate into vertical root fracture, when the tooth is subjected to repeated stresses from endodontic or restorative procedures. Aim This study evaluated occurrence of apical cracks with stainless steel hand files, rotary NiTi RaCe and K3 files at two different instrumentation lengths. Materials and Methods In the present in vitro study, 60 mandibular premolars were mounted in resin blocks with simulated periodontal ligament. Apical 3 mm of the root surfaces were exposed and stained using India ink. Preoperative images of root apices were obtained at 100x using stereomicroscope. The teeth were divided into six groups of 10 each. First two groups were instrumented with stainless steel files, next two groups with rotary NiTi RaCe files and the last two groups with rotary NiTi K3 files. The instrumentation was carried out till the apical foramen (Working Length-WL) and 1 mm short of the apical foramen (WL-1) with each file system. After root canal instrumentation, postoperative images of root apices were obtained. Preoperative and postoperative images were compared and the occurrence of cracks was recorded. Descriptive statistical analysis and Chi-square tests were used to analyze the results. Results Apical root cracks were seen in 30%, 35% and 20% of teeth instrumented with K-files, RaCe files and K3 files respectively. There was no statistical significance among three instrumentation systems in the formation of apical cracks (p=0.563). Apical cracks were seen in 40% and 20% of teeth instrumented with K-files; 60% and 10% of teeth with RaCe files and 40% and 0% of teeth with K3 files at WL and WL-1 respectively. For groups instrumented with hand files there was no statistical significance in number of cracks at WL and WL-1 (p=0.628). But for teeth instrumented with RaCe files and K3 files significantly more number of cracks were seen at WL than WL-1 (p=0.057 for RaCe files and p=0.087 for K3 files). Conclusion There was no statistical significance between stainless steel hand files and rotary files in terms of crack formation. Instrumentation length had a significant effect on the formation of cracks when rotary files were used. Using rotary instruments 1 mm short of apical foramen caused lesser crack formation. But, there was no statistically significant difference in number of cracks formed with hand files at two instrumentation levels. PMID:28274036
Devale, Madhuri R; Mahesh, M C; Bhandary, Shreetha
2017-01-01
Stresses generated during root canal instrumentation have been reported to cause apical cracks. The smaller, less pronounced defects like cracks can later propagate into vertical root fracture, when the tooth is subjected to repeated stresses from endodontic or restorative procedures. This study evaluated occurrence of apical cracks with stainless steel hand files, rotary NiTi RaCe and K3 files at two different instrumentation lengths. In the present in vitro study, 60 mandibular premolars were mounted in resin blocks with simulated periodontal ligament. Apical 3 mm of the root surfaces were exposed and stained using India ink. Preoperative images of root apices were obtained at 100x using stereomicroscope. The teeth were divided into six groups of 10 each. First two groups were instrumented with stainless steel files, next two groups with rotary NiTi RaCe files and the last two groups with rotary NiTi K3 files. The instrumentation was carried out till the apical foramen (Working Length-WL) and 1 mm short of the apical foramen (WL-1) with each file system. After root canal instrumentation, postoperative images of root apices were obtained. Preoperative and postoperative images were compared and the occurrence of cracks was recorded. Descriptive statistical analysis and Chi-square tests were used to analyze the results. Apical root cracks were seen in 30%, 35% and 20% of teeth instrumented with K-files, RaCe files and K3 files respectively. There was no statistical significance among three instrumentation systems in the formation of apical cracks (p=0.563). Apical cracks were seen in 40% and 20% of teeth instrumented with K-files; 60% and 10% of teeth with RaCe files and 40% and 0% of teeth with K3 files at WL and WL-1 respectively. For groups instrumented with hand files there was no statistical significance in number of cracks at WL and WL-1 (p=0.628). But for teeth instrumented with RaCe files and K3 files significantly more number of cracks were seen at WL than WL-1 (p=0.057 for RaCe files and p=0.087 for K3 files). There was no statistical significance between stainless steel hand files and rotary files in terms of crack formation. Instrumentation length had a significant effect on the formation of cracks when rotary files were used. Using rotary instruments 1 mm short of apical foramen caused lesser crack formation. But, there was no statistically significant difference in number of cracks formed with hand files at two instrumentation levels.
15 CFR 995.26 - Conversion of NOAA ENC ® files to other formats.
Code of Federal Regulations, 2011 CFR
2011-01-01
...) Conversion of NOAA ENC files to other formats—(1) Content. CEVAD may provide NOAA ENC data in forms other... data files without degradation to positional accuracy or informational content. (2) Software certification. Conversion of NOAA ENC data to other formats must be accomplished within the constraints of IHO...
Early Detection | Division of Cancer Prevention
[[{"fid":"171","view_mode":"default","fields":{"format":"default","field_file_image_alt_text[und][0][value]":"Early Detection Research Group Homepage Logo","field_file_image_title_text[und][0][value]":"Early Detection Research Group Homepage Logo","field_folder[und]":"15"},"type":"media","field_deltas":{"1":{"format":"default","field_file_image_alt_text[und][0][value]":"Early
Image Size Variation Influence on Corrupted and Non-viewable BMP Image
NASA Astrophysics Data System (ADS)
Azmi, Tengku Norsuhaila T.; Azma Abdullah, Nurul; Rahman, Nurul Hidayah Ab; Hamid, Isredza Rahmi A.; Chai Wen, Chuah
2017-08-01
Image is one of the evidence component seek in digital forensics. Joint Photographic Experts Group (JPEG) format is most popular used in the Internet because JPEG files are very lossy and easy to compress that can speed up Internet transmitting processes. However, corrupted JPEG images are hard to recover due to the complexities of determining corruption point. Nowadays Bitmap (BMP) images are preferred in image processing compared to another formats because BMP image contain all the image information in a simple format. Therefore, in order to investigate the corruption point in JPEG, the file is required to be converted into BMP format. Nevertheless, there are many things that can influence the corrupting of BMP image such as the changes of image size that make the file non-viewable. In this paper, the experiment indicates that the size of BMP file influences the changes in the image itself through three conditions, deleting, replacing and insertion. From the experiment, we learnt by correcting the file size, it can able to produce a viewable file though partially. Then, it can be investigated further to identify the corruption point.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ingargiola, A.; Laurence, T. A.; Boutelle, R.
We introduce Photon-HDF5, an open and efficient file format to simplify exchange and long term accessibility of data from single-molecule fluorescence experiments based on photon-counting detectors such as single-photon avalanche diode (SPAD), photomultiplier tube (PMT) or arrays of such detectors. The format is based on HDF5, a widely used platform- and language-independent hierarchical file format for which user-friendly viewers are available. Photon-HDF5 can store raw photon data (timestamp, channel number, etc) from any acquisition hardware, but also setup and sample description, information on provenance, authorship and other metadata, and is flexible enough to include any kind of custom data. Themore » format specifications are hosted on a public website, which is open to contributions by the biophysics community. As an initial resource, the website provides code examples to read Photon-HDF5 files in several programming languages and a reference python library (phconvert), to create new Photon-HDF5 files and convert several existing file formats into Photon-HDF5. As a result, to encourage adoption by the academic and commercial communities, all software is released under the MIT open source license.« less
75 FR 4284 - Triticonazole; Pesticide Tolerances
Federal Register 2010, 2011, 2012, 2013, 2014
2010-01-27
... harmonized test guidelines referenced in this document electronically, please go to http://www.epa.gov/oppts and select ``Test Methods & Guidelines'' on the left-side navigation menu. C. Can I File an Objection..., P.O. Box 13528, Research Triangle Park, NC 27709-3528. The petition requested that 40 CFR 180.583 be...
78 FR 78740 - Isopyrazam; Pesticide Tolerances
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-27
... access the OCSPP test guidelines referenced in this document electronically, please go to http://www.epa.gov/ocspp and select ``Test Methods and Guidelines.'' C. How can I file an objection or hearing... validity, completeness, and reliability as well as the relationship of the results of the studies to human...
UFO (UnFold Operator) default data format
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kissel, L.; Biggs, F.; Marking, T.R.
The default format for the storage of x,y data for use with the UFO code is described. The format assumes that the data stored in a file is a matrix of values; two columns of this matrix are selected to define a function of the form y = f(x). This format is specifically designed to allow for easy importation of data obtained from other sources, or easy entry of data using a text editor, with a minimum of reformatting. This format is flexible and extensible through the use of inline directives stored in the optional header of the file. Amore » special extension of the format implements encoded data which significantly reduces the storage required as compared wth the unencoded form. UFO supports several extensions to the file specification that implement execute-time operations, such as, transformation of the x and/or y values, selection of specific columns of the matrix for association with the x and y values, input of data directly from other formats (e.g., DAMP and PFF), and a simple type of library-structured file format. Several examples of the use of the format are given.« less
McDonald, Daniel; Clemente, Jose C; Kuczynski, Justin; Rideout, Jai Ram; Stombaugh, Jesse; Wendel, Doug; Wilke, Andreas; Huse, Susan; Hufnagle, John; Meyer, Folker; Knight, Rob; Caporaso, J Gregory
2012-07-12
We present the Biological Observation Matrix (BIOM, pronounced "biome") format: a JSON-based file format for representing arbitrary observation by sample contingency tables with associated sample and observation metadata. As the number of categories of comparative omics data types (collectively, the "ome-ome") grows rapidly, a general format to represent and archive this data will facilitate the interoperability of existing bioinformatics tools and future meta-analyses. The BIOM file format is supported by an independent open-source software project (the biom-format project), which initially contains Python objects that support the use and manipulation of BIOM data in Python programs, and is intended to be an open development effort where developers can submit implementations of these objects in other programming languages. The BIOM file format and the biom-format project are steps toward reducing the "bioinformatics bottleneck" that is currently being experienced in diverse areas of biological sciences, and will help us move toward the next phase of comparative omics where basic science is translated into clinical and environmental applications. The BIOM file format is currently recognized as an Earth Microbiome Project Standard, and as a Candidate Standard by the Genomic Standards Consortium.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-26
... applications or print-to-PDF format, and not in a scanned format, at http://www.ferc.gov/docs-filing/efiling....3d 1342 (DC Cir. 2009). \\5\\ Mandatory Reliability Standards for the Bulk-Power System, Order No. 693... applications or print-to-PDF format and not in a scanned format. Commenters filing electronically do not need...
File Server-Based CD-ROM Networking: Using SCSI Express.
ERIC Educational Resources Information Center
McQueen, Howard
1992-01-01
Provides guidelines for evaluating SCSI Express Novell 386, a new product allowing CD-ROM drives to be attached to a Netware 3.11 file server, increasing CD-ROM networking capability. Specific limitations concerning software, hardware, and human resources are outlined, as well as its unique features and potential for future networking uses. (EA)
WHO Expert Committee on specifications for pharmaceutical preparations.
2010-01-01
The Expert Committee on Specifications for Pharmaceutical Preparations works towards clear, independent and practical standards and guidelines for the quality assurance of medicines. Standards are developed by the Committee through worldwide consultation and an international consensus-building process. The following new guidelines were adopted and recommended for use: good practices for pharmaceutical quality control laboratories; supplementary guidelines for active pharmaceutical ingredients; good manufacturing practices for pharmaceutical products containing hazardous substances; good manufacturing practices for sterile pharmaceutical products; good distribution practices for pharmaceutical products; guidelines on the requalification of prequalified dossiers: and guidelines for the preparation of a contract research organization master file.
NetpathXL - An Excel Interface to the Program NETPATH
Parkhurst, David L.; Charlton, Scott R.
2008-01-01
NetpathXL is a revised version of NETPATH that runs under Windows? operating systems. NETPATH is a computer program that uses inverse geochemical modeling techniques to calculate net geochemical reactions that can account for changes in water composition between initial and final evolutionary waters in hydrologic systems. The inverse models also can account for the isotopic composition of waters and can be used to estimate radiocarbon ages of dissolved carbon in ground water. NETPATH relies on an auxiliary, database program, DB, to enter the chemical analyses and to perform speciation calculations that define total concentrations of elements, charge balance, and redox state of aqueous solutions that are then used in inverse modeling. Instead of DB, NetpathXL relies on Microsoft Excel? to enter the chemical analyses. The speciation calculation formerly included in DB is implemented within the program NetpathXL. A program DBXL can be used to translate files from the old DB format (.lon files) to NetpathXL spreadsheets, or to create new NetpathXL spreadsheets. Once users have a NetpathXL spreadsheet with the proper format, new spreadsheets can be generated by copying or saving NetpathXL spreadsheets. In addition, DBXL can convert NetpathXL spreadsheets to PHREEQC input files. New capabilities in PHREEQC (version 2.15) allow solution compositions to be written to a .lon file, and inverse models developed in PHREEQC to be written as NetpathXL .pat and model files. NetpathXL can open NetpathXL spreadsheets, NETPATH-format path files (.pat files), and NetpathXL-format path files (.pat files). Once the speciation calculations have been performed on a spreadsheet file or a .pat file has been opened, the NetpathXL calculation engine is identical to the original NETPATH. Development of models and viewing results in NetpathXL rely on keyboard entry as in NETPATH.
17 CFR 232.202 - Continuing hardship exemption.
Code of Federal Regulations, 2010 CFR
2010-04-01
... electronic format or post the Interactive Data File on its corporate Web site, as applicable, on the required... Interactive Data File, the electronic filer need not post on its Web site any statement with regard to the... submitted in electronic format or, in the case of an Interactive Data File (§ 232.11), to be posted on the...
17 CFR 232.202 - Continuing hardship exemption.
Code of Federal Regulations, 2013 CFR
2013-04-01
... electronic format or post the Interactive Data File on its corporate Web site, as applicable, on the required... Interactive Data File, the electronic filer need not post on its Web site any statement with regard to the... submitted in electronic format or, in the case of an Interactive Data File (§ 232.11), to be posted on the...
17 CFR 232.202 - Continuing hardship exemption.
Code of Federal Regulations, 2012 CFR
2012-04-01
... electronic format or post the Interactive Data File on its corporate Web site, as applicable, on the required... Interactive Data File, the electronic filer need not post on its Web site any statement with regard to the... submitted in electronic format or, in the case of an Interactive Data File (§ 232.11), to be posted on the...
17 CFR 232.202 - Continuing hardship exemption.
Code of Federal Regulations, 2014 CFR
2014-04-01
... electronic format or post the Interactive Data File on its corporate Web site, as applicable, on the required... Interactive Data File, the electronic filer need not post on its Web site any statement with regard to the... submitted in electronic format or, in the case of an Interactive Data File (§ 232.11), to be posted on the...
17 CFR 232.202 - Continuing hardship exemption.
Code of Federal Regulations, 2011 CFR
2011-04-01
... electronic format or post the Interactive Data File on its corporate Web site, as applicable, on the required... Interactive Data File, the electronic filer need not post on its Web site any statement with regard to the... submitted in electronic format or, in the case of an Interactive Data File (§ 232.11), to be posted on the...
Data Science Bowl Launched to Improve Lung Cancer Screening | Division of Cancer Prevention
[[{"fid":"2078","view_mode":"default","fields":{"format":"default","field_file_image_alt_text[und][0][value]":"Data Science Bowl Logo","field_file_image_title_text[und][0][value]":"Data Science Bowl Logo","field_folder[und]":"76"},"type":"media","field_deltas":{"1":{"format":"default","field_file_image_alt_text[und][0][value]":"Data Science Bowl
Code of Federal Regulations, 2014 CFR
2014-04-01
... submit a public version of a database in pdf format. The public version of the database must be publicly... interested party that files with the Department a request for an expedited antidumping review, an..., whichever is later. If the interested party that files the request is unable to locate a particular exporter...
47 CFR 1.10008 - What are IBFS file numbers?
Code of Federal Regulations, 2010 CFR
2010-10-01
... Bureau Filing System § 1.10008 What are IBFS file numbers? (a) We assign file numbers to electronic... information, see The International Bureau Filing System File Number Format Public Notice, DA-04-568 (released... 47 Telecommunication 1 2010-10-01 2010-10-01 false What are IBFS file numbers? 1.10008 Section 1...
47 CFR 1.10008 - What are IBFS file numbers?
Code of Federal Regulations, 2011 CFR
2011-10-01
... Bureau Filing System § 1.10008 What are IBFS file numbers? (a) We assign file numbers to electronic... information, see The International Bureau Filing System File Number Format Public Notice, DA-04-568 (released... 47 Telecommunication 1 2011-10-01 2011-10-01 false What are IBFS file numbers? 1.10008 Section 1...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-22
... print-to-PDF format and not in a scanned format. Mail/Hand Delivery: Commenters unable to file comments.... FERC, 564 F.3d 1342 (DC Cir. 2009). 3. In March 2007, the Commission issued Order No. 693, evaluating... should be filed in native applications or print-to-PDF format and not in a scanned format. Commenters...
Code of Federal Regulations, 2010 CFR
2010-10-01
... recording under § 67.200 may be submitted in portable document format (.pdf) as an attachment to electronic... submitted for filing in .pdf format pertains to a vessel that is not a currently documented vessel, a... with the National Vessel Documentation Center or must be submitted in .pdf format with the instrument...
Creating databases for biological information: an introduction.
Stein, Lincoln
2013-06-01
The essence of bioinformatics is dealing with large quantities of information. Whether it be sequencing data, microarray data files, mass spectrometric data (e.g., fingerprints), the catalog of strains arising from an insertional mutagenesis project, or even large numbers of PDF files, there inevitably comes a time when the information can simply no longer be managed with files and directories. This is where databases come into play. This unit briefly reviews the characteristics of several database management systems, including flat file, indexed file, relational databases, and NoSQL databases. It compares their strengths and weaknesses and offers some general guidelines for selecting an appropriate database management system. Copyright 2013 by JohnWiley & Sons, Inc.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-07-19
... consolidated last sale price by the same specified percentage. The numerical percentage proposed to be used in the Trading Collar price calculations will be equal to the appropriate ``numerical guideline... Collars will automatically be adjusted to match any future changes in the numerical guidelines in NYSE...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-28
....'' Second, the Exchange proposes replacing existing paragraph (c)(4) of Rule 52.4, entitled ``Numerical... eliminate the ability of the Exchange to deviate from the Numerical Guidelines contained in paragraph (c)(1... existing paragraph (c)(2), which provides flexibility to the Exchange to use different Numerical Guidelines...
75 FR 33190 - Trifloxystrobin; Pesticide Tolerances
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-11
....gpoaccess.gov/ecfr . To access the harmonized test guidelines referenced in this document electronically, please go http://www.epa.gov/ocspp and select ``Test Methods and Guidelines.'' C. How Can I File an... (PP 8F7487) by Bayer CropScience, 2 T.W. Alexander Drive, P.O. Box 12014, Research Triangle Park, NC...
ERIC Educational Resources Information Center
McClain, Tim
This paper provides easy-to-understand guidelines for citing online information in student bibliographies. Citation structure and examples are provided for each type of Internet source. Guidelines are included for the following Internet sources: electronic mail; Gopher; File Transfer Protocol (FTP); Telnet; World Wide Web; Usenet Newsgroups;…
NIH Seeks Input on In-patient Clinical Research Areas | Division of Cancer Prevention
[[{"fid":"2476","view_mode":"default","fields":{"format":"default","field_file_image_alt_text[und][0][value]":"Aerial view of the National Institutes of Health Clinical Center (Building 10) in Bethesda, Maryland.","field_file_image_title_text[und][0][value]":false},"type":"media","field_deltas":{"1":{"format":"default","field_file_image_alt_text[und][0][value]":"Aerial view of
Pancreatic Cancer Detection Consortium (PCDC) | Division of Cancer Prevention
[[{"fid":"2256","view_mode":"default","fields":{"format":"default","field_file_image_alt_text[und][0][value]":"A 3-dimensional image of a human torso highlighting the pancreas.","field_file_image_title_text[und][0][value]":false},"type":"media","field_deltas":{"1":{"format":"default","field_file_image_alt_text[und][0][value]":"A 3-dimensional image of a human torso
Reprocessing of multi-channel seismic-reflection data collected in the Beaufort Sea
Agena, W.F.; Lee, Myung W.; Hart, P.E.
2000-01-01
Contained on this set of two CD-ROMs are stacked and migrated multi-channel seismic-reflection data for 65 lines recorded in the Beaufort Sea by the United States Geological Survey in 1977. All data were reprocessed by the USGS using updated processing methods resulting in improved interpretability. Each of the two CD-ROMs contains the following files: 1) 65 files containing the digital seismic data in standard, SEG-Y format; 2) 1 file containing navigation data for the 65 lines in standard SEG-P1 format; 3) an ASCII text file with cross-reference information for relating the sequential trace numbers on each line to cdp numbers and shotpoint numbers; 4) 2 small scale graphic images (stacked and migrated) of a segment of line 722 in Adobe Acrobat (R) PDF format; 5) a graphic image of the location map, generated from the navigation file; 6) PlotSeis, an MS-DOS Application that allows PC users to interactively view the SEG-Y files; 7) a PlotSeis documentation file; and 8) an explanation of the processing used to create the final seismic sections (this document).
Manoukis, Nicholas C
2007-07-01
There has been a great increase in both the number of population genetic analysis programs and the size of data sets being studied with them. Since the file formats required by the most popular and useful programs are variable, automated reformatting or conversion between them is desirable. formatomatic is an easy to use program that can read allelic data files in genepop, raw (csv) or convert formats and create data files in nine formats: raw (csv), arlequin, genepop, immanc/bayesass +, migrate, newhybrids, msvar, baps and structure. Use of formatomatic should greatly reduce time spent reformatting data sets and avoid unnecessary errors.
File formats commonly used in mass spectrometry proteomics.
Deutsch, Eric W
2012-12-01
The application of mass spectrometry (MS) to the analysis of proteomes has enabled the high-throughput identification and abundance measurement of hundreds to thousands of proteins per experiment. However, the formidable informatics challenge associated with analyzing MS data has required a wide variety of data file formats to encode the complex data types associated with MS workflows. These formats encompass the encoding of input instruction for instruments, output products of the instruments, and several levels of information and results used by and produced by the informatics analysis tools. A brief overview of the most common file formats in use today is presented here, along with a discussion of related topics.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-06-07
.../enviro/guidelines.asp . \\1\\ The attachments referenced in this notice will not appear in the Federal... at Commission's Web site: http://www.ferc.gov/docs-filing/efiling.asp . With eFiling you can provide... the wider net of project types subject to the Plan and Procedure requirements. The revisions clarify...
Getting Uncle Sam to Enforce Your Civil Rights. Clearinghouse Publication 59.
ERIC Educational Resources Information Center
Hartley, Mary Elizabeth
This booklet was written to help persons who believe that their civil rights have been denied and who wish to file a complaint with the Federal government. In regard to discrimination because of race, color, sex, religion, or national origin, guidelines are presented for filing complaints about denial of opportunities in the areas of credit,…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sorokine, Alexandre
2011-10-01
Simple Ontology Format (SOFT) library and file format specification provides a set of simple tools for developing and maintaining ontologies. The library, implemented as a perl module, supports parsing and verification of the files in SOFt format, operations with ontologies (adding, removing, or filtering of entities), and converting of ontologies into other formats. SOFT allows users to quickly create ontologies using only a basic text editor, verify it, and portray it in a graph layout system using customized styles.
2012-01-01
Background We present the Biological Observation Matrix (BIOM, pronounced “biome”) format: a JSON-based file format for representing arbitrary observation by sample contingency tables with associated sample and observation metadata. As the number of categories of comparative omics data types (collectively, the “ome-ome”) grows rapidly, a general format to represent and archive this data will facilitate the interoperability of existing bioinformatics tools and future meta-analyses. Findings The BIOM file format is supported by an independent open-source software project (the biom-format project), which initially contains Python objects that support the use and manipulation of BIOM data in Python programs, and is intended to be an open development effort where developers can submit implementations of these objects in other programming languages. Conclusions The BIOM file format and the biom-format project are steps toward reducing the “bioinformatics bottleneck” that is currently being experienced in diverse areas of biological sciences, and will help us move toward the next phase of comparative omics where basic science is translated into clinical and environmental applications. The BIOM file format is currently recognized as an Earth Microbiome Project Standard, and as a Candidate Standard by the Genomic Standards Consortium. PMID:23587224
76 FR 47606 - Sport Fishing and Boating Partnership Council
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-05
... the following formats: One hard copy with original signature, and one electronic copy via e- mail (acceptable file formats are Adobe Acrobat PDF, WordPerfect, MS Word, MS PowerPoint, or rich text file...
Measles, Mumps, and Rubella (MMR) Vaccination: What Everyone Should Know
... rubella combination vaccine Measles=Rubeola Measles=”10-day”, “hard” and “red” measles MMRV=measles, mumps, rubella, and varicella combination vaccine File Formats Help: How do I view different file formats ( ...
78 FR 19152 - Revisions to Modeling, Data, and Analysis Reliability Standard
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-29
... processing software should be filed in native applications or print-to-PDF format and not in a scanned format...,126 (2006), aff'd sub nom. Alcoa, Inc. v. FERC, 564 F.3d 1342 (D.C. Cir. 2009). 3. In March 2007, the... print-to-PDF format and not in a scanned format. Commenters filing electronically do not need to make a...
76 FR 75898 - Sport Fishing and Boating Partnership Council
Federal Register 2010, 2011, 2012, 2013, 2014
2011-12-05
... following formats: One hard copy with original signature, and one electronic copy via email (acceptable file format: Adobe Acrobat PDF, WordPerfect, MS Word, MS PowerPoint, or Rich Text files in IBM-PC/Windows 98/2000/XP format). Please submit your statement to Douglas Hobbs, Council Coordinator (see FOR FURTHER...
14 CFR 221.195 - Requirement for filing printed material.
Code of Federal Regulations, 2010 CFR
2010-01-01
... (AVIATION PROCEEDINGS) ECONOMIC REGULATIONS TARIFFS Electronically Filed Tariffs § 221.195 Requirement for filing printed material. (a) Any tariff, or revision thereto, filed in paper format which accompanies....190(b). Further, such paper tariff, or revision thereto, shall be filed in accordance with the...
18 CFR 35.7 - Electronic filing requirements.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 18 Conservation of Power and Water Resources 1 2011-04-01 2011-04-01 false Electronic filing... § 35.7 Electronic filing requirements. (a) General rule. All filings made in proceedings initiated... declarations or statements and electronic signatures. (c) Format requirements for electronic filing. The...
18 CFR 35.7 - Electronic filing requirements.
Code of Federal Regulations, 2012 CFR
2012-04-01
... 18 Conservation of Power and Water Resources 1 2012-04-01 2012-04-01 false Electronic filing... § 35.7 Electronic filing requirements. (a) General rule. All filings made in proceedings initiated... declarations or statements and electronic signatures. (c) Format requirements for electronic filing. The...
18 CFR 35.7 - Electronic filing requirements.
Code of Federal Regulations, 2013 CFR
2013-04-01
... 18 Conservation of Power and Water Resources 1 2013-04-01 2013-04-01 false Electronic filing... § 35.7 Electronic filing requirements. (a) General rule. All filings made in proceedings initiated... declarations or statements and electronic signatures. (c) Format requirements for electronic filing. The...
18 CFR 35.7 - Electronic filing requirements.
Code of Federal Regulations, 2014 CFR
2014-04-01
... 18 Conservation of Power and Water Resources 1 2014-04-01 2014-04-01 false Electronic filing... § 35.7 Electronic filing requirements. (a) General rule. All filings made in proceedings initiated... declarations or statements and electronic signatures. (c) Format requirements for electronic filing. The...
NIMBUS 7 Earth Radiation Budget (ERB) Matrix User's Guide. Volume 2: Tape Specifications
NASA Technical Reports Server (NTRS)
Ray, S. N.; Vasanth, K. L.
1984-01-01
The ERB MATRIX tape is generated by an IBM 3081 computer program and is a 9 track, 1600 BPI tape. The gross format of the tape given on Page 1, shows an initial standard header file followed by data files. The standard header file contains two standard header records. A trailing documentation file (TDF) is the last file on the tape. Pages 9 through 17 describe, in detail, the standard header file and the TDF. The data files contain data for 37 different ERB parameters. Each file has data based on either a daily, 6 day cyclic, or monthly time interval. There are three types of physical records in the data files; namely, the world grid physical record, the documentation mercator/polar map projection physical record, and the monthly calibration physical record. The manner in which the data for the 37 ERB parameters are stored in the physical records comprising the data files, is given in the gross format section.
Extracting the Data From the LCM vk4 Formatted Output File
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wendelberger, James G.
These are slides about extracting the data from the LCM vk4 formatted output file. The following is covered: vk4 file produced by Keyence VK Software, custom analysis, no off the shelf way to read the file, reading the binary data in a vk4 file, various offsets in decimal lines, finding the height image data, directly in MATLAB, binary output beginning of height image data, color image information, color image binary data, color image decimal and binary data, MATLAB code to read vk4 file (choose a file, read the file, compute offsets, read optical image, laser optical image, read and computemore » laser intensity image, read height image, timing, display height image, display laser intensity image, display RGB laser optical images, display RGB optical images, display beginning data and save images to workspace, gamma correction subroutine), reading intensity form the vk4 file, linear in the low range, linear in the high range, gamma correction for vk4 files, computing the gamma intensity correction, observations.« less
Dragly, Svenn-Arne; Hobbi Mobarhan, Milad; Lepperød, Mikkel E.; Tennøe, Simen; Fyhn, Marianne; Hafting, Torkel; Malthe-Sørenssen, Anders
2018-01-01
Natural sciences generate an increasing amount of data in a wide range of formats developed by different research groups and commercial companies. At the same time there is a growing desire to share data along with publications in order to enable reproducible research. Open formats have publicly available specifications which facilitate data sharing and reproducible research. Hierarchical Data Format 5 (HDF5) is a popular open format widely used in neuroscience, often as a foundation for other, more specialized formats. However, drawbacks related to HDF5's complex specification have initiated a discussion for an improved replacement. We propose a novel alternative, the Experimental Directory Structure (Exdir), an open specification for data storage in experimental pipelines which amends drawbacks associated with HDF5 while retaining its advantages. HDF5 stores data and metadata in a hierarchy within a complex binary file which, among other things, is not human-readable, not optimal for version control systems, and lacks support for easy access to raw data from external applications. Exdir, on the other hand, uses file system directories to represent the hierarchy, with metadata stored in human-readable YAML files, datasets stored in binary NumPy files, and raw data stored directly in subdirectories. Furthermore, storing data in multiple files makes it easier to track for version control systems. Exdir is not a file format in itself, but a specification for organizing files in a directory structure. Exdir uses the same abstractions as HDF5 and is compatible with the HDF5 Abstract Data Model. Several research groups are already using data stored in a directory hierarchy as an alternative to HDF5, but no common standard exists. This complicates and limits the opportunity for data sharing and development of common tools for reading, writing, and analyzing data. Exdir facilitates improved data storage, data sharing, reproducible research, and novel insight from interdisciplinary collaboration. With the publication of Exdir, we invite the scientific community to join the development to create an open specification that will serve as many needs as possible and as a foundation for open access to and exchange of data. PMID:29706879
Dragly, Svenn-Arne; Hobbi Mobarhan, Milad; Lepperød, Mikkel E; Tennøe, Simen; Fyhn, Marianne; Hafting, Torkel; Malthe-Sørenssen, Anders
2018-01-01
Natural sciences generate an increasing amount of data in a wide range of formats developed by different research groups and commercial companies. At the same time there is a growing desire to share data along with publications in order to enable reproducible research. Open formats have publicly available specifications which facilitate data sharing and reproducible research. Hierarchical Data Format 5 (HDF5) is a popular open format widely used in neuroscience, often as a foundation for other, more specialized formats. However, drawbacks related to HDF5's complex specification have initiated a discussion for an improved replacement. We propose a novel alternative, the Experimental Directory Structure (Exdir), an open specification for data storage in experimental pipelines which amends drawbacks associated with HDF5 while retaining its advantages. HDF5 stores data and metadata in a hierarchy within a complex binary file which, among other things, is not human-readable, not optimal for version control systems, and lacks support for easy access to raw data from external applications. Exdir, on the other hand, uses file system directories to represent the hierarchy, with metadata stored in human-readable YAML files, datasets stored in binary NumPy files, and raw data stored directly in subdirectories. Furthermore, storing data in multiple files makes it easier to track for version control systems. Exdir is not a file format in itself, but a specification for organizing files in a directory structure. Exdir uses the same abstractions as HDF5 and is compatible with the HDF5 Abstract Data Model. Several research groups are already using data stored in a directory hierarchy as an alternative to HDF5, but no common standard exists. This complicates and limits the opportunity for data sharing and development of common tools for reading, writing, and analyzing data. Exdir facilitates improved data storage, data sharing, reproducible research, and novel insight from interdisciplinary collaboration. With the publication of Exdir, we invite the scientific community to join the development to create an open specification that will serve as many needs as possible and as a foundation for open access to and exchange of data.
Five Tips to Help Prevent Infections
... Information For… Media Policy Makers 5 Tips to Help Prevent Infections Language: English (US) Español (Spanish) Recommend ... Makers Language: English (US) Español (Spanish) File Formats Help: How do I view different file formats (PDF, ...
ISA-TAB-Nano: a specification for sharing nanomaterial research data in spreadsheet-based format.
Thomas, Dennis G; Gaheen, Sharon; Harper, Stacey L; Fritts, Martin; Klaessig, Fred; Hahn-Dantona, Elizabeth; Paik, David; Pan, Sue; Stafford, Grace A; Freund, Elaine T; Klemm, Juli D; Baker, Nathan A
2013-01-14
The high-throughput genomics communities have been successfully using standardized spreadsheet-based formats to capture and share data within labs and among public repositories. The nanomedicine community has yet to adopt similar standards to share the diverse and multi-dimensional types of data (including metadata) pertaining to the description and characterization of nanomaterials. Owing to the lack of standardization in representing and sharing nanomaterial data, most of the data currently shared via publications and data resources are incomplete, poorly-integrated, and not suitable for meaningful interpretation and re-use of the data. Specifically, in its current state, data cannot be effectively utilized for the development of predictive models that will inform the rational design of nanomaterials. We have developed a specification called ISA-TAB-Nano, which comprises four spreadsheet-based file formats for representing and integrating various types of nanomaterial data. Three file formats (Investigation, Study, and Assay files) have been adapted from the established ISA-TAB specification; while the Material file format was developed de novo to more readily describe the complexity of nanomaterials and associated small molecules. In this paper, we have discussed the main features of each file format and how to use them for sharing nanomaterial descriptions and assay metadata. The ISA-TAB-Nano file formats provide a general and flexible framework to record and integrate nanomaterial descriptions, assay data (metadata and endpoint measurements) and protocol information. Like ISA-TAB, ISA-TAB-Nano supports the use of ontology terms to promote standardized descriptions and to facilitate search and integration of the data. The ISA-TAB-Nano specification has been submitted as an ASTM work item to obtain community feedback and to provide a nanotechnology data-sharing standard for public development and adoption.
ISA-TAB-Nano: A Specification for Sharing Nanomaterial Research Data in Spreadsheet-based Format
2013-01-01
Background and motivation The high-throughput genomics communities have been successfully using standardized spreadsheet-based formats to capture and share data within labs and among public repositories. The nanomedicine community has yet to adopt similar standards to share the diverse and multi-dimensional types of data (including metadata) pertaining to the description and characterization of nanomaterials. Owing to the lack of standardization in representing and sharing nanomaterial data, most of the data currently shared via publications and data resources are incomplete, poorly-integrated, and not suitable for meaningful interpretation and re-use of the data. Specifically, in its current state, data cannot be effectively utilized for the development of predictive models that will inform the rational design of nanomaterials. Results We have developed a specification called ISA-TAB-Nano, which comprises four spreadsheet-based file formats for representing and integrating various types of nanomaterial data. Three file formats (Investigation, Study, and Assay files) have been adapted from the established ISA-TAB specification; while the Material file format was developed de novo to more readily describe the complexity of nanomaterials and associated small molecules. In this paper, we have discussed the main features of each file format and how to use them for sharing nanomaterial descriptions and assay metadata. Conclusion The ISA-TAB-Nano file formats provide a general and flexible framework to record and integrate nanomaterial descriptions, assay data (metadata and endpoint measurements) and protocol information. Like ISA-TAB, ISA-TAB-Nano supports the use of ontology terms to promote standardized descriptions and to facilitate search and integration of the data. The ISA-TAB-Nano specification has been submitted as an ASTM work item to obtain community feedback and to provide a nanotechnology data-sharing standard for public development and adoption. PMID:23311978
Federal Register 2010, 2011, 2012, 2013, 2014
2012-02-28
... via the Board's e-filing format or in the traditional paper format. Any person using e-filing should attach a document and otherwise comply with the instructions at the E-FILING link on the Board's Web site....S.C. 554(e). DRGHF requests that the Board issue an order declaring that municipal zoning law is...
Johnsen Lind, Andreas; Helge Johnsen, Bjorn; Hill, Labarron K; Sollers Iii, John J; Thayer, Julian F
2011-01-01
The aim of the present manuscript is to present a user-friendly and flexible platform for transforming Kubios HRV output files to an .xls-file format, used by MS Excel. The program utilizes either native or bundled Java and is platform-independent and mobile. This means that it can run without being installed on a computer. It also has an option of continuous transferring of data indicating that it can run in the background while Kubios produces output files. The program checks for changes in the file structure and automatically updates the .xls- output file.
5 CFR 1201.14 - Electronic filing procedures.
Code of Federal Regulations, 2010 CFR
2010-01-01
... (PDF), and image files (files created by scanning). A list of formats allowed can be found at e-Appeal..., or by uploading the supporting documents in the form of one or more PDF files in which each...
C2x: A tool for visualisation and input preparation for CASTEP and other electronic structure codes
NASA Astrophysics Data System (ADS)
Rutter, M. J.
2018-04-01
The c2x code fills two distinct roles. Its first role is in acting as a converter between the binary format .check files from the widely-used CASTEP [1] electronic structure code and various visualisation programs. Its second role is to manipulate and analyse the input and output files from a variety of electronic structure codes, including CASTEP, ONETEP and VASP, as well as the widely-used 'Gaussian cube' file format. Analysis includes symmetry analysis, and manipulation arbitrary cell transformations. It continues to be under development, with growing functionality, and is written in a form which would make it easy to extend it to working directly with files from other electronic structure codes. Data which c2x is capable of extracting from CASTEP's binary checkpoint files include charge densities, spin densities, wavefunctions, relaxed atomic positions, forces, the Fermi level, the total energy, and symmetry operations. It can recreate .cell input files from checkpoint files. Volumetric data can be output in formats useable by many common visualisation programs, and c2x will itself calculate integrals, expand data into supercells, and interpolate data via combinations of Fourier and trilinear interpolation. It can extract data along arbitrary lines (such as lines between atoms) as 1D output. C2x is able to convert between several common formats for describing molecules and crystals, including the .cell format of CASTEP. It can construct supercells, reduce cells to their primitive form, and add specified k-point meshes. It uses the spglib library [2] to report symmetry information, which it can add to .cell files. C2x is a command-line utility, so is readily included in scripts. It is available under the GPL and can be obtained from http://www.c2x.org.uk. It is believed to be the only open-source code which can read CASTEP's .check files, so it will have utility in other projects.
Versloot, Judith; Grudniewicz, Agnes; Chatterjee, Ananda; Hayden, Leigh; Kastner, Monika; Bhattacharyya, Onil
2015-06-01
We present simple formatting rules derived from an extensive literature review that can improve the format of clinical practice guidelines (CPGs), and potentially increase the likelihood of being used. We recently conducted a review of the literature from medicine, psychology, design, and human factors engineering on characteristics of guidelines that are associated with their use in practice, covering both the creation and communication of content. The formatting rules described in this article are derived from that review. The formatting rules are grouped into three categories that can be easily applied to CPGs: first, Vivid: make it stand out; second, Intuitive: match it to the audience's expectations, and third, Visual: use alternatives to text. We highlight rules supported by our broad literature review and provide specific 'how to' recommendations for individuals and groups developing evidence-based materials for clinicians. The way text documents are formatted influences their accessibility and usability. Optimizing the formatting of CPGs is a relatively inexpensive intervention and can be used to facilitate the dissemination of evidence in healthcare. Applying simple formatting principles to make documents more vivid, intuitive, and visual is a practical approach that has the potential to influence the usability of guidelines and to influence the extent to which guidelines are read, remembered, and used in practice.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-28
... Exchange is proposing to replace existing paragraph (c)(4) of Rule 11.13, entitled ``Numerical Guidelines... the ability of the Exchange to deviate from the Numerical Guidelines contained in paragraph (c)(1... of existing paragraph (c)(2), which provides flexibility to the Exchange to use different Numerical...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-28
... Securities.'' Second, the Exchange is replacing existing paragraph (C)(4) of Rule 11890, entitled ``Numerical... the ability of the Exchange to deviate from the Numerical Guidelines contained in paragraph (C)(1... flexibility to the Exchange to use different Numerical Guidelines or Reference Prices in various ``Unusual...
File Formats Commonly Used in Mass Spectrometry Proteomics*
Deutsch, Eric W.
2012-01-01
The application of mass spectrometry (MS) to the analysis of proteomes has enabled the high-throughput identification and abundance measurement of hundreds to thousands of proteins per experiment. However, the formidable informatics challenge associated with analyzing MS data has required a wide variety of data file formats to encode the complex data types associated with MS workflows. These formats encompass the encoding of input instruction for instruments, output products of the instruments, and several levels of information and results used by and produced by the informatics analysis tools. A brief overview of the most common file formats in use today is presented here, along with a discussion of related topics. PMID:22956731
75 FR 47624 - Sport Fishing and Boating Partnership Council
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-06
... Coordinator in both of the following formats: One hard copy with original signature, and one electronic copy via e- mail (acceptable file format: Adobe Acrobat PDF, WordPerfect, MS Word, MS PowerPoint, or Rich Text files in IBM-PC/Windows 98/2000/XP format). In order to attend this meeting, you must register by...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-11-12
... these guidelines, DTC must, among other things, maintain a Total Risk-Based Capital Ratio of at least 10... SECURITIES AND EXCHANGE COMMISSION [Release No. 34-63254; File No. SR-DTC-2010-14] Self-Regulatory... Preferred Stock November 5, 2010. Pursuant to Section 19(b)(1) of the Securities Exchange Act of 1934 (``Act...
Code of Federal Regulations, 2010 CFR
2010-07-01
... Administration DEPARTMENT OF JUSTICE (CONTINUED) REGULATIONS RELATING TO THE BANKRUPTCY REFORM ACTS OF 1978 AND... Reimbursement of Expenses Filed Under 11 U.S.C. 330 (a) General Information. (1) The Bankruptcy Reform Act of... thus reflect standards and procedures articulated in section 330 of the Code and Rule 2016 of the...
Performance regression manager for large scale systems
Faraj, Daniel A.
2017-10-17
System and computer program product to perform an operation comprising generating, based on a first output generated by a first execution instance of a command, a first output file specifying a value of at least one performance metric, wherein the first output file is formatted according to a predefined format, comparing the value of the at least one performance metric in the first output file to a value of the performance metric in a second output file, the second output file having been generated based on a second output generated by a second execution instance of the command, and outputting for display an indication of a result of the comparison of the value of the at least one performance metric of the first output file to the value of the at least one performance metric of the second output file.
Performance regression manager for large scale systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Faraj, Daniel A.
Methods comprising generating, based on a first output generated by a first execution instance of a command, a first output file specifying a value of at least one performance metric, wherein the first output file is formatted according to a predefined format, comparing the value of the at least one performance metric in the first output file to a value of the performance metric in a second output file, the second output file having been generated based on a second output generated by a second execution instance of the command, and outputting for display an indication of a result ofmore » the comparison of the value of the at least one performance metric of the first output file to the value of the at least one performance metric of the second output file.« less
Efficient stereoscopic contents file format on the basis of ISO base media file format
NASA Astrophysics Data System (ADS)
Kim, Kyuheon; Lee, Jangwon; Suh, Doug Young; Park, Gwang Hoon
2009-02-01
A lot of 3D contents haven been widely used for multimedia services, however, real 3D video contents have been adopted for a limited applications such as a specially designed 3D cinema. This is because of the difficulty of capturing real 3D video contents and the limitation of display devices available in a market. However, diverse types of display devices for stereoscopic video contents for real 3D video contents have been recently released in a market. Especially, a mobile phone with a stereoscopic camera has been released in a market, which provides a user as a consumer to have more realistic experiences without glasses, and also, as a content creator to take stereoscopic images or record the stereoscopic video contents. However, a user can only store and display these acquired stereoscopic contents with his/her own devices due to the non-existence of a common file format for these contents. This limitation causes a user not share his/her contents with any other users, which makes it difficult the relevant market to stereoscopic contents is getting expanded. Therefore, this paper proposes the common file format on the basis of ISO base media file format for stereoscopic contents, which enables users to store and exchange pure stereoscopic contents. This technology is also currently under development for an international standard of MPEG as being called as a stereoscopic video application format.
75 FR 5066 - Commission Information Collection Activities (FERC Form 60,1
Federal Register 2010, 2011, 2012, 2013, 2014
2010-02-01
... corresponding dockets and collection numbers.) Comments may be filed either electronically or in paper format. Those persons filing electronically do not need to make a paper filing. Documents filed electronically... acknowledgement to the sender's e- mail address upon receipt of comments. For paper filings, the comments should...
NASA-IGES Translator and Viewer
NASA Technical Reports Server (NTRS)
Chou, Jin J.; Logan, Michael A.
1995-01-01
NASA-IGES Translator (NIGEStranslator) is a batch program that translates a general IGES (Initial Graphics Exchange Specification) file to a NASA-IGES-Nurbs-Only (NINO) file. IGES is the most popular geometry exchange standard among Computer Aided Geometric Design (CAD) systems. NINO format is a subset of IGES, implementing the simple and yet the most popular NURBS (Non-Uniform Rational B-Splines) representation. NIGEStranslator converts a complex IGES file to the simpler NINO file to simplify the tasks of CFD grid generation for models in CAD format. The NASA-IGES Viewer (NIGESview) is an Open-Inventor-based, highly interactive viewer/ editor for NINO files. Geometry in the IGES files can be viewed, copied, transformed, deleted, and inquired. Users can use NIGEStranslator to translate IGES files from CAD systems to NINO files. The geometry then can be examined with NIGESview. Extraneous geometries can be interactively removed, and the cleaned model can be written to an IGES file, ready to be used in grid generation.
Nakamura, R; Sasaki, M; Oikawa, H; Harada, S; Tamakawa, Y
2000-03-01
To use an intranet technique to develop an information system that simultaneously supports both diagnostic reports and radiotherapy planning images. Using a file server as the gateway a radiation oncology LAN was connected to an already operative RIS LAN. Dose-distribution images were saved in tagged-image-file format by way of a screen dump to the file server. X-ray simulator images and portal images were saved in encapsulated postscript format in the file server and automatically converted to portable document format. The files on the file server were automatically registered to the Web server by the search engine and were available for searching and browsing using the Web browser. It took less than a minute to register planning images. For clients, searching and browsing the file took less than 3 seconds. Over 150,000 reports and 4,000 images from a six-month period were accessible. Because the intranet technique was used, construction and maintenance was completed without specialty. Prompt access to essential information about radiotherapy has been made possible by this system. It promotes public access to radiotherapy planning that may improve the quality of treatment.
77 FR 60138 - Trinity Adaptive Management Working Group; Public Teleconference/Web-Based Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-02
... statements must be supplied to Elizabeth Hadley in one of the following formats: One hard copy with original... file formats are Adobe Acrobat PDF, MS Word, PowerPoint, or rich text file). Registered speakers who...
E-submission chronic toxicology study supplemental files
The formats and instructions in these documents are designed to be used as an example or guide for registrants to format electronic files for submission of animal toxicology data to OPP for review in support of registration and reevaluation of pesticides.
Bradley, Anthony R; Rose, Alexander S; Pavelka, Antonín; Valasatava, Yana; Duarte, Jose M; Prlić, Andreas; Rose, Peter W
2017-06-01
Recent advances in experimental techniques have led to a rapid growth in complexity, size, and number of macromolecular structures that are made available through the Protein Data Bank. This creates a challenge for macromolecular visualization and analysis. Macromolecular structure files, such as PDB or PDBx/mmCIF files can be slow to transfer, parse, and hard to incorporate into third-party software tools. Here, we present a new binary and compressed data representation, the MacroMolecular Transmission Format, MMTF, as well as software implementations in several languages that have been developed around it, which address these issues. We describe the new format and its APIs and demonstrate that it is several times faster to parse, and about a quarter of the file size of the current standard format, PDBx/mmCIF. As a consequence of the new data representation, it is now possible to visualize structures with millions of atoms in a web browser, keep the whole PDB archive in memory or parse it within few minutes on average computers, which opens up a new way of thinking how to design and implement efficient algorithms in structural bioinformatics. The PDB archive is available in MMTF file format through web services and data that are updated on a weekly basis.
Pavelka, Antonín; Valasatava, Yana; Prlić, Andreas
2017-01-01
Recent advances in experimental techniques have led to a rapid growth in complexity, size, and number of macromolecular structures that are made available through the Protein Data Bank. This creates a challenge for macromolecular visualization and analysis. Macromolecular structure files, such as PDB or PDBx/mmCIF files can be slow to transfer, parse, and hard to incorporate into third-party software tools. Here, we present a new binary and compressed data representation, the MacroMolecular Transmission Format, MMTF, as well as software implementations in several languages that have been developed around it, which address these issues. We describe the new format and its APIs and demonstrate that it is several times faster to parse, and about a quarter of the file size of the current standard format, PDBx/mmCIF. As a consequence of the new data representation, it is now possible to visualize structures with millions of atoms in a web browser, keep the whole PDB archive in memory or parse it within few minutes on average computers, which opens up a new way of thinking how to design and implement efficient algorithms in structural bioinformatics. The PDB archive is available in MMTF file format through web services and data that are updated on a weekly basis. PMID:28574982
ChemEngine: harvesting 3D chemical structures of supplementary data from PDF files.
Karthikeyan, Muthukumarasamy; Vyas, Renu
2016-01-01
Digital access to chemical journals resulted in a vast array of molecular information that is now available in the supplementary material files in PDF format. However, extracting this molecular information, generally from a PDF document format is a daunting task. Here we present an approach to harvest 3D molecular data from the supporting information of scientific research articles that are normally available from publisher's resources. In order to demonstrate the feasibility of extracting truly computable molecules from PDF file formats in a fast and efficient manner, we have developed a Java based application, namely ChemEngine. This program recognizes textual patterns from the supplementary data and generates standard molecular structure data (bond matrix, atomic coordinates) that can be subjected to a multitude of computational processes automatically. The methodology has been demonstrated via several case studies on different formats of coordinates data stored in supplementary information files, wherein ChemEngine selectively harvested the atomic coordinates and interpreted them as molecules with high accuracy. The reusability of extracted molecular coordinate data was demonstrated by computing Single Point Energies that were in close agreement with the original computed data provided with the articles. It is envisaged that the methodology will enable large scale conversion of molecular information from supplementary files available in the PDF format into a collection of ready- to- compute molecular data to create an automated workflow for advanced computational processes. Software along with source codes and instructions available at https://sourceforge.net/projects/chemengine/files/?source=navbar.Graphical abstract.
Network Configuration Analysis for Formation Flying Satellites
NASA Technical Reports Server (NTRS)
Knoblock, Eric J.; Wallett, Thomas M.; Konangi, Vijay K.; Bhasin, Kul B.
2001-01-01
The performance of two networks to support autonomous multi-spacecraft formation flying systems is presented. Both systems are comprised of a ten-satellite formation, with one of the satellites designated as the central or 'mother ship.' All data is routed through the mother ship to the terrestrial network. The first system uses a TCP/EP over ATM protocol architecture within the formation, and the second system uses the IEEE 802.11 protocol architecture within the formation. The simulations consist of file transfers using either the File Transfer Protocol (FTP) or the Simple Automatic File Exchange (SAFE) Protocol. The results compare the IP queuing delay, IP queue size and IP processing delay at the mother ship as well as end-to-end delay for both systems. In all cases, using IEEE 802.11 within the formation yields less delay. Also, the throughput exhibited by SAFE is better than FTP.
NASA Astrophysics Data System (ADS)
Foster, K.
1994-09-01
This document is a description of a computer program called Format( )MEDIC( )Input. The purpose of this program is to allow the user to quickly reformat wind velocity data in the Model Evaluation Database (MEDb) into a reasonable 'first cut' set of MEDIC input files (MEDIC.nml, StnLoc.Met, and Observ.Met). The user is cautioned that these resulting input files must be reviewed for correctness and completeness. This program will not format MEDb data into a Problem Station Library or Problem Metdata File. A description of how the program reformats the data is provided, along with a description of the required and optional user input and a description of the resulting output files. A description of the MEDb is not provided here but can be found in the RAS Division Model Evaluation Database Description document.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-07
... can file your comments electronically using the eFiling feature located on the Commission's Web site ( www.ferc.gov ) under the Documents & Filings link. With eFiling, you can provide comments in a variety of formats by attaching them as a file with your submission. New eFiling users must first create an...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-03
... meeting. Written statements should be supplied to the DFO in the following formats: One hard copy with original signature and one electronic copy via e-mail (acceptable file format: Adobe Acrobat PDF, MS Word, WordPerfect, MS PowerPoint, or Rich Text files in IBM-PC/Windows 98/2000/XP format). Submitters are...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-01-25
... their consideration. Written statements should be supplied to the DFO in the following formats: one hard copy with original signature, and one electronic copy via e-mail (acceptable file format: Adobe Acrobat PDF, WordPerfect, MS Word, MS PowerPoint, or Rich Text files in IBM-PC/ Windows 98/2000/XP format...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-01-29
... statements should be supplied to the DFO in the following formats: one hard copy with original signature, and one electronic copy via e-mail (acceptable file format: Adobe Acrobat PDF, WordPerfect, MS Word, MS PowerPoint, or Rich Text files in IBM-PC/Windows 98/2000/XP format). Submitters are asked to provide...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-30
... should be supplied to the DFO in the following formats: One hard copy with original signature and one electronic copy via e-mail (acceptable file format: Adobe Acrobat PDF, MS Word, WordPerfect, MS PowerPoint, or Rich Text files in IBM-PC/Windows 98/2000/XP format). Submitters are asked to provide electronic...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-21
... be supplied to the DFO in the following formats: One hard copy with original signature, and one electronic copy via e-mail (acceptable file format: Adobe Acrobat PDF, WordPerfect, MS Word, MS PowerPoint, or Rich Text files in IBM-PC/ Windows 98/2000/XP format). Submitters are requested to provide two...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-30
... supplied to the DFO in the following formats: One hard copy with original signature, and one electronic copy via e-mail (acceptable file format: Adobe Acrobat PDF, WordPerfect, MS Word, MS PowerPoint, or Rich Text files in IBM-PC/ Windows 98/2000/XP format). Submitters are requested to provide two versions...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-01-11
... supplied to the DFO in the following formats: one hard copy with original signature and one electronic copy via e-mail (acceptable file format: Adobe Acrobat PDF, MS Word, WordPerfect, MS PowerPoint, or Rich Text files in IBM-PC/Windows 98/2000/XP format). Submitters are asked to provide versions of each...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-25
... statements should be supplied to the DFO in the following formats: One hard copy with original signature and one electronic copy via e-mail (acceptable file format: Adobe Acrobat PDF, WordPerfect, MS Word, MS PowerPoint, or Rich Text files in IBM-PC/Windows 98/2000/XP format). Submitters are requested to provide...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-10-08
.... Written statements should be supplied to the DFO in the following formats: one hard copy with original signature, and one electronic copy via e-mail (acceptable file format: Adobe Acrobat PDF, WordPerfect, MS Word, MS PowerPoint, or Rich Text files in IBM-PC/Windows 98/2000/XP format). Submitters are asked to...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-01
... their consideration. Written statements should be supplied to the DFO in the following formats: one hard copy with original signature, and one electronic copy via e-mail (acceptable file format: Adobe Acrobat PDF, WordPerfect, MS Word, MS PowerPoint, or Rich Text files in IBM-PC/ Windows 98/2000/XP format...
Miller, John J.; Agena, W.F.; Lee, M.W.; Zihlman, F.N.; Grow, J.A.; Taylor, D.J.; Killgore, Michele; Oliver, H.L.
2000-01-01
This CD-ROM contains stacked, migrated, 2-Dimensional seismic reflection data and associated support information for 22 regional seismic lines (3,470 line-miles) recorded in the National Petroleum Reserve ? Alaska (NPRA) from 1974 through 1981. Together, these lines constitute about one-quarter of the seismic data collected as part of the Federal Government?s program to evaluate the petroleum potential of the Reserve. The regional lines, which form a grid covering the entire NPRA, were created by combining various individual lines recorded in different years using different recording parameters. These data were reprocessed by the USGS using modern, post-stack processing techniques, to create a data set suitable for interpretation on interactive seismic interpretation computer workstations. Reprocessing was done in support of ongoing petroleum resource studies by the USGS Energy Program. The CD-ROM contains the following files: 1) 22 files containing the digital seismic data in standard, SEG-Y format; 2) 1 file containing navigation data for the 22 lines in standard SEG-P1 format; 3) 22 small scale graphic images of each seismic line in Adobe Acrobat? PDF format; 4) a graphic image of the location map, generated from the navigation file, with hyperlinks to the graphic images of the seismic lines; 5) an ASCII text file with cross-reference information for relating the sequential trace numbers on each regional line to the line number and shotpoint number of the original component lines; and 6) an explanation of the processing used to create the final seismic sections (this document). The SEG-Y format seismic files and SEG-P1 format navigation file contain all the information necessary for loading the data onto a seismic interpretation workstation.
17 CFR 232.14 - Paper filings not accepted without exemption.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 17 Commodity and Securities Exchanges 2 2011-04-01 2011-04-01 false Paper filings not accepted... COMMISSION REGULATION S-T-GENERAL RULES AND REGULATIONS FOR ELECTRONIC FILINGS General § 232.14 Paper filings not accepted without exemption. The Commission will not accept in paper format any filing required to...
Atmospheric Science Data Center
2013-12-19
UAEMIAAE Aerosol product. ( File version details ) File version F07_0015 has better ... properties. File version F08_0016 has improved cloud screening procedure resulting in better aerosol optical depth. ... Coverage: August - October 2004 File Format: HDF-EOS Tools: FTP Access: Data Pool ...
Performance regression manager for large scale systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Faraj, Daniel A.
System and computer program product to perform an operation comprising generating, based on a first output generated by a first execution instance of a command, a first output file specifying a value of at least one performance metric, wherein the first output file is formatted according to a predefined format, comparing the value of the at least one performance metric in the first output file to a value of the performance metric in a second output file, the second output file having been generated based on a second output generated by a second execution instance of the command, and outputtingmore » for display an indication of a result of the comparison of the value of the at least one performance metric of the first output file to the value of the at least one performance metric of the second output file.« less
NASA Astrophysics Data System (ADS)
Yamagishi, Y.; Yanaka, H.; Tsuboi, S.
2009-12-01
We have developed a conversion tool for the data of seismic tomography into KML, called KML generator, and made it available on the web site (http://www.jamstec.go.jp/pacific21/google_earth). The KML generator enables us to display vertical and horizontal cross sections of the model on Google Earth in three-dimensional manner, which would be useful to understand the Earth's interior. The previous generator accepts text files of grid-point data having longitude, latitude, and seismic velocity anomaly. Each data file contains the data for each depth. Metadata, such as bibliographic reference, grid-point interval, depth, are described in other information file. We did not allow users to upload their own tomographic model to the web application, because there is not standard format to represent tomographic model. Recently European seismology research project, NEIRES (Network of Research Infrastructures for European Seismology), advocates that the data of seismic tomography should be standardized. They propose a new format based on JSON (JavaScript Object Notation), which is one of the data-interchange formats, as a standard one for the tomography. This format consists of two parts, which are metadata and grid-point data values. The JSON format seems to be powerful to handle and to analyze the tomographic model, because the structure of the format is fully defined by JavaScript objects, thus the elements are directly accessible by a script. In addition, there exist JSON libraries for several programming languages. The International Federation of Digital Seismograph Network (FDSN) adapted this format as a FDSN standard format for seismic tomographic model. There might be a possibility that this format would not only be accepted by European seismologists but also be accepted as the world standard. Therefore we improve our KML generator for seismic tomography to accept the data file having also JSON format. We also improve the web application of the generator so that the JSON formatted data file can be uploaded. Users can convert any tomographic model data to KML. The KML obtained through the new generator should provide an arena to compare various tomographic models and other geophysical observations on Google Earth, which may act as a common platform for geoscience browser.
Smith, Steven M.
1997-01-01
The National Uranium Resource Evaluation (NURE) Hydrogeochemical and Stream Sediment Reconnaissance (HSSR) program produced a large amount of geochemical data. To fully understand how these data were generated, it is recommended that you read the History of NURE HSSR Program for a summary of the entire program. By the time the NURE program had ended, the HSSR data consisted of 894 separate data files stored with 47 different formats. Many files contained duplication of data found in other files. The University of Oklahoma's Information Systems Programs of the Energy Resources Institute (ISP) was contracted by the Department of Energy to enhance the accessibility and usefulness of the NURE HSSR data. ISP created a single standard-format master file to replace the 894 original files. ISP converted 817 of the 894 original files before its funding apparently ran out. The ISP-reformatted NURE data files have been released by the USGS on CD-ROM (Lower 48 States, Hoffman and Buttleman, 1994; Alaska, Hoffman and Buttleman, 1996). A description of each NURE database field, derived from a draft NURE HSSR data format manual (unpubl. commun., Stan Moll, ISP, Oct 7, 1988), was included in a readme file on each CD-ROM. That original manual was incomplete and assumed that the reformatting process had gone to completion. A lot of vital information was not included. Efforts to correct that manual and the NURE data revealed a large number of problems and missing data. As a result of the frustrating process of cleaning and re-cleaning data from the ISP-reformatted NURE files, a new NURE HSSR data format was developed. This work represents a totally new attempt to reformat the original NURE files into 2 consistent database structures; one for water samples and a second for sediment samples, on a quadrangle by quadrangle basis, from the original NURE files. Although this USGS-reformatted NURE HSSR data format is different than that created by the ISP, many of their ideas were incorporated and expanded in this effort. All of the data from each quadrangle are being examined thoroughly in an attempt to eliminate problems, to combine partial or duplicate records, to convert all coding to a common scheme, and to identify problems even if they can not be solved at this time.
SnopViz, an interactive snow profile visualization tool
NASA Astrophysics Data System (ADS)
Fierz, Charles; Egger, Thomas; gerber, Matthias; Bavay, Mathias; Techel, Frank
2016-04-01
SnopViz is a visualization tool for both simulation outputs of the snow-cover model SNOWPACK and observed snow profiles. It has been designed to fulfil the needs of operational services (Swiss Avalanche Warning Service, Avalanche Canada) as well as offer the flexibility required to satisfy the specific needs of researchers. This JavaScript application runs on any modern browser and does not require an active Internet connection. The open source code is available for download from models.slf.ch where examples can also be run. Both the SnopViz library and the SnopViz User Interface will become a full replacement of the current research visualization tool SN_GUI for SNOWPACK. The SnopViz library is a stand-alone application that parses the provided input files, for example, a single snow profile (CAAML file format) or multiple snow profiles as output by SNOWPACK (PRO file format). A plugin architecture allows for handling JSON objects (JavaScript Object Notation) as well and plugins for other file formats may be added easily. The outputs are provided either as vector graphics (SVG) or JSON objects. The SnopViz User Interface (UI) is a browser based stand-alone interface. It runs in every modern browser, including IE, and allows user interaction with the graphs. SVG, the XML based standard for vector graphics, was chosen because of its easy interaction with JS and a good software support (Adobe Illustrator, Inkscape) to manipulate graphs outside SnopViz for publication purposes. SnopViz provides new visualization for SNOWPACK timeline output as well as time series input and output. The actual output format for SNOWPACK timelines was retained while time series are read from SMET files, a file format used in conjunction with the open source data handling code MeteoIO. Finally, SnopViz is able to render single snow profiles, either observed or modelled, that are provided as CAAML-file. This file format (caaml.org/Schemas/V5.0/Profiles/SnowProfileIACS) is an international standard to exchange snow profile data. It is supported by the International Association of Cryospheric Sciences (IACS) and was developed in collaboration with practitioners (Avalanche Canada).
Effect of reciprocating file motion on microcrack formation in root canals: an SEM study.
Ashwinkumar, V; Krithikadatta, J; Surendran, S; Velmurugan, N
2014-07-01
To compare dentinal microcrack formation whilst using Ni-Ti hand K-files, ProTaper hand and rotary files and the WaveOne reciprocating file. One hundred and fifty mandibular first molars were selected. Thirty teeth were left unprepared and served as controls, and the remaining 120 teeth were divided into four groups. Ni-Ti hand K-files, ProTaper hand files, ProTaper rotary files and WaveOne Primary reciprocating files were used to prepare the mesial canals. Roots were then sectioned 3, 6 and 9 mm from the apex, and the cut surface was observed under scanning electron microscope (SEM) and checked for the presence of dentinal microcracks. The control and Ni-Ti hand K-files groups were not associated with microcracks. In roots prepared with ProTaper hand files, ProTaper rotary files and WaveOne Primary reciprocating files, dentinal microcracks were present. There was a significant difference between control/Ni-Ti hand K-files group and ProTaper hand files/ProTaper rotary files/WaveOne Primary reciprocating file group (P < 0.001) with ProTaper rotary files producing the most microcracks. No significant difference was observed between teeth prepared with ProTaper hand files and WaveOne Primary reciprocating files. ProTaper rotary files were associated with significantly more microcracks than ProTaper hand files and WaveOne Primary reciprocating files. Ni-Ti hand K-files did not produce microcracks at any levels inside the root canals. © 2013 International Endodontic Journal. Published by John Wiley & Sons Ltd.
DICOM to print, 35-mm slides, web, and video projector: tutorial using Adobe Photoshop.
Gurney, Jud W
2002-10-01
Preparing images for publication has dealt with film and the photographic process. With picture archiving and communications systems, many departments will no longer produce film. This will change how images are produced for publication. DICOM, the file format for radiographic images, has to be converted and then prepared for traditional publication, 35-mm slides, the newest techniques of video projection, and the World Wide Web. Tagged image file format is the common format for traditional print publication, whereas joint photographic expert group is the current file format for the World Wide Web. Each medium has specific requirements that can be met with a common image-editing program such as Adobe Photoshop (Adobe Systems, San Jose, CA). High-resolution images are required for print, a process that requires interpolation. However, the Internet requires images with a small file size for rapid transmission. The resolution of each output differs and the image resolution must be optimized to match the output of the publishing medium.
TADPLOT program, version 2.0: User's guide
NASA Technical Reports Server (NTRS)
Hammond, Dana P.
1991-01-01
The TADPLOT Program, Version 2.0 is described. The TADPLOT program is a software package coordinated by a single, easy-to-use interface, enabling the researcher to access several standard file formats, selectively collect specific subsets of data, and create full-featured publication and viewgraph quality plots. The user-interface was designed to be independent from any file format, yet provide capabilities to accommodate highly specialized data queries. Integrated with an applications software network, data can be assessed, collected, and viewed quickly and easily. Since the commands are data independent, subsequent modifications to the file format will be transparent, while additional file formats can be integrated with minimal impact on the user-interface. The graphical capabilities are independent of the method of data collection; thus, the data specification and subsequent plotting can be modified and upgraded as separate functional components. The graphics kernel selected adheres to the full functional specifications of the CORE standard. Both interface and postprocessing capabilities are fully integrated into TADPLOT.
A New Archive of UKIRT Legacy Data at CADC
NASA Astrophysics Data System (ADS)
Bell, G. S.; Currie, M. J.; Redman, R. O.; Purves, M.; Jenness, T.
2014-05-01
We describe a new archive of legacy data from the United Kingdom Infrared Telescope (UKIRT) at the Canadian Astronomy Data Centre (CADC) containing all available data from the Cassegrain instruments. The desire was to archive the raw data in as close to the original format as possible, so where the data followed our current convention of having a single data file per observation, it was archived without alteration, except for minor fixes to headers of data in FITS format to allow it to pass fitsverify and be accepted by CADC. Some of the older data comprised multiple integrations in separate files per observation, stored in either Starlink NDF or Figaro DST format. These were placed inside HDS container files, and DST files were rearranged into NDF format. The describing the observations is ingested into the CAOM-2 repository via an intermediate MongoDB header database, which will also be used to guide the ORAC-DR pipeline in generating reduced data products.
Converting CSV Files to RKSML Files
NASA Technical Reports Server (NTRS)
Trebi-Ollennu, Ashitey; Liebersbach, Robert
2009-01-01
A computer program converts, into a format suitable for processing on Earth, files of downlinked telemetric data pertaining to the operation of the Instrument Deployment Device (IDD), which is a robot arm on either of the Mars Explorer Rovers (MERs). The raw downlinked data files are in comma-separated- value (CSV) format. The present program converts the files into Rover Kinematics State Markup Language (RKSML), which is an Extensible Markup Language (XML) format that facilitates representation of operations of the IDD and enables analysis of the operations by means of the Rover Sequencing Validation Program (RSVP), which is used to build sequences of commanded operations for the MERs. After conversion by means of the present program, the downlinked data can be processed by RSVP, enabling the MER downlink operations team to play back the actual IDD activity represented by the telemetric data against the planned IDD activity. Thus, the present program enhances the diagnosis of anomalies that manifest themselves as differences between actual and planned IDD activities.
Fortran Program for X-Ray Photoelectron Spectroscopy Data Reformatting
NASA Technical Reports Server (NTRS)
Abel, Phillip B.
1989-01-01
A FORTRAN program has been written for use on an IBM PC/XT or AT or compatible microcomputer (personal computer, PC) that converts a column of ASCII-format numbers into a binary-format file suitable for interactive analysis on a Digital Equipment Corporation (DEC) computer running the VGS-5000 Enhanced Data Processing (EDP) software package. The incompatible floating-point number representations of the two computers were compared, and a subroutine was created to correctly store floating-point numbers on the IBM PC, which can be directly read by the DEC computer. Any file transfer protocol having provision for binary data can be used to transmit the resulting file from the PC to the DEC machine. The data file header required by the EDP programs for an x ray photoelectron spectrum is also written to the file. The user is prompted for the relevant experimental parameters, which are then properly coded into the format used internally by all of the VGS-5000 series EDP packages.
COSPAR/PRBEM international working group activities report
NASA Astrophysics Data System (ADS)
Bourdarie, S.; Blake, B.; Cao, J. B.; Friedel, R.; Miyoshi, Y.; Panasyuk, M.; Underwood, C.
It is now clear to everybody that the current standard AE8 AP8 model for ionising particle specification in the radiation belts must be updated But such an objective is quite difficult to reach just as a reminder to develop AE8 AP8 model in the seventies was 10 persons full time for ten years It is clear that world-wide efforts must be combined because not any individual group has the human resource to perform these new models by themselves Under COSPAR umbrella an international group of expert well distributed around the world has been created to set up a common framework for everybody involved in this field Planned activities of the international group of experts are to - Define users needs - Provide guidelines for standard file format for ionising measurements - Set up guidelines to process in-situ data on a common basis - Decide in which form the new models will have to be - Centralise all progress done world-wide to advise the community - Try to organise world-wide activities as a project to ensure complementarities and more efficiencies between all efforts done Activities of this working group since its creation will be reported as well as future plans
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-30
... file your comments electronically using the eFiling feature on the Commission's Web site ( www.ferc.gov ) under the link to Documents and Filings. With eFiling, you can provide comments in a variety of formats by attaching them as a file with your submission. New eFiling users must first create an account by...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-09-04
... on a project; (2) You can file your comments electronically using the eFiling feature located on the Commission's Web site ( www.ferc.gov ) under the Documents & Filings link. With eFiling, you can provide comments in a variety of formats by attaching them as a file with your submission. New eFiling users must...
Implementation of study results in guidelines and adherence to guidelines in clinical practice.
Waldfahrer, Frank
2016-01-01
Guidelines were introduced in hospital- and practice-based otorhinolaryngology in the 1990ies, and have been undergoing further development ever since. There are currently 20 guidelines on file at the German Society of Oto-Rhino-Laryngology, Head & Neck Surgery. The society has cooperated in further 34 guidelines. The quality of the guidelines has been continuously improved by concrete specifications put forward by the Association of the Scientific Medical Societies in Germany (Arbeitsgemeinschaft der Wissenschaftlichen Medizinischen Fachgesellschaften e.V., AWMF). Since increasing digitalization has made access to scientific publications quicker and simpler, relevant study results can be incorporated in guidelines more easily today than in the analog world. S2e and S3 guidelines must be based on a formal literature search with subsequent evaluation of the evidence. The consensus procedure for S2k guidelines is also regulated. However, the implementation of guidelines in routine medical practice must still be considered inadequate, and there is still a considerable need for improvement in adherence to these guidelines.
Implementation of study results in guidelines and adherence to guidelines in clinical practice
Waldfahrer, Frank
2016-01-01
Guidelines were introduced in hospital- and practice-based otorhinolaryngology in the 1990ies, and have been undergoing further development ever since. There are currently 20 guidelines on file at the German Society of Oto-Rhino-Laryngology, Head & Neck Surgery. The society has cooperated in further 34 guidelines. The quality of the guidelines has been continuously improved by concrete specifications put forward by the Association of the Scientific Medical Societies in Germany (Arbeitsgemeinschaft der Wissenschaftlichen Medizinischen Fachgesellschaften e.V., AWMF). Since increasing digitalization has made access to scientific publications quicker and simpler, relevant study results can be incorporated in guidelines more easily today than in the analog world. S2e and S3 guidelines must be based on a formal literature search with subsequent evaluation of the evidence. The consensus procedure for S2k guidelines is also regulated. However, the implementation of guidelines in routine medical practice must still be considered inadequate, and there is still a considerable need for improvement in adherence to these guidelines. PMID:28025601
A convertor and user interface to import CAD files into worldtoolkit virtual reality systems
NASA Technical Reports Server (NTRS)
Wang, Peter Hor-Ching
1996-01-01
Virtual Reality (VR) is a rapidly developing human-to-computer interface technology. VR can be considered as a three-dimensional computer-generated Virtual World (VW) which can sense particular aspects of a user's behavior, allow the user to manipulate the objects interactively, and render the VW at real-time accordingly. The user is totally immersed in the virtual world and feel the sense of transforming into that VW. NASA/MSFC Computer Application Virtual Environments (CAVE) has been developing the space-related VR applications since 1990. The VR systems in CAVE lab are based on VPL RB2 system which consists of a VPL RB2 control tower, an LX eyephone, an Isotrak polhemus sensor, two Fastrak polhemus sensors, a folk of Bird sensor, and two VPL DG2 DataGloves. A dynamics animator called Body Electric from VPL is used as the control system to interface with all the input/output devices and to provide the network communications as well as VR programming environment. The RB2 Swivel 3D is used as the modelling program to construct the VW's. A severe limitation of the VPL VR system is the use of RB2 Swivel 3D, which restricts the files to a maximum of 1020 objects and doesn't have the advanced graphics texture mapping. The other limitation is that the VPL VR system is a turn-key system which does not provide the flexibility for user to add new sensors and C language interface. Recently, NASA/MSFC CAVE lab provides VR systems built on Sense8 WorldToolKit (WTK) which is a C library for creating VR development environments. WTK provides device drivers for most of the sensors and eyephones available on the VR market. WTK accepts several CAD file formats, such as Sense8 Neutral File Format, AutoCAD DXF and 3D Studio file format, Wave Front OBJ file format, VideoScape GEO file format, Intergraph EMS stereolithographics and CATIA Stereolithographics STL file formats. WTK functions are object-oriented in their naming convention, are grouped into classes, and provide easy C language interface. Using a CAD or modelling program to build a VW for WTK VR applications, we typically construct the stationary universe with all the geometric objects except the dynamic objects, and create each dynamic object in an individual file.
NASA Astrophysics Data System (ADS)
Haran, T. M.; Brodzik, M. J.; Nordgren, B.; Estilow, T.; Scott, D. J.
2015-12-01
An increasing number of new Earth science datasets are being producedby data providers in self-describing, machine-independent file formatsincluding Hierarchical Data Format version 5 (HDF5) and NetworkCommon Data Form version 4 (netCDF-4). Furthermore data providers maybe producing netCDF-4 files that follow the conventions for Climateand Forecast metadata version 1.6 (CF 1.6) which, for datasets mappedto a projected raster grid covering all or a portion of the earth,includes the Coordinate Reference System (CRS) used to define howlatitude and longitude are mapped to grid coordinates, i.e. columnsand rows, and vice versa. One problem that users may encounter is thattheir preferred visualization and analysis tool may not yet includesupport for one of these newer formats. Moreover, data distributorssuch as NASA's NSIDC DAAC may not yet include support for on-the-flyconversion of data files for all data sets produced in a new format toa preferred older distributed format.There do exist open source solutions to this dilemma in the form ofsoftware packages that can translate files in one of the new formatsto one of the preferred formats. However these software packagesrequire that the file to be translated conform to the specificationsof its respective format. Although an online CF-Convention compliancechecker is available from cfconventions.org, a recent NSIDC userservices incident described here in detail involved an NSIDC-supporteddata set that passed the (then current) CF Checker Version 2.0.6, butwas in fact lacking two variables necessary for conformance. Thisproblem was not detected until GDAL, a software package which reliedon the missing variables, was employed by a user in an attempt totranslate the data into a different file format, namely GeoTIFF.This incident indicates that testing a candidate data product with oneor more software products written to accept the advertised conventionsis proposed as a practice which improves interoperability. Differencesbetween data file contents and software package expectations areexposed, affording an opportunity to improve conformance of software,data or both. The incident can also serve as a demonstration that dataproviders, distributors, and users can work together to improve dataproduct quality and interoperability.
Code of Federal Regulations, 2010 CFR
2010-07-01
... that time, you must file your travel claim in the format prescribed by your agency. If the prescribed... travel claim in a specific format and must the claim be signed? 301-52.3 Section 301-52.3 Public Contracts and Property Management Federal Travel Regulation System TEMPORARY DUTY (TDY) TRAVEL ALLOWANCES...
Code of Federal Regulations, 2013 CFR
2013-07-01
... that time, you must file your travel claim in the format prescribed by your agency. If the prescribed... travel claim in a specific format and must the claim be signed? 301-52.3 Section 301-52.3 Public Contracts and Property Management Federal Travel Regulation System TEMPORARY DUTY (TDY) TRAVEL ALLOWANCES...
Code of Federal Regulations, 2011 CFR
2011-07-01
... that time, you must file your travel claim in the format prescribed by your agency. If the prescribed... travel claim in a specific format and must the claim be signed? 301-52.3 Section 301-52.3 Public Contracts and Property Management Federal Travel Regulation System TEMPORARY DUTY (TDY) TRAVEL ALLOWANCES...
Code of Federal Regulations, 2014 CFR
2014-07-01
... that time, you must file your travel claim in the format prescribed by your agency. If the prescribed... travel claim in a specific format and must the claim be signed? 301-52.3 Section 301-52.3 Public Contracts and Property Management Federal Travel Regulation System TEMPORARY DUTY (TDY) TRAVEL ALLOWANCES...
Code of Federal Regulations, 2012 CFR
2012-07-01
... that time, you must file your travel claim in the format prescribed by your agency. If the prescribed... travel claim in a specific format and must the claim be signed? 301-52.3 Section 301-52.3 Public Contracts and Property Management Federal Travel Regulation System TEMPORARY DUTY (TDY) TRAVEL ALLOWANCES...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-16
... to Docket No. IC10-542-001. Comments may be filed either electronically or in paper format. Those persons filing electronically do not need to make a paper filing. Documents filed electronically via the... sender's e-mail address upon receipt of comments. For paper filings, the comments should be submitted to...
Canadian physicians' attitudes about and preferences regarding clinical practice guidelines.
Hayward, R S; Guyatt, G H; Moore, K A; McKibbon, K A; Carter, A O
1997-06-15
To assess Canadian physicians' confidence in, attitudes about and preferences regarding clinical practice guidelines. Cross-sectional, self-administered mailed survey. Stratified random sample of 3000 Canadian physicians; 1878 (62.6%) responded. Canada. Physicians' use of various information sources; familiarity with and confidence in guidelines; attitudes about guidelines and their effect on medical care; rating of importance of guidelines and other sources of information in clinical decision-making; rating of importance of various considerations in deciding whether to adopt a set of guidelines; and rating of usefulness of different formats for presenting guidelines. In all, 52% of the respondents reported using guidelines at least monthly, substantially less frequently than traditional information sources. Most of the respondents expressed confidence in guidelines issued by various physician organizations, but 51% to 77% were not confident in guidelines issued by federal or provincial health ministries or by health insurance plans. The respondents were generally positive about guidelines (e.g., over 50% strongly agreed that they are a convenient source of advice and good educational tools); however, 22% to 26% had concerns about loss of autonomy, the rigidity of guidelines and decreased satisfaction with medical practice. Endorsement by respected colleagues or major organizations was identified as very important by 78% and 62% of the respondents respectively in deciding whether to adopt a set of guidelines in their practice. User friendliness of the guidelines format was thought to be very important by 62%; short pamphlets, manuals summarizing a number of guidelines, journal articles and pocket cards summarizing guidelines were the preferred formats (identified as most useful by 50% to 62% of the respondents). Canadian physicians, although generally positive about guidelines and confident in those developed by clinicians, have not yet integrated the use of guidelines into their practices to a large extent. Our results suggest that respected organizations and opinion leaders should be involved in the development of guidelines and that the acceptability of any proposed format and medium for guidelines presentation should be pretested.
Data File Standard for Flow Cytometry, version FCS 3.1.
Spidlen, Josef; Moore, Wayne; Parks, David; Goldberg, Michael; Bray, Chris; Bierre, Pierre; Gorombey, Peter; Hyun, Bill; Hubbard, Mark; Lange, Simon; Lefebvre, Ray; Leif, Robert; Novo, David; Ostruszka, Leo; Treister, Adam; Wood, James; Murphy, Robert F; Roederer, Mario; Sudar, Damir; Zigon, Robert; Brinkman, Ryan R
2010-01-01
The flow cytometry data file standard provides the specifications needed to completely describe flow cytometry data sets within the confines of the file containing the experimental data. In 1984, the first Flow Cytometry Standard format for data files was adopted as FCS 1.0. This standard was modified in 1990 as FCS 2.0 and again in 1997 as FCS 3.0. We report here on the next generation flow cytometry standard data file format. FCS 3.1 is a minor revision based on suggested improvements from the community. The unchanged goal of the standard is to provide a uniform file format that allows files created by one type of acquisition hardware and software to be analyzed by any other type.The FCS 3.1 standard retains the basic FCS file structure and most features of previous versions of the standard. Changes included in FCS 3.1 address potential ambiguities in the previous versions and provide a more robust standard. The major changes include simplified support for international characters and improved support for storing compensation. The major additions are support for preferred display scale, a standardized way of capturing the sample volume, information about originality of the data file, and support for plate and well identification in high throughput, plate based experiments. Please see the normative version of the FCS 3.1 specification in Supporting Information for this manuscript (or at http://www.isac-net.org/ in the Current standards section) for a complete list of changes.
Data File Standard for Flow Cytometry, Version FCS 3.1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Spidlen, Josef; Moore, Wayne; Parks, David
2009-11-10
The flow cytometry data file standard provides the specifications needed to completely describe flow cytometry data sets within the confines of the file containing the experimental data. In 1984, the first Flow Cytometry Standard format for data files was adopted as FCS 1.0. This standard was modified in 1990 as FCS 2.0 and again in 1997 as FCS 3.0. We report here on the next generation flow cytometry standard data file format. FCS 3.1 is a minor revision based on suggested improvements from the community. The unchanged goal of the standard is to provide a uniform file format that allowsmore » files created by one type of acquisition hardware and software to be analyzed by any other type. The FCS 3.1 standard retains the basic FCS file structure and most features of previous versions of the standard. Changes included in FCS 3.1 address potential ambiguities in the previous versions and provide a more robust standard. The major changes include simplified support for international characters and improved support for storing compensation. The major additions are support for preferred display scale, a standardized way of capturing the sample volume, information about originality of the data file, and support for plate and well identification in high throughput, plate based experiments. Please see the normative version of the FCS 3.1 specification in Supporting Information for this manuscript (or at http://www.isac-net.org/ in the Current standards section) for a complete list of changes.« less
Moretti, Loris; Sartori, Luca
2016-09-01
In the field of Computer-Aided Drug Discovery and Development (CADDD) the proper software infrastructure is essential for everyday investigations. The creation of such an environment should be carefully planned and implemented with certain features in order to be productive and efficient. Here we describe a solution to integrate standard computational services into a functional unit that empowers modelling applications for drug discovery. This system allows users with various level of expertise to run in silico experiments automatically and without the burden of file formatting for different software, managing the actual computation, keeping track of the activities and graphical rendering of the structural outcomes. To showcase the potential of this approach, performances of five different docking programs on an Hiv-1 protease test set are presented. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Adding Data Management Services to Parallel File Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brandt, Scott
2015-03-04
The objective of this project, called DAMASC for “Data Management in Scientific Computing”, is to coalesce data management with parallel file system management to present a declarative interface to scientists for managing, querying, and analyzing extremely large data sets efficiently and predictably. Managing extremely large data sets is a key challenge of exascale computing. The overhead, energy, and cost of moving massive volumes of data demand designs where computation is close to storage. In current architectures, compute/analysis clusters access data in a physically separate parallel file system and largely leave it scientist to reduce data movement. Over the past decadesmore » the high-end computing community has adopted middleware with multiple layers of abstractions and specialized file formats such as NetCDF-4 and HDF5. These abstractions provide a limited set of high-level data processing functions, but have inherent functionality and performance limitations: middleware that provides access to the highly structured contents of scientific data files stored in the (unstructured) file systems can only optimize to the extent that file system interfaces permit; the highly structured formats of these files often impedes native file system performance optimizations. We are developing Damasc, an enhanced high-performance file system with native rich data management services. Damasc will enable efficient queries and updates over files stored in their native byte-stream format while retaining the inherent performance of file system data storage via declarative queries and updates over views of underlying files. Damasc has four key benefits for the development of data-intensive scientific code: (1) applications can use important data-management services, such as declarative queries, views, and provenance tracking, that are currently available only within database systems; (2) the use of these services becomes easier, as they are provided within a familiar file-based ecosystem; (3) common optimizations, e.g., indexing and caching, are readily supported across several file formats, avoiding effort duplication; and (4) performance improves significantly, as data processing is integrated more tightly with data storage. Our key contributions are: SciHadoop which explores changes to MapReduce assumption by taking advantage of semantics of structured data while preserving MapReduce’s failure and resource management; DataMods which extends common abstractions of parallel file systems so they become programmable such that they can be extended to natively support a variety of data models and can be hooked into emerging distributed runtimes such as Stanford’s Legion; and Miso which combines Hadoop and relational data warehousing to minimize time to insight, taking into account the overhead of ingesting data into data warehousing.« less
... of running) so you don't breathe as hard. Avoid busy roads and highways where PM is usually worse because of emissions from cars and trucks. For more tools to help you learn about air quality, visit Tracking Air Quality . Top of Page File Formats Help: How do I view different file formats ( ...
Deep PDF parsing to extract features for detecting embedded malware.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Munson, Miles Arthur; Cross, Jesse S.
2011-09-01
The number of PDF files with embedded malicious code has risen significantly in the past few years. This is due to the portability of the file format, the ways Adobe Reader recovers from corrupt PDF files, the addition of many multimedia and scripting extensions to the file format, and many format properties the malware author may use to disguise the presence of malware. Current research focuses on executable, MS Office, and HTML formats. In this paper, several features and properties of PDF Files are identified. Features are extracted using an instrumented open source PDF viewer. The feature descriptions of benignmore » and malicious PDFs can be used to construct a machine learning model for detecting possible malware in future PDF files. The detection rate of PDF malware by current antivirus software is very low. A PDF file is easy to edit and manipulate because it is a text format, providing a low barrier to malware authors. Analyzing PDF files for malware is nonetheless difficult because of (a) the complexity of the formatting language, (b) the parsing idiosyncrasies in Adobe Reader, and (c) undocumented correction techniques employed in Adobe Reader. In May 2011, Esparza demonstrated that PDF malware could be hidden from 42 of 43 antivirus packages by combining multiple obfuscation techniques [4]. One reason current antivirus software fails is the ease of varying byte sequences in PDF malware, thereby rendering conventional signature-based virus detection useless. The compression and encryption functions produce sequences of bytes that are each functions of multiple input bytes. As a result, padding the malware payload with some whitespace before compression/encryption can change many of the bytes in the final payload. In this study we analyzed a corpus of 2591 benign and 87 malicious PDF files. While this corpus is admittedly small, it allowed us to test a system for collecting indicators of embedded PDF malware. We will call these indicators features throughout the rest of this report. The features are extracted using an instrumented PDF viewer, and are the inputs to a prediction model that scores the likelihood of a PDF file containing malware. The prediction model is constructed from a sample of labeled data by a machine learning algorithm (specifically, decision tree ensemble learning). Preliminary experiments show that the model is able to detect half of the PDF malware in the corpus with zero false alarms. We conclude the report with suggestions for extending this work to detect a greater variety of PDF malware.« less
12 CFR 749.4 - Format for vital records preservation.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 12 Banks and Banking 6 2010-01-01 2010-01-01 false Format for vital records preservation. 749.4... RECORDS PRESERVATION PROGRAM AND APPENDICES-RECORD RETENTION GUIDELINES; CATASTROPHIC ACT PREPAREDNESS GUIDELINES § 749.4 Format for vital records preservation. Preserved records may be in any format that can be...
2014-12-01
format for the orientation of a body. It further recommends support- ing data be stored in a text PCK. These formats are used by the SPICE system...INTRODUCTION These file formats were developed for and are used by the SPICE system, developed by the Navigation and Ancillary Information Facility (NAIF...of NASA’s Jet Propulsion Laboratory (JPL). Most users will want to use either the SPICE libraries or CALCEPH, developed by the Institut de mécanique
Personalization of structural PDB files.
Woźniak, Tomasz; Adamiak, Ryszard W
2013-01-01
PDB format is most commonly applied by various programs to define three-dimensional structure of biomolecules. However, the programs often use different versions of the format. Thus far, no comprehensive solution for unifying the PDB formats has been developed. Here we present an open-source, Python-based tool called PDBinout for processing and conversion of various versions of PDB file format for biostructural applications. Moreover, PDBinout allows to create one's own PDB versions. PDBinout is freely available under the LGPL licence at http://pdbinout.ibch.poznan.pl.
Shahar, Yuval; Young, Ohad; Shalom, Erez; Mayaffit, Alon; Moskovitch, Robert; Hessing, Alon; Galperin, Maya
2004-01-01
We propose to present a poster (and potentially also a demonstration of the implemented system) summarizing the current state of our work on a hybrid, multiple-format representation of clinical guidelines that facilitates conversion of guidelines from free text to a formal representation. We describe a distributed Web-based architecture (DeGeL) and a set of tools using the hybrid representation. The tools enable performing tasks such as guideline specification, semantic markup, search, retrieval, visualization, eligibility determination, runtime application and retrospective quality assessment. The representation includes four parallel formats: Free text (one or more original sources); semistructured text (labeled by the target guideline-ontology semantic labels); semiformal text (which includes some control specification); and a formal, machine-executable representation. The specification, indexing, search, retrieval, and browsing tools are essentially independent of the ontology chosen for guideline representation, but editing the semi-formal and formal formats requires ontology-specific tools, which we have developed in the case of the Asbru guideline-specification language. The four formats support increasingly sophisticated computational tasks. The hybrid guidelines are stored in a Web-based library. All tools, such as for runtime guideline application or retrospective quality assessment, are designed to operate on all representations. We demonstrate the hybrid framework by providing examples from the semantic markup and search tools.
14 CFR 221.30 - Passenger fares and charges.
Code of Federal Regulations, 2010 CFR
2010-01-01
... PROCEEDINGS) ECONOMIC REGULATIONS TARIFFS Manner of Filing Tariffs § 221.30 Passenger fares and charges. (a... necessary to carry out the purposes of this part, the applicant carrier to file fare tariffs in a paper format. Such waivers shall only be considered where electronic filing, compared to paper filing, is...
GEWEX-RFA Data File Format and File Naming Convention
Atmospheric Science Data Center
2016-05-20
... documentation, will be stored for each data product. Each time data is added to, removed from, or modified in the file set for a product, ... including 29 days in leap-year Februaries. Time series files containing 15-minute data should start at the top of an hour to ...
TOLNet Data Format for Lidar Ozone Profile & Surface Observations
NASA Astrophysics Data System (ADS)
Chen, G.; Aknan, A. A.; Newchurch, M.; Leblanc, T.
2015-12-01
The Tropospheric Ozone Lidar Network (TOLNet) is an interagency initiative started by NASA, NOAA, and EPA in 2011. TOLNet currently has six Lidars and one ozonesonde station. TOLNet provides high-resolution spatio-temporal measurements of tropospheric (surface to tropopause) ozone and aerosol vertical profiles to address fundamental air-quality science questions. The TOLNet data format was developed by TOLNet members as a community standard for reporting ozone profile observations. The development of this new format was primarily based on the existing NDAAC (Network for the Detection of Atmospheric Composition Change) format and ICARTT (International Consortium for Atmospheric Research on Transport and Transformation) format. The main goal is to present the Lidar observations in self-describing and easy-to-use data files. The TOLNet format is an ASCII format containing a general file header, individual profile headers, and the profile data. The last two components repeat for all profiles recorded in the file. The TOLNet format is both human and machine readable as it adopts standard metadata entries and fixed variable names. In addition, software has been developed to check for format compliance. To be presented is a detailed description of the TOLNet format protocol and scanning software.
46 CFR 535.701 - General requirements.
Code of Federal Regulations, 2010 CFR
2010-10-01
..., Washington, DC 20573-0001. A copy of the Monitoring Report form in Microsoft Word and Excel format may be... Monitoring Reports in the Commission's prescribed electronic format, either on diskette or CD-ROM. (e)(1) The... filed by this subpart may be filed by direct electronic transmission in lieu of hard copy. Detailed...
Standard Electronic Format Specification for Tank Characterization Data Loader Version 3.5
DOE Office of Scientific and Technical Information (OSTI.GOV)
ADAMS, M.R.
2001-01-31
The purpose of this document is to describe the standard electronic format for data files that will be sent for entry into the Tank Characterization Database (TCD). There are 2 different file types needed for each data load: (1) Analytical Results and (2) Sample Descriptions.
47 CFR 1.913 - Application and notification forms; electronic and manual filing.
Code of Federal Regulations, 2011 CFR
2011-10-01
... notifications whenever possible. The files, other than the ASCII table of contents, should be in Adobe Acrobat... possible. The attachment should be uploaded via ULS in Adobe Acrobat Portable Document Format (PDF... the table of contents, should be in Adobe Acrobat Portable Document Format (PDF) whenever possible...
9 CFR 124.30 - Filing, format, and content of petitions.
Code of Federal Regulations, 2010 CFR
2010-01-01
... RESTORATION Due Diligence Petitions § 124.30 Filing, format, and content of petitions. (a) Any interested... diligence in seeking APHIS approval of the product during the regulatory review period. (b) The petition... subpart. (c) The petition must allege that the applicant failed to act with due diligence sometime during...
Viewing Files — EDRN Public Portal
In addition to standard HTML Web pages, our web site contain other file formats. You may need additional software or browser plug-ins to view some of the information available on our site. This document lists show each format, along with links to the corresponding freely available plug-ins or viewers.
Painless File Extraction: The A(rc)--Z(oo) of Internet Archive Formats.
ERIC Educational Resources Information Center
Simmonds, Curtis
1993-01-01
Discusses extraction programs needed to postprocess software downloaded from the Internet that has been archived and compressed for the purposes of storage and file transfer. Archiving formats for DOS, Macintosh, and UNIX operating systems are described; and cross-platform compression utilities are explained. (LRW)
SiLK: A Tool Suite for Unsampled Network Flow Analysis at Scale
2014-06-01
file format,” [Accessed: Feb 9, 2014]. [Online]. Available: https: //tools.netsa.cert.org/silk/faq.html#file-formats [12] “2012 data breach investigations...report (DBIR),” Verizon, Tech. Rep., 2012. [Online]. Available: http://www.verizonenterprise.com/DBIR/2012/ [13] “2013 data breach investigations
Özyürek, Taha; Tek, Vildan; Yılmaz, Koray; Uslu, Gülşah
2017-11-01
To determine the incidence of crack formation and propagation in apical root dentin after retreatment procedures performed using ProTaper Universal Retreatment (PTR), Mtwo-R, ProTaper Next (PTN), and Twisted File Adaptive (TFA) systems. The study consisted of 120 extracted mandibular premolars. One millimeter from the apex of each tooth was ground perpendicular to the long axis of the tooth, and the apical surface was polished. Twenty teeth served as the negative control group. One hundred teeth were prepared, obturated, and then divided into 5 retreatment groups. The retreatment procedures were performed using the following files: PTR, Mtwo-R, PTN, TFA, and hand files. After filling material removal, apical enlargement was done using apical size 0.50 mm ProTaper Universal (PTU), Mtwo, PTN, TFA, and hand files. Digital images of the apical root surfaces were recorded before preparation, after preparation, after obturation, after filling removal, and after apical enlargement using a stereomicroscope. The images were then inspected for the presence of new apical cracks and crack propagation. Data were analyzed with χ 2 tests using SPSS 21.0 software. New cracks and crack propagation occurred in all the experimental groups during the retreatment process. Nickel-titanium rotary file systems caused significantly more apical crack formation and propagation than the hand files. The PTU system caused significantly more apical cracks than the other groups after the apical enlargement stage. This study showed that retreatment procedures and apical enlargement after the use of retreatment files can cause crack formation and propagation in apical dentin.
Darzi, Andrea; Abou-Jaoude, Elias A; Agarwal, Arnav; Lakis, Chantal; Wiercioch, Wojtek; Santesso, Nancy; Brax, Hneine; El-Jardali, Fadi; Schünemann, Holger J; Akl, Elie A
2017-06-01
Our objective was to identify and describe published frameworks for adaptation of clinical, public health, and health services guidelines. We included reports describing methods of adaptation of guidelines in sufficient detail to allow its reproducibility. We searched Medline and EMBASE databases. We also searched personal files, as well manuals and handbooks of organizations and professional societies that proposed methods of adaptation and adoption of guidelines. We followed standard systematic review methodology. Our search captured 12,021 citations, out of which we identified eight proposed methods of guidelines adaptation: ADAPTE, Adapted ADAPTE, Alberta Ambassador Program adaptation phase, GRADE-ADOLOPMENT, MAGIC, RAPADAPTE, Royal College of Nursing (RCN), and Systematic Guideline Review (SGR). The ADAPTE framework consists of a 24-step process to adapt guidelines to a local context taking into consideration the needs, priorities, legislation, policies, and resources. The Alexandria Center for Evidence-Based Clinical Practice Guidelines updated one of ADAPTE's tools, modified three tools, and added three new ones. In addition, they proposed optionally using three other tools. The Alberta Ambassador Program adaptation phase consists of 11 steps and focused on adapting good-quality guidelines for nonspecific low back pain into local context. GRADE-ADOLOPMENT is an eight-step process based on the GRADE Working Group's Evidence to Decision frameworks and applied in 22 guidelines in the context of national guideline development program. The MAGIC research program developed a five-step adaptation process, informed by ADAPTE and the GRADE approach in the context of adapting thrombosis guidelines. The RAPADAPTE framework consists of 12 steps based on ADAPTE and using synthesized evidence databases, retrospectively derived from the experience of producing a high-quality guideline for the treatment of breast cancer with limited resources in Costa Rica. The RCN outlines five key steps strategy for adaptation of guidelines to the local context. The SGR method consists of nine steps and takes into consideration both methodological gaps and context-specific normative issues in source guidelines. We identified through searching personal files two abandoned methods. We identified and described eight proposed frameworks for the adaptation of health-related guidelines. There is a need to evaluate these different frameworks to assess rigor, efficiency, and transparency of their proposed processes. Copyright © 2017 Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Pototzky, Anthony S.
2010-01-01
A methodology is described for generating first-order plant equations of motion for aeroelastic and aeroservoelastic applications. The description begins with the process of generating data files representing specialized mode-shapes, such as rigid-body and control surface modes, using both PATRAN and NASTRAN analysis. NASTRAN executes the 146 solution sequence using numerous Direct Matrix Abstraction Program (DMAP) calls to import the mode-shape files and to perform the aeroelastic response analysis. The aeroelastic response analysis calculates and extracts structural frequencies, generalized masses, frequency-dependent generalized aerodynamic force (GAF) coefficients, sensor deflections and load coefficients data as text-formatted data files. The data files are then re-sequenced and re-formatted using a custom written FORTRAN program. The text-formatted data files are stored and coefficients for s-plane equations are fitted to the frequency-dependent GAF coefficients using two Interactions of Structures, Aerodynamics and Controls (ISAC) programs. With tabular files from stored data created by ISAC, MATLAB generates the first-order aeroservoelastic plant equations of motion. These equations include control-surface actuator, turbulence, sensor and load modeling. Altitude varying root-locus plot and PSD plot results for a model of the F-18 aircraft are presented to demonstrate the capability.
Extract and visualize geolocation from any text file
NASA Astrophysics Data System (ADS)
Boustani, M.
2015-12-01
There are variety of text file formats such as PDF, HTML and more which contains words about locations(countries, cities, regions and more). GeoParser developed as one of sub-projects under DARPA Memex to help finding any geolocation information crawled website data. It is a web application benefiting from Apache Tika to extract locations from any text file format and visualize geolocations on the map. https://github.com/MBoustani/GeoParserhttps://github.com/chrismattmann/tika-pythonhttp://www.darpa.mil/program/memex
1999-12-01
addition, the data files saved in the POINT format can include an optional header which is compatible with Amtec Engineering’s 2-D and 3-D visualization...34.DAT" file so that the file can be used directly by Amtec Engineering’s 2-D and 3-D visualization package Tecplot©. The ARRAY and POINT formats are
Can ASCII data files be standardized for Earth Science?
NASA Astrophysics Data System (ADS)
Evans, K. D.; Chen, G.; Wilson, A.; Law, E.; Olding, S. W.; Krotkov, N. A.; Conover, H.
2015-12-01
NASA's Earth Science Data Systems Working Groups (ESDSWG) was created over 10 years ago. The role of the ESDSWG is to make recommendations relevant to NASA's Earth science data systems from user experiences. Each group works independently focusing on a unique topic. Participation in ESDSWG groups comes from a variety of NASA-funded science and technology projects, such as MEaSUREs, NASA information technology experts, affiliated contractor, staff and other interested community members from academia and industry. Recommendations from the ESDSWG groups will enhance NASA's efforts to develop long term data products. Each year, the ESDSWG has a face-to-face meeting to discuss recommendations and future efforts. Last year's (2014) ASCII for Science Data Working Group (ASCII WG) completed its goals and made recommendations on a minimum set of information that is needed to make ASCII files at least human readable and usable for the foreseeable future. The 2014 ASCII WG created a table of ASCII files and their components as a means for understanding what kind of ASCII formats exist and what components they have in common. Using this table and adding information from other ASCII file formats, we will discuss the advantages and disadvantages of a standardized format. For instance, Space Geodesy scientists have been using the same RINEX/SINEX ASCII format for decades. Astronomers mostly archive their data in the FITS format. Yet Earth scientists seem to have a slew of ASCII formats, such as ICARTT, netCDF (an ASCII dump) and the IceBridge ASCII format. The 2015 Working Group is focusing on promoting extendibility and machine readability of ASCII data. Questions have been posed, including, Can we have a standardized ASCII file format? Can it be machine-readable and simultaneously human-readable? We will present a summary of the current used ASCII formats in terms of advantages and shortcomings, as well as potential improvements.
As-built design specification for PARCLS
NASA Technical Reports Server (NTRS)
Tompkins, M. A. (Principal Investigator)
1981-01-01
The PARCLS program, part of the CLASFYG package, reads a parameter file created by the CLASFYG program and a pure pixel ground truth file in order to create to classification file of three separate crop categories in universal format.
47 CFR 73.875 - Modification of transmission systems.
Code of Federal Regulations, 2010 CFR
2010-10-01
.... (2) Those that would cause the transmission system to exceed the equipment performance measurements... radiofrequency radiation guidelines. In addition, applications solely filed pursuant to paragraphs (c)(1) or (c...
Semantic Clinical Guideline Documents
Eriksson, Henrik; Tu, Samson W.; Musen, Mark
2005-01-01
Decision-support systems based on clinical practice guidelines can support physicians and other health-care personnel in the process of following best practice consistently. A knowledge-based approach to represent guidelines makes it possible to encode computer-interpretable guidelines in a formal manner, perform consistency checks, and use the guidelines directly in decision-support systems. Decision-support authors and guideline users require guidelines in human-readable formats in addition to computer-interpretable ones (e.g., for guideline review and quality assurance). We propose a new document-oriented information architecture that combines knowledge-representation models with electronic and paper documents. The approach integrates decision-support modes with standard document formats to create a combined clinical-guideline model that supports on-line viewing, printing, and decision support. PMID:16779037
Analytic Patch Configuration (APC) gateway version 1.0 user's guide
NASA Technical Reports Server (NTRS)
Bingel, Bradford D.
1990-01-01
The Analytic Patch Configuration (APC) is an interactive software tool which translates aircraft configuration geometry files from one format into another. This initial release of the APC Gateway accommodates six formats: the four accepted APC formats (89f, 89fd, 89u, and 89ud), the PATRAN 2.x phase 1 neutral file format, and the Integrated Aerodynamic Analysis System (IAAS) General Geometry (GG) format. Written in ANSI FORTRAN 77 and completely self-contained, the APC Gateway is very portable and was already installed on CDC/NOS, VAX/VMS, SUN, SGI/IRIS, CONVEX, and GRAY hosts.
GEM at 10: a decade's experience with the Guideline Elements Model.
Hajizadeh, Negin; Kashyap, Nitu; Michel, George; Shiffman, Richard N
2011-01-01
The Guideline Elements Model (GEM) was developed in 2000 to organize the information contained in clinical practice guidelines using XML and to represent guideline content in a form that can be understood by human readers and processed by computers. In this work, we systematically reviewed the literature to better understand how GEM was being used, potential barriers to its use, and suggestions for improvement. Fifty external and twelve internally produced publications were identified and analyzed. GEM was used most commonly for modeling and ontology creation. Other investigators applied GEM for knowledge extraction and data mining, for clinical decision support for guideline generation. The GEM Cutter software-used to markup guidelines for translation into XML- has been downloaded 563 times since 2000. Although many investigators found GEM to be valuable, others critiqued its failure to clarify guideline semantics, difficulties in markup, and the fact that GEM files are not usually executable.
Thilagaratnam, S; Ding, Y Y; Au Eong, K G; Chiam, P C; Chow, Y L; Khoo, G; Lim, H B; Lim, H Y L; Lim, W S; Lim, W Y; Peh, K C; Phua, K T; Sitoh, Y Y; Tan, B Y; Wong, S F; Wong, W P; Yee, R
2010-06-01
The Health Promotion Board (HPB) and the Ministry of Health (MOH) publish clinical practice guidelines to provide doctors and patients in Singapore with evidence-based guidance on managing important medical conditions. This article reproduces the introduction and executive summary (with recommendations from the guidelines) from the HPB-MOH clinical practice guidelines on Functional Screening for Older Adults in the Community, for the information of readers of the Singapore Medical Journal. Chapters and page numbers mentioned in the reproduced extract refer to the full text of the guidelines, which are available from the Health Promotion Board website (http://www.hpb.gov.sg/uploadedFiles/HPB_Online/Publications/CPGFunctionalscreening.pdf). The recommendations should be used with reference to the full text of the guidelines. Following this article are multiple choice questions based on the full text of the guidelines.
Integration of DICOM and openEHR standards
NASA Astrophysics Data System (ADS)
Wang, Ying; Yao, Zhihong; Liu, Lei
2011-03-01
The standard format for medical imaging storage and transmission is DICOM. openEHR is an open standard specification in health informatics that describes the management and storage, retrieval and exchange of health data in electronic health records. Considering that the integration of DICOM and openEHR is beneficial to information sharing, on the basis of XML-based DICOM format, we developed a method of creating a DICOM Imaging Archetype in openEHR to enable the integration of DICOM and openEHR. Each DICOM file contains abundant imaging information. However, because reading a DICOM involves looking up the DICOM Data Dictionary, the readability of a DICOM file has been limited. openEHR has innovatively adopted two level modeling method, making clinical information divided into lower level, the information model, and upper level, archetypes and templates. But one critical challenge posed to the development of openEHR is the information sharing problem, especially in imaging information sharing. For example, some important imaging information cannot be displayed in an openEHR file. In this paper, to enhance the readability of a DICOM file and semantic interoperability of an openEHR file, we developed a method of mapping a DICOM file to an openEHR file by adopting the form of archetype defined in openEHR. Because an archetype has a tree structure, after mapping a DICOM file to an openEHR file, the converted information is structuralized in conformance with openEHR format. This method enables the integration of DICOM and openEHR and data exchange without losing imaging information between two standards.
78 FR 13933 - Railroad Cost of Capital-2012
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-01
... by May 31, 2013. ADDRESSES: Comments may be submitted either via the Board's e-filing system or in the traditional paper format. Any person using e-filing should comply with the instructions at the E-FILING link on the Board's Web site, at http://www.stb.dot.gov . Any person submitting a filing in the...
76 FR 10430 - Railroad Cost of Capital-2010
Federal Register 2010, 2011, 2012, 2013, 2014
2011-02-24
... by June 8, 2011. ADDRESSES: Comments may be submitted either via the Board's e-filing system or in the traditional paper format. Any person using e-filing should comply with the instructions at the E-FILING link on the Board's Web site, at http://www.stb.dot.gov . Any person submitting a filing in the...
75 FR 16894 - Railroad Cost of Capital-2009
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-02
... 15, 2010. ADDRESSES: Comments may be submitted either via the Board's e-filing system or in the traditional paper format. Any person using e-filing should comply with the instructions at the E-FILING link on the Board's Web site, at http://www.stb.dot.gov . Any person submitting a filing in the...
5 CFR 1201.14 - Electronic filing procedures.
Code of Federal Regulations, 2014 CFR
2014-01-01
...-Appeal Online, in which case service is governed by paragraph (j) of this section, or by non-electronic... (PDF), and image files (files created by scanning). A list of formats allowed can be found at e-Appeal... representatives of the appeals in which they were filed. (j) Service of electronic pleadings and MSPB documents...
5 CFR 1201.14 - Electronic filing procedures.
Code of Federal Regulations, 2013 CFR
2013-01-01
...-Appeal Online, in which case service is governed by paragraph (j) of this section, or by non-electronic... (PDF), and image files (files created by scanning). A list of formats allowed can be found at e-Appeal... representatives of the appeals in which they were filed. (j) Service of electronic pleadings and MSPB documents...
5 CFR 1201.14 - Electronic filing procedures.
Code of Federal Regulations, 2011 CFR
2011-01-01
...-Appeal Online, in which case service is governed by paragraph (j) of this section, or by non-electronic... (PDF), and image files (files created by scanning). A list of formats allowed can be found at e-Appeal... representatives of the appeals in which they were filed. (j) Service of electronic pleadings and MSPB documents...
5 CFR 1201.14 - Electronic filing procedures.
Code of Federal Regulations, 2012 CFR
2012-01-01
...-Appeal Online, in which case service is governed by paragraph (j) of this section, or by non-electronic... (PDF), and image files (files created by scanning). A list of formats allowed can be found at e-Appeal... representatives of the appeals in which they were filed. (j) Service of electronic pleadings and MSPB documents...
Converting Inhouse Subject Card Files to Electronic Keyword Files.
ERIC Educational Resources Information Center
Culmer, Carita M.
The library at Phoenix College developed the Controversial Issues Files (CIF), a "home made" card file containing references pertinent to specific ongoing assignments. Although the CIF had proven itself to be an excellent resource tool for beginning researchers, it was cumbersome to maintain in the card format, and was limited to very…
Networks for Autonomous Formation Flying Satellite Systems
NASA Technical Reports Server (NTRS)
Knoblock, Eric J.; Konangi, Vijay K.; Wallett, Thomas M.; Bhasin, Kul B.
2001-01-01
The performance of three communications networks to support autonomous multi-spacecraft formation flying systems is presented. All systems are comprised of a ten-satellite formation arranged in a star topology, with one of the satellites designated as the central or "mother ship." All data is routed through the mother ship to the terrestrial network. The first system uses a TCP/lP over ATM protocol architecture within the formation the second system uses the IEEE 802.11 protocol architecture within the formation and the last system uses both of the previous architectures with a constellation of geosynchronous satellites serving as an intermediate point-of-contact between the formation and the terrestrial network. The simulations consist of file transfers using either the File Transfer Protocol (FTP) or the Simple Automatic File Exchange (SAFE) Protocol. The results compare the IF queuing delay, and IP processing delay at the mother ship as well as application-level round-trip time for both systems, In all cases, using IEEE 802.11 within the formation yields less delay. Also, the throughput exhibited by SAFE is better than FTP.
Concurrent Formative Evaluation: Guidelines and Implications for Multimedia Designers.
ERIC Educational Resources Information Center
Northrup, Pamela Taylor
1995-01-01
Discusses formative evaluation for multimedia instruction and presents guidelines for formatively evaluating multimedia instruction concurrent with analysis, design, and development. Data collection criteria that include group involvement, data collection strategies, and information to be gathered are presented, and rapid prototypes and…
/Filing occurs between 8:00 AM and 3:30 PM Monday through Friday except holidays or scheduled closures Copy Request e-Recording information. Search and Index Guidelines. Recording District History, and
37 CFR 1.615 - Format of papers filed in a supplemental examination proceeding.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 37 Patents, Trademarks, and Copyrights 1 2013-07-01 2013-07-01 false Format of papers filed in a supplemental examination proceeding. 1.615 Section 1.615 Patents, Trademarks, and Copyrights UNITED STATES PATENT AND TRADEMARK OFFICE, DEPARTMENT OF COMMERCE GENERAL RULES OF PRACTICE IN PATENT CASES...
75 FR 14386 - Interpretation of Transmission Planning Reliability Standard
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-25
... created electronically using word processing software should be filed in native applications or print-to.... FERC, 564 F.3d 1342 (DC Cir. 2009). \\6\\ Mandatory Reliability Standards for the Bulk-Power System... print-to-PDF format and not in a scanned format. Commenters filing electronically do not need to make a...
PROPOSED ST ANDARD TO GREA TL Y EXP AND PUBLIC ACCESS AND EXPLORATION OF TOXICITY DATA: EVALUATION OF STRUCTURE DATA FILE FORMAT
The ability to assess the potential toxicity of environmental, pharmaceutical, or industrial chemicals based on chemical structure in...
37 CFR 1.615 - Format of papers filed in a supplemental examination proceeding.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 37 Patents, Trademarks, and Copyrights 1 2014-07-01 2014-07-01 false Format of papers filed in a supplemental examination proceeding. 1.615 Section 1.615 Patents, Trademarks, and Copyrights UNITED STATES PATENT AND TRADEMARK OFFICE, DEPARTMENT OF COMMERCE GENERAL RULES OF PRACTICE IN PATENT CASES...
VizieR Online Data Catalog: Metal enrichment in semi-analytical model (Cousin+, 2016)
NASA Astrophysics Data System (ADS)
Cousin, M.; Buat, V.; Boissier, S.; Bethermin, M.; Roehlly, Y. Genois M.
2016-04-01
The repository contains outputs from the different models: - m1: Classical (only hot gas) isotropic accretion scenario + Standard Shmidt Kennicutt law - m2: Bimodal accretion (cold streams) + Standard Shmidt Kennicutt law - m3: Classical (only hot gas) isotropic accretion scenario + ad-hoc non-star forming gas reservoir - m4: Bimodal accretion (cold streams) + ad-hoc non-star forming gas reservoir For each model of these models dada are saved in eGalICS_m*.fits file. All these fits-formated files are compatible with the TOPCAT software available on: http://www.star.bris.ac.uk/~mbt/topcat/ We also provide, for each Initial Mass Function available, a set of two fits-formated files associated to the chemodynamical library presented in the paper. For these two files, data are available for all metallicity bins used. - masslossrates_IMF.fits: The instantaneous total ejecta rate associated to a SSP for the six different main-ISM elements. - SNratesIMF.fits: The total SN rate (SNII+SNIa [nb/Gyr]) associated to a SSP, individual contribution of SNII and SNIa are also given. These files are available for four different IMFs: Salpeter+55 (1955ApJ...121..161S), Chabrier+03 (2003PASP..115..763C), Kroupa+93 (2001MNRAS.322..231K) and Scalo+98 (1998ASPC..142..201S. Both ejecta rates and SN rates are computed for the complete list of stellar ages provided in the BC03 spectra library. They are saved in fits-formated files and structured with different extensions corresponding to the different initial stellar metallicity bins. We finally provide the median star formation history, the median gas accretion history and the metal enrichment histories associated to our MW-sisters sample: MWsistershistories.dat If you used data associated to eGalICS semi-analytic model, please cite the following paper: Cousin et al., 2015A&A...575A..33C, "Toward a new modelling of gas flows in a semi-analytical model of galaxy formation and evolution" (3 data files).
IVS Working Group 4: VLBI Data Structures
NASA Astrophysics Data System (ADS)
Gipson, J.
2012-12-01
I present an overview of the "openDB format" for storing, archiving, and processing VLBI data. In this scheme, most VLBI data is stored in NetCDF files. NetCDF has the advantage that there are interfaces to most common computer languages including Fortran, Fortran-90, C, C++, Perl, etc, and the most common operating systems including Linux, Windows, and Mac. The data files for a particular session are organized by special ASCII "wrapper" files which contain pointers to the data files. This allows great flexibility in the processing and analysis of VLBI data. For example it allows you to easily change subsets of the data used in the analysis such as troposphere modeling, ionospheric calibration, editing, and ambiguity resolution. It also allows for extending the types of data used, e.g., source maps. I present a roadmap to transition to this new format. The new format can already be used by VieVS and by the global mode of solve. There are plans in work for other software packages to be able to use the new format.
CONNJUR spectrum translator: an open source application for reformatting NMR spectral data.
Nowling, Ronald J; Vyas, Jay; Weatherby, Gerard; Fenwick, Matthew W; Ellis, Heidi J C; Gryk, Michael R
2011-05-01
NMR spectroscopists are hindered by the lack of standardization for spectral data among the file formats for various NMR data processing tools. This lack of standardization is cumbersome as researchers must perform their own file conversion in order to switch between processing tools and also restricts the combination of tools employed if no conversion option is available. The CONNJUR Spectrum Translator introduces a new, extensible architecture for spectrum translation and introduces two key algorithmic improvements. This first is translation of NMR spectral data (time and frequency domain) to a single in-memory data model to allow addition of new file formats with two converter modules, a reader and a writer, instead of writing a separate converter to each existing format. Secondly, the use of layout descriptors allows a single fid data translation engine to be used for all formats. For the end user, sophisticated metadata readers allow conversion of the majority of files with minimum user configuration. The open source code is freely available at http://connjur.sourceforge.net for inspection and extension.
Segy-change: The swiss army knife for the SEG-Y files
NASA Astrophysics Data System (ADS)
Stanghellini, Giuseppe; Carrara, Gabriela
Data collected during active and passive seismic surveys can be stored in many different, more or less standard, formats. One of the most popular is the SEG-Y format, developed since 1975 to store single-line seismic digital data on tapes, and now evolved to store them into hard-disk and other media as well. Unfortunately, sometimes, files that are claimed to be recorded in the SEG-Y format cannot be processed using available free or industrial packages. Aiming to solve this impasse we present segy-change, a pre-processing software program to view, analyze, change and fix errors present in SEG-Y data files. It is written in C language and it can be used also as a software library and is compatible with most operating systems. Segy-change allows the user to display and optionally change the values inside all parts of a SEG-Y file: the file header, the trace headers and the data blocks. In addition, it allows to do a quality check on the data by plotting the traces. We provide instructions and examples on how to use the software.
Flow diagram or prose: does the format of practice guidelines matter?
Hardern, R D; Hodgson, L C; Hamer, D W
1998-06-01
The aim of this study was to compare the performance of two formats (prose and flow diagram) of the guidelines for management of paracetamol poisoning, and to assess the likely performance without access to the guidelines. A prospective questionnaire study of the management of seven hypothetical cases of paracetamol ingestion was carried out by accident and emergency senior house officers at a regional induction course. No differences were found between the two formats. The proportion of correct answers was 37% in the flow diagram and 31% in the prose group (95% confidence interval for the difference -8% to 20%). In the group with neither format of the guideline the proportion of correct answers was lower: 19% (95% confidence interval for the difference between this group and the group with flow charts 6.9% to 30.6%, for the difference between this group and the group with the prose format 0.4% to 24.8%). The time taken to answer the questions did not vary between the groups. These data do not support the exclusive use of either format. They suggest that management of paracetamol poisoning is less likely to be correct if staff do not have access to the guidelines.
VizieR Online Data Catalog: Opacities from the Opacity Project (Seaton+, 1995)
NASA Astrophysics Data System (ADS)
Seaton, M. J.; Yan, Y.; Mihalas, D.; Pradhan, A. K.
1997-08-01
1 CODES. ***** 1.1 Code rop.for ************ This code reads opacity files written in standard OP format. Its main purpose is to provide documentation on the contents of the files. This code, like the other codes provided, prompts for the name of the file (or files) to be read. The file names read in response to the prompt may have up to 128 characters. 1.2 Code opfit.for ************** This code reads opacity files in standard OP format, and provides for interpolation of opacities to any required values of temperature and mass-density. The method used is described in OPF. The code prompts for the name of a file giving all required control parameters. As an example, the file opfit.dat is provided (users will need to change directory names and file names). The use of opfit.for is illustrated using opfit.dat. Most users will probably want to adapt opfit.for for use as a subroutine in other codes. Timings for DEC 7000 ALPHA: 0.3 sec for data read and initialisations; then 0.0007 sec for each temperature-density point. Users who like OPAL formats should note that opfit.for has a facility to produce files of OP data in OPAL-type formats. 1.3 Code ixz.for ************ This code provides for interpolations to any required values of X and Z. See IXZ. It prompts for the name of a file giving all required control parameters. An example of such a file if provided, ixz.dat (the user will need to change directory and file names). The output files have names s92INT.'nnn'. The user specifies the first value of nnn, and the number of files to be produced. 2. DATA FILES ********** 2.1 Data files for solar metal-mix ****************************** Data for solar metal-mix s92 as defined in SYMP. These files are from version 2 runs of December 1994 (see IXZ for details on Version 2). There are 213 files with names s92.'nnn', 'nnn'=201 to 413. Each file occupies 83762 bytes. The file s92.version2 gives values of X (hydrogen mass-faction) and Z (metals mass-fraction) for each value of 'nnn'. The user can get s92.version2, select the values of 'nnn' required, then get the required files s92.'nnn'. The user can see the file in ftp, displayed on the screen, by typing "get s92.version2 -". The files s92.'nnn' can be used with opfit.for to obtain opacities for any requires value of temperature and mass density. Files for other metal-mixtures will be added in due course. Send requests to mjs@star.ucl.ac.uk. 2.2 Files for interpolation in X and Z ********************************** The data files have names s92xz.'mmm', where 'mmm'=001 to 096. They differ from the standard OP files (such as s92.'nnn' --- section 2.1 above) in that they contain information giving derivatives of opacities with respect to X and Z. Each file s92xz.'mmm' occupies 148241 bytes. The interpolations to any required values of X and Z are made using ixz.for. Timings: on DEC 7000 ALPHA, 2.16 sec for each new-mixture file. For interpolations to some specified values of X and Z, one requires just 4 files s92xz.'mmm'. Most users will not require the complete set of files s92xz.'mmm'. The file s92xz.index includes a table (starting on line 3) giving values, for each 'mmm' file, of x,y,z (abundances by number-factions) and X,Y,Z (abundances by mass-fractions). Users are advised to get the file s92.index, and select values of 'mmm' for files required, then get those files. The files produced by ixz.for are in standard OP format and can be used with opfit.for to obtain opacities for any required values of temperature and mass density. 3 RECOMMENDED PROCEDURE FOR USE OF OPACITY FILES ********************************************** (1) Get the file s92.version2. (2) If the values of X and Z you require are available in the files s92.'nnn' then get those files. (3) If not, get the file s92xz.index. (4) Select from s92xz.index the values of 'mmm' which cover the range of X and Z in which your are interested. Get those files and use ixz.for to generate files for your exact required values of X and Z. (5) Note that the exact abundance mixtures used are specified in each file (see rop.for). Also each run of opfit.for produces a table of abundances. (6) If you want a metal-mix different from that of s92, contact mjs@star.ucl.ac.uk. 4 FUTURE DEVELOPMENTS ******************* (1) Data for the calculation of radiative forces are provided as the CDS catalog
Canadian physicians' attitudes about and preferences regarding clinical practice guidelines
Hayward, R S; Guyatt, G H; Moore, K A; McKibbon, K A; Carter, A O
1997-01-01
OBJECTIVE: To assess Canadian physicians' confidence in, attitudes about and preferences regarding clinical practice guidelines. DESIGN: Cross-sectional, self-administered mailed survey. PARTICIPANTS: Stratified random sample of 3000 Canadian physicians; 1878 (62.6%) responded. SETTING: Canada. OUTCOME MEASURES: Physicians' use of various information sources; familiarity with and confidence in guidelines; attitudes about guidelines and their effect on medical care; rating of importance of guidelines and other sources of information in clinical decision-making; rating of importance of various considerations in deciding whether to adopt a set of guidelines; and rating of usefulness of different formats for presenting guidelines. MAIN RESULTS: In all, 52% of the respondents reported using guidelines at least monthly, substantially less frequently than traditional information sources. Most of the respondents expressed confidence in guidelines issued by various physician organizations, but 51% to 77% were not confident in guidelines issued by federal or provincial health ministries or by health insurance plans. The respondents were generally positive about guidelines (e.g., over 50% strongly agreed that they are a convenient source of advice and good educational tools); however, 22% to 26% had concerns about loss of autonomy, the rigidity of guidelines and decreased satisfaction with medical practice. Endorsement by respected colleagues or major organizations was identified as very important by 78% and 62% of the respondents respectively in deciding whether to adopt a set of guidelines in their practice. User friendliness of the guidelines format was thought to be very important by 62%; short pamphlets, manuals summarizing a number of guidelines, journal articles and pocket cards summarizing guidelines were the preferred formats (identified as most useful by 50% to 62% of the respondents). CONCLUSIONS: Canadian physicians, although generally positive about guidelines and confident in those developed by clinicians, have not yet integrated the use of guidelines into their practices to a large extent. Our results suggest that respected organizations and opinion leaders should be involved in the development of guidelines and that the acceptability of any proposed format and medium for guidelines presentation should be pretested. PMID:9220923
ERIC Educational Resources Information Center
American Council on Education, Washington, DC.
Guidelines for colleges concerning the privacy of employee records are presented in two policy statements. Institutional policy should minimize intrusiveness, maximize fairness, and create legitimate expectations of confidentiality. In addition to strengthening professional equity of treatment, confidentiality permits consideration of both adverse…
Gigabit Network Communications Research
1992-12-31
additional BPF channels, raw bytesync support for video codecs, and others. All source file modifications were logged with RCS. Source and object trees were...34 (RFCs). 20 RFCs were published this quarter: RFC 1366: Gerich, E., " Guidelines for Management of IP Address Space", Merit, October 1992. RFC 1367...Topolcic, C., "Schedule for IP Address Space Management Guidelines ", CNRI, October 1992. RFC 1368: McMaster, D. (Synoptics Communications, Inc.), K
Web Standard: PDF - When to Use, Document Metadata, PDF Sections
PDF files provide some benefits when used appropriately. PDF files should not be used for short documents ( 5 pages) unless retaining the format for printing is important. PDFs should have internal file metadata and meet section 508 standards.
Guide to GFS History File Change on May 1, 2007
Guide to GFS History File Change on May 1, 2007 On May 1, 2007 12Z, the GFS had a major change. The change caused the internal binary GFS history file to change formats. The file is still in spectral space but now pressure is calculated in a different way. Sometime in the future, the GFS history file may be
FGGE/ERBZ tape specification and shipping letter description
NASA Technical Reports Server (NTRS)
Han, D.; Lo, H.
1983-01-01
The FGGE/ERBZ tape contains 5 parameters which are extracted and reformatted from the Nimbus-7 ERB Zonal Means Tape. There are three types of files on a FGGE/ERBZ tape: a tape header file, and data files. Physical characteristics, gross format, and file specifications are given. A sample tape check/document printout (shipping letter) is included.
. These tables may be defined within a separate ASCII text file (see Description and Format of BUFR Tables time, the BUFR tables are usually read from an external ASCII text file (although it is also possible reports. Click here to view the ASCII text file (called /nwprod/fix/bufrtab.002 on the NCEP CCS machines
75 FR 45609 - Commission Information Collection Activities (FERC-542); Comment Request; Extension
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-03
... electronically (eFiled) or in paper format, and should refer to Docket No. IC10-542-000. Documents must be.... Commenters making an eFiling should not make a paper filing. Commenters that are not able to file electronically must send an original and two (2) paper copies of their comments to: Federal Energy Regulatory...
Global Paleoclimatic Data for 6000 Yr B.P. (1985) (NDP-011)
Webb, III, T. [Department of Geological Sciences, Brown University, Providence, Rhode Island (USA)
2012-01-01
To determine regional and global climatic variations during the past 6000 years, pollen, lake level, and marine plankton data from 797 stations were compiled to form a global data set. Radiocarbon dating and dated tephras were used to determine the ages of the specimens. The data available for the pollen data are site number, site name, latitude, longitude, elevation, and percentages of various taxa. For lake-level data, the data are site number, site name, latitude, longitude, and lake-level status. And for marine plankton, the data are site number, site name, latitude, longitude, water depth, date, dating control code, depth of sample, interpolated age of sample, estimated winter and summer sea-surface temperatures, and percentages of various taxa. The data are in 55 files: 5 files for each of 9 geographic regions and 10 supplemental files. The files for each region include (1) a FORMAT file describing the format and contents of the data for that region, (2) an INDEX file containing descriptive information about each site and its data, (3) a DATA file containing the data and available climatic estimates, (4) a PUBINDEX file indexing the bibliographic references associated with each site, and (5) a REFERENCE file containing the bibliographic references. The files range in size from 2 to 66 kB.
PATSTAGS - PATRAN-STAGSC-1 TRANSLATOR
NASA Technical Reports Server (NTRS)
Otte, N. E.
1994-01-01
PATSTAGS translates PATRAN finite model data into STAGS (Structural Analysis of General Shells) input records to be used for engineering analysis. The program reads data from a PATRAN neutral file and writes STAGS input records into a STAGS input file and a UPRESS data file. It is able to support translations of nodal constraints, nodal, element, force and pressure data. PATSTAGS uses three files: the PATRAN neutral file to be translated, a STAGS input file and a STAGS pressure data file. The user provides the names for the neutral file and the desired names of the STAGS files to be created. The pressure data file contains the element live pressure data used in the STAGS subroutine UPRESS. PATSTAGS is written in FORTRAN 77 for DEC VAX series computers running VMS. The main memory requirement for execution is approximately 790K of virtual memory. Output blocks can be modified to output the data in any format desired, allowing the program to be used to translate model data to analysis codes other than STAGSC-1 (HQN-10967). This program is available in DEC VAX BACKUP format on a 9-track magnetic tape or TK50 tape cartridge. Documentation is included in the price of the program. PATSTAGS was developed in 1990. DEC, VAX, TK50 and VMS are trademarks of Digital Equipment Corporation.
Tsuru, Satoko; Okamine, Eiko; Takada, Aya; Watanabe, Chitose; Uchiyama, Makiko; Dannoue, Hideo; Aoyagi, Hisae; Endo, Akira
2009-01-01
Nursing Action Master and Nursing Observation Master were released from 2002 to 2008. Two kinds of format, an Excel format and a CSV format file are prepared for maintaining them. Followings were decided as a basic rule of the maintenance: newly addition, revision, deletion, the numbering of the management and a rule of the coding. The master was developed based on it. We do quality assurance for the masters using these rules.
,
2006-01-01
This chapter describes data used in support of the process being applied by the U.S. Geological Survey (USGS) National Oil and Gas Assessment (NOGA) project. Digital tabular data used in this report and archival data that permit the user to perform further analyses are available elsewhere on the CD-ROM. Computers and software may import the data without transcription from the Portable Document Format files (.pdf files) of the text by the reader. Because of the number and variety of platforms and software available, graphical images are provided as .pdf files and tabular data are provided in a raw form as tab-delimited text files (.tab files).
Recommendations resulting from the SPDS Community-Wide Workshop
NASA Technical Reports Server (NTRS)
1993-01-01
The Data Systems Panel identified three critical functionalities of a Space Physics Data System (SPDS): the delivery of self-documenting data, the existence of a matrix of translators between various standard formats (IDFS, CDF, netCDF, HDF, TENNIS, UCLA flat file, and FITS), and a network-based capability for browsing and examining inventory records for the system's data holdings. The recommendations resulting from the workshop include the philosophy, funding, and objectives of a SPDS. Access to quality data is seen as the most important objective by the Policy Panel, with curation and information about the data being integral parts of any accessible data set. The Data Issues Panel concluded that the SPDS can supply encouragement, guidelines, and ultimately provide a mechanism for financial support for data archiving, restoration, and curation. The Software Panel of the SPDS focused on defining the requirements and priorities for SPDS to support common data analysis and data visualization tools and packages.
[A solution for display and processing of DICOM images in web PACS].
Xue, Wei-jing; Lu, Wen; Wang, Hai-yang; Meng, Jian
2009-03-01
Use the technique of Java Applet to realize the supporting of DICOM image in ordinary Web browser, thereby to expand the processing function of medical image. First analyze the format of DICOM file and design a class which can acquire the pixels, then design two Applet classes, of which one is used to disposal the DICOM image, the other is used to display DICOM image that have been disposaled in the first Applet. They all embedded in the View page, and they communicate by Applet Context object. The method designed in this paper can make users display and process DICOM images directly by using ordinary Web browser, which makes Web PACS not only have the advantages of B/S model, but also have the advantages of the C/S model. Java Applet is the key for expanding the Web browser's function in Web PACS, which provides a guideline to sharing of medical images.
75 FR 71625 - System Restoration Reliability Standards
Federal Register 2010, 2011, 2012, 2013, 2014
2010-11-24
... processing software should be filed in native applications or print-to-PDF format, and not in a scanned... (2006), aff'd sub nom. Alcoa, Inc. v. FERC, 564 F.3d 1342 (D.C. Cir. 2009). 6. On March 16, 2007, the... electronically using word processing software should be filed in native applications or print-to-PDF format, and...
75 FR 81152 - Interpretation of Protection System Reliability Standard
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-27
... created electronically using word processing software should be filed in native applications or print-to... reh'g & compliance, 117 FERC ] 61,126 (2006), aff'd sub nom. Alcoa, Inc. v. FERC, 564 F.3d 1342 (DC... print-to-PDF format and not in a scanned format, at http://www.ferc.gov/docs-filing/efiling.asp . Mail...
78 FR 4766 - Adoption of Updated EDGAR Filer Manual
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-23
... primarily to introduce the new EDGARLink Online submission type IRANNOTICE; and support PDF as an official... Portable Document Format (PDF) as an official filing format. EDGAR will continue to accept ASCII and HTML...) and 101 (17 CFR 232.101) of Regulation S-T and the EDGAR Filer Manual relating to the use of PDF files...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-22
... Systems in 1993 for document exchange. PDF captures formatting information from a variety of desktop publishing applications, making it possible to send formatted documents and have them appear on the recipient... Administrative Procedure Act generally requires that an agency publish an adopted rule in the Federal Register 30...
43 CFR 32.7 - Application format, instructions, and guidelines.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 43 Public Lands: Interior 1 2014-10-01 2014-10-01 false Application format, instructions, and guidelines. 32.7 Section 32.7 Public Lands: Interior Office of the Secretary of the Interior GRANTS TO STATES FOR ESTABLISHING YOUNG ADULT CONSERVATION CORPS (YACC) PROGRAM § 32.7 Application format, instructions...
43 CFR 32.7 - Application format, instructions, and guidelines.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 43 Public Lands: Interior 1 2010-10-01 2010-10-01 false Application format, instructions, and guidelines. 32.7 Section 32.7 Public Lands: Interior Office of the Secretary of the Interior GRANTS TO STATES FOR ESTABLISHING YOUNG ADULT CONSERVATION CORPS (YACC) PROGRAM § 32.7 Application format, instructions...
COMBINE archive and OMEX format: one file to share all information to reproduce a modeling project.
Bergmann, Frank T; Adams, Richard; Moodie, Stuart; Cooper, Jonathan; Glont, Mihai; Golebiewski, Martin; Hucka, Michael; Laibe, Camille; Miller, Andrew K; Nickerson, David P; Olivier, Brett G; Rodriguez, Nicolas; Sauro, Herbert M; Scharm, Martin; Soiland-Reyes, Stian; Waltemath, Dagmar; Yvon, Florent; Le Novère, Nicolas
2014-12-14
With the ever increasing use of computational models in the biosciences, the need to share models and reproduce the results of published studies efficiently and easily is becoming more important. To this end, various standards have been proposed that can be used to describe models, simulations, data or other essential information in a consistent fashion. These constitute various separate components required to reproduce a given published scientific result. We describe the Open Modeling EXchange format (OMEX). Together with the use of other standard formats from the Computational Modeling in Biology Network (COMBINE), OMEX is the basis of the COMBINE Archive, a single file that supports the exchange of all the information necessary for a modeling and simulation experiment in biology. An OMEX file is a ZIP container that includes a manifest file, listing the content of the archive, an optional metadata file adding information about the archive and its content, and the files describing the model. The content of a COMBINE Archive consists of files encoded in COMBINE standards whenever possible, but may include additional files defined by an Internet Media Type. Several tools that support the COMBINE Archive are available, either as independent libraries or embedded in modeling software. The COMBINE Archive facilitates the reproduction of modeling and simulation experiments in biology by embedding all the relevant information in one file. Having all the information stored and exchanged at once also helps in building activity logs and audit trails. We anticipate that the COMBINE Archive will become a significant help for modellers, as the domain moves to larger, more complex experiments such as multi-scale models of organs, digital organisms, and bioengineering.
NMReDATA, a standard to report the NMR assignment and parameters of organic compounds.
Pupier, Marion; Nuzillard, Jean-Marc; Wist, Julien; Schlörer, Nils E; Kuhn, Stefan; Erdelyi, Mate; Steinbeck, Christoph; Williams, Antony J; Butts, Craig; Claridge, Tim D W; Mikhova, Bozhana; Robien, Wolfgang; Dashti, Hesam; Eghbalnia, Hamid R; Farès, Christophe; Adam, Christian; Kessler, Pavel; Moriaud, Fabrice; Elyashberg, Mikhail; Argyropoulos, Dimitris; Pérez, Manuel; Giraudeau, Patrick; Gil, Roberto R; Trevorrow, Paul; Jeannerat, Damien
2018-04-14
Even though NMR has found countless applications in the field of small molecule characterization, there is no standard file format available for the NMR data relevant to structure characterization of small molecules. A new format is therefore introduced to associate the NMR parameters extracted from 1D and 2D spectra of organic compounds to the proposed chemical structure. These NMR parameters, which we shall call NMReDATA (for nuclear magnetic resonance extracted data), include chemical shift values, signal integrals, intensities, multiplicities, scalar coupling constants, lists of 2D correlations, relaxation times, and diffusion rates. The file format is an extension of the existing Structure Data Format, which is compatible with the commonly used MOL format. The association of an NMReDATA file with the raw and spectral data from which it originates constitutes an NMR record. This format is easily readable by humans and computers and provides a simple and efficient way for disseminating results of structural chemistry investigations, allowing automatic verification of published results, and for assisting the constitution of highly needed open-source structural databases. Copyright © 2018 John Wiley & Sons, Ltd.
Cyber Moat: Adaptive Virtualized Network Framework for Deception and Disinformation
2016-12-12
As one type of bots, web crawlers have been leveraged by search engines (e.g., Googlebot by Google) to popularize websites through website indexing...However, the number of malicious bots is increasing too. To regulate the behavior of crawlers, most websites include a file called "robots.txt" that...However, "robots.txt" only provides a guideline, and almost all malicious robots ignore it. Moreover, since this file is publicly available, malicious
Ingargiola, Antonino; Laurence, Ted; Boutelle, Robert; Weiss, Shimon; Michalet, Xavier
2017-01-01
Archival of experimental data in public databases has increasingly become a requirement for most funding agencies and journals. These data-sharing policies have the potential to maximize data reuse, and to enable confirmatory as well as novel studies. However, the lack of standard data formats can severely hinder data reuse. In photon-counting-based single-molecule fluorescence experiments, data is stored in a variety of vendor-specific or even setup-specific (custom) file formats, making data interchange prohibitively laborious, unless the same hardware-software combination is used. Moreover, the number of available techniques and setup configurations make it difficult to find a common standard. To address this problem, we developed Photon-HDF5 (www.photon-hdf5.org), an open data format for timestamp-based single-molecule fluorescence experiments. Building on the solid foundation of HDF5, Photon-HDF5 provides a platform- and language-independent, easy-to-use file format that is self-describing and supports rich metadata. Photon-HDF5 supports different types of measurements by separating raw data (e.g. photon-timestamps, detectors, etc) from measurement metadata. This approach allows representing several measurement types and setup configurations within the same core structure and makes possible extending the format in backward-compatible way. Complementing the format specifications, we provide open source software to create and convert Photon-HDF5 files, together with code examples in multiple languages showing how to read Photon-HDF5 files. Photon-HDF5 allows sharing data in a format suitable for long term archival, avoiding the effort to document custom binary formats and increasing interoperability with different analysis software. We encourage participation of the single-molecule community to extend interoperability and to help defining future versions of Photon-HDF5. PMID:28649160
Ingargiola, Antonino; Laurence, Ted; Boutelle, Robert; Weiss, Shimon; Michalet, Xavier
2016-02-13
Archival of experimental data in public databases has increasingly become a requirement for most funding agencies and journals. These data-sharing policies have the potential to maximize data reuse, and to enable confirmatory as well as novel studies. However, the lack of standard data formats can severely hinder data reuse. In photon-counting-based single-molecule fluorescence experiments, data is stored in a variety of vendor-specific or even setup-specific (custom) file formats, making data interchange prohibitively laborious, unless the same hardware-software combination is used. Moreover, the number of available techniques and setup configurations make it difficult to find a common standard. To address this problem, we developed Photon-HDF5 (www.photon-hdf5.org), an open data format for timestamp-based single-molecule fluorescence experiments. Building on the solid foundation of HDF5, Photon-HDF5 provides a platform- and language-independent, easy-to-use file format that is self-describing and supports rich metadata. Photon-HDF5 supports different types of measurements by separating raw data (e.g. photon-timestamps, detectors, etc) from measurement metadata. This approach allows representing several measurement types and setup configurations within the same core structure and makes possible extending the format in backward-compatible way. Complementing the format specifications, we provide open source software to create and convert Photon-HDF5 files, together with code examples in multiple languages showing how to read Photon-HDF5 files. Photon-HDF5 allows sharing data in a format suitable for long term archival, avoiding the effort to document custom binary formats and increasing interoperability with different analysis software. We encourage participation of the single-molecule community to extend interoperability and to help defining future versions of Photon-HDF5.
NASA Astrophysics Data System (ADS)
Ingargiola, Antonino; Laurence, Ted; Boutelle, Robert; Weiss, Shimon; Michalet, Xavier
2016-02-01
Archival of experimental data in public databases has increasingly become a requirement for most funding agencies and journals. These data-sharing policies have the potential to maximize data reuse, and to enable confirmatory as well as novel studies. However, the lack of standard data formats can severely hinder data reuse. In photon-counting-based single-molecule fluorescence experiments, data is stored in a variety of vendor-specific or even setup-specific (custom) file formats, making data interchange prohibitively laborious, unless the same hardware-software combination is used. Moreover, the number of available techniques and setup configurations make it difficult to find a common standard. To address this problem, we developed Photon-HDF5 (www.photon-hdf5.org), an open data format for timestamp-based single-molecule fluorescence experiments. Building on the solid foundation of HDF5, Photon- HDF5 provides a platform- and language-independent, easy-to-use file format that is self-describing and supports rich metadata. Photon-HDF5 supports different types of measurements by separating raw data (e.g. photon-timestamps, detectors, etc) from measurement metadata. This approach allows representing several measurement types and setup configurations within the same core structure and makes possible extending the format in backward-compatible way. Complementing the format specifications, we provide open source software to create and convert Photon- HDF5 files, together with code examples in multiple languages showing how to read Photon-HDF5 files. Photon- HDF5 allows sharing data in a format suitable for long term archival, avoiding the effort to document custom binary formats and increasing interoperability with different analysis software. We encourage participation of the single-molecule community to extend interoperability and to help defining future versions of Photon-HDF5.
Petroleum system modeling of the western Canada sedimentary basin - isopach grid files
Higley, Debra K.; Henry, Mitchell E.; Roberts, Laura N.R.
2005-01-01
This publication contains zmap-format grid files of isopach intervals that represent strata associated with Devonian to Holocene petroleum systems of the Western Canada Sedimentary Basin (WCSB) of Alberta, British Columbia, and Saskatchewan, Canada. Also included is one grid file that represents elevations relative to sea level of the top of the Lower Cretaceous Mannville Group. Vertical and lateral scales are in meters. The age range represented by the stratigraphic intervals comprising the grid files is 373 million years ago (Ma) to present day. File names, age ranges, formation intervals, and primary petroleum system elements are listed in table 1. Metadata associated with this publication includes information on the study area and the zmap-format files. The digital files listed in table 1 were compiled as part of the Petroleum Processes Research Project being conducted by the Central Energy Resources Team of the U.S. Geological Survey, which focuses on modeling petroleum generation, 3 migration, and accumulation through time for petroleum systems of the WCSB. Primary purposes of the WCSB study are to Construct the 1-D/2-D/3-D petroleum system models of the WCSB. Actual boundaries of the study area are documented within the metadata; excluded are northern Alberta and eastern Saskatchewan, but fringing areas of the United States are included.Publish results of the research and the grid files generated for use in the 3-D model of the WCSB.Evaluate the use of petroleum system modeling in assessing undiscovered oil and gas resources for geologic provinces across the World.
DoD Implementation of the National Practitioner Data Bank Guidelines
1998-06-26
clearly demonstrates that Congress wanted all malpractice payments reported, including nuisance claims. The "corporate shield" concept is a loophole...on the DoD kriplementation of the National Practitioner Data Bank Guidelines. The office of the Assistant Secretary of Defense ( Healh Affairs), sad...34corporate shield". Under this concept , the practitioner’s name is deleted from the malpractice claim and the claim is filed against the corportilon
Torsion and bending properties of shape memory and superelastic nickel-titanium rotary instruments.
Ninan, Elizabeth; Berzins, David W
2013-01-01
Recently introduced into the market are shape memory nickel-titanium (NiTi) rotary files. The objective of this study was to investigate the torsion and bending properties of shape memory files (CM Wire, HyFlex CM, and Phoenix Flex) and compare them with conventional (ProFile ISO and K3) and M-Wire (GT Series X and ProFile Vortex) NiTi files. Sizes 20, 30, and 40 (n = 12/size/taper) of 0.02 taper CM Wire, Phoenix Flex, K3, and ProFile ISO and 0.04 taper HyFlex CM, ProFile ISO, GT Series X, and Vortex were tested in torsion and bending per ISO 3630-1 guidelines by using a torsiometer. All data were statistically analyzed by analysis of variance and the Tukey-Kramer test (P = .05) to determine any significant differences between the files. Significant interactions were present among factors of size and file. Variability in maximum torque values was noted among the shape memory files brands, sometimes exhibiting the greatest or least torque depending on brand, size, and taper. In general, the shape memory files showed a high angle of rotation before fracture but were not statistically different from some of the other files. However, the shape memory files were more flexible, as evidenced by significantly lower bending moments (P < .008). Shape memory files show greater flexibility compared with several other NiTi rotary file brands. Copyright © 2013 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.
SEDIMENT DATA - ST. PAUL WATERWAY - TACOMA, WA - 1996 MONITORING DATA
Benthic Infauna Monitoring Data Files are Excel-format spreadsheet files which contain data presented in the St. Paul Waterway Area Remedial Action and Habitat Restoration Project, 1996 Monitoring Report. The files can be viewed directly or readily downlo aded and read into most ...
Thompson, B Gregory; Brown, Robert D; Amin-Hanjani, Sepideh; Broderick, Joseph P; Cockroft, Kevin M; Connolly, E Sander; Duckwiler, Gary R; Harris, Catherine C; Howard, Virginia J; Johnston, S Claiborne Clay; Meyers, Philip M; Molyneux, Andrew; Ogilvy, Christopher S; Ringer, Andrew J; Torner, James
2015-08-01
The aim of this updated statement is to provide comprehensive and evidence-based recommendations for management of patients with unruptured intracranial aneurysms. Writing group members used systematic literature reviews from January 1977 up to June 2014. They also reviewed contemporary published evidence-based guidelines, personal files, and published expert opinion to summarize existing evidence, indicate gaps in current knowledge, and when appropriate, formulated recommendations using standard American Heart Association criteria. The guideline underwent extensive peer review, including review by the Stroke Council Leadership and Stroke Scientific Statement Oversight Committees, before consideration and approval by the American Heart Association Science Advisory and Coordinating Committee. Evidence-based guidelines are presented for the care of patients presenting with unruptured intracranial aneurysms. The guidelines address presentation, natural history, epidemiology, risk factors, screening, diagnosis, imaging and outcomes from surgical and endovascular treatment. © 2015 American Heart Association, Inc.
Active Management of Integrated Geothermal-CO2 Storage Reservoirs in Sedimentary Formations
Buscheck, Thomas A.
2012-01-01
Active Management of Integrated Geothermal–CO2 Storage Reservoirs in Sedimentary Formations: An Approach to Improve Energy Recovery and Mitigate Risk : FY1 Final Report The purpose of phase 1 is to determine the feasibility of integrating geologic CO2 storage (GCS) with geothermal energy production. Phase 1 includes reservoir analyses to determine injector/producer well schemes that balance the generation of economically useful flow rates at the producers with the need to manage reservoir overpressure to reduce the risks associated with overpressure, such as induced seismicity and CO2 leakage to overlying aquifers. This submittal contains input and output files of the reservoir model analyses. A reservoir-model "index-html" file was sent in a previous submittal to organize the reservoir-model input and output files according to sections of the FY1 Final Report to which they pertain. The recipient should save the file: Reservoir-models-inputs-outputs-index.html in the same directory that the files: Section2.1.*.tar.gz files are saved in.
Active Management of Integrated Geothermal-CO2 Storage Reservoirs in Sedimentary Formations
Buscheck, Thomas A.
2000-01-01
Active Management of Integrated Geothermal–CO2 Storage Reservoirs in Sedimentary Formations: An Approach to Improve Energy Recovery and Mitigate Risk: FY1 Final Report The purpose of phase 1 is to determine the feasibility of integrating geologic CO2 storage (GCS) with geothermal energy production. Phase 1 includes reservoir analyses to determine injector/producer well schemes that balance the generation of economically useful flow rates at the producers with the need to manage reservoir overpressure to reduce the risks associated with overpressure, such as induced seismicity and CO2 leakage to overlying aquifers. This submittal contains input and output files of the reservoir model analyses. A reservoir-model "index-html" file was sent in a previous submittal to organize the reservoir-model input and output files according to sections of the FY1 Final Report to which they pertain. The recipient should save the file: Reservoir-models-inputs-outputs-index.html in the same directory that the files: Section2.1.*.tar.gz files are saved in.
Developing a radiology-based teaching approach for gross anatomy in the digital era.
Marker, David R; Bansal, Anshuman K; Juluru, Krishna; Magid, Donna
2010-08-01
The purpose of this study was to assess the implementation of a digital anatomy lecture series based largely on annotated, radiographic images and the utility of the Radiological Society of North America-developed Medical Imaging Resource Center (MIRC) for providing an online educational resource. A series of digital teaching images were collected and organized to correspond to lecture and dissection topics. MIRC was used to provide the images in a Web-based educational format for incorporation into anatomy lectures and as a review resource. A survey assessed the impressions of the medical students regarding this educational format. MIRC teaching files were successfully used in our teaching approach. The lectures were interactive with questions to and from the medical student audience regarding the labeled images used in the presentation. Eighty-five of 120 students completed the survey. The majority of students (87%) indicated that the MIRC teaching files were "somewhat useful" to "very useful" when incorporated into the lecture. The students who used the MIRC files were most likely to access the material from home (82%) on an occasional basis (76%). With regard to areas for improvement, 63% of the students reported that they would have benefited from more teaching files, and only 9% of the students indicated that the online files were not user friendly. The combination of electronic radiology resources available in lecture format and on the Internet can provide multiple opportunities for medical students to learn and revisit first-year anatomy. MIRC provides a user-friendly format for presenting radiology education files for medical students. 2010 AUR. Published by Elsevier Inc. All rights reserved.
Parser Combinators: a Practical Application for Generating Parsers for NMR Data
Fenwick, Matthew; Weatherby, Gerard; Ellis, Heidi JC; Gryk, Michael R.
2013-01-01
Nuclear Magnetic Resonance (NMR) spectroscopy is a technique for acquiring protein data at atomic resolution and determining the three-dimensional structure of large protein molecules. A typical structure determination process results in the deposition of a large data sets to the BMRB (Bio-Magnetic Resonance Data Bank). This data is stored and shared in a file format called NMR-Star. This format is syntactically and semantically complex making it challenging to parse. Nevertheless, parsing these files is crucial to applying the vast amounts of biological information stored in NMR-Star files, allowing researchers to harness the results of previous studies to direct and validate future work. One powerful approach for parsing files is to apply a Backus-Naur Form (BNF) grammar, which is a high-level model of a file format. Translation of the grammatical model to an executable parser may be automatically accomplished. This paper will show how we applied a model BNF grammar of the NMR-Star format to create a free, open-source parser, using a method that originated in the functional programming world known as “parser combinators”. This paper demonstrates the effectiveness of a principled approach to file specification and parsing. This paper also builds upon our previous work [1], in that 1) it applies concepts from Functional Programming (which is relevant even though the implementation language, Java, is more mainstream than Functional Programming), and 2) all work and accomplishments from this project will be made available under standard open source licenses to provide the community with the opportunity to learn from our techniques and methods. PMID:24352525
VizieR Online Data Catalog: Infrared Arcturus Atlas (Hinkle+ 1995)
NASA Astrophysics Data System (ADS)
Hinkle, K.; Wallace, L.; Livingston, W.
1996-01-01
The atlas is contained in 310 spectral files a list of line identifications, plus a file containing a list of the files and unobserved spectral regions. The spectral file names are in the form 'abnnnnn' where 'nnnnn' denotes the spectral region, e.g. file 'ab4300' contains spectra for the 4300-4325 cm-1 range. The atomic and molecular line identifications are in files 'appendix.a' and 'appendix.b', and repeated with a uniform format in file 'lines'. The file 'appendix.c' is a book-keeping device used to correlate the plot plages and spectral files with frequency. See the author-supplied description in 'readme.dat' for more information. (311 data files).
Strategies for Sharing Seismic Data Among Multiple Computer Platforms
NASA Astrophysics Data System (ADS)
Baker, L. M.; Fletcher, J. B.
2001-12-01
Seismic waveform data is readily available from a variety of sources, but it often comes in a distinct, instrument-specific data format. For example, data may be from portable seismographs, such as those made by Refraction Technology or Kinemetrics, from permanent seismograph arrays, such as the USGS Parkfield Dense Array, from public data centers, such as the IRIS Data Center, or from personal communication with other researchers through e-mail or ftp. A computer must be selected to import the data - usually whichever is the most suitable for reading the originating format. However, the computer best suited for a specific analysis may not be the same. When copies of the data are then made for analysis, a proliferation of copies of the same data results, in possibly incompatible, computer-specific formats. In addition, if an error is detected and corrected in one copy, or some other change is made, all the other copies must be updated to preserve their validity. Keeping track of what data is available, where it is located, and which copy is authoritative requires an effort that is easy to neglect. We solve this problem by importing waveform data to a shared network file server that is accessible to all our computers on our campus LAN. We use a Network Appliance file server running Sun's Network File System (NFS) software. Using an NFS client software package on each analysis computer, waveform data can then be read by our MatLab or Fortran applications without first copying the data. Since there is a single copy of the waveform data in a single location, the NFS file system hierarchy provides an implicit complete waveform data catalog and the single copy is inherently authoritative. Another part of our solution is to convert the original data into a blocked-binary format (known historically as USGS DR100 or VFBB format) that is interpreted by MatLab or Fortran library routines available on each computer so that the idiosyncrasies of each machine are not visible to the user. Commercial software packages, such as MatLab, also have the ability to share data in their own formats across multiple computer platforms. Our Fortran applications can create plot files in Adobe PostScript, Illustrator, and Portable Document Format (PDF) formats. Vendor support for reading these files is readily available on multiple computer platforms. We will illustrate by example our strategies for sharing seismic data among our multiple computer platforms, and we will discuss our positive and negative experiences. We will include our solutions for handling the different byte ordering, floating-point formats, and text file ``end-of-line'' conventions on the various computer platforms we use (6 different operating systems on 5 processor architectures).
Geologic map of the Valjean Hills 7.5' quadrangle, San Bernardino County, California
Calzia, J.P.; Troxel, Bennie W.; digital database by Raumann, Christian G.
2003-01-01
FGDC-compliant metadata for the ARC/INFO coverages. The Correlation of Map Units and Description of Map Units is in the editorial format of USGS Geologic Investigations Series (I-series) maps but has not been edited to comply with I-map standards. Within the geologic map data package, map units are identified by standard geologic map criteria such as formation-name, age, and lithology. Even though this is an Open-File Report and includes the standard USGS Open-File disclaimer, the report closely adheres to the stratigraphic nomenclature of the U.S. Geological Survey. Descriptions of units can be obtained by viewing or plotting the .pdf file (3 above) or plotting the postscript file (2 above).
PDB Editor: a user-friendly Java-based Protein Data Bank file editor with a GUI.
Lee, Jonas; Kim, Sung Hou
2009-04-01
The Protein Data Bank file format is the format most widely used by protein crystallographers and biologists to disseminate and manipulate protein structures. Despite this, there are few user-friendly software packages available to efficiently edit and extract raw information from PDB files. This limitation often leads to many protein crystallographers wasting significant time manually editing PDB files. PDB Editor, written in Java Swing GUI, allows the user to selectively search, select, extract and edit information in parallel. Furthermore, the program is a stand-alone application written in Java which frees users from the hassles associated with platform/operating system-dependent installation and usage. PDB Editor can be downloaded from http://sourceforge.net/projects/pdbeditorjl/.
Cánovas, Rodrigo; Moffat, Alistair; Turpin, Andrew
2016-12-15
Next generation sequencing machines produce vast amounts of genomic data. For the data to be useful, it is essential that it can be stored and manipulated efficiently. This work responds to the combined challenge of compressing genomic data, while providing fast access to regions of interest, without necessitating decompression of whole files. We describe CSAM (Compressed SAM format), a compression approach offering lossless and lossy compression for SAM files. The structures and techniques proposed are suitable for representing SAM files, as well as supporting fast access to the compressed information. They generate more compact lossless representations than BAM, which is currently the preferred lossless compressed SAM-equivalent format; and are self-contained, that is, they do not depend on any external resources to compress or decompress SAM files. An implementation is available at https://github.com/rcanovas/libCSAM CONTACT: canovas-ba@lirmm.frSupplementary Information: Supplementary data is available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Software for Automated Reading of STEP Files by I-DEAS(trademark)
NASA Technical Reports Server (NTRS)
Pinedo, John
2003-01-01
A program called "readstep" enables the I-DEAS(tm) computer-aided-design (CAD) software to automatically read Standard for the Exchange of Product Model Data (STEP) files. (The STEP format is one of several used to transfer data between dissimilar CAD programs.) Prior to the development of "readstep," it was necessary to read STEP files into I-DEAS(tm) one at a time in a slow process that required repeated intervention by the user. In operation, "readstep" prompts the user for the location of the desired STEP files and the names of the I-DEAS(tm) project and model file, then generates an I-DEAS(tm) program file called "readstep.prg" and two Unix shell programs called "runner" and "controller." The program "runner" runs I-DEAS(tm) sessions that execute readstep.prg, while "controller" controls the execution of "runner" and edits readstep.prg if necessary. The user sets "runner" and "controller" into execution simultaneously, and then no further intervention by the user is required. When "runner" has finished, the user should see only parts from successfully read STEP files present in the model file. STEP files that could not be read successfully (e.g., because of format errors) should be regenerated before attempting to read them again.
Shuttle Data Center File-Processing Tool in Java
NASA Technical Reports Server (NTRS)
Barry, Matthew R.; Miller, Walter H.
2006-01-01
A Java-language computer program has been written to facilitate mining of data in files in the Shuttle Data Center (SDC) archives. This program can be executed on a variety of workstations or via Web-browser programs. This program is partly similar to prior C-language programs used for the same purpose, while differing from those programs in that it exploits the platform-neutrality of Java in implementing several features that are important for analysis of large sets of time-series data. The program supports regular expression queries of SDC archive files, reads the files, interleaves the time-stamped samples according to a chosen output, then transforms the results into that format. A user can choose among a variety of output file formats that are useful for diverse purposes, including plotting, Markov modeling, multivariate density estimation, and wavelet multiresolution analysis, as well as for playback of data in support of simulation and testing.
Users' Manual and Installation Guide for the EverVIEW Slice and Dice Tool (Version 1.0 Beta)
Roszell, Dustin; Conzelmann, Craig; Chimmula, Sumani; Chandrasekaran, Anuradha; Hunnicut, Christina
2009-01-01
Network Common Data Form (NetCDF) is a self-describing, machine-independent file format for storing array-oriented scientific data. Over the past few years, there has been a growing movement within the community of natural resource managers in The Everglades, Fla., to use NetCDF as the standard data container for datasets based on multidimensional arrays. As a consequence, a need arose for additional tools to view and manipulate NetCDF datasets, specifically to create subsets of large NetCDF files. To address this need, we created the EverVIEW Slice and Dice Tool to allow users to create subsets of grid-based NetCDF files. The major functions of this tool are (1) to subset NetCDF files both spatially and temporally; (2) to view the NetCDF data in table form; and (3) to export filtered data to a comma-separated value file format.
HDF4 Maps: For Now and For the Future
NASA Astrophysics Data System (ADS)
Plutchak, J.; Aydt, R.; Folk, M. J.
2013-12-01
Data formats and access tools necessarily change as technology improves to address emerging requirements with new capabilities. This on-going process inevitably leaves behind significant data collections in legacy formats that are difficult to support and sustain. NASA ESDIS and The HDF Group currently face this problem with large and growing archives of data in HDF4, an older version of the HDF format. Indefinitely guaranteeing the ability to read these data with multi-platform libraries in many languages is very difficult. As an alternative, HDF and NASA worked together to create maps of the files that contain metadata and information about data types, locations, and sizes of data objects in the files. These maps are written in XML and have successfully been used to access and understand data in HDF4 files without the HDF libraries. While originally developed to support sustainable access to these data, these maps can also be used to provide access to HDF4 metadata, facilitate user understanding of files prior to download, and validate the files for compliance with particular conventions. These capabilities are now available as a service for HDF4 archives and users.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-20
...). Those NITUs permitted railbanking/interim trail use negotiations under the Trails Act, 16 U.S.C. 1247(d... November 19, 2010. ADDRESSES: Comments may be submitted either via the Board's e-filing format or in the traditional paper format. Any person using e-filing should attach a document and otherwise comply with the...
Kataoka, Satoshi; Ohe, Kazuhiko; Mochizuki, Mayumi; Ueda, Shiro
2002-01-01
We have developed an adverse drug reaction (ADR) reporting system integrating it with Hospital Information System (HIS) of the University of Tokyo Hospital. Since this system is designed with JAVA, it is portable without re-compiling to any operating systems on which JAVA virtual machines work. In this system, we implemented an automatic data filling function using XML-based (extended Markup Language) files generated by HIS. This new specification would decrease the time needed for physicians and pharmacists to fill the spontaneous ADR reports. By clicking a button, the report is sent to the text database through Simple Mail Transfer Protocol (SMTP) electronic mails. The destination of the report mail can be changed arbitrarily by administrators, which adds this system more flexibility for practical operation. Although we tried our best to use the SGML-based (Standard Generalized Markup Language) ICH M2 guideline to follow the global standard of the case report, we eventually adopted XML as the output report format. This is because we found some problems in handling two bytes characters with ICH guideline and XML has a lot of useful features. According to our pilot survey conducted at the University of Tokyo Hospital, many physicians answered that our idea, integrating ADR reporting system to HIS, would increase the ADR reporting numbers.
10 CFR 436.102 - General operations plan format and content.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 10 Energy 3 2010-01-01 2010-01-01 false General operations plan format and content. 436.102... PROGRAMS Guidelines for General Operations Plans § 436.102 General operations plan format and content. (a... effective date of these guidelines, a general operations 10-year plan which shall consist of two parts, an...
FastStats: Obstetrical Procedures
... Publications and Information Products Surveys and Data Collection Systems Washington Group on Disability Statistics Where to Write for Vital Records File Formats Help: How do I view different file ...
... Publications and Information Products Surveys and Data Collection Systems Washington Group on Disability Statistics Where to Write for Vital Records File Formats Help: How do I view different file ...
HepML, an XML-based format for describing simulated data in high energy physics
NASA Astrophysics Data System (ADS)
Belov, S.; Dudko, L.; Kekelidze, D.; Sherstnev, A.
2010-10-01
In this paper we describe a HepML format and a corresponding C++ library developed for keeping complete description of parton level events in a unified and flexible form. HepML tags contain enough information to understand what kind of physics the simulated events describe and how the events have been prepared. A HepML block can be included into event files in the LHEF format. The structure of the HepML block is described by means of several XML Schemas. The Schemas define necessary information for the HepML block and how this information should be located within the block. The library libhepml is a C++ library intended for parsing and serialization of HepML tags, and representing the HepML block in computer memory. The library is an API for external software. For example, Matrix Element Monte Carlo event generators can use the library for preparing and writing a header of an LHEF file in the form of HepML tags. In turn, Showering and Hadronization event generators can parse the HepML header and get the information in the form of C++ classes. libhepml can be used in C++, C, and Fortran programs. All necessary parts of HepML have been prepared and we present the project to the HEP community. Program summaryProgram title: libhepml Catalogue identifier: AEGL_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEGL_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU GPLv3 No. of lines in distributed program, including test data, etc.: 138 866 No. of bytes in distributed program, including test data, etc.: 613 122 Distribution format: tar.gz Programming language: C++, C Computer: PCs and workstations Operating system: Scientific Linux CERN 4/5, Ubuntu 9.10 RAM: 1 073 741 824 bytes (1 Gb) Classification: 6.2, 11.1, 11.2 External routines: Xerces XML library ( http://xerces.apache.org/xerces-c/), Expat XML Parser ( http://expat.sourceforge.net/) Nature of problem: Monte Carlo simulation in high energy physics is divided into several stages. Various programs exist for these stages. In this article we are interested in interfacing different Monte Carlo event generators via data files, in particular, Matrix Element (ME) generators and Showering and Hadronization (SH) generators. There is a widely accepted format for data files for such interfaces - Les Houches Event Format (LHEF). Although information kept in an LHEF file is enough for proper working of SH generators, it is insufficient for understanding how events in the LHEF file have been prepared and which physical model has been applied. In this paper we propose an extension of the format for keeping additional information available in generators. We propose to add a new information block, marked up with XML tags, to the LHEF file. This block describes events in the file in more detail. In particular, it stores information about a physical model, kinematical cuts, generator, etc. This helps to make LHEF files self-documented. Certainly, HepML can be applied in more general context, not in LHEF files only. Solution method: In order to overcome drawbacks of the original LHEF accord we propose to add a new information block of HepML tags. HepML is an XML-based markup language. We designed several XML Schemas for all tags in the language. Any HepML document should follow rules of the Schemas. The language is equipped with a library for operation with HepML tags and documents. This C++ library, called libhepml, consists of classes for HepML objects, which represent a HepML document in computer memory, parsing classes, serializating classes, and some auxiliary classes. Restrictions: The software is adapted for solving problems, described in the article. There are no additional restrictions. Running time: Tests have been done on a computer with Intel(R) Core(TM)2 Solo, 1.4 GHz. Parsing of a HepML file: 6 ms (size of the HepML files is 12.5 Kb) Writing of a HepML block to file: 14 ms (file size 12.5 Kb) Merging of two HepML blocks and writing to file: 18 ms (file size - 25.0 Kb).
BOREAS Forest Cover Data Layers over the SSA-MSA in Raster Format
NASA Technical Reports Server (NTRS)
Nickeson, Jaime; Gruszka, F; Hall, F.
2000-01-01
This data set, originally provided as vector polygons with attributes, has been processed by BORIS staff to provide raster files that can be used for modeling or for comparison purposes. The original data were received as ARC/INFO coverages or as export files from SERM. The data include information on forest parameters for the BOREAS SSA-MSA. Most of the data used for this product were acquired by BORIS in 1993; the maps were produced from aerial photography taken as recently as 1988. The data are stored in binary, image format files.
Development of Software to Model AXAF-I Image Quality
NASA Technical Reports Server (NTRS)
Geary, Joseph; Hawkins, Lamar; Ahmad, Anees; Gong, Qian
1997-01-01
This report describes work conducted on Delivery Order 181 between October 1996 through June 1997. During this period software was written to: compute axial PSD's from RDOS AXAF-I mirror surface maps; plot axial surface errors and compute PSD's from HDOS "Big 8" axial scans; plot PSD's from FITS format PSD files; plot band-limited RMS vs axial and azimuthal position for multiple PSD files; combine and organize PSD's from multiple mirror surface measurements formatted as input to GRAZTRACE; modify GRAZTRACE to read FITS formatted PSD files; evaluate AXAF-I test results; improve and expand the capabilities of the GT x-ray mirror analysis package. During this period work began on a more user-friendly manual for the GT program, and improvements were made to the on-line help manual.
UNICON: A Powerful and Easy-to-Use Compound Library Converter.
Sommer, Kai; Friedrich, Nils-Ole; Bietz, Stefan; Hilbig, Matthias; Inhester, Therese; Rarey, Matthias
2016-06-27
The accurate handling of different chemical file formats and the consistent conversion between them play important roles for calculations in complex cheminformatics workflows. Working with different cheminformatic tools often makes the conversion between file formats a mandatory step. Such a conversion might become a difficult task in cases where the information content substantially differs. This paper describes UNICON, an easy-to-use software tool for this task. The functionality of UNICON ranges from file conversion between standard formats SDF, MOL2, SMILES, PDB, and PDBx/mmCIF via the generation of 2D structure coordinates and 3D structures to the enumeration of tautomeric forms, protonation states, and conformer ensembles. For this purpose, UNICON bundles the key elements of the previously described NAOMI library in a single, easy-to-use command line tool.
The Open Microscopy Environment: open image informatics for the biological sciences
NASA Astrophysics Data System (ADS)
Blackburn, Colin; Allan, Chris; Besson, Sébastien; Burel, Jean-Marie; Carroll, Mark; Ferguson, Richard K.; Flynn, Helen; Gault, David; Gillen, Kenneth; Leigh, Roger; Leo, Simone; Li, Simon; Lindner, Dominik; Linkert, Melissa; Moore, Josh; Moore, William J.; Ramalingam, Balaji; Rozbicki, Emil; Rustici, Gabriella; Tarkowska, Aleksandra; Walczysko, Petr; Williams, Eleanor; Swedlow, Jason R.
2016-07-01
Despite significant advances in biological imaging and analysis, major informatics challenges remain unsolved: file formats are proprietary, storage and analysis facilities are lacking, as are standards for sharing image data and results. While the open FITS file format is ubiquitous in astronomy, astronomical imaging shares many challenges with biological imaging, including the need to share large image sets using secure, cross-platform APIs, and the need for scalable applications for processing and visualization. The Open Microscopy Environment (OME) is an open-source software framework developed to address these challenges. OME tools include: an open data model for multidimensional imaging (OME Data Model); an open file format (OME-TIFF) and library (Bio-Formats) enabling free access to images (5D+) written in more than 145 formats from many imaging domains, including FITS; and a data management server (OMERO). The Java-based OMERO client-server platform comprises an image metadata store, an image repository, visualization and analysis by remote access, allowing sharing and publishing of image data. OMERO provides a means to manage the data through a multi-platform API. OMERO's model-based architecture has enabled its extension into a range of imaging domains, including light and electron microscopy, high content screening, digital pathology and recently into applications using non-image data from clinical and genomic studies. This is made possible using the Bio-Formats library. The current release includes a single mechanism for accessing image data of all types, regardless of original file format, via Java, C/C++ and Python and a variety of applications and environments (e.g. ImageJ, Matlab and R).
Java Library for Input and Output of Image Data and Metadata
NASA Technical Reports Server (NTRS)
Deen, Robert; Levoe, Steven
2003-01-01
A Java-language library supports input and output (I/O) of image data and metadata (label data) in the format of the Video Image Communication and Retrieval (VICAR) image-processing software and in several similar formats, including a subset of the Planetary Data System (PDS) image file format. The library does the following: It provides low-level, direct access layer, enabling an application subprogram to read and write specific image files, lines, or pixels, and manipulate metadata directly. Two coding/decoding subprograms ("codecs" for short) based on the Java Advanced Imaging (JAI) software provide access to VICAR and PDS images in a file-format-independent manner. The VICAR and PDS codecs enable any program that conforms to the specification of the JAI codec to use VICAR or PDS images automatically, without specific knowledge of the VICAR or PDS format. The library also includes Image I/O plugin subprograms for VICAR and PDS formats. Application programs that conform to the Image I/O specification of Java version 1.4 can utilize any image format for which such a plug-in subprogram exists, without specific knowledge of the format itself. Like the aforementioned codecs, the VICAR and PDS Image I/O plug-in subprograms support reading and writing of metadata.
Vandvik, Per Olav; Alonso-Coello, Pablo; Akl, Elie A; Thornton, Judith; Rigau, David; Adams, Katie; O'Connor, Paul; Guyatt, Gordon; Kristiansen, Annette
2017-01-01
Objectives To investigate practicing physicians' preferences, perceived usefulness and understanding of a new multilayered guideline presentation format—compared to a standard format—as well as conceptual understanding of trustworthy guideline concepts. Design Participants attended a standardised lecture in which they were presented with a clinical scenario and randomised to view a guideline recommendation in a multilayered format or standard format after which they answered multiple-choice questions using clickers. Both groups were also presented and asked about guideline concepts. Setting Mandatory educational lectures in 7 non-academic and academic hospitals, and 2 settings involving primary care in Lebanon, Norway, Spain and the UK. Participants 181 practicing physicians in internal medicine (156) and general practice (25). Interventions A new digitally structured, multilayered guideline presentation format and a standard narrative presentation format currently in widespread use. Primary and secondary outcome measures Our primary outcome was preference for presentation format. Understanding, perceived usefulness and perception of absolute effects were secondary outcomes. Results 72% (95% CI 65 to 79) of participants preferred the multilayered format and 16% (95% CI 10 to 22) preferred the standard format. A majority agreed that recommendations (multilayered 86% vs standard 91%, p value=0.31) and evidence summaries (79% vs 77%, p value=0.76) were useful in the context of the clinical scenario. 72% of participants randomised to the multilayered format vs 58% for standard formats reported correct understanding of the recommendations (p value=0.06). Most participants elected an appropriate clinical action after viewing the recommendations (98% vs 92%, p value=0.10). 82% of the participants considered absolute effect estimates in evidence summaries helpful or crucial. Conclusions Clinicians clearly preferred a novel multilayered presentation format to the standard format. Whether the preferred format improves decision-making and has an impact on patient important outcomes merits further investigation. PMID:28188149
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lasche, George P.
2009-10-01
Cambio is an application intended to automatically read and display any spectrum file of any format in the world that the nuclear emergency response community might encounter. Cambio also provides an analysis capability suitable for HPGe spectra when detector response and scattering environment are not well known. Why is Cambio needed: (1) Cambio solves the following problem - With over 50 types of formats from instruments used in the field and new format variations appearing frequently, it is impractical for every responder to have current versions of the manufacturer's software from every instrument used in the field; (2) Cambio convertsmore » field spectra to any one of several common formats that are used for analysis, saving valuable time in an emergency situation; (3) Cambio provides basic tools for comparing spectra, calibrating spectra, and isotope identification with analysis suited especially for HPGe spectra; and (4) Cambio has a batch processing capability to automatically translate a large number of archival spectral files of any format to one of several common formats, such as the IAEA SPE or the DHS N42. Currently over 540 analysts and members of the nuclear emergency response community worldwide are on the distribution list for updates to Cambio. Cambio users come from all levels of government, university, and commercial partners around the world that support efforts to counter terrorist nuclear activities. Cambio is Unclassified Unlimited Release (UUR) and distributed by internet downloads with email notifications whenever a new build of Cambio provides for new formats, bug fixes, or new or improved capabilities. Cambio is also provided as a DLL to the Karlsruhe Institute for Transuranium Elements so that Cambio's automatic file-reading capability can be included at the Nucleonica web site.« less
C++ Coding Standards and Style Guide
NASA Technical Reports Server (NTRS)
Hughes, Steven; Jun, Linda; Shoan, Wendy
2005-01-01
This document is based on the "C Style Guide" (SEL-94-003). It contains recommendations for C++ implementations that build on, or in some cases replace, the style described in the C style guide. Style guidelines on any topics that are not covered in this document can be found in the "C Style Guide." An attempt has been made to indicate when these recommendations are just guidelines or suggestions versus when they are more strongly encouraged. Using coding standards makes code easier to read and maintain. General principles that maximize the readability and maintainability of C++ are: (1) Organize classes using encapsulation and information hiding techniques. (2) Enhance readability through the use of indentation and blank lines. (3) Add comments to header files to help users of classes. (4) Add comments to implementation files to help maintainers of classes. (5) Create names that are meaningful and readable.
75 FR 41093 - FM Table of Allotments, Maupin, Oregon
Federal Register 2010, 2011, 2012, 2013, 2014
2010-07-15
.... SUMMARY: The Audio Division grants the Petition for Reconsideration filed on behalf of Maupin Broadcasting... materials in accessible formats for people with disabilities (Braille, large print, electronic files, audio.... John A. Karousos, Assistant Chief, Audio Division, Media Bureau. [FR Doc. 2010-17226 Filed 7-14-10; 8...
A SARA Timeseries Utility supports analysis and management of time-varying environmental data including listing, graphing, computing statistics, computing meteorological data and saving in a WDM or text file. File formats supported include WDM, HSPF Binary (.hbn), USGS RDB, and T...
IDG - INTERACTIVE DIF GENERATOR
NASA Technical Reports Server (NTRS)
Preheim, L. E.
1994-01-01
The Interactive DIF Generator (IDG) utility is a tool used to generate and manipulate Directory Interchange Format files (DIF). Its purpose as a specialized text editor is to create and update DIF files which can be sent to NASA's Master Directory, also referred to as the International Global Change Directory at Goddard. Many government and university data systems use the Master Directory to advertise the availability of research data. The IDG interface consists of a set of four windows: (1) the IDG main window; (2) a text editing window; (3) a text formatting and validation window; and (4) a file viewing window. The IDG main window starts up the other windows and contains a list of valid keywords. The keywords are loaded from a user-designated file and selected keywords can be copied into any active editing window. Once activated, the editing window designates the file to be edited. Upon switching from the editing window to the formatting and validation window, the user has options for making simple changes to one or more files such as inserting tabs, aligning fields, and indenting groups. The viewing window is a scrollable read-only window that allows fast viewing of any text file. IDG is an interactive tool and requires a mouse or a trackball to operate. IDG uses the X Window System to build and manage its interactive forms, and also uses the Motif widget set and runs under Sun UNIX. IDG is written in C-language for Sun computers running SunOS. This package requires the X Window System, Version 11 Revision 4, with OSF/Motif 1.1. IDG requires 1.8Mb of hard disk space. The standard distribution medium for IDG is a .25 inch streaming magnetic tape cartridge in UNIX tar format. It is also available on a 3.5 inch diskette in UNIX tar format. The program was developed in 1991 and is a copyrighted work with all copyright vested in NASA. SunOS is a trademark of Sun Microsystems, Inc. X Window System is a trademark of Massachusetts Institute of Technology. OSF/Motif is a trademark of the Open Software Foundation, Inc. UNIX is a trademark of Bell Laboratories.
Development of a novel, multilayered presentation format for clinical practice guidelines.
Kristiansen, Annette; Brandt, Linn; Alonso-Coello, Pablo; Agoritsas, Thomas; Akl, Elie A; Conboy, Tara; Elbarbary, Mahmoud; Ferwana, Mazen; Medani, Wedad; Murad, Mohammad Hassan; Rigau, David; Rosenbaum, Sarah; Spencer, Frederick A; Treweek, Shaun; Guyatt, Gordon; Vandvik, Per Olav
2015-03-01
Bridging the gap between clinical research and everyday health-care practice requires effective communication strategies. To address current shortcomings in conveying practice recommendations and supporting evidence, we are creating and testing presentation formats for clinical practice guidelines (CPGs). We carried out multiple cycles of brainstorming and sketching, developing a prototype. Physicians participating in the user testing viewed CPG formats linked to clinical scenarios and engaged in semistructured interviews applying a think-aloud method for exploring important aspects of user experience. We developed a multilayered presentation format that allows clinicians to successively view more in-depth information. Starting with the recommendations, clinicians can, on demand, access a rationale and a key information section containing statements on quality of the evidence, balance between desirable and undesirable consequences, values and preferences, and resource considerations. We collected feedback from 27 stakeholders and performed user testing with 47 practicing physicians from six countries. Advisory group feedback and user testing of the first version revealed problems with conceptual understanding of underlying CPG methodology, as well as difficulties with the complexity of the layout and content. Extensive revisions made before the second round of user testing resulted in most participants expressing overall satisfaction with the final presentation format. We have developed an electronic, multilayered, CPG format that enhances the usability of CPGs for frontline clinicians. We have implemented the format in electronic guideline tools that guideline organizations can now use when authoring and publishing their guidelines.
75 FR 35700 - Revisions to Forms, Statements, and Reporting Requirements for Natural Gas Pipelines
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-23
... filed in native applications or print-to-PDF format and not in a scanned format. Mail/Hand Delivery... also propose to revise page 520 accordingly. \\1\\ American Gas Association v. FERC, 593 F.3d 14 (D.C....\\14\\ \\14\\ 593 F.3d at 21. 8. Following the court's remand, AGA filed a motion requesting that the...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnson, William
2015-10-19
Cambio opens data files from common gamma radiation detectors, displays a visual representation of it, and allows the user to edit the meta-data, as well as convert the data to a different file format.
Preliminary surficial geologic map database of the Amboy 30 x 60 minute quadrangle, California
Bedford, David R.; Miller, David M.; Phelps, Geoffrey A.
2006-01-01
The surficial geologic map database of the Amboy 30x60 minute quadrangle presents characteristics of surficial materials for an area approximately 5,000 km2 in the eastern Mojave Desert of California. This map consists of new surficial mapping conducted between 2000 and 2005, as well as compilations of previous surficial mapping. Surficial geology units are mapped and described based on depositional process and age categories that reflect the mode of deposition, pedogenic effects occurring post-deposition, and, where appropriate, the lithologic nature of the material. The physical properties recorded in the database focus on those that drive hydrologic, biologic, and physical processes such as particle size distribution (PSD) and bulk density. This version of the database is distributed with point data representing locations of samples for both laboratory determined physical properties and semi-quantitative field-based information. Future publications will include the field and laboratory data as well as maps of distributed physical properties across the landscape tied to physical process models where appropriate. The database is distributed in three parts: documentation, spatial map-based data, and printable map graphics of the database. Documentation includes this file, which provides a discussion of the surficial geology and describes the format and content of the map data, a database 'readme' file, which describes the database contents, and FGDC metadata for the spatial map information. Spatial data are distributed as Arc/Info coverage in ESRI interchange (e00) format, or as tabular data in the form of DBF3-file (.DBF) file formats. Map graphics files are distributed as Postscript and Adobe Portable Document Format (PDF) files, and are appropriate for representing a view of the spatial database at the mapped scale.
Rosetta: Ensuring the Preservation and Usability of ASCII-based Data into the Future
NASA Astrophysics Data System (ADS)
Ramamurthy, M. K.; Arms, S. C.
2015-12-01
Field data obtained from dataloggers often take the form of comma separated value (CSV) ASCII text files. While ASCII based data formats have positive aspects, such as the ease of accessing the data from disk and the wide variety of tools available for data analysis, there are some drawbacks, especially when viewing the situation through the lens of data interoperability and stewardship. The Unidata data translation tool, Rosetta, is a web-based service that provides an easy, wizard-based interface for data collectors to transform their datalogger generated ASCII output into Climate and Forecast (CF) compliant netCDF files following the CF-1.6 discrete sampling geometries. These files are complete with metadata describing what data are contained in the file, the instruments used to collect the data, and other critical information that otherwise may be lost in one of many README files. The choice of the machine readable netCDF data format and data model, coupled with the CF conventions, ensures long-term preservation and interoperability, and that future users will have enough information to responsibly use the data. However, with the understanding that the observational community appreciates the ease of use of ASCII files, methods for transforming the netCDF back into a CSV or spreadsheet format are also built-in. One benefit of translating ASCII data into a machine readable format that follows open community-driven standards is that they are instantly able to take advantage of data services provided by the many open-source data server tools, such as the THREDDS Data Server (TDS). While Rosetta is currently a stand-alone service, this talk will also highlight efforts to couple Rosetta with the TDS, thus allowing self-publishing of thoroughly documented datasets by the data producers themselves.
Römpp, Andreas; Schramm, Thorsten; Hester, Alfons; Klinkert, Ivo; Both, Jean-Pierre; Heeren, Ron M A; Stöckli, Markus; Spengler, Bernhard
2011-01-01
Imaging mass spectrometry is the method of scanning a sample of interest and generating an "image" of the intensity distribution of a specific analyte. The data sets consist of a large number of mass spectra which are usually acquired with identical settings. Existing data formats are not sufficient to describe an MS imaging experiment completely. The data format imzML was developed to allow the flexible and efficient exchange of MS imaging data between different instruments and data analysis software.For this purpose, the MS imaging data is divided in two separate files. The mass spectral data is stored in a binary file to ensure efficient storage. All metadata (e.g., instrumental parameters, sample details) are stored in an XML file which is based on the standard data format mzML developed by HUPO-PSI. The original mzML controlled vocabulary was extended to include specific parameters of imaging mass spectrometry (such as x/y position and spatial resolution). The two files (XML and binary) are connected by offset values in the XML file and are unambiguously linked by a universally unique identifier. The resulting datasets are comparable in size to the raw data and the separate metadata file allows flexible handling of large datasets.Several imaging MS software tools already support imzML. This allows choosing from a (growing) number of processing tools. One is no longer limited to proprietary software, but is able to use the processing software which is best suited for a specific question or application. On the other hand, measurements from different instruments can be compared within one software application using identical settings for data processing. All necessary information for evaluating and implementing imzML can be found at http://www.imzML.org .
Genotype harmonizer: automatic strand alignment and format conversion for genotype data integration.
Deelen, Patrick; Bonder, Marc Jan; van der Velde, K Joeri; Westra, Harm-Jan; Winder, Erwin; Hendriksen, Dennis; Franke, Lude; Swertz, Morris A
2014-12-11
To gain statistical power or to allow fine mapping, researchers typically want to pool data before meta-analyses or genotype imputation. However, the necessary harmonization of genetic datasets is currently error-prone because of many different file formats and lack of clarity about which genomic strand is used as reference. Genotype Harmonizer (GH) is a command-line tool to harmonize genetic datasets by automatically solving issues concerning genomic strand and file format. GH solves the unknown strand issue by aligning ambiguous A/T and G/C SNPs to a specified reference, using linkage disequilibrium patterns without prior knowledge of the used strands. GH supports many common GWAS/NGS genotype formats including PLINK, binary PLINK, VCF, SHAPEIT2 & Oxford GEN. GH is implemented in Java and a large part of the functionality can also be used as Java 'Genotype-IO' API. All software is open source under license LGPLv3 and available from http://www.molgenis.org/systemsgenetics. GH can be used to harmonize genetic datasets across different file formats and can be easily integrated as a step in routine meta-analysis and imputation pipelines.
BOREAS Elevation Contours over the NSA and SSA in ARC/INFO Generate Format
NASA Technical Reports Server (NTRS)
Knapp, David; Nickeson, Jaime; Hall, Forrest G. (Editor)
2000-01-01
This data set was prepared by BORIS Staff by reformatting the original data into the ARC/INFO Generate format. The original data were received in SIF at a scale of 1:50,000. BORIS staff could not find a format document or commercial software for reading SIF; the BOREAS HYD-08 team pro-vided some C source code that could read some of the SIF files. The data cover the BOREAS NSA and SSA. The original data were compiled from information available in the 1970s and 1980s. The data are available in ARC/INFO Generate format files.
NASA Technical Reports Server (NTRS)
Guenther, Bruce W.; Godden, Gerald D.; Xiong, Xiao-Xiong; Knight, Edward J.; Qiu, Shi-Yue; Montgomery, Harry; Hopkins, M. M.; Khayat, Mohammad G.; Hao, Zhi-Dong; Smith, David E. (Technical Monitor)
2000-01-01
The Moderate Resolution Imaging Spectroradiometer (MODIS) radiometric calibration product is described for the thermal emissive and the reflective solar bands. Specific sensor design characteristics are identified to assist in understanding how the calibration algorithm software product is designed. The reflected solar band software products of radiance and reflectance factor both are described. The product file format is summarized and the MODIS Characterization Support Team (MCST) Homepage location for the current file format is provided.
VizieR Online Data Catalog: Sgr B2(N) and Sgr B2(M) IRAM 30m line survey (Belloche+, 2013)
NASA Astrophysics Data System (ADS)
Belloche, A.; Mueller, H. S. P.; Menten, K. M.; Schilke, P.; Comito, C.
2013-08-01
The list of line identifications corresponding to the blue labels in Figs. 2 to 7 where the labels are often too crowded to be easily readable are available in ASCII format. The lists are split into six files, three for Sgr B2(N) and three for Sgr B2(M). For each source, there is one file per atmospheric window (3, 2, and 1mm). Each file is ordered by increasing frequency. The observed and synthetic spectra of Sgr B2(N) and Sgr B2(M) between 80 and 116GHz are available both in ASCII and FITS formats. The synthetic spectra were resampled to the same frequency channels as the observed spectra. The blanking value is -1000K for the ASCII files. There is one ASCII file per source. There are two FITS files per source, one for the observed spectrum and one for the synthetic spectrum. The intensities are in main-beam temperature scale in K. The blanking value is 42.75234K for the observed spectrum of SgrB2(N) and 53.96533K for the observed spectrum of SgrB2(M). (9 data files).
ERIC Educational Resources Information Center
American School & University, 1998
1998-01-01
Provides guidelines for school administrators to aid in the selection of school-roofing systems, and information required to make specification and purchasing decisions. Low-slope roofing systems are examined, as are multiply systems such as modified bitumen, EPDM, thermoplastic, metal, and foam. (GR)
OpenMSI: A High-Performance Web-Based Platform for Mass Spectrometry Imaging
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rubel, Oliver; Greiner, Annette; Cholia, Shreyas
Mass spectrometry imaging (MSI) enables researchers to directly probe endogenous molecules directly within the architecture of the biological matrix. Unfortunately, efficient access, management, and analysis of the data generated by MSI approaches remain major challenges to this rapidly developing field. Despite the availability of numerous dedicated file formats and software packages, it is a widely held viewpoint that the biggest challenge is simply opening, sharing, and analyzing a file without loss of information. Here we present OpenMSI, a software framework and platform that addresses these challenges via an advanced, high-performance, extensible file format and Web API for remote data accessmore » (http://openmsi.nersc.gov). The OpenMSI file format supports storage of raw MSI data, metadata, and derived analyses in a single, self-describing format based on HDF5 and is supported by a large range of analysis software (e.g., Matlab and R) and programming languages (e.g., C++, Fortran, and Python). Careful optimization of the storage layout of MSI data sets using chunking, compression, and data replication accelerates common, selective data access operations while minimizing data storage requirements and are critical enablers of rapid data I/O. The OpenMSI file format has shown to provide >2000-fold improvement for image access operations, enabling spectrum and image retrieval in less than 0.3 s across the Internet even for 50 GB MSI data sets. To make remote high-performance compute resources accessible for analysis and to facilitate data sharing and collaboration, we describe an easy-to-use yet powerful Web API, enabling fast and convenient access to MSI data, metadata, and derived analysis results stored remotely to facilitate high-performance data analysis and enable implementation of Web based data sharing, visualization, and analysis.« less
Use of Schema on Read in Earth Science Data Archives
NASA Technical Reports Server (NTRS)
Hegde, Mahabaleshwara; Smit, Christine; Pilone, Paul; Petrenko, Maksym; Pham, Long
2017-01-01
Traditionally, NASA Earth Science data archives have file-based storage using proprietary data file formats, such as HDF and HDF-EOS, which are optimized to support fast and efficient storage of spaceborne and model data as they are generated. The use of file-based storage essentially imposes an indexing strategy based on data dimensions. In most cases, NASA Earth Science data uses time as the primary index, leading to poor performance in accessing data in spatial dimensions. For example, producing a time series for a single spatial grid cell involves accessing a large number of data files. With exponential growth in data volume due to the ever-increasing spatial and temporal resolution of the data, using file-based archives poses significant performance and cost barriers to data discovery and access. Storing and disseminating data in proprietary data formats imposes an additional access barrier for users outside the mainstream research community. At the NASA Goddard Earth Sciences Data Information Services Center (GES DISC), we have evaluated applying the schema-on-read principle to data access and distribution. We used Apache Parquet to store geospatial data, and have exposed data through Amazon Web Services (AWS) Athena, AWS Simple Storage Service (S3), and Apache Spark. Using the schema-on-read approach allows customization of indexing spatially or temporally to suit the data access pattern. The storage of data in open formats such as Apache Parquet has widespread support in popular programming languages. A wide range of solutions for handling big data lowers the access barrier for all users. This presentation will discuss formats used for data storage, frameworks with This presentation will discuss formats used for data storage, frameworks with support for schema-on-read used for data access, and common use cases covering data usage patterns seen in a geospatial data archive.
Wang, Dongwen; Peleg, Mor; Tu, Samson W; Boxwala, Aziz A; Greenes, Robert A; Patel, Vimla L; Shortliffe, Edward H
2002-12-18
Representation of clinical practice guidelines in a computer-interpretable format is a critical issue for guideline development, implementation, and evaluation. We studied 11 types of guideline representation models that can be used to encode guidelines in computer-interpretable formats. We have consistently found in all reviewed models that primitives for representation of actions and decisions are necessary components of a guideline representation model. Patient states and execution states are important concepts that closely relate to each other. Scheduling constraints on representation primitives can be modeled as sequences, concurrences, alternatives, and loops in a guideline's application process. Nesting of guidelines provides multiple views to a guideline with different granularities. Integration of guidelines with electronic medical records can be facilitated by the introduction of a formal model for patient data. Data collection, decision, patient state, and intervention constitute four basic types of primitives in a guideline's logic flow. Decisions clarify our understanding on a patient's clinical state, while interventions lead to the change from one patient state to another.
A malware detection scheme based on mining format information.
Bai, Jinrong; Wang, Junfeng; Zou, Guozhong
2014-01-01
Malware has become one of the most serious threats to computer information system and the current malware detection technology still has very significant limitations. In this paper, we proposed a malware detection approach by mining format information of PE (portable executable) files. Based on in-depth analysis of the static format information of the PE files, we extracted 197 features from format information of PE files and applied feature selection methods to reduce the dimensionality of the features and achieve acceptable high performance. When the selected features were trained using classification algorithms, the results of our experiments indicate that the accuracy of the top classification algorithm is 99.1% and the value of the AUC is 0.998. We designed three experiments to evaluate the performance of our detection scheme and the ability of detecting unknown and new malware. Although the experimental results of identifying new malware are not perfect, our method is still able to identify 97.6% of new malware with 1.3% false positive rates.
A Malware Detection Scheme Based on Mining Format Information
Bai, Jinrong; Wang, Junfeng; Zou, Guozhong
2014-01-01
Malware has become one of the most serious threats to computer information system and the current malware detection technology still has very significant limitations. In this paper, we proposed a malware detection approach by mining format information of PE (portable executable) files. Based on in-depth analysis of the static format information of the PE files, we extracted 197 features from format information of PE files and applied feature selection methods to reduce the dimensionality of the features and achieve acceptable high performance. When the selected features were trained using classification algorithms, the results of our experiments indicate that the accuracy of the top classification algorithm is 99.1% and the value of the AUC is 0.998. We designed three experiments to evaluate the performance of our detection scheme and the ability of detecting unknown and new malware. Although the experimental results of identifying new malware are not perfect, our method is still able to identify 97.6% of new malware with 1.3% false positive rates. PMID:24991639
Flores, Romeo M.; Spear, Brianne D.; Purchase, Peter A.; Gallagher, Craig M.
2010-01-01
Described in this report is an updated subsurface stratigraphic framework of the Paleocene Fort Union Formation and Eocene Wasatch Formation in the Powder River Basin (PRB) in Wyoming and Montana. This framework is graphically presented in 17 intersecting west-east and north-south cross sections across the basin. Also included are: (1) the dataset and all associated digital files and (2) digital files for all figures and table 1 suitable for large-format printing. The purpose of this U.S. Geological Survey (USGS) Open-File Report is to provide rapid dissemination and accessibility of the stratigraphic cross sections and related digital data to USGS customers, especially the U.S. Bureau of Land Management (BLM), to facilitate their modeling of the hydrostratigraphy of the PRB. This report contains a brief summary of the coal-bed correlations and database, and is part of a larger ongoing study that will be available in the near future.
Development of an e-VLBI Data Transport Software Suite with VDIF
NASA Technical Reports Server (NTRS)
Sekido, Mamoru; Takefuji, Kazuhiro; Kimura, Moritaka; Hobiger, Thomas; Kokado, Kensuke; Nozawa, Kentarou; Kurihara, Shinobu; Shinno, Takuya; Takahashi, Fujinobu
2010-01-01
We have developed a software library (KVTP-lib) for VLBI data transmission over the network with the VDIF (VLBI Data Interchange Format), which is the newly proposed standard VLBI data format designed for electronic data transfer over the network. The software package keeps the application layer (VDIF frame) and the transmission layer separate, so that each layer can be developed efficiently. The real-time VLBI data transmission tool sudp-send is an application tool based on the KVTP-lib library. sudp-send captures the VLBI data stream from the VSI-H interface with the K5/VSI PC-board and writes the data to file in standard Linux file format or transmits it to the network using the simple- UDP (SUDP) protocol. Another tool, sudp-recv , receives the data stream from the network and writes the data to file in a specific VLBI format (K5/VSSP, VDIF, or Mark 5B). This software system has been implemented on the Wettzell Tsukuba baseline; evaluation before operational employment is under way.
75 FR 19339 - FM Table of Allotments, Amboy, California
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-14
.... SUMMARY: The Audio Division seeks comments on a petition filed by Sunnylands Broadcasting, LLC, proposing... disabilities (Braille, large print, electronic files, audio format), send an e-mail to [email protected] or call... Chief, Audio Division, Media Bureau. [FR Doc. 2010-8449 Filed 4-13-10; 8:45 am] BILLING CODE 6712-01-S ...
14 CFR 221.121 - How to prepare and file applications for Special Tariff Permission.
Code of Federal Regulations, 2010 CFR
2010-01-01
..., DEPARTMENT OF TRANSPORTATION (AVIATION PROCEEDINGS) ECONOMIC REGULATIONS TARIFFS Special Tariff Permission To... notice shall conform to the requirements of § 221.212 if filed electronically. (b) Number of paper copies and place of filing. For paper format applications, the original and one copy of each such application...
Biological Investigations of Adaptive Networks: Neuronal Control of Conditioned Responses
1989-07-01
The program also controls A/D sampling of voltage trace from NMR transducer and disk files for NMR, neural spikes, and synchronization. * HSAD . Basic...format which ANALYZE (by John Desmond) can read. e FIG.HIRES Reads C-64 HSAD files and EVENT NMR files and generates oscilloscope-like figures showing
77 FR 6625 - Railroad Cost of Capital-2011
Federal Register 2010, 2011, 2012, 2013, 2014
2012-02-08
... railroads are due by May 9, 2012. ADDRESSES: Comments may be submitted either via the Board's e-filing system or in the traditional paper format. Any person using e-filing should comply with the instructions at the E-FILING link on the Board's Web site, at http://www.stb.dot.gov . Any person submitting a...
Students' Attitudes to and Usage of Academic Feedback Provided via Audio Files
ERIC Educational Resources Information Center
Merry, Stephen; Orsmond, Paul
2008-01-01
This study explores students' attitudes to the provision of formative feedback on academic work using audio files together with the ways in which students implement such feedback within their learning. Fifteen students received audio file feedback on written work and were subsequently interviewed regarding their utilisation of that feedback within…
Kabekkodu, Soorya N; Faber, John; Fawcett, Tim
2002-06-01
The International Centre for Diffraction Data (ICDD) is responding to the changing needs in powder diffraction and materials analysis by developing the Powder Diffraction File (PDF) in a very flexible relational database (RDB) format. The PDF now contains 136,895 powder diffraction patterns. In this paper, an attempt is made to give an overview of the PDF-4, search/match methods and the advantages of having the PDF-4 in RDB format. Some case studies have been carried out to search for crystallization trends, properties, frequencies of space groups and prototype structures. These studies give a good understanding of the basic structural aspects of classes of compounds present in the database. The present paper also reports data-mining techniques and demonstrates the power of a relational database over the traditional (flat-file) database structures.
Is HDF5 a Good Format to Replace UVFITS?
NASA Astrophysics Data System (ADS)
Price, D. C.; Barsdell, B. R.; Greenhill, L. J.
2015-09-01
The FITS (Flexible Image Transport System) data format was developed in the late 1970s for storage and exchange of astronomy-related image data. Since then, it has become a standard file format not only for images, but also for radio interferometer data (e.g. UVFITS, FITS-IDI). But is FITS the right format for next-generation telescopes to adopt? The newer Hierarchical Data Format (HDF5) file format offers considerable advantages over FITS, but has yet to gain widespread adoption within the radio astronomy. One of the major holdbacks is that HDF5 is not well supported by data reduction software packages. Here, we present a comparison of FITS, HDF5, and the MeasurementSet (MS) format for storage of interferometric data. In addition, we present a tool for converting between formats. We show that the underlying data model of FITS can be ported to HDF5, a first step toward achieving wider HDF5 support.
BOREAS TE-20 Soils Data Over the NSA-MSA and Tower Sites in Raster Format
NASA Technical Reports Server (NTRS)
Hall, Forrest G. (Editor); Veldhuis, Hugo; Knapp, David; Veldhuis, Hugo
2000-01-01
The BOREAS TE-20 team collected several data sets for use in developing and testing models of forest ecosystem dynamics. This data set was gridded from vector layers of soil maps that were received from Dr. Hugo Veldhuis, who did the original mapping in the field during 1994. The vector layers were gridded into raster files that cover the NSA-MSA and tower sites. The data are stored in binary, image format files. The data files are available on a CD-ROM (see document number 20010000884), or from the Oak Ridge National Laboratory (ORNL) Distributed Active Center (DAAC).
Covariance Data File Formats for Whisper-1.0 & Whisper-1.1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, Forrest B.; Rising, Michael Evan
2017-01-09
Whisper is a statistical analysis package developed in 2014 to support nuclear criticality safety (NCS) validation. It uses the sensitivity profile data for an application as computed by MCNP6 along with covariance files for the nuclear data to determine a baseline upper-subcritical-limit (USL) for the application. Whisper version 1.0 was first developed and used at LANL in 2014. During 2015-2016, Whisper was updated to version 1.1 and is to be included with the upcoming release of MCNP6.2. This report describes the file formats used for the covariance data in both Whisper-1.0 and Whisper-1.1.
Tool for Merging Proposals Into DSN Schedules
NASA Technical Reports Server (NTRS)
Khanampornpan, Teerapat; Kwok, John; Call, Jared
2008-01-01
A Practical Extraction and Reporting Language (Perl) script called merge7da has been developed to facilitate determination, by a project scheduler in NASA's Deep Space Network, of whether a proposal for use of the DSN could create a conflict with the current DSN schedule. Prior to the development of merge7da, there was no way to quickly identify potential schedule conflicts: it was necessary to submit a proposal and wait a day or two for a response from a DSN scheduling facility. By using merge7da to detect and eliminate potential schedule conflicts before submitting a proposal, a project scheduler saves time and gains assurance that the proposal will probably be accepted. merge7da accepts two input files, one of which contains the current DSN schedule and is in a DSN-standard format called '7da'. The other input file contains the proposal and is in another DSN-standard format called 'C1/C2'. merge7da processes the two input files to produce a merged 7da-format output file that represents the DSN schedule as it would be if the proposal were to be adopted. This 7da output file can be loaded into various DSN scheduling software tools now in use.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-29
... submissions by the parties may be submitted via the Board's e-filing format or in the traditional paper format. Any person using e-filing should attach a document and otherwise comply with the instructions at the E... proceeding under 49 U.S.C. 721 and 5 U.S.C. 554(e). Petitioners request that the Board declare that specific...
Occupational Survey Report. Visual Information, AFSC 3V0X1
2000-04-01
of the career ladder include: Scan artwork using flatbed scanners Convert graphic file formats Design layouts Letter certificates using laser...Design layouts Scan artwork using flatbed scanners Produce artwork using mouse or digitizing tablets Design and produce imagery for web pages Produce...DAFSC 3V031 PERSONNEL TASKS A0034 Scan artwork using flatbed scanners C0065 Design layouts A0004 Convert graphic file formats A0006 Create
DOE Office of Scientific and Technical Information (OSTI.GOV)
Meng, Da; Zhang, Qibin; Gao, Xiaoli
2014-04-30
We have developed a tool for automated, high-throughput analysis of LC-MS/MS data files, which greatly simplifies LC-MS based lipidomics analysis. Our results showed that LipidMiner is accurate and comprehensive in identification and quantification of lipid molecular species. In addition, the workflow implemented in LipidMiner is not limited to identification and quantification of lipids. If a suitable metabolite library is implemented in the library matching module, LipidMiner could be reconfigured as a tool for general metabolomics data analysis. It is of note that LipidMiner currently is limited to singly charged ions, although it is adequate for the purpose of lipidomics sincemore » lipids are rarely multiply charged,[14] even for the polyphosphoinositides. LipidMiner also only processes file formats generated from mass spectrometers from Thermo, i.e. the .RAW format. In the future, we are planning to accommodate file formats generated by mass spectrometers from other predominant instrument vendors to make this tool more universal.« less
Transforming Dermatologic Imaging for the Digital Era: Metadata and Standards.
Caffery, Liam J; Clunie, David; Curiel-Lewandrowski, Clara; Malvehy, Josep; Soyer, H Peter; Halpern, Allan C
2018-01-17
Imaging is increasingly being used in dermatology for documentation, diagnosis, and management of cutaneous disease. The lack of standards for dermatologic imaging is an impediment to clinical uptake. Standardization can occur in image acquisition, terminology, interoperability, and metadata. This paper presents the International Skin Imaging Collaboration position on standardization of metadata for dermatologic imaging. Metadata is essential to ensure that dermatologic images are properly managed and interpreted. There are two standards-based approaches to recording and storing metadata in dermatologic imaging. The first uses standard consumer image file formats, and the second is the file format and metadata model developed for the Digital Imaging and Communication in Medicine (DICOM) standard. DICOM would appear to provide an advantage over using consumer image file formats for metadata as it includes all the patient, study, and technical metadata necessary to use images clinically. Whereas, consumer image file formats only include technical metadata and need to be used in conjunction with another actor-for example, an electronic medical record-to supply the patient and study metadata. The use of DICOM may have some ancillary benefits in dermatologic imaging including leveraging DICOM network and workflow services, interoperability of images and metadata, leveraging existing enterprise imaging infrastructure, greater patient safety, and better compliance to legislative requirements for image retention.
Robichaud, Guillaume; Garrard, Kenneth P; Barry, Jeremy A; Muddiman, David C
2013-05-01
During the past decade, the field of mass spectrometry imaging (MSI) has greatly evolved, to a point where it has now been fully integrated by most vendors as an optional or dedicated platform that can be purchased with their instruments. However, the technology is not mature and multiple research groups in both academia and industry are still very actively studying the fundamentals of imaging techniques, adapting the technology to new ionization sources, and developing new applications. As a result, there important varieties of data file formats used to store mass spectrometry imaging data and, concurrent to the development of MSi, collaborative efforts have been undertaken to introduce common imaging data file formats. However, few free software packages to read and analyze files of these different formats are readily available. We introduce here MSiReader, a free open source application to read and analyze high resolution MSI data from the most common MSi data formats. The application is built on the Matlab platform (Mathworks, Natick, MA, USA) and includes a large selection of data analysis tools and features. People who are unfamiliar with the Matlab language will have little difficult navigating the user-friendly interface, and users with Matlab programming experience can adapt and customize MSiReader for their own needs.
NASA Astrophysics Data System (ADS)
Robichaud, Guillaume; Garrard, Kenneth P.; Barry, Jeremy A.; Muddiman, David C.
2013-05-01
During the past decade, the field of mass spectrometry imaging (MSI) has greatly evolved, to a point where it has now been fully integrated by most vendors as an optional or dedicated platform that can be purchased with their instruments. However, the technology is not mature and multiple research groups in both academia and industry are still very actively studying the fundamentals of imaging techniques, adapting the technology to new ionization sources, and developing new applications. As a result, there important varieties of data file formats used to store mass spectrometry imaging data and, concurrent to the development of MSi, collaborative efforts have been undertaken to introduce common imaging data file formats. However, few free software packages to read and analyze files of these different formats are readily available. We introduce here MSiReader, a free open source application to read and analyze high resolution MSI data from the most common MSi data formats. The application is built on the Matlab platform (Mathworks, Natick, MA, USA) and includes a large selection of data analysis tools and features. People who are unfamiliar with the Matlab language will have little difficult navigating the user-friendly interface, and users with Matlab programming experience can adapt and customize MSiReader for their own needs.
ArrayInitiative - a tool that simplifies creating custom Affymetrix CDFs
2011-01-01
Background Probes on a microarray represent a frozen view of a genome and are quickly outdated when new sequencing studies extend our knowledge, resulting in significant measurement error when analyzing any microarray experiment. There are several bioinformatics approaches to improve probe assignments, but without in-house programming expertise, standardizing these custom array specifications as a usable file (e.g. as Affymetrix CDFs) is difficult, owing mostly to the complexity of the specification file format. However, without correctly standardized files there is a significant barrier for testing competing analysis approaches since this file is one of the required inputs for many commonly used algorithms. The need to test combinations of probe assignments and analysis algorithms led us to develop ArrayInitiative, a tool for creating and managing custom array specifications. Results ArrayInitiative is a standalone, cross-platform, rich client desktop application for creating correctly formatted, custom versions of manufacturer-provided (default) array specifications, requiring only minimal knowledge of the array specification rules and file formats. Users can import default array specifications, import probe sequences for a default array specification, design and import a custom array specification, export any array specification to multiple output formats, export the probe sequences for any array specification and browse high-level information about the microarray, such as version and number of probes. The initial release of ArrayInitiative supports the Affymetrix 3' IVT expression arrays we currently analyze, but as an open source application, we hope that others will contribute modules for other platforms. Conclusions ArrayInitiative allows researchers to create new array specifications, in a standard format, based upon their own requirements. This makes it easier to test competing design and analysis strategies that depend on probe definitions. Since the custom array specifications are easily exported to the manufacturer's standard format, researchers can analyze these customized microarray experiments using established software tools, such as those available in Bioconductor. PMID:21548938
Toolsets for Airborne Data (TAD): Improving Machine Readability for ICARTT Data Files
NASA Astrophysics Data System (ADS)
Northup, E. A.; Early, A. B.; Beach, A. L., III; Kusterer, J.; Quam, B.; Wang, D.; Chen, G.
2015-12-01
NASA has conducted airborne tropospheric chemistry studies for about three decades. These field campaigns have generated a great wealth of observations, including a wide range of the trace gases and aerosol properties. The ASDC Toolsets for Airborne Data (TAD) is designed to meet the user community needs for manipulating aircraft data for scientific research on climate change and air quality relevant issues. TAD makes use of aircraft data stored in the International Consortium for Atmospheric Research on Transport and Transformation (ICARTT) file format. ICARTT has been the NASA standard since 2010, and is widely used by NOAA, NSF, and international partners (DLR, FAAM). Its level of acceptance is due in part to it being generally self-describing for researchers, i.e., it provides necessary data descriptions for proper research use. Despite this, there are a number of issues with the current ICARTT format, especially concerning the machine readability. In order to overcome these issues, the TAD team has developed an "idealized" file format. This format is ASCII and is sufficiently machine readable to sustain the TAD system, however, it is not fully compatible with the current ICARTT format. The process of mapping ICARTT metadata to the idealized format, the format specifics, and the actual conversion process will be discussed. The goal of this presentation is to demonstrate an example of how to improve the machine readability of ASCII data format protocols.
East Asian Cinema (College Course File).
ERIC Educational Resources Information Center
Ehrlich, Linda; Ma, Ning
1990-01-01
Provides guidelines for instructors who teach entire courses on (or who include) films from Japan and China. Considers issues of concern in contemporary Asian cinema such as conflicts between tradition and modernity, indigenous definitions of cultural identity and artistic form, and internationalization. (KEH)
Code of Federal Regulations, 2011 CFR
2011-10-01
.... The Regional Administrator shall integrate the NEPA process with other planning at the earliest... guidelines available upon request. (ii) The Regional Administrator shall provide such guidance on a project... agency approval, or notification that an application will be filed, the Regional Administrator shall...
Code of Federal Regulations, 2014 CFR
2014-10-01
.... The Regional Administrator shall integrate the NEPA process with other planning at the earliest... guidelines available upon request. (ii) The Regional Administrator shall provide such guidance on a project... agency approval, or notification that an application will be filed, the Regional Administrator shall...
Imaging Systems: What, When, How.
ERIC Educational Resources Information Center
Lunin, Lois F.; And Others
1992-01-01
The three articles in this special section on document image files discuss intelligent character recognition, including comparison with optical character recognition; selection of displays for document image processing, focusing on paperlike displays; and imaging hardware, software, and vendors, including guidelines for system selection. (MES)
SW New Mexico Oil Well Formation Tops
Shari Kelley
2015-10-21
Rock formation top picks from oil wells from southwestern New Mexico from scout cards and other sources. There are differing formation tops interpretations for some wells, so for those wells duplicate formation top data are presented in this file.
Faibish, Sorin; Bent, John M; Tzelnic, Percy; Grider, Gary; Torres, Aaron
2015-02-03
Techniques are provided for storing files in a parallel computing system using sub-files with semantically meaningful boundaries. A method is provided for storing at least one file generated by a distributed application in a parallel computing system. The file comprises one or more of a complete file and a plurality of sub-files. The method comprises the steps of obtaining a user specification of semantic information related to the file; providing the semantic information as a data structure description to a data formatting library write function; and storing the semantic information related to the file with one or more of the sub-files in one or more storage nodes of the parallel computing system. The semantic information provides a description of data in the file. The sub-files can be replicated based on semantically meaningful boundaries.
Standard interface files and procedures for reactor physics codes, version III
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carmichael, B.M.
Standards and procedures for promoting the exchange of reactor physics codes are updated to Version-III status. Standards covering program structure, interface files, file handling subroutines, and card input format are included. The implementation status of the standards in codes and the extension of the standards to new code areas are summarized. (15 references) (auth)
75 FR 19338 - FM TABLE OF ALLOTMENTS, Milford, Utah
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-14
.... SUMMARY: The Audio Division seeks comments on a petition filed by Canyon Media Group, LLC, authorized..., large print, electronic files, audio format), send an e-mail to [email protected] or call the Consumer... Chief, Audio Division, Media Bureau. [FR Doc. 2010-8448 Filed 4-13-10; 8:45 am] BILLING CODE 6712-01-S ...
Snake River Plain Geothermal Play Fairway Analysis - Phase 1 KMZ files
John Shervais
2015-10-10
This dataset contain raw data files in kmz files (Google Earth georeference format). These files include volcanic vent locations and age, the distribution of fine-grained lacustrine sediments (which act as both a seal and an insulating layer for hydrothermal fluids), and post-Miocene faults compiled from the Idaho Geological Survey, the USGS Quaternary Fault database, and unpublished mapping. It also contains the Composite Common Risk Segment Map created during Phase 1 studies, as well as a file with locations of select deep wells used to interrogate the subsurface.
1998-07-01
all the MS Word files into FrameMaker + SGML format and use the FrameMaker application to SGML tag all of the data in accordance with the Army TM...Document Type Definitions (DTDs) in MIL-STD- 2361. The edited SGML tagged files are saved as PDF files for delivery to the field. The FrameMaker ...as TIFF files and being imported into FrameMaker prior to saving the TMs as PDF files. Since the hardware to be used by the AN/PPS-5 technician is
Cytoscape file of chemical networks
The maximum connectivity scores of pairwise chemical conditions summarized from Cmap results in a file with Cytoscape format (http://www.cytoscape.org/). The figures in the publication were generated from this file. The Cytoscape file is formed from importing the eight text file therein.This dataset is associated with the following publication:Wang , R., A. Biales , N. Garcia-Reyero, E. Perkins, D. Villeneuve, G. Ankley, and D. Bencic. Fish Connectivity Mapping: Linking Chemical Stressors by Their MOA-Driven Transcriptomic Profiles. BMC Genomics. BioMed Central Ltd, London, UK, 17(84): 1-20, (2016).
Index files for Belle II - very small skim containers
NASA Astrophysics Data System (ADS)
Sevior, Martin; Bloomfield, Tristan; Kuhr, Thomas; Ueda, I.; Miyake, H.; Hara, T.
2017-10-01
The Belle II experiment[1] employs the root file format[2] for recording data and is investigating the use of “index-files” to reduce the size of data skims. These files contain pointers to the location of interesting events within the total Belle II data set and reduce the size of data skims by 2 orders of magnitude. We implement this scheme on the Belle II grid by recording the parent file metadata and the event location within the parent file. While the scheme works, it is substantially slower than a normal sequential read of standard skim files using default root file parameters. We investigate the performance of the scheme by adjusting the “splitLevel” and “autoflushsize” parameters of the root files in the parent data files.
18 CFR 270.304 - Tight formation gas.
Code of Federal Regulations, 2011 CFR
2011-04-01
... determination that natural gas is tight formation gas must file with the jurisdictional agency an application... formation; (d) A complete copy of the well log, including the log heading identifying the designated tight...
Proposal for a Standard Format for Neurophysiology Data Recording and Exchange.
Stead, Matt; Halford, Jonathan J
2016-10-01
The lack of interoperability between information networks is a significant source of cost in health care. Standardized data formats decrease health care cost, improve quality of care, and facilitate biomedical research. There is no common standard digital format for storing clinical neurophysiologic data. This review proposes a new standard file format for neurophysiology data (the bulk of which is video-electroencephalographic data), entitled the Multiscale Electrophysiology Format, version 3 (MEF3), which is designed to address many of the shortcomings of existing formats. MEF3 provides functionality that addresses many of the limitations of current formats. The proposed improvements include (1) hierarchical file structure with improved organization; (2) greater extensibility for big data applications requiring a large number of channels, signal types, and parallel processing; (3) efficient and flexible lossy or lossless data compression; (4) industry standard multilayered data encryption and time obfuscation that permits sharing of human data without the need for deidentification procedures; (5) resistance to file corruption; (6) facilitation of online and offline review and analysis; and (7) provision of full open source documentation. At this time, there is no other neurophysiology format that supports all of these features. MEF3 is currently gaining industry and academic community support. The authors propose the use of the MEF3 as a standard format for neurophysiology recording and data exchange. Collaboration between industry, professional organizations, research communities, and independent standards organizations is needed to move the project forward.
HDFITS: Porting the FITS data model to HDF5
NASA Astrophysics Data System (ADS)
Price, D. C.; Barsdell, B. R.; Greenhill, L. J.
2015-09-01
The FITS (Flexible Image Transport System) data format has been the de facto data format for astronomy-related data products since its inception in the late 1970s. While the FITS file format is widely supported, it lacks many of the features of more modern data serialization, such as the Hierarchical Data Format (HDF5). The HDF5 file format offers considerable advantages over FITS, such as improved I/O speed and compression, but has yet to gain widespread adoption within astronomy. One of the major holdbacks is that HDF5 is not well supported by data reduction software packages and image viewers. Here, we present a comparison of FITS and HDF5 as a format for storage of astronomy datasets. We show that the underlying data model of FITS can be ported to HDF5 in a straightforward manner, and that by doing so the advantages of the HDF5 file format can be leveraged immediately. In addition, we present a software tool, fits2hdf, for converting between FITS and a new 'HDFITS' format, where data are stored in HDF5 in a FITS-like manner. We show that HDFITS allows faster reading of data (up to 100x of FITS in some use cases), and improved compression (higher compression ratios and higher throughput). Finally, we show that by only changing the import lines in Python-based FITS utilities, HDFITS formatted data can be presented transparently as an in-memory FITS equivalent.
What is meant by Format Version? Product Version? Collection?
Atmospheric Science Data Center
2017-10-12
The format Version is used to distinguish between software deliveries to ASDC that result in a product format change. The format version is given in the MISR data file name using the designator _Fnn_ where nn is the version number. ...
ListingAnalyst: A program for analyzing the main output file from MODFLOW
Winston, Richard B.; Paulinski, Scott
2014-01-01
ListingAnalyst is a Windows® program for viewing the main output file from MODFLOW-2005, MODFLOW-NWT, or MODFLOW-LGR. It organizes and displays large files quickly without using excessive memory. The sections and subsections of the file are displayed in a tree-view control, which allows the user to navigate quickly to desired locations in the files. ListingAnalyst gathers error and warning messages scattered throughout the main output file and displays them all together in an error and a warning tab. A grid view displays tables in a readable format and allows the user to copy the table into a spreadsheet. The user can also search the file for terms of interest.
Vector Topographic Map Data over the BOREAS NSA and SSA in SIF Format
NASA Technical Reports Server (NTRS)
Knapp, David; Nickeson, Jaime; Hall, Forrest G. (Editor)
2000-01-01
This data set contains vector contours and other features of individual topographic map sheets from the National Topographic Series (NTS). The map sheet files were received in Standard Interchange Format (SIF) and cover the BOReal Ecosystem-Atmosphere Study (BOREAS) Northern Study Area (NSA) and Southern Study Area (SSA) at scales of 1:50,000 and 1:250,000. The individual files are stored in compressed Unix tar archives.
Workflow opportunities using JPEG 2000
NASA Astrophysics Data System (ADS)
Foshee, Scott
2002-11-01
JPEG 2000 is a new image compression standard from ISO/IEC JTC1 SC29 WG1, the Joint Photographic Experts Group (JPEG) committee. Better thought of as a sibling to JPEG rather than descendant, the JPEG 2000 standard offers wavelet based compression as well as companion file formats and related standardized technology. This paper examines the JPEG 2000 standard for features in four specific areas-compression, file formats, client-server, and conformance/compliance that enable image workflows.
GIF Animation of Mode Shapes and Other Data on the Internet
NASA Technical Reports Server (NTRS)
Pappa, Richard S.
1998-01-01
The World Wide Web abounds with animated cartoons and advertisements competing for our attention. Most of these figures are animated Graphics Interchange Format (GIF) files. These files contain a series of ordinary GIF images plus control information, and they provide an exceptionally simple, effective way to animate on the Internet. To date, however, this format has rarely been used for technical data, although there is no inherent reason not to do so. This paper describes a procedure for creating high-resolution animated GIFs of mode shapes and other types of structural dynamics data with readily available software. The paper shows three example applications using recent modal test data and video footage of a high-speed sled run. A fairly detailed summary of the GIF file format is provided in the appendix. All of the animations discussed in the paper are posted on the Internet available through the following address: http://sdb-www.larc.nasa.gov/.
Preparing PNNL Reports with LaTeX
DOE Office of Scientific and Technical Information (OSTI.GOV)
Waichler, Scott R.
2005-06-01
LaTeX is a mature document preparation system that is the standard in many scientific and academic workplaces. It has been used extensively by scattered individuals and research groups within PNNL for years, but until now there have been no centralized or lab-focused resources to help authors and editors. PNNL authors and editors can produce correctly formatted PNNL or PNWD reports using the LaTeX document preparation system and the available template files. Please visit the PNNL-LaTeX Project (http://stidev.pnl.gov/resources/latex/, inside the PNNL firewall) for additional information and files. In LaTeX, document content is maintained separately from document structure for the most part.more » This means that the author can easily produce the same content in different formats and, more importantly, can focus on the content and write it in a plain text file that doesn't go awry, is easily transferable, and won't become obsolete due to software changes. LaTeX produces the finest print quality output; its typesetting is noticeably better than that of MS Word. This is particularly true for mathematics, tables, and other types of special text. Other benefits of LaTeX: easy handling of large numbers of figures and tables; automatic and error-free captioning, citation, cross-referencing, hyperlinking, and indexing; excellent published and online documentation; free or low-cost distributions for Windows/Linux/Unix/Mac OS X. This document serves two purposes: (1) it provides instructions to produce reports formatted to PNNL requirements using LaTeX, and (2) the document itself is in the form of a PNNL report, providing examples of many solved formatting challenges. Authors can use this document or its skeleton version (with formatting examples removed) as the starting point for their own reports. The pnnreport.cls class file and pnnl.bst bibliography style file contain the required formatting specifications for reports to the Department of Energy. Options are also provided for formatting PNWD (non-1830) reports. This documentation and the referenced files are meant to provide a complete package of PNNL particulars for authors and editors who wish to prepare technical reports using LaTeX. The example material in this document was borrowed from real reports and edited for demonstration purposes. The subject matter content of the example material is not relevant here and generally does not make literal sense in the context of this document. Brackets ''[]'' are used to denote large blocks of example text. The PDF file for this report contains hyperlinks to facilitate navigation. Hyperlinks are provided for all cross-referenced material, including section headings, figures, tables, and references. Not all hyperlinks are colored but will be obvious when you move your mouse over them.« less
Publications - PIR 2002-3 | Alaska Division of Geological & Geophysical
): Philip Smith Mountains Bibliographic Reference Stevens, D.S.P., 2014, Engineering-geologic map of the Digital Geospatial Data Philip Smith Mountains: Engineering-geologic map Data File Format File Size Info
NASA Astrophysics Data System (ADS)
Mosca, Pietro; Mounier, Claude
2016-03-01
The automatic construction of evolution chains recently implemented in GALILEE system is based on the analysis of several ENDF files : the multigroup production cross sections present in the GENDF files processed by NJOY from the ENDF evaluation, the decay file and the fission product yields (FPY) file. In this context, this paper highlights the importance of the nucleus identification to properly interconnect the data mentioned above. The first part of the paper describes the present status of the nucleus identification among the several ENDF files focusing, in particular, on the use of the excited state number and of the isomeric state number. The second part reviews the problems encountered during the automatic construction of the depletion chains using recent ENDF data. The processing of the JEFF-3.1.1, ENDF/B-VII.0 (decay and FPY) and the JEFF-3.2 (production cross section) points out problems about the compliance or not of the nucleus identifiers with the ENDF-6 format and sometimes the inconsistencies among the various ENDF files. In addition, the analysis of EAF-2003 and EAF-2010 shows some incoherence between the ZA product identifier and the reaction identifier MT for the reactions (n, pα) and (n, 2np). As a main result of this work, our suggestion is to change the ENDF format using systematically the isomeric state number to identify the nuclei. This proposal is already compliant to a huge amount ENDF data that are not in agreement with the present ENDF format. This choice is the most convenient because, ultimately, it allows one to give human readable names to the nuclei of the depletion chains.
BOREAS RSS-14 Level-1a GOES-8 Visible, IR and Water Vapor Images
NASA Technical Reports Server (NTRS)
Hall, Forrest G. (Editor); Newcomer, Jeffrey A.; Faysash, David; Cooper, Harry J.; Smith, Eric A.
2000-01-01
The BOREAS RSS-14 team collected and processed several GOES-7 and GOES-8 image data sets that covered the BOREAS study region. The level-1a GOES-8 images were created by BORIS personnel from the level-1 images delivered by FSU personnel. The data cover 14-Jul-1995 to 21-Sep-1995 and 12-Feb-1996 to 03-Oct-1996. The data start out as three bands with 8-bit pixel values and end up as five bands with 10-bit pixel values. No major problems with the data have been identified. The differences between the level-1 and level-1a GOES-8 data are the formatting and packaging of the data. The images missing from the temporal series of level-1 GOES-8 images were zero-filled by BORIS staff to create files consistent in size and format. In addition, BORIS staff packaged all the images of a given type from a given day into a single file, removed the header information from the individual level-1 files, and placed it into a single descriptive ASCII header file. The data are contained in binary image format files. Due to the large size of the images, the level-1a GOES-8 data are not contained on the BOREAS CD-ROM set. An inventory listing file is supplied on the CD-ROM to inform users of what data were collected. The level-1a GOES-8 image data are available from the Earth Observing System Data and Information System (EOSDIS) Oak Ridge National Laboratory (ORNL) Distributed Active Archive Center (DAAC). See sections 15 and 16 for more information. The data files are available on a CD-ROM (see document number 20010000884).
Main image file tape description
Warriner, Howard W.
1980-01-01
This Main Image File Tape document defines the data content and file structure of the Main Image File Tape (MIFT) produced by the EROS Data Center (EDC). This document also defines an INQUIRY tape, which is just a subset of the MIFT. The format of the INQUIRY tape is identical to the MIFT except for two records; therefore, with the exception of these two records (described elsewhere in this document), every remark made about the MIFT is true for the INQUIRY tape.
GuideLine Interchange Format (GLIP): Extensions and Practical Applications
2000-09-01
Others have also studied and used GLIF as well6-9 . We chose this formalism because it can be easily (and exactly) represented in a flowchart , a...benefit from guidelines in different formats (algorithms, flowcharts , written text). Guidelines in diagrammatic form such as algorithms are useful for...as long as a task is performed, the GL suggests the next steps. This can be a suitable modality for beginners . On the contrary, it can be "silent
lcps: Light curve pre-selection
NASA Astrophysics Data System (ADS)
Schlecker, Martin
2018-05-01
lcps searches for transit-like features (i.e., dips) in photometric data. Its main purpose is to restrict large sets of light curves to a number of files that show interesting behavior, such as drops in flux. While lcps is adaptable to any format of time series, its I/O module is designed specifically for photometry of the Kepler spacecraft. It extracts the pre-conditioned PDCSAP data from light curves files created by the standard Kepler pipeline. It can also handle csv-formatted ascii files. lcps uses a sliding window technique to compare a section of flux time series with its surroundings. A dip is detected if the flux within the window is lower than a threshold fraction of the surrounding fluxes.
Chemopreventive Agent Development | Division of Cancer Prevention
[[{"fid":"174","view_mode":"default","fields":{"format":"default","field_file_image_alt_text[und][0][value]":"Chemoprevenentive Agent Development Research Group Homepage Logo","field_file_image_title_text[und][0][value]":"Chemoprevenentive Agent Development Research Group Homepage
17 CFR 16.06 - Errors or omissions.
Code of Federal Regulations, 2010 CFR
2010-04-01
..., reporting markets shall file corrections to errors or omissions in data previously filed with the Commission pursuant to §§ 16.00 and 16.01 in the format and using the coding structure and electronic data submission...
Publications - PR 121 | Alaska Division of Geological & Geophysical Surveys
: Download below or please see our publication sales page for more information. Quadrangle(s): Philip Smith Philip Smith Mountains: Surficial Geology Data File Format File Size Info Download psm-surficial-geo
Publications - RI 2001-1C | Alaska Division of Geological & Geophysical
map of the Chulitna region, southcentral Alaska, scale 1:63,360 (7.5 M) Digital Geospatial Data Digital Geospatial Data Chulitna region surficial geology Data File Format File Size Info Download
Publications - RDF 2015-17 | Alaska Division of Geological & Geophysical
/10.14509/29519 Publication Products Report Report Information rdf2015_017.pdf (347.0 K) Digital Geospatial Data Digital Geospatial Data Tonsina geochemistry: DGGS samples Data File Format File Size Info
VizieR Online Data Catalog: Horizontal temperature at Venus upper atmosphere (Peralta+, 2016)
NASA Astrophysics Data System (ADS)
Peralta, J.; Lopez-Valverde, M. A.; Gilli, G.; Piccialli, A.
2015-11-01
The dayside atmospheric temperatures in the UMLT of Venus (displayed in Figure 7A of this article) are listed as a CSV data file. These values consist of averages in bins of 5° in latitude and 0.25-hours in local time from dayside temperatures covering five years of data (from 2006/05/14 to 2011/06/05). These temperatures were inferred from the CO2 NLTE nadir spectra measured by the instrument VIRTIS-H onboard Venus Express (see article for full description of the procedure), and are representative of the atmospheric region between 10-2 to 10-5mb. Along with the temperatures, we also provide the corresponding error and the number of temperatures averaged in each bin. The format of the CSV file reasonably agrees with the expected format of the data files to be provided in the future version of the Venus International Reference Atmosphere (VIRA). (1 data file).
Profex: a graphical user interface for the Rietveld refinement program BGMN.
Doebelin, Nicola; Kleeberg, Reinhard
2015-10-01
Profex is a graphical user interface for the Rietveld refinement program BGMN . Its interface focuses on preserving BGMN 's powerful and flexible scripting features by giving direct access to BGMN input files. Very efficient workflows for single or batch refinements are achieved by managing refinement control files and structure files, by providing dialogues and shortcuts for many operations, by performing operations in the background, and by providing import filters for CIF and XML crystal structure files. Refinement results can be easily exported for further processing. State-of-the-art graphical export of diffraction patterns to pixel and vector graphics formats allows the creation of publication-quality graphs with minimum effort. Profex reads and converts a variety of proprietary raw data formats and is thus largely instrument independent. Profex and BGMN are available under an open-source license for Windows, Linux and OS X operating systems.
Profex: a graphical user interface for the Rietveld refinement program BGMN
Doebelin, Nicola; Kleeberg, Reinhard
2015-01-01
Profex is a graphical user interface for the Rietveld refinement program BGMN. Its interface focuses on preserving BGMN’s powerful and flexible scripting features by giving direct access to BGMN input files. Very efficient workflows for single or batch refinements are achieved by managing refinement control files and structure files, by providing dialogues and shortcuts for many operations, by performing operations in the background, and by providing import filters for CIF and XML crystal structure files. Refinement results can be easily exported for further processing. State-of-the-art graphical export of diffraction patterns to pixel and vector graphics formats allows the creation of publication-quality graphs with minimum effort. Profex reads and converts a variety of proprietary raw data formats and is thus largely instrument independent. Profex and BGMN are available under an open-source license for Windows, Linux and OS X operating systems. PMID:26500466
Desktop document delivery using portable document format (PDF) files and the Web.
Shipman, J P; Gembala, W L; Reeder, J M; Zick, B A; Rainwater, M J
1998-01-01
Desktop access to electronic full-text literature was rated one of the most desirable services in a client survey conducted by the University of Washington Libraries. The University of Washington Health Sciences Libraries (UW HSL) conducted a ten-month pilot test from August 1996 to May 1997 to determine the feasibility of delivering electronic journal articles via the Internet to remote faculty. Articles were scanned into Adobe Acrobat Portable Document Format (PDF) files and delivered to individuals using Multipurpose Internet Mail Extensions (MIME) standard e-mail attachments and the Web. Participants retrieved scanned articles and used the Adobe Acrobat Reader software to view and print files. The pilot test required a special programming effort to automate the client notification and file deletion processes. Test participants were satisfied with the pilot test despite some technical difficulties. Desktop delivery is now offered as a routine delivery method from the UW HSL. PMID:9681165
Desktop document delivery using portable document format (PDF) files and the Web.
Shipman, J P; Gembala, W L; Reeder, J M; Zick, B A; Rainwater, M J
1998-07-01
Desktop access to electronic full-text literature was rated one of the most desirable services in a client survey conducted by the University of Washington Libraries. The University of Washington Health Sciences Libraries (UW HSL) conducted a ten-month pilot test from August 1996 to May 1997 to determine the feasibility of delivering electronic journal articles via the Internet to remote faculty. Articles were scanned into Adobe Acrobat Portable Document Format (PDF) files and delivered to individuals using Multipurpose Internet Mail Extensions (MIME) standard e-mail attachments and the Web. Participants retrieved scanned articles and used the Adobe Acrobat Reader software to view and print files. The pilot test required a special programming effort to automate the client notification and file deletion processes. Test participants were satisfied with the pilot test despite some technical difficulties. Desktop delivery is now offered as a routine delivery method from the UW HSL.
"AFacet": a geometry based format and visualizer to support SAR and multisensor signature generation
NASA Astrophysics Data System (ADS)
Rosencrantz, Stephen; Nehrbass, John; Zelnio, Ed; Sudkamp, Beth
2018-04-01
When simulating multisensor signature data (including SAR, LIDAR, EO, IR, etc...), geometry data are required that accurately represent the target. Most vehicular targets can, in real life, exist in many possible configurations. Examples of these configurations might include a rotated turret, an open door, a missing roof rack, or a seat made of metal or wood. Previously we have used the Modelman (.mmp) format and tool to represent and manipulate our articulable models. Unfortunately Modelman is now an unsupported tool and an undocumented binary format. Some work has been done to reverse engineer a reader in Matlab so that the format could continue to be useful. This work was tedious and resulted in an incomplete conversion. In addition, the resulting articulable models could not be altered and re-saved in the Modelman format. The AFacet (.afacet) articulable facet file format is a replacement for the binary Modelman (.mmp) file format. There is a one-time straight forward path for conversion from Modelman to the AFacet format. It is a simple ASCII, comma separated, self-documenting format that is easily readable (and in many cases usefully editable) by a human with any text editor, preventing future obsolescence. In addition, because the format is simple, it is relatively easy for even the most novice programmer to create a program to read and write AFacet files in any language without any special libraries. This paper presents the AFacet format, as well as a suite of tools for creating, articulating, manipulating, viewing, and converting the 370+ (when this paper was written) models that have been converted to the AFacet format.
Bouyssié, David; Dubois, Marc; Nasso, Sara; Gonzalez de Peredo, Anne; Burlet-Schiltz, Odile; Aebersold, Ruedi; Monsarrat, Bernard
2015-01-01
The analysis and management of MS data, especially those generated by data independent MS acquisition, exemplified by SWATH-MS, pose significant challenges for proteomics bioinformatics. The large size and vast amount of information inherent to these data sets need to be properly structured to enable an efficient and straightforward extraction of the signals used to identify specific target peptides. Standard XML based formats are not well suited to large MS data files, for example, those generated by SWATH-MS, and compromise high-throughput data processing and storing. We developed mzDB, an efficient file format for large MS data sets. It relies on the SQLite software library and consists of a standardized and portable server-less single-file database. An optimized 3D indexing approach is adopted, where the LC-MS coordinates (retention time and m/z), along with the precursor m/z for SWATH-MS data, are used to query the database for data extraction. In comparison with XML formats, mzDB saves ∼25% of storage space and improves access times by a factor of twofold up to even 2000-fold, depending on the particular data access. Similarly, mzDB shows also slightly to significantly lower access times in comparison with other formats like mz5. Both C++ and Java implementations, converting raw or XML formats to mzDB and providing access methods, will be released under permissive license. mzDB can be easily accessed by the SQLite C library and its drivers for all major languages, and browsed with existing dedicated GUIs. The mzDB described here can boost existing mass spectrometry data analysis pipelines, offering unprecedented performance in terms of efficiency, portability, compactness, and flexibility. PMID:25505153
Seaside, Oregon, Tsunami Pilot Study-Modernization of FEMA Flood Hazard Maps: GIS Data
Wong, Florence L.; Venturato, Angie J.; Geist, Eric L.
2006-01-01
Introduction: The Federal Emergency Management Agency (FEMA) Federal Insurance Rate Map (FIRM) guidelines do not currently exist for conducting and incorporating tsunami hazard assessments that reflect the substantial advances in tsunami research achieved in the last two decades; this conclusion is the result of two FEMA-sponsored workshops and the associated Tsunami Focused Study (Chowdhury and others, 2005). Therefore, as part of FEMA's Map Modernization Program, a Tsunami Pilot Study was carried out in the Seaside/Gearhart, Oregon, area to develop an improved Probabilistic Tsunami Hazard Analysis (PTHA) methodology and to provide recommendations for improved tsunami hazard assessment guidelines (Tsunami Pilot Study Working Group, 2006). The Seaside area was chosen because it is typical of many coastal communities in the section of the Pacific Coast from Cape Mendocino to the Strait of Juan de Fuca, and because State agencies and local stakeholders expressed considerable interest in mapping the tsunami threat to this area. The study was an interagency effort by FEMA, U.S. Geological Survey, and the National Oceanic and Atmospheric Administration (NOAA), in collaboration with the University of Southern California, Middle East Technical University, Portland State University, Horning Geoscience, Northwest Hydraulics Consultants, and the Oregon Department of Geological and Mineral Industries. We present the spatial (geographic information system, GIS) data from the pilot study in standard GIS formats and provide files for visualization in Google Earth, a global map viewer.
ArrayBridge: Interweaving declarative array processing with high-performance computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xing, Haoyuan; Floratos, Sofoklis; Blanas, Spyros
Scientists are increasingly turning to datacenter-scale computers to produce and analyze massive arrays. Despite decades of database research that extols the virtues of declarative query processing, scientists still write, debug and parallelize imperative HPC kernels even for the most mundane queries. This impedance mismatch has been partly attributed to the cumbersome data loading process; in response, the database community has proposed in situ mechanisms to access data in scientific file formats. Scientists, however, desire more than a passive access method that reads arrays from files. This paper describes ArrayBridge, a bi-directional array view mechanism for scientific file formats, that aimsmore » to make declarative array manipulations interoperable with imperative file-centric analyses. Our prototype implementation of ArrayBridge uses HDF5 as the underlying array storage library and seamlessly integrates into the SciDB open-source array database system. In addition to fast querying over external array objects, ArrayBridge produces arrays in the HDF5 file format just as easily as it can read from it. ArrayBridge also supports time travel queries from imperative kernels through the unmodified HDF5 API, and automatically deduplicates between array versions for space efficiency. Our extensive performance evaluation in NERSC, a large-scale scientific computing facility, shows that ArrayBridge exhibits statistically indistinguishable performance and I/O scalability to the native SciDB storage engine.« less
User's guide to HYPOINVERSE-2000, a Fortran program to solve for earthquake locations and magnitudes
Klein, Fred W.
2002-01-01
Hypoinverse is a computer program that processes files of seismic station data for an earthquake (like p wave arrival times and seismogram amplitudes and durations) into earthquake locations and magnitudes. It is one of a long line of similar USGS programs including HYPOLAYR (Eaton, 1969), HYPO71 (Lee and Lahr, 1972), and HYPOELLIPSE (Lahr, 1980). If you are new to Hypoinverse, you may want to start by glancing at the section “SOME SIMPLE COMMAND SEQUENCES” to get a feel of some simpler sessions. This document is essentially an advanced user’s guide, and reading it sequentially will probably plow the reader into more detail than he/she needs. Every user must have a crust model, station list and phase data input files, and glancing at these sections is a good place to begin. The program has many options because it has grown over the years to meet the needs of one the largest seismic networks in the world, but small networks with just a few stations do use the program and can ignore most of the options and commands. History and availability. Hypoinverse was originally written for the Eclipse minicomputer in 1978 (Klein, 1978). A revised version for VAX and Pro-350 computers (Klein, 1985) was later expanded to include multiple crustal models and other capabilities (Klein, 1989). This current report documents the expanded Y2000 version and it supercedes the earlier documents. It serves as a detailed user's guide to the current version running on unix and VAX-alpha computers, and to the version supplied with the Earthworm earthquake digitizing system. Fortran-77 source code (Sun and VAX compatible) and copies of this documentation is available via anonymous ftp from computers in Menlo Park. At present, the computer is swave.wr.usgs.gov and the directory is /ftp/pub/outgoing/klein/hyp2000. If you are running Hypoinverse on one of the Menlo Park EHZ or NCSN unix computers, the executable currently is ~klein/hyp2000/hyp2000. New features. The Y2000 version of Hypoinverse includes all of the previous capabilities, but adds Y2000 formats to those defined earlier. In most cases, the new formats add 2 digits to the year field to accommodate the century. Other fields are sometimes rearranged or expanded to accommodate a better field order. The Y2000 formats are invoked with the “200” command. When the Y2000 flag is turned on, all files are read and written in the new format and there is no mixing of format types in a single run. Some formats without a date field, like station files, have not changed. A separate program called 2000CONV has been written to convert old formats to new. Other new features, like expanded station names, calculating amplitude magnitudes from a variety of digital seismometers, station history files, interactive earthquake processing, and locations from CUSP (Caltech USGS Seismic Processing) binary files have been added. General features. Hypoinverse will locate any number of events in an input file, which can be in one of several different formats. Any or all of printout, summary or archive output may be produced. Hypoinverse is driven by user commands. The various commands define input and output files, set adjustable parameters, and solve for locations of a file of earthquake data using the parameters and files currently set. It is both interactive and "batch" in that commands may be executed either from the keyboard or from a file. You execute the commands in a file by typing @filename at the Hypoinverse prompt. Users may either supply parameters on the command line, or omit them and are prompted interactively. The current parameter values are displayed and may be taken as defaults by pressing just the RETURN key after the prompt. This makes the program very easy to use, providing you can remember the names of the commands. Combining commands with and without their required parameters into a command file permits a variety of customized procedures such as automatic input of crustal model and station data, but prompting for a different phase file each time. All commands are 3 letters long and most require one or more parameters or file names. If they appear on a line with a command, character strings such as filenames must be enclosed in apostrophes (single quotes). Appendix 1 gives this and other free-format rules for supplying parameters, which are parsed in Fortran. When several parameters are required following a command, any of them may be omitted by replacing them with null fields (see appendix 1). A null field leaves that parameter unchanged from its current or default value. When you start HYPOINVERSE, default values are in effect for all parameters except file names. Hypoinverse is a complicated program with many features and options. Many of these "advanced" or seldom used features are documented here, but are more detailed than a typical user needs to read about when first starting with the program. I have put some of this material in smaller type so that a first time user can concentrate on the more important information.
2011-05-01
iTunes illustrate the difference between the centralized approach of digital library systems and the distributed approach of container file formats...metadata in a container file format. Apple’s iTunes uses a centralized metadata approach and allows users to maintain song metadata in a single...one iTunes library to another the metadata must be copied separately or reentered in the new library. This demonstrates the utility of storing metadata
Enhanced Historical Land-Use and Land-Cover Data Sets of the U.S. Geological Survey
Price, Curtis V.; Nakagaki, Naomi; Hitt, Kerie J.; Clawges, Rick M.
2007-01-01
Historical land-use and land-cover data, available from the U.S. Geological Survey (USGS) for the conterminous United States and Hawaii, have been enhanced for use in geographic information systems (GIS) applications. The original digital data sets were created by the USGS in the late 1970s and early 1980s and were later converted by USGS and the U.S. Environmental Protection Agency (USEPA) to a geographic information system (GIS) format in the early 1990s. These data were made available on USEPA's Web site since the early 1990s and have been used for many national applications, despite minor coding and topological errors. During the 1990s, a group of USGS researchers made modifications to the data set for use in the National Water-Quality Assessment Program. These edited files have been further modified to create a more accurate, topologically clean, and seamless national data set. Several different methods, including custom editing software and several batch processes, were applied to create this enhanced version of the national data set. The data sets are included in this report in the commonly used shapefile and Tagged Image Format File (TIFF) formats. In addition, this report includes two polygon data sets (in shapefile format) representing (1) land-use and land-cover source documentation extracted from the previously published USGS data files, and (2) the extent of each polygon data file.
Belgian guidelines for economic evaluations: second edition.
Thiry, Nancy; Neyt, Mattias; Van De Sande, Stefaan; Cleemput, Irina
2014-12-01
The aim of this study was to present the updated methodological guidelines for economic evaluations of healthcare interventions (drugs, medical devices, and other interventions) in Belgium. The update of the guidelines was performed by three Belgian health economists following feedback from users of the former guidelines and personal experience. The updated guidelines were discussed with a multidisciplinary team consisting of other health economists, assessors of reimbursement request files, representatives of Belgian databases and representatives of the drugs and medical devices industry. The final document was validated by three external validators that were not involved in the previous discussions. The guidelines give methodological guidance for the following components of an economic evaluation: literature review, perspective of the evaluation, definition of the target population, choice of the comparator, analytic technique and study design, calculation of costs, valuation of outcomes, definition of the time horizon, modeling, handling uncertainty and discounting. We present a reference case that can be considered as the minimal requirement for Belgian economic evaluations of health interventions. These guidelines will improve the methodological quality, transparency and uniformity of the economic evaluations performed in Belgium. The guidelines will also provide support to the researchers and assessors performing or evaluating economic evaluations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, D.
2014-03-31
In November 2012, the Working Party on Evaluation Cooperation Subgroup 38 (WPEC-SG38) began with the task of developing a nuclear data format and supporting infrastructure to replace the now nearly 50 year old ENDF format. The first step in this process is to develop requirements for the new format and infrastructure. In this talk, I will review the status of ENDF's Thermal Scattering Law (TSL) formats as well as support for this data in the GND format (from which the new format is expected to evolve). Finally, I hope to begin a dialog with members of the thermal neutron scatteringmore » community so that their data needs can be accurately and easily accommodated by the new format and tools, as captured by the requirements document. During this discussion, we must keep in mind that the new tools and format must; Support what is in existing data files; Support new things we want to put in data files; and Be flexible enough for us to adapt it to future unanticipated challenges.« less
Lina Ma,; Sherrod, David R.; Scott, William E.
2014-01-01
This geodatabase contains information derived from legacy mapping that was published in 1995 as U.S. Geological Survey Open-File Report 95-219. The main component of this publication is a geologic map database prepared using geographic information system (GIS) applications. Included are pdf files to view or print the map sheet, the accompanying pamphlet from Open-File Report 95-219, and links to the original publication, which is available as scanned files in pdf format.
Brown, Janie; McCrorie, Pamela
2015-03-01
This research explored the impact of tablet technology, in the form of Apple iPads, on undergraduate nursing and midwifery students' learning outcomes. In simulated clinical learning environments, first-year nursing students (n = 30) accessed apps and reference materials on iPads. Third-year nursing students (n = 88) referred to clinical guidelines to aid their decision making when problem solving. First-year midwifery students (n = 25) filmed themselves undertaking a skill and then immediately played back the video file. A total of 45 students completed an online questionnaire that allowed for qualitative comments. Students reported finding the use of iPads easy and that iPads provided point-of-care access to resources, ensuring an evidence-based approach to clinical decision making. iPads reportedly improved student efficiency and time management, while improving their ability to provide patient education. Students who used iPads for the purpose of formative self-assessment appreciated the immediate feedback and opportunity to develop clinical skills.
cljam: a library for handling DNA sequence alignment/map (SAM) with parallel processing.
Takeuchi, Toshiki; Yamada, Atsuo; Aoki, Takashi; Nishimura, Kunihiro
2016-01-01
Next-generation sequencing can determine DNA bases and the results of sequence alignments are generally stored in files in the Sequence Alignment/Map (SAM) format and the compressed binary version (BAM) of it. SAMtools is a typical tool for dealing with files in the SAM/BAM format. SAMtools has various functions, including detection of variants, visualization of alignments, indexing, extraction of parts of the data and loci, and conversion of file formats. It is written in C and can execute fast. However, SAMtools requires an additional implementation to be used in parallel with, for example, OpenMP (Open Multi-Processing) libraries. For the accumulation of next-generation sequencing data, a simple parallelization program, which can support cloud and PC cluster environments, is required. We have developed cljam using the Clojure programming language, which simplifies parallel programming, to handle SAM/BAM data. Cljam can run in a Java runtime environment (e.g., Windows, Linux, Mac OS X) with Clojure. Cljam can process and analyze SAM/BAM files in parallel and at high speed. The execution time with cljam is almost the same as with SAMtools. The cljam code is written in Clojure and has fewer lines than other similar tools.
WhopGenome: high-speed access to whole-genome variation and sequence data in R.
Wittelsbürger, Ulrich; Pfeifer, Bastian; Lercher, Martin J
2015-02-01
The statistical programming language R has become a de facto standard for the analysis of many types of biological data, and is well suited for the rapid development of new algorithms. However, variant call data from population-scale resequencing projects are typically too large to be read and processed efficiently with R's built-in I/O capabilities. WhopGenome can efficiently read whole-genome variation data stored in the widely used variant call format (VCF) file format into several R data types. VCF files can be accessed either on local hard drives or on remote servers. WhopGenome can associate variants with annotations such as those available from the UCSC genome browser, and can accelerate the reading process by filtering loci according to user-defined criteria. WhopGenome can also read other Tabix-indexed files and create indices to allow fast selective access to FASTA-formatted sequence files. The WhopGenome R package is available on CRAN at http://cran.r-project.org/web/packages/WhopGenome/. A Bioconductor package has been submitted. lercher@cs.uni-duesseldorf.de. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Barnes, David G.; Vidiassov, Michail; Ruthensteiner, Bernhard; Fluke, Christopher J.; Quayle, Michelle R.; McHenry, Colin R.
2013-01-01
With the latest release of the S2PLOT graphics library, embedding interactive, 3-dimensional (3-d) scientific figures in Adobe Portable Document Format (PDF) files is simple, and can be accomplished without commercial software. In this paper, we motivate the need for embedding 3-d figures in scholarly articles. We explain how 3-d figures can be created using the S2PLOT graphics library, exported to Product Representation Compact (PRC) format, and included as fully interactive, 3-d figures in PDF files using the movie15 LaTeX package. We present new examples of 3-d PDF figures, explain how they have been made, validate them, and comment on their advantages over traditional, static 2-dimensional (2-d) figures. With the judicious use of 3-d rather than 2-d figures, scientists can now publish, share and archive more useful, flexible and faithful representations of their study outcomes. The article you are reading does not have embedded 3-d figures. The full paper, with embedded 3-d figures, is recommended and is available as a supplementary download from PLoS ONE (File S2). PMID:24086243
Barnes, David G; Vidiassov, Michail; Ruthensteiner, Bernhard; Fluke, Christopher J; Quayle, Michelle R; McHenry, Colin R
2013-01-01
With the latest release of the S2PLOT graphics library, embedding interactive, 3-dimensional (3-d) scientific figures in Adobe Portable Document Format (PDF) files is simple, and can be accomplished without commercial software. In this paper, we motivate the need for embedding 3-d figures in scholarly articles. We explain how 3-d figures can be created using the S2PLOT graphics library, exported to Product Representation Compact (PRC) format, and included as fully interactive, 3-d figures in PDF files using the movie15 LaTeX package. We present new examples of 3-d PDF figures, explain how they have been made, validate them, and comment on their advantages over traditional, static 2-dimensional (2-d) figures. With the judicious use of 3-d rather than 2-d figures, scientists can now publish, share and archive more useful, flexible and faithful representations of their study outcomes. The article you are reading does not have embedded 3-d figures. The full paper, with embedded 3-d figures, is recommended and is available as a supplementary download from PLoS ONE (File S2).
Cannon, William F.; Woodruff, Laurel G.
2003-01-01
This data set consists of nine files of geochemical information on various types of surficial deposits in northwestern Wisconsin and immediately adjacent parts of Michigan and Minnesota. The files are presented in two formats: as dbase files in dbaseIV form and Microsoft Excel form. The data present multi-element chemical analyses of soils, stream sediments, and lake sediments. Latitude and longitude values are provided in each file so that the dbf files can be readily imported to GIS applications. Metadata files are provided in outline form, question and answer form and text form. The metadata includes information on procedures for sample collection, sample preparation, and chemical analyses including sensitivity and precision.
ProMC: Input-output data format for HEP applications using varint encoding
NASA Astrophysics Data System (ADS)
Chekanov, S. V.; May, E.; Strand, K.; Van Gemmeren, P.
2014-10-01
A new data format for Monte Carlo (MC) events, or any structural data, including experimental data, is discussed. The format is designed to store data in a compact binary form using variable-size integer encoding as implemented in the Google's Protocol Buffers package. This approach is implemented in the PROMC library which produces smaller file sizes for MC records compared to the existing input-output libraries used in high-energy physics (HEP). Other important features of the proposed format are a separation of abstract data layouts from concrete programming implementations, self-description and random access. Data stored in PROMC files can be written, read and manipulated in a number of programming languages, such C++, JAVA, FORTRAN and PYTHON.
ERIC Educational Resources Information Center
Falk, Howard
1998-01-01
Discussion of CD (compact disc) recorders describes recording applications, including storing large graphic files, creating audio CDs, and storing material downloaded from the Internet; backing up files; lifespan; CD recording formats; continuous recording; recording software; recorder media; vulnerability of CDs; basic computer requirements; and…
14 CFR 221.31 - Rules and regulations governing passenger fares and services.
Code of Federal Regulations, 2010 CFR
2010-01-01
... TRANSPORTATION (AVIATION PROCEEDINGS) ECONOMIC REGULATIONS TARIFFS Manner of Filing Tariffs § 221.31 Rules and... (b) of this section may be filed in a paper format, subject to the requirements of this part and...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-20
... through DEMD's in-house databases; Well log interpretation, including correlation of formation tops.... Files must have descriptive file names to help DEMD quickly locate specific components of the proposal...
Prostate and Urologic Cancer | Division of Cancer Prevention
[[{"fid":"183","view_mode":"default","fields":{"format":"default","field_file_image_alt_text[und][0][value]":"Prostate and Urologic Cancer Research Group Homepage Logo","field_file_image_title_text[und][0][value]":"Prostate and Urologic Cancer Research Group Homepage
Publications - RDF 2007-1 | Alaska Division of Geological & Geophysical
://doi.org/10.14509/15759 Publication Products Report Report Information rdf2007_001.pdf (443.0 K) Digital Geospatial Data Digital Geospatial Data Fairbanks Mining District Geochemical Data Data File Format File Size
Publications - RDF 2011-4 v. 2 | Alaska Division of Geological &
://doi.org/10.14509/23002 Publication Products Report Report Information rdf2011_004.pdf (519.0 K) Digital Geospatial Data Digital Geospatial Data Moran Geochemistry Data File Format File Size Info Download moran
Publications - RI 2001-1D | Alaska Division of Geological & Geophysical
-geologic map of the Chulitna region, southcentral Alaska, scale 1:63,360 (16.0 M) Digital Geospatial Data Digital Geospatial Data Chulitna region engineering geology Data File Format File Size Info Download
Griss, Johannes; Jones, Andrew R; Sachsenberg, Timo; Walzer, Mathias; Gatto, Laurent; Hartler, Jürgen; Thallinger, Gerhard G; Salek, Reza M; Steinbeck, Christoph; Neuhauser, Nadin; Cox, Jürgen; Neumann, Steffen; Fan, Jun; Reisinger, Florian; Xu, Qing-Wei; Del Toro, Noemi; Pérez-Riverol, Yasset; Ghali, Fawaz; Bandeira, Nuno; Xenarios, Ioannis; Kohlbacher, Oliver; Vizcaíno, Juan Antonio; Hermjakob, Henning
2014-10-01
The HUPO Proteomics Standards Initiative has developed several standardized data formats to facilitate data sharing in mass spectrometry (MS)-based proteomics. These allow researchers to report their complete results in a unified way. However, at present, there is no format to describe the final qualitative and quantitative results for proteomics and metabolomics experiments in a simple tabular format. Many downstream analysis use cases are only concerned with the final results of an experiment and require an easily accessible format, compatible with tools such as Microsoft Excel or R. We developed the mzTab file format for MS-based proteomics and metabolomics results to meet this need. mzTab is intended as a lightweight supplement to the existing standard XML-based file formats (mzML, mzIdentML, mzQuantML), providing a comprehensive summary, similar in concept to the supplemental material of a scientific publication. mzTab files can contain protein, peptide, and small molecule identifications together with experimental metadata and basic quantitative information. The format is not intended to store the complete experimental evidence but provides mechanisms to report results at different levels of detail. These range from a simple summary of the final results to a representation of the results including the experimental design. This format is ideally suited to make MS-based proteomics and metabolomics results available to a wider biological community outside the field of MS. Several software tools for proteomics and metabolomics have already adapted the format as an output format. The comprehensive mzTab specification document and extensive additional documentation can be found online. © 2014 by The American Society for Biochemistry and Molecular Biology, Inc.
Griss, Johannes; Jones, Andrew R.; Sachsenberg, Timo; Walzer, Mathias; Gatto, Laurent; Hartler, Jürgen; Thallinger, Gerhard G.; Salek, Reza M.; Steinbeck, Christoph; Neuhauser, Nadin; Cox, Jürgen; Neumann, Steffen; Fan, Jun; Reisinger, Florian; Xu, Qing-Wei; del Toro, Noemi; Pérez-Riverol, Yasset; Ghali, Fawaz; Bandeira, Nuno; Xenarios, Ioannis; Kohlbacher, Oliver; Vizcaíno, Juan Antonio; Hermjakob, Henning
2014-01-01
The HUPO Proteomics Standards Initiative has developed several standardized data formats to facilitate data sharing in mass spectrometry (MS)-based proteomics. These allow researchers to report their complete results in a unified way. However, at present, there is no format to describe the final qualitative and quantitative results for proteomics and metabolomics experiments in a simple tabular format. Many downstream analysis use cases are only concerned with the final results of an experiment and require an easily accessible format, compatible with tools such as Microsoft Excel or R. We developed the mzTab file format for MS-based proteomics and metabolomics results to meet this need. mzTab is intended as a lightweight supplement to the existing standard XML-based file formats (mzML, mzIdentML, mzQuantML), providing a comprehensive summary, similar in concept to the supplemental material of a scientific publication. mzTab files can contain protein, peptide, and small molecule identifications together with experimental metadata and basic quantitative information. The format is not intended to store the complete experimental evidence but provides mechanisms to report results at different levels of detail. These range from a simple summary of the final results to a representation of the results including the experimental design. This format is ideally suited to make MS-based proteomics and metabolomics results available to a wider biological community outside the field of MS. Several software tools for proteomics and metabolomics have already adapted the format as an output format. The comprehensive mzTab specification document and extensive additional documentation can be found online. PMID:24980485
FEQinput—An editor for the full equations (FEQ) hydraulic modeling system
Ancalle, David S.; Ancalle, Pablo J.; Domanski, Marian M.
2017-10-30
IntroductionThe Full Equations Model (FEQ) is a computer program that solves the full, dynamic equations of motion for one-dimensional unsteady hydraulic flow in open channels and through control structures. As a result, hydrologists have used FEQ to design and operate flood-control structures, delineate inundation maps, and analyze peak-flow impacts. To aid in fighting floods, hydrologists are using the software to develop a system that uses flood-plain models to simulate real-time streamflow.Input files for FEQ are composed of text files that contain large amounts of parameters, data, and instructions that are written in a format exclusive to FEQ. Although documentation exists that can aid in the creation and editing of these input files, new users face a steep learning curve in order to understand the specific format and language of the files.FEQinput provides a set of tools to help a new user overcome the steep learning curve associated with creating and modifying input files for the FEQ hydraulic model and the related utility tool, Full Equations Utilities (FEQUTL).
NASA Technical Reports Server (NTRS)
Ryan, J. W.; Ma, C.; Schupler, B. R.
1980-01-01
A data base handler which would act to tie Mark 3 system programs together is discussed. The data base handler is written in FORTRAN and is implemented on the Hewlett-Packard 21MX and the IBM 360/91. The system design objectives were to (1) provide for an easily specified method of data interchange among programs, (2) provide for a high level of data integrity, (3) accommodate changing requirments, (4) promote program accountability, (5) provide a single source of program constants, and (6) provide a central point for data archiving. The system consists of two distinct parts: a set of files existing on disk packs and tapes; and a set of utility subroutines which allow users to access the information in these files. Users never directly read or write the files and need not know the details of how the data are formatted in the files. To the users, the storage medium is format free. A user does need to know something about the sequencing of his data in the files but nothing about data in which he has no interest.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dolan, Daniel H.; Ao, Tommy
The Sandia Data Archive (SDA) format is a specific implementation of the HDF5 (Hierarchal Data Format version 5) standard. The format was developed for storing data in a universally accessible manner. SDA files may contain one or more data records, each associated with a distinct text label. Primitive records provide basic data storage, while compound records support more elaborate grouping. External records allow text/binary files to be carried inside an archive and later recovered. This report documents version 1.0 of the SDA standard. The information provided here is sufficient for reading from and writing to an archive. Although the formatmore » was original designed for use in MATLAB, broader use is encouraged.« less
Navigation Guidelines for Orbital Formation Flying Missions
NASA Technical Reports Server (NTRS)
Carpenter, J. Russell
2003-01-01
Some simple guidelines based on the accuracy in determining a satellite formation's semi-major axis differences are useful in making preliminary assessments of the navigation accuracy needed to support such missions. These guidelines are valid for any elliptical orbit, regardless of eccentricity. Although maneuvers required for formation establishment, reconfiguration, and station-keeping require accurate prediction of the state estimate to the maneuver time, and hence are directly affected by errors in all the orbital elements, experience has shown that determination of orbit plane orientation and orbit shape to acceptable levels is less challenging than the determination of orbital period or semi-major axis. Furthermore, any differences among the member's semi-major axis are undesirable for a satellite formation, since it will lead to differential along-track drift due to period differences. Since inevitable navigation errors prevent these differences from ever being zero, one may use the guidelines this paper presents to determine how much drift will result from a given relative navigation accuracy, or vice versa. Since the guidelines do not account for non-two-body perturbations, they may be viewed as useful preliminary design tools, rather than as the basis for mission navigation requirements, which should be based on detailed analysis of the mission configuration, including all relevant sources of uncertainty.
Do you also have problems with the file format syndrome?
De Cuyper, B; Nyssen, E; Christophe, Y; Cornelis, J
1991-11-01
In a biomedical data processing environment, an essential requirement is the ability to integrate a large class of standard modules for the acquisition, processing and display of the (image) data. Our approach to the management and manipulation of the different data formats is based on the specification of a common standard for the representation of data formats, called 'data nature descriptions' to emphasise that this representation not only specifies the structure but also the contents of data objects (files). The idea behind this concept is to associate each hardware and software component that produces or uses medical data with a description of the data objects manipulated by that component. In our approach a special software module (a format convertor generator) takes care of the appropriate data format conversions, required when two or more components of the system exchange data.
Use of Schema on Read in Earth Science Data Archives
NASA Astrophysics Data System (ADS)
Petrenko, M.; Hegde, M.; Smit, C.; Pilone, P.; Pham, L.
2017-12-01
Traditionally, NASA Earth Science data archives have file-based storage using proprietary data file formats, such as HDF and HDF-EOS, which are optimized to support fast and efficient storage of spaceborne and model data as they are generated. The use of file-based storage essentially imposes an indexing strategy based on data dimensions. In most cases, NASA Earth Science data uses time as the primary index, leading to poor performance in accessing data in spatial dimensions. For example, producing a time series for a single spatial grid cell involves accessing a large number of data files. With exponential growth in data volume due to the ever-increasing spatial and temporal resolution of the data, using file-based archives poses significant performance and cost barriers to data discovery and access. Storing and disseminating data in proprietary data formats imposes an additional access barrier for users outside the mainstream research community. At the NASA Goddard Earth Sciences Data Information Services Center (GES DISC), we have evaluated applying the "schema-on-read" principle to data access and distribution. We used Apache Parquet to store geospatial data, and have exposed data through Amazon Web Services (AWS) Athena, AWS Simple Storage Service (S3), and Apache Spark. Using the "schema-on-read" approach allows customization of indexing—spatial or temporal—to suit the data access pattern. The storage of data in open formats such as Apache Parquet has widespread support in popular programming languages. A wide range of solutions for handling big data lowers the access barrier for all users. This presentation will discuss formats used for data storage, frameworks with support for "schema-on-read" used for data access, and common use cases covering data usage patterns seen in a geospatial data archive.
17 CFR 240.13d-2 - Filing of amendments to Schedules 13D or 13G.
Code of Federal Regulations, 2013 CFR
2013-04-01
...) The first electronic amendment to a paper format Schedule 13D (§ 240.13d-101 of this chapter) or... 17 Commodity and Securities Exchanges 3 2013-04-01 2013-04-01 false Filing of amendments to... Under the Securities Exchange Act of 1934 Regulation 13d-G § 240.13d-2 Filing of amendments to Schedules...
17 CFR 240.13d-2 - Filing of amendments to Schedules 13D or 13G.
Code of Federal Regulations, 2014 CFR
2014-04-01
...) The first electronic amendment to a paper format Schedule 13D (§ 240.13d-101 of this chapter) or... 17 Commodity and Securities Exchanges 4 2014-04-01 2014-04-01 false Filing of amendments to... Under the Securities Exchange Act of 1934 Regulation 13d-G § 240.13d-2 Filing of amendments to Schedules...
DOT National Transportation Integrated Search
2017-07-26
This zip file contains POSTDATA.ATT (.ATT); Print to File (.PRN); Portable Document Format (.PDF); and document (.DOCX) files of data to support FHWA-JPO-16-385, Analysis, modeling, and simulation (AMS) testbed development and evaluation to support d...
Family Day Homes: Get Organized with Information Systems.
ERIC Educational Resources Information Center
Dague, Mindy
1999-01-01
Notes that record keeping and management are critical aspects of home day care centers. Highlights options for tools, including calendars, loose-leaf notebooks, ledgers, computer spreadsheet software, and file boxes. Provides guidelines for organizing information as well as particular information necessary regarding provider, parent, and child.…
Smieszek, Tomas W.; Granato, Gregory E.
2000-01-01
Spatial data are important for interpretation of water-quality information on a regional or national scale. Geographic information systems (GIS) facilitate interpretation and integration of spatial data. The geographic information and data compiled for the conterminous United States during the National Highway Runoff Water-Quality Data and Methodology Synthesis project is described in this document, which also includes information on the structure, file types, and the geographic information in the data files. This 'geodata' directory contains two subdirectories, labeled 'gisdata' and 'gisimage.' The 'gisdata' directory contains ArcInfo coverages, ArcInfo export files, shapefiles (used in ArcView), Spatial Data Transfer Standard Topological Vector Profile format files, and meta files in subdirectories organized by file type. The 'gisimage' directory contains the GIS data in common image-file formats. The spatial geodata includes two rain-zone region maps and a map of national ecosystems originally published by the U.S. Environmental Protection Agency; regional estimates of mean annual streamflow, and water hardness published by the Federal Highway Administration; and mean monthly temperature, mean annual precipitation, and mean monthly snowfall modified from data published by the National Climatic Data Center and made available to the public by the Oregon Climate Service at Oregon State University. These GIS files were compiled for qualitative spatial analysis of available data on a national and(or) regional scale and therefore should be considered as qualitative representations, not precise geographic location information.
Electronic hand-drafting and picture management system.
Yang, Tsung-Han; Ku, Cheng-Yuan; Yen, David C; Hsieh, Wen-Huai
2012-08-01
The Department of Health of Executive Yuan in Taiwan (R.O.C.) is implementing a five-stage project entitled Electronic Medical Record (EMR) converting all health records from written to electronic form. Traditionally, physicians record patients' symptoms, related examinations, and suggested treatments on paper medical records. Currently when implementing the EMR, all text files and image files in the Hospital Information System (HIS) and Picture Archiving and Communication Systems (PACS) are kept separate. The current medical system environment is unable to combine text files, hand-drafted files, and photographs in the same system, so it is difficult to support physicians with the recording of medical data. Furthermore, in surgical and other related departments, physicians need immediate access to medical records in order to understand the details of a patient's condition. In order to address these problems, the Department of Health has implemented an EMR project, with the primary goal of building an electronic hand-drafting and picture management system (HDP system) that can be used by medical personnel to record medical information in a convenient way. This system can simultaneously edit text files, hand-drafted files, and image files and then integrate these data into Portable Document Format (PDF) files. In addition, the output is designed to fit a variety of formats in order to meet various laws and regulations. By combining the HDP system with HIS and PACS, the applicability can be enhanced to fit various scenarios and can assist the medical industry in moving into the final phase of EMR.
Organic geochemistry data of Alaska
complied by Threlkeld, Charles N.; Obuch, Raymond C.; Gunther, G.L.
2000-01-01
In order to archive the results of various petroleum geochemical analyses of the Alaska resource assessment, the USGS developed an Alaskan Organic Geochemical Data Base (AOGDB) in 1978 to house the data generated from USGS and subcontracted laboratories. Prior to the AOGDB, the accumulated data resided in a flat data file entitled 'PGS' that was maintained by Petroleum Information Corporation with technical input from the USGS. The information herein is a breakout of the master flat file format into a relational data base table format (akdata).
As-built design specification for the CLASFYG program
NASA Technical Reports Server (NTRS)
Horton, C. L. (Principal Investigator)
1981-01-01
This program produces a file with a Universal-formatted header and data records in a nonstandard format. Trajectory coefficients are calculated from 5 to 8 acquisitions of radiance values in the training field corresponding to an agricultural product. These coefficients are then used to calculate a time of emergence and corresponding trajectory coefficients for each pixel in the test field. The time of emergence, two of the coefficients, and the sigma value for each pixel are written to the file.
Atmospheric Science Data Center
2018-04-12
SSE Global Data Text files of monthly averaged data for the entire ... Version: V6 Location: Global Spatial Coverage: (90N, 90S)(180W,180E) ... File Format: ASCII Order Data: SSE Global Data: Order Data SCAR-B Block: ...
Easy Online Access to Helpful Internet Guides.
ERIC Educational Resources Information Center
Tuss, Joan
1993-01-01
Lists recommended guides to the Internet that are available electronically. Basic commands needed to use anonymous ftp (file transfer protocol) are explained. An annotation and command formats to access, scan, retrieve, and exit each file are included for 11 titles. (EAM)
Publications - RI 94-25 | Alaska Division of Geological & Geophysical
-materials map of the Anchorage C-7 NW Quadrangle, Alaska, scale 1:25,000 (1.4 M) Digital Geospatial Data Digital Geospatial Data Anchorage C-7 NW Derivative materials Data File Format File Size Info Download
Publications - RI 94-26 | Alaska Division of Geological & Geophysical
-materials map of the Anchorage C-8 NE Quadrangle, Alaska, scale 1:25,000 (3.8 M) Digital Geospatial Data Digital Geospatial Data Anchorage C-8 NE Derivative materials Data File Format File Size Info Download
Publications - RI 94-27 | Alaska Division of Geological & Geophysical
-materials map of the Anchorage C-8 NW Quadrangle, Alaska, scale 1:25,000 (676.0 M) Digital Geospatial Data Digital Geospatial Data Anchorage C-8 NW Derivative materials Data File Format File Size Info Download
Publications - RI 94-24 | Alaska Division of Geological & Geophysical
-materials map of the Anchorage C-7 NE Quadrangle, Alaska, scale 1:25,000 (2.4 M) Digital Geospatial Data Digital Geospatial Data Anchorage C-7 NE Derivative materials Data File Format File Size Info Download
Astronomical Instrumentation System Markup Language
NASA Astrophysics Data System (ADS)
Goldbaum, Jesse M.
2016-05-01
The Astronomical Instrumentation System Markup Language (AISML) is an Extensible Markup Language (XML) based file format for maintaining and exchanging information about astronomical instrumentation. The factors behind the need for an AISML are first discussed followed by the reasons why XML was chosen as the format. Next it's shown how XML also provides the framework for a more precise definition of an astronomical instrument and how these instruments can be combined to form an Astronomical Instrumentation System (AIS). AISML files for several instruments as well as one for a sample AIS are provided. The files demonstrate how AISML can be utilized for various tasks from web page generation and programming interface to instrument maintenance and quality management. The advantages of widespread adoption of AISML are discussed.
An extended BET format for La RC shuttle experiments: Definition and development
NASA Technical Reports Server (NTRS)
Findlay, J. T.; Kelly, G. M.; Henry, M. W.
1981-01-01
A program for shuttle post-flight data reduction is discussed. An extended Best Estimate Trajectory (BET) file was developed. The extended format results in some subtle changes to the header record. The major change is the addition of twenty-six words to each data record. These words include atmospheric related parameters, body axis rate and acceleration data, computed aerodynamic coefficients, and angular accelerations. These parameters were added to facilitate post-flight aerodynamic coefficient determinations as well as shuttle entry air data sensor analyses. Software (NEWBET) was developed to generate the extended BET file utilizing the previously defined ENTREE BET, a dynamic data file which may be either derived inertial measurement unit data or aerodynamic coefficient instrument package data, and some atmospheric information.
Rea, A.H.; Becker, C.J.
1997-01-01
This compact disc contains 25 digital map data sets covering the State of Oklahoma that may be of interest to the general public, private industry, schools, and government agencies. Fourteen data sets are statewide. These data sets include: administrative boundaries; 104th U.S. Congressional district boundaries; county boundaries; latitudinal lines; longitudinal lines; geographic names; indexes of U.S. Geological Survey 1:100,000, and 1:250,000-scale topographic quadrangles; a shaded-relief image; Oklahoma State House of Representatives district boundaries; Oklahoma State Senate district boundaries; locations of U.S. Geological Survey stream gages; watershed boundaries and hydrologic cataloging unit numbers; and locations of weather stations. Eleven data sets are divided by county and are located in 77 county subdirectories. These data sets include: census block group boundaries with selected demographic data; city and major highways text; geographic names; land surface elevation contours; elevation points; an index of U.S. Geological Survey 1:24,000-scale topographic quadrangles; roads, streets and address ranges; highway text; school district boundaries; streams, river and lakes; and the public land survey system. All data sets are provided in a readily accessible format. Most data sets are provided in Digital Line Graph (DLG) format. The attributes for many of the DLG files are stored in related dBASE(R)-format files and may be joined to the data set polygon attribute or arc attribute tables using dBASE(R)-compatible software. (Any use of trade names in this publication is for descriptive purposes only and does not imply endorsement by the U.S. Government.) Point attribute tables are provided in dBASE(R) format only, and include the X and Y map coordinates of each point. Annotation (text plotted in map coordinates) are provided in AutoCAD Drawing Exchange format (DXF) files. The shaded-relief image is provided in TIFF format. All data sets except the shaded-relief image also are provided in ARC/INFO export-file format.
Filtering NetCDF Files by Using the EverVIEW Slice and Dice Tool
Conzelmann, Craig; Romañach, Stephanie S.
2010-01-01
Network Common Data Form (NetCDF) is a self-describing, machine-independent file format for storing array-oriented scientific data. It was created to provide a common interface between applications and real-time meteorological and other scientific data. Over the past few years, there has been a growing movement within the community of natural resource managers in The Everglades, Fla., to use NetCDF as the standard data container for datasets based on multidimensional arrays. As a consequence, a need surfaced for additional tools to view and manipulate NetCDF datasets, specifically to filter the files by creating subsets of large NetCDF files. The U.S. Geological Survey (USGS) and the Joint Ecosystem Modeling (JEM) group are working to address these needs with applications like the EverVIEW Slice and Dice Tool, which allows users to filter grid-based NetCDF files, thus targeting those data most important to them. The major functions of this tool are as follows: (1) to create subsets of NetCDF files temporally, spatially, and by data value; (2) to view the NetCDF data in table form; and (3) to export the filtered data to a comma-separated value (CSV) file format. The USGS and JEM will continue to work with scientists and natural resource managers across The Everglades to solve complex restoration problems through technological advances.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goodall, John; Iannacone, Mike; Athalye, Anish
2013-08-01
Morph is a framework and domain-specific language (DSL) that helps parse and transform structured documents. It currently supports several file formats including XML, JSON, and CSV, and custom formats are usable as well.
ISMRM Raw Data Format: A Proposed Standard for MRI Raw Datasets
Inati, Souheil J.; Naegele, Joseph D.; Zwart, Nicholas R.; Roopchansingh, Vinai; Lizak, Martin J.; Hansen, David C.; Liu, Chia-Ying; Atkinson, David; Kellman, Peter; Kozerke, Sebastian; Xue, Hui; Campbell-Washburn, Adrienne E.; Sørensen, Thomas S.; Hansen, Michael S.
2015-01-01
Purpose This work proposes the ISMRM Raw Data (ISMRMRD) format as a common MR raw data format, which promotes algorithm and data sharing. Methods A file format consisting of a flexible header and tagged frames of k-space data was designed. Application Programming Interfaces were implemented in C/C++, MATLAB, and Python. Converters for Bruker, General Electric, Philips, and Siemens proprietary file formats were implemented in C++. Raw data were collected using MRI scanners from four vendors, converted to ISMRMRD format, and reconstructed using software implemented in three programming languages (C++, MATLAB, Python). Results Images were obtained by reconstructing the raw data from all vendors. The source code, raw data, and images comprising this work are shared online, serving as an example of an image reconstruction project following a paradigm of reproducible research. Conclusion The proposed raw data format solves a practical problem for the MRI community. It may serve as a foundation for reproducible research and collaborations. The ISMRMRD format is a completely open and community-driven format, and the scientific community is invited (including commercial vendors) to participate either as users or developers. PMID:26822475
MINC 2.0: A Flexible Format for Multi-Modal Images.
Vincent, Robert D; Neelin, Peter; Khalili-Mahani, Najmeh; Janke, Andrew L; Fonov, Vladimir S; Robbins, Steven M; Baghdadi, Leila; Lerch, Jason; Sled, John G; Adalat, Reza; MacDonald, David; Zijdenbos, Alex P; Collins, D Louis; Evans, Alan C
2016-01-01
It is often useful that an imaging data format can afford rich metadata, be flexible, scale to very large file sizes, support multi-modal data, and have strong inbuilt mechanisms for data provenance. Beginning in 1992, MINC was developed as a system for flexible, self-documenting representation of neuroscientific imaging data with arbitrary orientation and dimensionality. The MINC system incorporates three broad components: a file format specification, a programming library, and a growing set of tools. In the early 2000's the MINC developers created MINC 2.0, which added support for 64-bit file sizes, internal compression, and a number of other modern features. Because of its extensible design, it has been easy to incorporate details of provenance in the header metadata, including an explicit processing history, unique identifiers, and vendor-specific scanner settings. This makes MINC ideal for use in large scale imaging studies and databases. It also makes it easy to adapt to new scanning sequences and modalities.
Newe, Axel; Ganslandt, Thomas
2013-01-01
The usefulness of the 3D Portable Document Format (PDF) for clinical, educational, and research purposes has recently been shown. However, the lack of a simple tool for converting biomedical data into the model data in the necessary Universal 3D (U3D) file format is a drawback for the broad acceptance of this new technology. A new module for the image processing and rapid prototyping framework MeVisLab does not only provide a platform-independent possibility to create surface meshes out of biomedical/DICOM and other data and to export them into U3D--it also lets the user add meta data to these meshes to predefine colors and names that can be processed by a PDF authoring software while generating 3D PDF files. Furthermore, the source code of the respective module is available and well documented so that it can easily be modified for own purposes.
Caryoscope: An Open Source Java application for viewing microarray data in a genomic context
Awad, Ihab AB; Rees, Christian A; Hernandez-Boussard, Tina; Ball, Catherine A; Sherlock, Gavin
2004-01-01
Background Microarray-based comparative genome hybridization experiments generate data that can be mapped onto the genome. These data are interpreted more easily when represented graphically in a genomic context. Results We have developed Caryoscope, which is an open source Java application for visualizing microarray data from array comparative genome hybridization experiments in a genomic context. Caryoscope can read General Feature Format files (GFF files), as well as comma- and tab-delimited files, that define the genomic positions of the microarray reporters for which data are obtained. The microarray data can be browsed using an interactive, zoomable interface, which helps users identify regions of chromosomal deletion or amplification. The graphical representation of the data can be exported in a number of graphic formats, including publication-quality formats such as PostScript. Conclusion Caryoscope is a useful tool that can aid in the visualization, exploration and interpretation of microarray data in a genomic context. PMID:15488149
Kamauu, Aaron W C; DuVall, Scott L; Robison, Reid J; Liimatta, Andrew P; Wiggins, Richard H; Avrin, David E
2006-01-01
Although digital teaching files are important to radiology education, there are no current satisfactory solutions for export of Digital Imaging and Communications in Medicine (DICOM) images from picture archiving and communication systems (PACS) in desktop publishing format. A vendor-neutral digital teaching file, the Radiology Interesting Case Server (RadICS), offers an efficient tool for harvesting interesting cases from PACS without requiring modifications of the PACS configurations. Radiologists push imaging studies from PACS to RadICS via the standard DICOM Send process, and the RadICS server automatically converts the DICOM images into the Joint Photographic Experts Group format, a common desktop publishing format. They can then select key images and create an interesting case series at the PACS workstation. RadICS was tested successfully against multiple unmodified commercial PACS. Using RadICS, radiologists are able to harvest and author interesting cases at the point of clinical interpretation with minimal disruption in clinical work flow. RSNA, 2006
ROSAT implementation of a proposed multi-mission x ray data format
NASA Technical Reports Server (NTRS)
Corcoran, M.; Pence, W.; White, R.; Conroy, M.
1992-01-01
Until recently little effort has been made to ensure that data from X-ray telescopes are delivered in a format that reflects the common characteristics that most X-ray datasets share. Instrument-specific data-product design hampers the comparison of X-ray measurements made by different detectors and should be avoided whenever possible. The ROSAT project and the High Energy Astrophysics Science Archive Research Center (HEASARC) have defined a set of X-ray data products ('rationalized files') for ROSAT data that can be used for distribution and archiving of data from other X-ray missions. This set of 'rationalized files' has been defined to isolate instrument-independent and instrument-specific quantities using standards FITS constructs to ensure portability. We discuss the usage of the 'rationalized files' by ROSAT for data distribution and archiving, with particular emphasis on discrimination between instrument-independent and instrument-specific quantities, and discuss application of this format to data from other X-ray missions.
ORPC RivGen controller performance raw data - Igiugig 2015
McEntee, Jarlath
2015-12-18
Contains raw data for operations of Ocean Renewable Power Company (ORPC) RivGen Power System in Igiugig 2015 in Matlab data file format. Two data files capture the data and timestamps for data, including power in, voltage, rotation rate, and velocity.
Putting "Reference" in the Publications Reference File.
ERIC Educational Resources Information Center
Zink, Steven D.
1980-01-01
Argues for more widespread utilization of the U.S. Government Printing Office's Publications Reference File, a reference tool in microfiche format used to answer questions about current U.S. government documents and their availability. Ways to accomplish this task are suggested. (Author/JD)
... this page, please enable JavaScript. MedlinePlus produces XML data sets that you are welcome to download and use. If you have questions about the MedlinePlus XML files, please contact us . For additional sources of MedlinePlus data in XML format, visit our Web service page, ...
Merged analog and photon counting profiles used as input for other RLPROF VAPs
Newsom, Rob
2014-10-03
The rlprof_merge VAP "merges" the photon counting and analog signals appropriately for each channel, creating an output data file that is very similar to the original raw data file format that the Raman lidar initially had.
Merged analog and photon counting profiles used as input for other RLPROF VAPs
Newsom, Rob
1998-03-01
The rlprof_merge VAP "merges" the photon counting and analog signals appropriately for each channel, creating an output data file that is very similar to the original raw data file format that the Raman lidar initially had.
MISR Level 3 Radiance Versioning
Atmospheric Science Data Center
2016-11-04
... ESDT Product File Name Prefix Current Quality Designations MIL3DRD, MIL3MRD, MIL3QRD, and MIL3YRD ... Data Product Specification Rev K (PDF). Update to work with new format of the input PGE 1 files. F02_0007 ...
The importance of Good Clinical Practice guidelines and its role in clinical trials
Vijayananthan, A; Nawawi, O
2008-01-01
Good Clinical Practice (GCP) is an international ethical and scientific quality standard for the design, conduct, performance, monitoring, auditing, recording, analyses and reporting of clinical trials. It also serves to protect the rights, integrity and confidentiality of trial subjects. It is very important to understand the background of the formation of the ICH-GCP guidelines as this, in itself, explains the reasons and the need for doing so. In this paper, we address the historical background and the events that led up to the formation of these guidelines. Today, the ICH-GCP guidelines are used in clinical trials throughout the globe with the main aim of protecting and preserving human rights. PMID:21614316
78 FR 66916 - Alaskan Seafood Processing Effluent Limitations Guidelines
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-07
... discharges which EPA is considering whether to promulgate in final form. DATES: Comments on this Notice, as... consider your comment. Electronic files should avoid the use of special characters, any form of encryption... data took the form of a questionnaire that included the following topics: general information about the...
Vocabulary Development and Maintenance--Identifiers. ERIC Processing Manual, Section VIII (Part 2).
ERIC Educational Resources Information Center
Weller, Carolyn R., Ed.; Houston, Jim, Ed.
Comprehensive rules, guidelines, and examples are provided for use by ERIC indexers and lexicographers in creating and using Identifiers, and in developing and maintaining the ERIC Identifier file via the "Identifier Authority List (IAL)." Identifiers and the IAL are defined/described: Identifiers are highly specific entities, including…
41 CFR 101-25.107 - Guidelines for requisitioning and proper use of consumable or low cost items.
Code of Federal Regulations, 2011 CFR
2011-07-01
... needs. Attache cases, Ball point pens and refills, Brief cases, Binders, Carbon paper, Dictionaries, Felt tip markers, Felt tip pens and refills, File folders, Letterex, Letter openers, Pads (paper), Paper clips, Pencils, Pencil sharpeners, Portfolios (leather, plastic, and writing pads), Rubber bands...
41 CFR 101-25.107 - Guidelines for requisitioning and proper use of consumable or low cost items.
Code of Federal Regulations, 2012 CFR
2012-07-01
... needs. Attache cases, Ball point pens and refills, Brief cases, Binders, Carbon paper, Dictionaries, Felt tip markers, Felt tip pens and refills, File folders, Letterex, Letter openers, Pads (paper), Paper clips, Pencils, Pencil sharpeners, Portfolios (leather, plastic, and writing pads), Rubber bands...
41 CFR 101-25.107 - Guidelines for requisitioning and proper use of consumable or low cost items.
Code of Federal Regulations, 2014 CFR
2014-07-01
... needs. Attache cases, Ball point pens and refills, Brief cases, Binders, Carbon paper, Dictionaries, Felt tip markers, Felt tip pens and refills, File folders, Letterex, Letter openers, Pads (paper), Paper clips, Pencils, Pencil sharpeners, Portfolios (leather, plastic, and writing pads), Rubber bands...
41 CFR 101-25.107 - Guidelines for requisitioning and proper use of consumable or low cost items.
Code of Federal Regulations, 2013 CFR
2013-07-01
... needs. Attache cases, Ball point pens and refills, Brief cases, Binders, Carbon paper, Dictionaries, Felt tip markers, Felt tip pens and refills, File folders, Letterex, Letter openers, Pads (paper), Paper clips, Pencils, Pencil sharpeners, Portfolios (leather, plastic, and writing pads), Rubber bands...
49 CFR Appendix A to Part 355 - Guidelines for the Regulatory Review
Code of Federal Regulations, 2012 CFR
2012-10-01
... COMPATIBILITY OF STATE LAWS AND REGULATIONS AFFECTING INTERSTATE MOTOR CARRIER OPERATIONS Pt. 355, App. A... drug and alcohol abuse; require a motor carrier to maintain a driver qualification file for each driver... Vehicles Prohibit possession, use, or driving under the influence of alcohol or other controlled substances...
49 CFR Appendix A to Part 355 - Guidelines for the Regulatory Review
Code of Federal Regulations, 2014 CFR
2014-10-01
... COMPATIBILITY OF STATE LAWS AND REGULATIONS AFFECTING INTERSTATE MOTOR CARRIER OPERATIONS Pt. 355, App. A... drug and alcohol abuse; require a motor carrier to maintain a driver qualification file for each driver... Vehicles Prohibit possession, use, or driving under the influence of alcohol or other controlled substances...
49 CFR Appendix A to Part 355 - Guidelines for the Regulatory Review
Code of Federal Regulations, 2013 CFR
2013-10-01
... COMPATIBILITY OF STATE LAWS AND REGULATIONS AFFECTING INTERSTATE MOTOR CARRIER OPERATIONS Pt. 355, App. A... drug and alcohol abuse; require a motor carrier to maintain a driver qualification file for each driver... Vehicles Prohibit possession, use, or driving under the influence of alcohol or other controlled substances...
49 CFR Appendix A to Part 355 - Guidelines for the Regulatory Review
Code of Federal Regulations, 2011 CFR
2011-10-01
... COMPATIBILITY OF STATE LAWS AND REGULATIONS AFFECTING INTERSTATE MOTOR CARRIER OPERATIONS Pt. 355, App. A... drug and alcohol abuse; require a motor carrier to maintain a driver qualification file for each driver... Vehicles Prohibit possession, use, or driving under the influence of alcohol or other controlled substances...
75 FR 11134 - Pacific Fishery Management Council; Public Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-10
... limits (ACL) and accountability measures (AM). This meeting of the SAC is open to the public. DATES: The... with new ACL and AM requirements. The NS1 guidelines include recommendations for establishing several.... [FR Doc. 2010-5086 Filed 3-9-10; 8:45 am] BILLING CODE 3510-22-S ...
75 FR 971 - Pacific Fishery Management Council; Public Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2010-01-07
...) and accountability measures (AM). This meeting of the SAC is open to the public. DATES: The meeting... with new ACL and AM requirements. The NS1 guidelines include recommendations for establishing several.... [FR Doc. E9-31427 Filed 1-6-10; 8:45 am] BILLING CODE 3510-22-S ...
76 FR 22676 - Pacific Fishery Management Council; Public Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-22
...) and accountability measures (AM). This meeting of the SAC is open to the public. DATES: The meeting... MSA to provide guidance on how to comply with new ACL and AM requirements. The NS1 guidelines include.... 2011-9803 Filed 4-21-11; 8:45 am] BILLING CODE 3510-22-P ...
33 CFR 230.25 - Environmental review and consultation requirements.
Code of Federal Regulations, 2010 CFR
2010-07-01
... Principles and Guidelines for Water and Related Land Resources Implementation Studies. Reviews and... of the ongoing studies are not expected to materially affect the decision on the proposed action, the filing of the final EIS need not be delayed. (b) Executive Order 12114, Environmental Effects Abroad of...
33 CFR 230.25 - Environmental review and consultation requirements.
Code of Federal Regulations, 2011 CFR
2011-07-01
... Principles and Guidelines for Water and Related Land Resources Implementation Studies. Reviews and... of the ongoing studies are not expected to materially affect the decision on the proposed action, the filing of the final EIS need not be delayed. (b) Executive Order 12114, Environmental Effects Abroad of...
33 CFR 230.25 - Environmental review and consultation requirements.
Code of Federal Regulations, 2014 CFR
2014-07-01
... Principles and Guidelines for Water and Related Land Resources Implementation Studies. Reviews and... of the ongoing studies are not expected to materially affect the decision on the proposed action, the filing of the final EIS need not be delayed. (b) Executive Order 12114, Environmental Effects Abroad of...
33 CFR 230.25 - Environmental review and consultation requirements.
Code of Federal Regulations, 2013 CFR
2013-07-01
... Principles and Guidelines for Water and Related Land Resources Implementation Studies. Reviews and... of the ongoing studies are not expected to materially affect the decision on the proposed action, the filing of the final EIS need not be delayed. (b) Executive Order 12114, Environmental Effects Abroad of...
33 CFR 230.25 - Environmental review and consultation requirements.
Code of Federal Regulations, 2012 CFR
2012-07-01
... Principles and Guidelines for Water and Related Land Resources Implementation Studies. Reviews and... of the ongoing studies are not expected to materially affect the decision on the proposed action, the filing of the final EIS need not be delayed. (b) Executive Order 12114, Environmental Effects Abroad of...
38 CFR 74.10 - Where must an application be filed?
Code of Federal Regulations, 2010 CFR
2010-07-01
... (CONTINUED) VETERANS SMALL BUSINESS REGULATIONS Application Guidelines § 74.10 Where must an application be... Information Pages database located in the Center for Veterans Enterprise's Web portal, http://www.VetBiz.gov... dispatched to: Director, Center for Veterans Enterprise (00VE), U.S. Department of Veterans Affairs, 810...
An Overview of ARL’s Multimodal Signatures Database and Web Interface
2007-12-01
ActiveX components, which hindered distribution due to license agreements and run-time license software to use such components. g. Proprietary...Overview The database consists of multimodal signature data files in the HDF5 format. Generally, each signature file contains all the ancillary...only contains information in the database, Web interface, and signature files that is releasable to the public. The Web interface consists of static
Web servlet-assisted, dial-in flow cytometry data analysis.
Battye, F
2001-02-01
The obvious benefits of centralized data storage notwithstanding, the size of modern flow cytometry data files discourages their transmission over commonly used telephone modem connections. The proposed solution is to install at the central location a web servlet that can extract compact data arrays, of a form dependent on the requested display type, from the stored files and transmit them to a remote client computer program for display. A client program and a web servlet, both written in the Java programming language, were designed to communicate over standard network connections. The client program creates familiar numerical and graphical display types and allows the creation of gates from combinations of user-defined regions. Data compression techniques further reduce transmission times for data arrays that are already much smaller than the data file itself. For typical data files, network transmission times were reduced more than 700-fold for extraction of one-dimensional (1-D) histograms, between 18 and 120-fold for 2-D histograms, and 6-fold for color-coded dot plots. Numerous display formats are possible without further access to the data file. This scheme enables telephone modem access to centrally stored data without restricting flexibility of display format or preventing comparisons with locally stored files. Copyright 2001 Wiley-Liss, Inc.
proBAMconvert: A Conversion Tool for proBAM/proBed.
Olexiouk, Volodimir; Menschaert, Gerben
2017-07-07
The introduction of new standard formats, proBAM and proBed, improves the integration of genomics and proteomics information, thus aiding proteogenomics applications. These novel formats enable peptide spectrum matches (PSM) to be stored, inspected, and analyzed within the context of the genome. However, an easy-to-use and transparent tool to convert mass spectrometry identification files to these new formats is indispensable. proBAMconvert enables the conversion of common identification file formats (mzIdentML, mzTab, and pepXML) to proBAM/proBed using an intuitive interface. Furthermore, ProBAMconvert enables information to be output both at the PSM and peptide levels and has a command line interface next to the graphical user interface. Detailed documentation and a completely worked-out tutorial is available at http://probam.biobix.be .
Cardio-PACs: a new opportunity
NASA Astrophysics Data System (ADS)
Heupler, Frederick A., Jr.; Thomas, James D.; Blume, Hartwig R.; Cecil, Robert A.; Heisler, Mary
2000-05-01
It is now possible to replace film-based image management in the cardiac catheterization laboratory with a Cardiology Picture Archiving and Communication System (Cardio-PACS) based on digital imaging technology. The first step in the conversion process is installation of a digital image acquisition system that is capable of generating high-quality DICOM-compatible images. The next three steps, which are the subject of this presentation, involve image display, distribution, and storage. Clinical requirements and associated cost considerations for these three steps are listed below: Image display: (1) Image quality equal to film, with DICOM format, lossless compression, image processing, desktop PC-based with color monitor, and physician-friendly imaging software; (2) Performance specifications include: acquire 30 frames/sec; replay 15 frames/sec; access to file server 5 seconds, and to archive 5 minutes; (3) Compatibility of image file, transmission, and processing formats; (4) Image manipulation: brightness, contrast, gray scale, zoom, biplane display, and quantification; (5) User-friendly control of image review. Image distribution: (1) Standard IP-based network between cardiac catheterization laboratories, file server, long-term archive, review stations, and remote sites; (2) Non-proprietary formats; (3) Bidirectional distribution. Image storage: (1) CD-ROM vs disk vs tape; (2) Verification of data integrity; (3) User-designated storage capacity for catheterization laboratory, file server, long-term archive. Costs: (1) Image acquisition equipment, file server, long-term archive; (2) Network infrastructure; (3) Review stations and software; (4) Maintenance and administration; (5) Future upgrades and expansion; (6) Personnel.
Brabb, Earl E.; Roberts, Sebastian; Cotton, William R.; Kropp, Alan L.; Wright, Robert H.; Zinn, Erik N.; Digital database by Roberts, Sebastian; Mills, Suzanne K.; Barnes, Jason B.; Marsolek, Joanna E.
2000-01-01
This publication consists of a digital map database on a geohazards web site, http://kaibab.wr.usgs.gov/geohazweb/intro.htm, this text, and 43 digital map images available for downloading at this site. The report is stored as several digital files, in ARC export (uncompressed) format for the database, and Postscript and PDF formats for the map images. Several of the source data layers for the images have already been released in other publications by the USGS and are available for downloading on the Internet. These source layers are not included in this digital database, but rather a reference is given for the web site where the data can be found in digital format. The exported ARC coverages and grids lie in UTM zone 10 projection. The pamphlet, which only describes the content and character of the digital map database, is included as Postscript, PDF, and ASCII text files and is also available on paper as USGS Open-File Report 00-127. The full versatility of the spatial database is realized by importing the ARC export files into ARC/INFO or an equivalent GIS. Other GIS packages, including MapInfo and ARCVIEW, can also use the ARC export files. The Postscript map image can be used for viewing or plotting in computer systems with sufficient capacity, and the considerably smaller PDF image files can be viewed or plotted in full or in part from Adobe ACROBAT software running on Macintosh, PC, or UNIX platforms.
A qualitative examination of perceptions of physical activity guidelines and preferences for format.
Berry, Tanya R; Witcher, Chad; Holt, Nicholas L; Plotnikoff, Ronald C
2010-11-01
A descriptive exploratory study was conducted to gain an understanding of public perceptions of physical activity guidelines and to discover what formats appeal to participants. Canada's Physical Activity Guide (CPAG) was used as an example of such guidelines. Data were collected from 22 participants in five focus groups (composed of female undergraduate students, female office workers, male office workers, participants in a Type II diabetes rehabilitation program, and participants in a cardiovascular rehabilitation program). Cross-case qualitative analyses were conducted. Six themes emerged under the general categories of familiarity and preferences for PA promotional materials. In terms of familiarity, participants lacked awareness of CPAG and criticized its format. In terms of preferences, participants encouraged the use of stylistically similar messaging to those used by commercial advertisers, wanted personal stories, Internet-based media, and the use of celebrities' success stories. There was little awareness of CPAG and the current format was unappealing.
CAPRICE positively regulates stomatal formation in the Arabidopsis hypocotyl
2008-01-01
In the Arabidopsis hypocotyl, stomata develop only from a set of epidermal cell files. Previous studies have identified several negative regulators of stomata formation. Such regulators also trigger non-hair cell fate in the root. Here, it is shown that TOO MANY MOUTHS (TMM) positively regulates CAPRICE (CPC) expression in differentiating stomaless-forming cell files, and that the CPC protein might move to the nucleus of neighbouring stoma-forming cells, where it promotes stomata formation in a redundant manner with TRIPTYCHON (TRY). Unexpectedly, the CPC protein was also localized in the nucleus and peripheral cytoplasm of hypocotyl fully differentiated epidermal cells, suggesting that CPC plays an additional role to those related to stomata formation. These results identify CPC and TRY as positive regulators of stomata formation in the embryonic stem, which increases the similarity between the genetic control of root hair and stoma cell fate determination. PMID:19513241