Sample records for binary output file

  1. Extracting the Data From the LCM vk4 Formatted Output File

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wendelberger, James G.

    These are slides about extracting the data from the LCM vk4 formatted output file. The following is covered: vk4 file produced by Keyence VK Software, custom analysis, no off the shelf way to read the file, reading the binary data in a vk4 file, various offsets in decimal lines, finding the height image data, directly in MATLAB, binary output beginning of height image data, color image information, color image binary data, color image decimal and binary data, MATLAB code to read vk4 file (choose a file, read the file, compute offsets, read optical image, laser optical image, read and computemore » laser intensity image, read height image, timing, display height image, display laser intensity image, display RGB laser optical images, display RGB optical images, display beginning data and save images to workspace, gamma correction subroutine), reading intensity form the vk4 file, linear in the low range, linear in the high range, gamma correction for vk4 files, computing the gamma intensity correction, observations.« less

  2. User Guide and Documentation for Five MODFLOW Ground-Water Modeling Utility Programs

    USGS Publications Warehouse

    Banta, Edward R.; Paschke, Suzanne S.; Litke, David W.

    2008-01-01

    This report documents five utility programs designed for use in conjunction with ground-water flow models developed with the U.S. Geological Survey's MODFLOW ground-water modeling program. One program extracts calculated flow values from one model for use as input to another model. The other four programs extract model input or output arrays from one model and make them available in a form that can be used to generate an ArcGIS raster data set. The resulting raster data sets may be useful for visual display of the data or for further geographic data processing. The utility program GRID2GRIDFLOW reads a MODFLOW binary output file of cell-by-cell flow terms for one (source) model grid and converts the flow values to input flow values for a different (target) model grid. The spatial and temporal discretization of the two models may differ. The four other utilities extract selected 2-dimensional data arrays in MODFLOW input and output files and write them to text files that can be imported into an ArcGIS geographic information system raster format. These four utilities require that the model cells be square and aligned with the projected coordinate system in which the model grid is defined. The four raster-conversion utilities are * CBC2RASTER, which extracts selected stress-package flow data from a MODFLOW binary output file of cell-by-cell flows; * DIS2RASTER, which extracts cell-elevation data from a MODFLOW Discretization file; * MFBIN2RASTER, which extracts array data from a MODFLOW binary output file of head or drawdown; and * MULT2RASTER, which extracts array data from a MODFLOW Multiplier file.

  3. HDF-EOS Dump Tools

    NASA Astrophysics Data System (ADS)

    Prasad, U.; Rahabi, A.

    2001-05-01

    The following utilities developed for HDF-EOS format data dump are of special use for Earth science data for NASA's Earth Observation System (EOS). This poster demonstrates their use and application. The first four tools take HDF-EOS data files as input. HDF-EOS Metadata Dumper - metadmp Metadata dumper extracts metadata from EOS data granules. It operates by simply copying blocks of metadata from the file to the standard output. It does not process the metadata in any way. Since all metadata in EOS granules is encoded in the Object Description Language (ODL), the output of metadmp will be in the form of complete ODL statements. EOS data granules may contain up to three different sets of metadata (Core, Archive, and Structural Metadata). HDF-EOS Contents Dumper - heosls Heosls dumper displays the contents of HDF-EOS files. This utility provides detailed information on the POINT, SWATH, and GRID data sets. in the files. For example: it will list, the Geo-location fields, Data fields and objects. HDF-EOS ASCII Dumper - asciidmp The ASCII dump utility extracts fields from EOS data granules into plain ASCII text. The output from asciidmp should be easily human readable. With minor editing, asciidmp's output can be made ingestible by any application with ASCII import capabilities. HDF-EOS Binary Dumper - bindmp The binary dumper utility dumps HDF-EOS objects in binary format. This is useful for feeding the output of it into existing program, which does not understand HDF, for example: custom software and COTS products. HDF-EOS User Friendly Metadata - UFM The UFM utility tool is useful for viewing ECS metadata. UFM takes an EOSDIS ODL metadata file and produces an HTML report of the metadata for display using a web browser. HDF-EOS METCHECK - METCHECK METCHECK can be invoked from either Unix or Dos environment with a set of command line options that a user might use to direct the tool inputs and output . METCHECK validates the inventory metadata in (.met file) using The Descriptor file (.desc) as the reference. The tool takes (.desc), and (.met) an ODL file as inputs, and generates a simple output file contains the results of the checking process.

  4. Computer program documentation: CYBER to Univac binary conversion user's guide

    NASA Technical Reports Server (NTRS)

    Martin, E. W.

    1980-01-01

    A user's guide for a computer program which will convert SINDA temperature history data from CDC (Cyber) binary format to UNIVAC 1100 binary format is presented. The various options available, the required input, the optional output, file assignments, and the restrictions of the program are discussed.

  5. Documentation of model input and output values for simulation of pumping effects in Paradise Valley, a basin tributary to the Humboldt River, Humboldt County, Nevada

    USGS Publications Warehouse

    Carey, A.E.; Prudic, David E.

    1996-01-01

    Documentation is provided of model input and sample output used in a previous report for analysis of ground-water flow and simulated pumping scenarios in Paradise Valley, Humboldt County, Nevada.Documentation includes files containing input values and listings of sample output. The files, in American International Standard Code for Information Interchange (ASCII) or binary format, are compressed and put on a 3-1/2-inch diskette. The decompressed files require approximately 8.4 megabytes of disk space on an International Business Machine (IBM)- compatible microcomputer using the MicroSoft Disk Operating System (MS-DOS) operating system version 5.0 or greater.

  6. C2x: A tool for visualisation and input preparation for CASTEP and other electronic structure codes

    NASA Astrophysics Data System (ADS)

    Rutter, M. J.

    2018-04-01

    The c2x code fills two distinct roles. Its first role is in acting as a converter between the binary format .check files from the widely-used CASTEP [1] electronic structure code and various visualisation programs. Its second role is to manipulate and analyse the input and output files from a variety of electronic structure codes, including CASTEP, ONETEP and VASP, as well as the widely-used 'Gaussian cube' file format. Analysis includes symmetry analysis, and manipulation arbitrary cell transformations. It continues to be under development, with growing functionality, and is written in a form which would make it easy to extend it to working directly with files from other electronic structure codes. Data which c2x is capable of extracting from CASTEP's binary checkpoint files include charge densities, spin densities, wavefunctions, relaxed atomic positions, forces, the Fermi level, the total energy, and symmetry operations. It can recreate .cell input files from checkpoint files. Volumetric data can be output in formats useable by many common visualisation programs, and c2x will itself calculate integrals, expand data into supercells, and interpolate data via combinations of Fourier and trilinear interpolation. It can extract data along arbitrary lines (such as lines between atoms) as 1D output. C2x is able to convert between several common formats for describing molecules and crystals, including the .cell format of CASTEP. It can construct supercells, reduce cells to their primitive form, and add specified k-point meshes. It uses the spglib library [2] to report symmetry information, which it can add to .cell files. C2x is a command-line utility, so is readily included in scripts. It is available under the GPL and can be obtained from http://www.c2x.org.uk. It is believed to be the only open-source code which can read CASTEP's .check files, so it will have utility in other projects.

  7. Mass spectrometer output file format mzML.

    PubMed

    Deutsch, Eric W

    2010-01-01

    Mass spectrometry is an important technique for analyzing proteins and other biomolecular compounds in biological samples. Each of the vendors of these mass spectrometers uses a different proprietary binary output file format, which has hindered data sharing and the development of open source software for downstream analysis. The solution has been to develop, with the full participation of academic researchers as well as software and hardware vendors, an open XML-based format for encoding mass spectrometer output files, and then to write software to use this format for archiving, sharing, and processing. This chapter presents the various components and information available for this format, mzML. In addition to the XML schema that defines the file structure, a controlled vocabulary provides clear terms and definitions for the spectral metadata, and a semantic validation rules mapping file allows the mzML semantic validator to insure that an mzML document complies with one of several levels of requirements. Complete documentation and example files insure that the format may be uniformly implemented. At the time of release, there already existed several implementations of the format and vendors have committed to supporting the format in their products.

  8. Wrapping Python around MODFLOW/MT3DMS based groundwater models

    NASA Astrophysics Data System (ADS)

    Post, V.

    2008-12-01

    Numerical models that simulate groundwater flow and solute transport require a great amount of input data that is often organized into different files. A large proportion of the input data consists of spatially-distributed model parameters. The model output consists of a variety data such as heads, fluxes and concentrations. Typically all files have different formats. Consequently, preparing input and managing output is a complex and error-prone task. Proprietary software tools are available that facilitate the preparation of input files and analysis of model outcomes. The use of such software may be limited if it does not support all the features of the groundwater model or when the costs of such tools are prohibitive. Therefore a Python library was developed that contains routines to generate input files and process output files of MODFLOW/MT3DMS based models. The library is freely available and has an open structure so that the routines can be customized and linked into other scripts and libraries. The current set of functions supports the generation of input files for MODFLOW and MT3DMS, including the capability to read spatially-distributed input parameters (e.g. hydraulic conductivity) from PNG files. Both ASCII and binary output files can be read efficiently allowing for visualization of, for example, solute concentration patterns in contour plots with superimposed flow vectors using matplotlib. Series of contour plots are then easily saved as an animation. The subroutines can also be used within scripts to calculate derived quantities such as the mass of a solute within a particular region of the model domain. Using Python as a wrapper around groundwater models provides an efficient and flexible way of processing input and output data, which is not constrained by limitations of third-party products.

  9. Extraction of Vertical Profiles of Atmospheric Variables from Gridded Binary, Edition 2 (GRIB2) Model Output Files

    DTIC Science & Technology

    2018-01-18

    processing. Specifically, the method described herein uses wgrib2 commands along with a Python script or program to produce tabular text files that in...It makes use of software that is readily available and can be implemented on many computer systems combined with relatively modest additional...example), extracts appropriate information, and lists the extracted information in a readable tabular form. The Python script used here is described in

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Collins, Benjamin S.

    The Futility package contains the following: 1) Definition of the size of integers and real numbers; 2) A generic Unit test harness; 3) Definitions for some basic extensions to the Fortran language: arbitrary length strings, a parameter list construct, exception handlers, command line processor, timers; 4) Geometry definitions: point, line, plane, box, cylinder, polyhedron; 5) File wrapper functions: standard Fortran input/output files, Fortran binary files, HDF5 files; 6) Parallel wrapper functions: MPI, and Open MP abstraction layers, partitioning algorithms; 7) Math utilities: BLAS, Matrix and Vector definitions, Linear Solver methods and wrappers for other TPLs (PETSC, MKL, etc), preconditioner classes;more » 8) Misc: random number generator, water saturation properties, sorting algorithms.« less

  11. ProMC: Input-output data format for HEP applications using varint encoding

    NASA Astrophysics Data System (ADS)

    Chekanov, S. V.; May, E.; Strand, K.; Van Gemmeren, P.

    2014-10-01

    A new data format for Monte Carlo (MC) events, or any structural data, including experimental data, is discussed. The format is designed to store data in a compact binary form using variable-size integer encoding as implemented in the Google's Protocol Buffers package. This approach is implemented in the PROMC library which produces smaller file sizes for MC records compared to the existing input-output libraries used in high-energy physics (HEP). Other important features of the proposed format are a separation of abstract data layouts from concrete programming implementations, self-description and random access. Data stored in PROMC files can be written, read and manipulated in a number of programming languages, such C++, JAVA, FORTRAN and PYTHON.

  12. Thermo-msf-parser: an open source Java library to parse and visualize Thermo Proteome Discoverer msf files.

    PubMed

    Colaert, Niklaas; Barsnes, Harald; Vaudel, Marc; Helsens, Kenny; Timmerman, Evy; Sickmann, Albert; Gevaert, Kris; Martens, Lennart

    2011-08-05

    The Thermo Proteome Discoverer program integrates both peptide identification and quantification into a single workflow for peptide-centric proteomics. Furthermore, its close integration with Thermo mass spectrometers has made it increasingly popular in the field. Here, we present a Java library to parse the msf files that constitute the output of Proteome Discoverer. The parser is also implemented as a graphical user interface allowing convenient access to the information found in the msf files, and in Rover, a program to analyze and validate quantitative proteomics information. All code, binaries, and documentation is freely available at http://thermo-msf-parser.googlecode.com.

  13. Software for Preprocessing Data from Rocket-Engine Tests

    NASA Technical Reports Server (NTRS)

    Cheng, Chiu-Fu

    2004-01-01

    Three computer programs have been written to preprocess digitized outputs of sensors during rocket-engine tests at Stennis Space Center (SSC). The programs apply exclusively to the SSC E test-stand complex and utilize the SSC file format. The programs are the following: Engineering Units Generator (EUGEN) converts sensor-output-measurement data to engineering units. The inputs to EUGEN are raw binary test-data files, which include the voltage data, a list identifying the data channels, and time codes. EUGEN effects conversion by use of a file that contains calibration coefficients for each channel. QUICKLOOK enables immediate viewing of a few selected channels of data, in contradistinction to viewing only after post-test processing (which can take 30 minutes to several hours depending on the number of channels and other test parameters) of data from all channels. QUICKLOOK converts the selected data into a form in which they can be plotted in engineering units by use of Winplot (a free graphing program written by Rick Paris). EUPLOT provides a quick means for looking at data files generated by EUGEN without the necessity of relying on the PV-WAVE based plotting software.

  14. Software for Preprocessing Data From Rocket-Engine Tests

    NASA Technical Reports Server (NTRS)

    Cheng, Chiu-Fu

    2003-01-01

    Three computer programs have been written to preprocess digitized outputs of sensors during rocket-engine tests at Stennis Space Center (SSC). The programs apply exclusively to the SSC E test-stand complex and utilize the SSC file format. The programs are the following: (1) Engineering Units Generator (EUGEN) converts sensor-output-measurement data to engineering units. The inputs to EUGEN are raw binary test-data files, which include the voltage data, a list identifying the data channels, and time codes. EUGEN effects conversion by use of a file that contains calibration coefficients for each channel. (2) QUICKLOOK enables immediate viewing of a few selected channels of data, in contradistinction to viewing only after post-test processing (which can take 30 minutes to several hours depending on the number of channels and other test parameters) of data from all channels. QUICKLOOK converts the selected data into a form in which they can be plotted in engineering units by use of Winplot. (3) EUPLOT provides a quick means for looking at data files generated by EUGEN without the necessity of relying on the PVWAVE based plotting software.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bartoletti, T.

    SPI/U3.1 consists of five tools used to assess and report the security posture of computers running the UNIX operating system. The tools are: Access Control Test: A rule-based system which identifies sequential dependencies in UNIX access controls. Binary Inspector Tool: Evaluates the release status of system binaries by comparing a crypto-checksum to provide table entries. Change Detection Tool: Maintains and applies a snapshot of critical system files and attributes for purposes of change detection. Configuration Query Language: Accepts CQL-based scripts (provided) to evaluate queries over the status of system files, configuration of services and many other elements of UNIX systemmore » security. Password Security Inspector: Tests for weak or aged passwords. The tools are packaged with a forms-based user interface providing on-line context-sensistive help, job scheduling, parameter management and output report management utilities. Tools may be run independent of the UI.« less

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bartoletti, Tony

    SPI/U3.2 consists of five tools used to assess and report the security posture of computers running the UNIX operating system. The tools are: Access Control Test: A rule-based system which identifies sequential dependencies in UNIX access controls. Binary Authentication Tool: Evaluates the release status of system binaries by comparing a crypto-checksum to provide table entries. Change Detection Tool: Maintains and applies a snapshot of critical system files and attributes for purposes of change detection. Configuration Query Language: Accepts CQL-based scripts (provided) to evaluate queries over the status of system files, configuration of services and many other elements of UNIX systemmore » security. Password Security Inspector: Tests for weak or aged passwords. The tools are packaged with a forms-based user interface providing on-line context-sensistive help, job scheduling, parameter management and output report management utilities. Tools may be run independent of the UI.« less

  17. SPI/U3.2. Security Profile Inspector for UNIX Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bartoletti, A.

    1994-08-01

    SPI/U3.2 consists of five tools used to assess and report the security posture of computers running the UNIX operating system. The tools are: Access Control Test: A rule-based system which identifies sequential dependencies in UNIX access controls. Binary Authentication Tool: Evaluates the release status of system binaries by comparing a crypto-checksum to provide table entries. Change Detection Tool: Maintains and applies a snapshot of critical system files and attributes for purposes of change detection. Configuration Query Language: Accepts CQL-based scripts (provided) to evaluate queries over the status of system files, configuration of services and many other elements of UNIX systemmore » security. Password Security Inspector: Tests for weak or aged passwords. The tools are packaged with a forms-based user interface providing on-line context-sensistive help, job scheduling, parameter management and output report management utilities. Tools may be run independent of the UI.« less

  18. Software for Preprocessing Data From Rocket-Engine Tests

    NASA Technical Reports Server (NTRS)

    Cheng, Chiu-Fu

    2002-01-01

    Three computer programs have been written to preprocess digitized outputs of sensors during rocket-engine tests at Stennis Space Center (SSC). The programs apply exclusively to the SSC "E" test-stand complex and utilize the SSC file format. The programs are the following: 1) Engineering Units Generator (EUGEN) converts sensor-output-measurement data to engineering units. The inputs to EUGEN are raw binary test-data files, which include the voltage data, a list identifying the data channels, and time codes. EUGEN effects conversion by use of a file that contains calibration coefficients for each channel; 2) QUICKLOOK enables immediate viewing of a few selected channels of data, in contradistinction to viewing only after post test processing (which can take 30 minutes to several hours depending on the number of channels and other test parameters) of data from all channels. QUICKLOOK converts the selected data into a form in which they can be plotted in engineering units by use of Winplot (a free graphing program written by Rick Paris); and 3) EUPLOT provides a quick means for looking at data files generated by EUGEN without the necessity of relying on the PVWAVE based plotting software.

  19. AGS a set of UNIX commands for neutron data reduction

    NASA Astrophysics Data System (ADS)

    Bastian, C.

    1997-02-01

    The output of a detector system recording neutron-induced nuclear reactions consists of a set of multichannel spectra and of scaler/counter values. These data must be reduced - i.e.. corrected and combined - to produce a clean energy spectrum of the reaction cross-section with a covariance estimate suitable for evaluation. The reduction process may be broken down into a sequence of operations. We present a set of reduction operations implemented as commands on a UNIX system. Every operation reads spectra from a file and appends results as new spectra to the same file. The binary file format AGS used thereby records the spectra as named entities including a set of neutron energy values and a corresponding set of values with their correlated and uncorrelated uncertainties.

  20. Program to convert SUDS2ASC files to a single binary SEGY file

    USGS Publications Warehouse

    Goldman, Mark

    2000-01-01

    This program, SUDS2SEGY, converts and combines ASCII files created using SUDS2ASC Version 2.60, to a single SEGY file. SUDS2ASC has been used previously to create an ASCII file of three-component seismic data for an individual recording station. However, many seismic processing packages have difficulty reading in ASCII data. In addition, it may be cumbersome to process a separate file for each recording station, particularly if traces from different recording stations contain a different number of data samples and/or a different start time. This new program - SUDS2SEGY - combines these recording station files into a single SEGY file. In addition, SUDS2SEGY normalizes the trace times so that each trace starts at a given time and consists of a fixed number of samples. This normalization allows seismic data from many different stations to be read in as a single "data gather". SUDS2SEGY also produces a report summarizing the offset and maximum absolute amplitude for each component in a station file. These data are output separately to an ASCII file and can be subsequently input to a plotting package.

  1. Gro2mat: a package to efficiently read gromacs output in MATLAB.

    PubMed

    Dien, Hung; Deane, Charlotte M; Knapp, Bernhard

    2014-07-30

    Molecular dynamics (MD) simulations are a state-of-the-art computational method used to investigate molecular interactions at atomic scale. Interaction processes out of experimental reach can be monitored using MD software, such as Gromacs. Here, we present the gro2mat package that allows fast and easy access to Gromacs output files from Matlab. Gro2mat enables direct parsing of the most common Gromacs output formats including the binary xtc-format. No openly available Matlab parser currently exists for this format. The xtc reader is orders of magnitudes faster than other available pdb/ascii workarounds. Gro2mat is especially useful for scientists with an interest in quick prototyping of new mathematical and statistical approaches for Gromacs trajectory analyses. © 2014 Wiley Periodicals, Inc. Copyright © 2014 Wiley Periodicals, Inc.

  2. HYSEP: A Computer Program for Streamflow Hydrograph Separation and Analysis

    USGS Publications Warehouse

    Sloto, Ronald A.; Crouse, Michele Y.

    1996-01-01

    HYSEP is a computer program that can be used to separate a streamflow hydrograph into base-flow and surface-runoff components. The base-flow component has traditionally been associated with ground-water discharge and the surface-runoff component with precipitation that enters the stream as overland runoff. HYSEP includes three methods of hydrograph separation that are referred to in the literature as the fixed interval, sliding-interval, and local-minimum methods. The program also describes the frequency and duration of measured streamflow and computed base flow and surface runoff. Daily mean stream discharge is used as input to the program in either an American Standard Code for Information Interchange (ASCII) or binary format. Output from the program includes table,s graphs, and data files. Graphical output may be plotted on the computer screen or output to a printer, plotter, or metafile.

  3. A software to digital image processing to be used in the voxel phantom development.

    PubMed

    Vieira, J W; Lima, F R A

    2009-11-15

    Anthropomorphic models used in computational dosimetry, also denominated phantoms, are based on digital images recorded from scanning of real people by Computed Tomography (CT) or Magnetic Resonance Imaging (MRI). The voxel phantom construction requests computational processing for transformations of image formats, to compact two-dimensional (2-D) images forming of three-dimensional (3-D) matrices, image sampling and quantization, image enhancement, restoration and segmentation, among others. Hardly the researcher of computational dosimetry will find all these available abilities in single software, and almost always this difficulty presents as a result the decrease of the rhythm of his researches or the use, sometimes inadequate, of alternative tools. The need to integrate the several tasks mentioned above to obtain an image that can be used in an exposure computational model motivated the development of the Digital Image Processing (DIP) software, mainly to solve particular problems in Dissertations and Thesis developed by members of the Grupo de Pesquisa em Dosimetria Numérica (GDN/CNPq). Because of this particular objective, the software uses the Portuguese idiom in their implementations and interfaces. This paper presents the second version of the DIP, whose main changes are the more formal organization on menus and menu items, and menu for digital image segmentation. Currently, the DIP contains the menus Fundamentos, Visualizações, Domínio Espacial, Domínio de Frequências, Segmentações and Estudos. Each menu contains items and sub-items with functionalities that, usually, request an image as input and produce an image or an attribute in the output. The DIP reads edits and writes binary files containing the 3-D matrix corresponding to a stack of axial images from a given geometry that can be a human body or other volume of interest. It also can read any type of computational image and to make conversions. When the task involves only an output image, this is saved as a JPEG file in the Windows default; when it involves an image stack, the output binary file is denominated SGI (Simulações Gráficas Interativas (Interactive Graphic Simulations), an acronym already used in other publications of the GDN/CNPq.

  4. MBSSAS: A code for the computation of margules parameters and equilibrium relations in binary solid-solution aqueous-solution systems

    USGS Publications Warehouse

    Glynn, P.D.

    1991-01-01

    The computer code MBSSAS uses two-parameter Margules-type excess-free-energy of mixing equations to calculate thermodynamic equilibrium, pure-phase saturation, and stoichiometric saturation states in binary solid-solution aqueous-solution (SSAS) systems. Lippmann phase diagrams, Roozeboom diagrams, and distribution-coefficient diagrams can be constructed from the output data files, and also can be displayed by MBSSAS (on IBM-PC compatible computers). MBSSAS also will calculate accessory information, such as the location of miscibility gaps, spinodal gaps, critical-mixing points, alyotropic extrema, Henry's law solid-phase activity coefficients, and limiting distribution coefficients. Alternatively, MBSSAS can use such information (instead of the Margules, Guggenheim, or Thompson and Waldbaum excess-free-energy parameters) to calculate the appropriate excess-free-energy of mixing equation for any given SSAS system. ?? 1991.

  5. GWM-VI: groundwater management with parallel processing for multiple MODFLOW versions

    USGS Publications Warehouse

    Banta, Edward R.; Ahlfeld, David P.

    2013-01-01

    Groundwater Management–Version Independent (GWM–VI) is a new version of the Groundwater Management Process of MODFLOW. The Groundwater Management Process couples groundwater-flow simulation with a capability to optimize stresses on the simulated aquifer based on an objective function and constraints imposed on stresses and aquifer state. GWM–VI extends prior versions of Groundwater Management in two significant ways—(1) it can be used with any version of MODFLOW that meets certain requirements on input and output, and (2) it is structured to allow parallel processing of the repeated runs of the MODFLOW model that are required to solve the optimization problem. GWM–VI uses the same input structure for files that describe the management problem as that used by prior versions of Groundwater Management. GWM–VI requires only minor changes to the input files used by the MODFLOW model. GWM–VI uses the Joint Universal Parameter IdenTification and Evaluation of Reliability Application Programming Interface (JUPITER-API) to implement both version independence and parallel processing. GWM–VI communicates with the MODFLOW model by manipulating certain input files and interpreting results from the MODFLOW listing file and binary output files. Nearly all capabilities of prior versions of Groundwater Management are available in GWM–VI. GWM–VI has been tested with MODFLOW-2005, MODFLOW-NWT (a Newton formulation for MODFLOW-2005), MF2005-FMP2 (the Farm Process for MODFLOW-2005), SEAWAT, and CFP (Conduit Flow Process for MODFLOW-2005). This report provides sample problems that demonstrate a range of applications of GWM–VI and the directory structure and input information required to use the parallel-processing capability.

  6. SSM/OOM - SSM WITH OOM MANIPULATION CODE

    NASA Technical Reports Server (NTRS)

    Goza, S. P.

    1994-01-01

    Creating, animating, and recording solid-shaded and wireframe three-dimensional geometric models can be of great assistance in the research and design phases of product development, in project planning, and in engineering analyses. SSM and OOM are application programs which together allow for interactive construction and manipulation of three-dimensional models of real-world objects as simple as boxes or as complex as Space Station Freedom. The output of SSM, in the form of binary files defining geometric three dimensional models, is used as input to OOM. Animation in OOM is done using 3D models from SSM as well as cameras and light sources. The animated results of OOM can be output to videotape recorders, film recorders, color printers and disk files. SSM and OOM are also available separately as MSC-21914 and MSC-22263, respectively. The Solid Surface Modeler (SSM) is an interactive graphics software application for solid-shaded and wireframe three-dimensional geometric modeling. The program has a versatile user interface that, in many cases, allows mouse input for intuitive operation or keyboard input when accuracy is critical. SSM can be used as a stand-alone model generation and display program and offers high-fidelity still image rendering. Models created in SSM can also be loaded into the Object Orientation Manipulator for animation or engineering simulation. The Object Orientation Manipulator (OOM) is an application program for creating, rendering, and recording three-dimensional computer-generated still and animated images. This is done using geometrically defined 3D models, cameras, and light sources, referred to collectively as animation elements. OOM does not provide the tools necessary to construct 3D models; instead, it imports binary format model files generated by the Solid Surface Modeler (SSM). Model files stored in other formats must be converted to the SSM binary format before they can be used in OOM. SSM is available as MSC-21914 or as part of the SSM/OOM bundle, COS-10047. Among OOM's features are collision detection (with visual and audio feedback), the capability to define and manipulate hierarchical relationships between animation elements, stereographic display, and ray- traced rendering. OOM uses Euler angle transformations for calculating the results of translation and rotation operations. OOM and SSM are written in C-language for implementation on SGI IRIS 4D series workstations running the IRIX operating system. A minimum of 8Mb of RAM is recommended for each program. The standard distribution medium for this program package is a .25 inch streaming magnetic IRIX tape cartridge in UNIX tar format. These versions of OOM and SSM were released in 1993.

  7. Expected Number of Passes in a Binary Search Scheme

    ERIC Educational Resources Information Center

    Tenenbein, Aaron

    1974-01-01

    The binary search scheme is a method of finding a particular file from a set of ordered files stored in a computer. In this article an exact expression for the expected number of passes required to find a file is derived. (Author)

  8. Performance regression manager for large scale systems

    DOEpatents

    Faraj, Daniel A.

    2017-10-17

    System and computer program product to perform an operation comprising generating, based on a first output generated by a first execution instance of a command, a first output file specifying a value of at least one performance metric, wherein the first output file is formatted according to a predefined format, comparing the value of the at least one performance metric in the first output file to a value of the performance metric in a second output file, the second output file having been generated based on a second output generated by a second execution instance of the command, and outputting for display an indication of a result of the comparison of the value of the at least one performance metric of the first output file to the value of the at least one performance metric of the second output file.

  9. Performance regression manager for large scale systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Faraj, Daniel A.

    Methods comprising generating, based on a first output generated by a first execution instance of a command, a first output file specifying a value of at least one performance metric, wherein the first output file is formatted according to a predefined format, comparing the value of the at least one performance metric in the first output file to a value of the performance metric in a second output file, the second output file having been generated based on a second output generated by a second execution instance of the command, and outputting for display an indication of a result ofmore » the comparison of the value of the at least one performance metric of the first output file to the value of the at least one performance metric of the second output file.« less

  10. Task-parallel message passing interface implementation of Autodock4 for docking of very large databases of compounds using high-performance super-computers.

    PubMed

    Collignon, Barbara; Schulz, Roland; Smith, Jeremy C; Baudry, Jerome

    2011-04-30

    A message passing interface (MPI)-based implementation (Autodock4.lga.MPI) of the grid-based docking program Autodock4 has been developed to allow simultaneous and independent docking of multiple compounds on up to thousands of central processing units (CPUs) using the Lamarkian genetic algorithm. The MPI version reads a single binary file containing precalculated grids that represent the protein-ligand interactions, i.e., van der Waals, electrostatic, and desolvation potentials, and needs only two input parameter files for the entire docking run. In comparison, the serial version of Autodock4 reads ASCII grid files and requires one parameter file per compound. The modifications performed result in significantly reduced input/output activity compared with the serial version. Autodock4.lga.MPI scales up to 8192 CPUs with a maximal overhead of 16.3%, of which two thirds is due to input/output operations and one third originates from MPI operations. The optimal docking strategy, which minimizes docking CPU time without lowering the quality of the database enrichments, comprises the docking of ligands preordered from the most to the least flexible and the assignment of the number of energy evaluations as a function of the number of rotatable bounds. In 24 h, on 8192 high-performance computing CPUs, the present MPI version would allow docking to a rigid protein of about 300K small flexible compounds or 11 million rigid compounds.

  11. University of Massachusetts Marine Renewable Energy Center Waverider Bouy Data

    DOE Data Explorer

    Lohrenz, Steven

    2015-10-07

    The compressed (.zip) file contains Datawell MK-III Directional Waverider binary and unpacked data files as well as a description of the data and manuals for the instrumentation. The data files are contained in the two directories within the zip file, ''Apr_July_2012'' and ''Jun_Sept_2013''. Time series and summary data were recorded in the buoy to binary files with extensions '.RDT' and '.SDT', respectively. These are located in the subdirectories 'Data_Raw' in each of the top-level deployment directories. '.RDT' files contain 3 days of time series (at 1.28 Hz) in 30 minute "bursts". Each '.SDT' file contains summary statistics for the month indicated computed at half-hour intervals for each burst. Each deployment directory also contains a description (in 'File.list') of the Datawell binary data files, and a figure ('Hs_vs_yearday') showing the significant wave height associated with each .RDT file (decoded from the filename). The corresponding unpacked Matlab .mat files are contained in the subdirectories 'Data_Mat'. These files have the extension '.mat' but use the root filename of the source .RDT and .SDT files.

  12. Public-domain-software solution to data-access problems for numerical modelers

    USGS Publications Warehouse

    Jenter, Harry; Signell, Richard

    1992-01-01

    Unidata's network Common Data Form, netCDF, provides users with an efficient set of software for scientific-data-storage, retrieval, and manipulation. The netCDF file format is machine-independent, direct-access, self-describing, and in the public domain, thereby alleviating many problems associated with accessing output from large hydrodynamic models. NetCDF has programming interfaces in both the Fortran and C computer language with an interface to C++ planned for release in the future. NetCDF also has an abstract data type that relieves users from understanding details of the binary file structure; data are written and retrieved by an intuitive, user-supplied name rather than by file position. Users are aided further by Unidata's inclusion of the Common Data Language, CDL, a printable text-equivalent of the contents of a netCDF file. Unidata provides numerous operators and utilities for processing netCDF files. In addition, a number of public-domain and proprietary netCDF utilities from other sources are available at this time or will be available later this year. The U.S. Geological Survey has produced and is producing a number of public-domain netCDF utilities.

  13. Performance regression manager for large scale systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Faraj, Daniel A.

    System and computer program product to perform an operation comprising generating, based on a first output generated by a first execution instance of a command, a first output file specifying a value of at least one performance metric, wherein the first output file is formatted according to a predefined format, comparing the value of the at least one performance metric in the first output file to a value of the performance metric in a second output file, the second output file having been generated based on a second output generated by a second execution instance of the command, and outputtingmore » for display an indication of a result of the comparison of the value of the at least one performance metric of the first output file to the value of the at least one performance metric of the second output file.« less

  14. An object oriented fully 3D tomography visual toolkit.

    PubMed

    Agostinelli, S; Paoli, G

    2001-04-01

    In this paper we present a modern object oriented component object model (COMM) C + + toolkit dedicated to fully 3D cone-beam tomography. The toolkit allows the display and visual manipulation of analytical phantoms, projection sets and volumetric data through a standard Windows graphical user interface. Data input/output is performed using proprietary file formats but import/export of industry standard file formats, including raw binary, Windows bitmap and AVI, ACR/NEMA DICOMM 3 and NCSA HDF is available. At the time of writing built-in implemented data manipulators include a basic phantom ray-tracer and a Matrox Genesis frame grabbing facility. A COMM plug-in interface is provided for user-defined custom backprojector algorithms: a simple Feldkamp ActiveX control, including source code, is provided as an example; our fast Feldkamp plug-in is also available.

  15. Sandbox for Mac Malware v 1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Walkup, Elizabeth

    This software is an analyzer for automated sandbox analysis of malware on the OS X operating system. It runs inside an OS X virtual machine to collect data about what happens when a given file is opened or run. As of August 2014, there was no sandbox software for Mac OS X malware, as it requires different methods from those used on the Windows OS (which most sandboxes are written for). This software adds OS X analysis capabilities to an existing open-source sandbox, Cuckoo Sandbox (http://cuckoosandbox.org/), which previously only worked for Windows. The analyzer itself can take many different typesmore » of files as input: the traditional Mach-O and FAT executables, .app files, zip files, Python scripts, Java archives, and web pages, as well as PDFs and other documents. While the file is running, the analyzer also simulates rudimentary human interaction with clicks and mouse movements in order to bypass the tests some malware use to see if they are being analyzed. The analyzer outputs several different kinds of data: function call traces, network captures, screenshots, and all created and modified files. This work also includes a static analysis Cuckoo module for Mach-O binary files. It extracts file structures, code library imports and exports, and signatures. This data can be used along with the analyzer results to create signatures for malware.« less

  16. The Use of Binary Search Trees in External Distribution Sorting.

    ERIC Educational Resources Information Center

    Cooper, David; Lynch, Michael F.

    1984-01-01

    Suggests new method of external distribution called tree partitioning that involves use of binary tree to split incoming file into successively smaller partitions for internal sorting. Number of disc accesses during a tree-partitioning sort were calculated in simulation using files extracted from British National Bibliography catalog files. (19…

  17. Designing a data portal for synthesis modeling

    NASA Astrophysics Data System (ADS)

    Holmes, M. A.

    2006-12-01

    Processing of field and model data in multi-disciplinary integrated science studies is a vital part of synthesis modeling. Collection and storage techniques for field data vary greatly between the participating scientific disciplines due to the nature of the data being collected, whether it be in situ, remotely sensed, or recorded by automated data logging equipment. Spreadsheets, personal databases, text files and binary files are used in the initial storage and processing of the raw data. In order to be useful to scientists, engineers and modelers the data need to be stored in a format that is easily identifiable, accessible and transparent to a variety of computing environments. The Model Operations and Synthesis (MOAS) database and associated web portal were created to provide such capabilities. The industry standard relational database is comprised of spatial and temporal data tables, shape files and supporting metadata accessible over the network, through a menu driven web-based portal or spatially accessible through ArcSDE connections from the user's local GIS desktop software. A separate server provides public access to spatial data and model output in the form of attributed shape files through an ArcIMS web-based graphical user interface.

  18. The MODIS reprojection tool

    USGS Publications Warehouse

    Dwyer, John L.; Schmidt, Gail L.; Qu, J.J.; Gao, W.; Kafatos, M.; Murphy , R.E.; Salomonson, V.V.

    2006-01-01

    The MODIS Reprojection Tool (MRT) is designed to help individuals work with MODIS Level-2G, Level-3, and Level-4 land data products. These products are referenced to a global tiling scheme in which each tile is approximately 10° latitude by 10° longitude and non-overlapping (Fig. 9.1). If desired, the user may reproject only selected portions of the product (spatial or parameter subsetting). The software may also be used to convert MODIS products to file formats (generic binary and GeoTIFF) that are more readily compatible with existing software packages. The MODIS land products distributed by the Land Processes Distributed Active Archive Center (LP DAAC) are in the Hierarchical Data Format - Earth Observing System (HDF-EOS), developed by the National Center for Supercomputing Applications at the University of Illinois at Urbana Champaign for the NASA EOS Program. Each HDF-EOS file is comprised of one or more science data sets (SDSs) corresponding to geophysical or biophysical parameters. Metadata are embedded in the HDF file as well as contained in a .met file that is associated with each HDF-EOS file. The MRT supports 8-bit, 16-bit, and 32-bit integer data (both signed and unsigned), as well as 32-bit float data. The data type of the output is the same as the data type of each corresponding input SDS.

  19. NASA LaRC FIB Multi-Channel Anemometry Recording System-User's Manual. [conducted at the Langley Low-Turbulence Pressure Tunnel

    NASA Technical Reports Server (NTRS)

    Johnson, Sherylene (Compiler); Bertelrud, Arild (Compiler); Anders, J. B. (Technical Monitor)

    2002-01-01

    This report is part of a series of reports describing a flow physics high-lift experiment conducted in NASA Langley Research Center's Low-Turbulence Pressure Tunnel (LTPT) in 1996. The anemometry system used in the experiment was originally designed for and used in flight tests with NASA's Boeing 737 airplane. Information that may be useful in the evaluation or use of the experimental data has been compiled. The report also contains details regarding record structure, how to read the embedded time code, as well as the output file formats used in the code reading the binary data.

  20. Regulated dc-to-dc converter for voltage step-up or step-down with input-output isolation

    NASA Technical Reports Server (NTRS)

    Feng, S. Y.; Wilson, T. G. (Inventor)

    1973-01-01

    A closed loop regulated dc-to-dc converter employing an unregulated two winding inductive energy storage converter is provided by using a magnetically coupled multivibrator acting as duty cycle generator to drive the converter. The multivibrator is comprised of two transistor switches and a saturable transformer. The output of the converter is compared with a reference in a comparator which transmits a binary zero until the output exceeds the reference. When the output exceeds the reference, the binary output of the comparator drives transistor switches to turn the multivibrator off. The multivibrator is unbalanced so that a predetermined transistor will always turn on first when the binary feedback signal becomes zero.

  1. Transported Geothermal Energy Technoeconomic Screening Tool - Calculation Engine

    DOE Data Explorer

    Liu, Xiaobing

    2016-09-21

    This calculation engine estimates technoeconomic feasibility for transported geothermal energy projects. The TGE screening tool (geotool.exe) takes input from input file (input.txt), and list results into output file (output.txt). Both the input and ouput files are in the same folder as the geotool.exe. To use the tool, the input file containing adequate information of the case should be prepared in the format explained below, and the input file should be put into the same folder as geotool.exe. Then the geotool.exe can be executed, which will generate a output.txt file in the same folder containing all key calculation results. The format and content of the output file is explained below as well.

  2. Bin-Carver: Automatic Recovery of Binary Executable Files

    DTIC Science & Technology

    2012-05-01

    PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Texas A&M University,Department of Computer Science and Engineering,College Station,TX,77840 8...PERFORMING ORGANIZATION REPORT NUMBER 9. SPONSORING/MONITORING AGENCY NAME(S) AND ADDRESS(ES) 10. SPONSOR/MONITOR’S ACRONYM(S) 11. SPONSOR/MONITOR’S REPORT...least 23 4K data blocks) and observed how this binary file gets organized in a brand new disk. We found that this simple ls file actually gets

  3. Production and Injection data for NV Binary facilities

    DOE Data Explorer

    Mines, Greg

    2013-12-24

    Excel files are provided with well production and injection data for binary facilities in Nevada. The files contain the data that reported montly to the Nevada Bureau of Mines and Geology (NBMG) by the facility operators. this data has been complied into Excel spreadsheets for each of the facilities given on the NBMG web site.

  4. Binary power multiplier for electromagnetic energy

    DOEpatents

    Farkas, Zoltan D.

    1988-01-01

    A technique for converting electromagnetic pulses to higher power amplitude and shorter duration, in binary multiples, splits an input pulse into two channels, and subjects the pulses in the two channels to a number of binary pulse compression operations. Each pulse compression operation entails combining the pulses in both input channels and selectively steering the combined power to one output channel during the leading half of the pulses and to the other output channel during the trailing half of the pulses, and then delaying the pulse in the first output channel by an amount equal to half the initial pulse duration. Apparatus for carrying out each of the binary multiplication operation preferably includes a four-port coupler (such as a 3 dB hybrid), which operates on power inputs at a pair of input ports by directing the combined power to either of a pair of output ports, depending on the relative phase of the inputs. Therefore, by appropriately phase coding the pulses prior to any of the pulse compression stages, the entire pulse compression (with associated binary power multiplication) can be carried out solely with passive elements.

  5. Binary to Octal and Octal to Binary Code Converter Using Mach-Zehnder Interferometer for High Speed Communication

    NASA Astrophysics Data System (ADS)

    Pal, Amrindra; Kumar, Santosh; Sharma, Sandeep

    2017-05-01

    Binary to octal and octal to binary code converter is a device that allows placing digital information from many inputs to many outputs. Any application of combinational logic circuit can be implemented by using external gates. In this paper, binary to octal and octal to binary code converter is proposed using electro-optic effect inside lithium-niobate based Mach-Zehnder interferometers (MZIs). The MZI structures have powerful capability to switching an optical input signal to a desired output port. The paper constitutes a mathematical description of the proposed device and thereafter simulation using MATLAB. The study is verified using beam propagation method (BPM).

  6. A user-friendly application for the extraction of kubios hrv output to an optimal format for statistical analysis - biomed 2011.

    PubMed

    Johnsen Lind, Andreas; Helge Johnsen, Bjorn; Hill, Labarron K; Sollers Iii, John J; Thayer, Julian F

    2011-01-01

    The aim of the present manuscript is to present a user-friendly and flexible platform for transforming Kubios HRV output files to an .xls-file format, used by MS Excel. The program utilizes either native or bundled Java and is platform-independent and mobile. This means that it can run without being installed on a computer. It also has an option of continuous transferring of data indicating that it can run in the background while Kubios produces output files. The program checks for changes in the file structure and automatically updates the .xls- output file.

  7. Modifications to the accuracy assessment analysis routine MLTCRP to produce an output file

    NASA Technical Reports Server (NTRS)

    Carnes, J. G.

    1978-01-01

    Modifications are described that were made to the analysis program MLTCRP in the accuracy assessment software system to produce a disk output file. The output files produced by this modified program are used to aggregate data for regions greater than a single segment.

  8. Handling Input and Output for COAMPS

    NASA Technical Reports Server (NTRS)

    Fitzpatrick, Patrick; Tran, Nam; Li, Yongzuo; Anantharaj, Valentine

    2007-01-01

    Two suites of software have been developed to handle the input and output of the Coupled Ocean Atmosphere Prediction System (COAMPS), which is a regional atmospheric model developed by the Navy for simulating and predicting weather. Typically, the initial and boundary conditions for COAMPS are provided by a flat-file representation of the Navy s global model. Additional algorithms are needed for running the COAMPS software using global models. One of the present suites satisfies this need for running COAMPS using the Global Forecast System (GFS) model of the National Oceanic and Atmospheric Administration. The first step in running COAMPS downloading of GFS data from an Internet file-transfer-protocol (FTP) server computer of the National Centers for Environmental Prediction (NCEP) is performed by one of the programs (SSC-00273) in this suite. The GFS data, which are in gridded binary (GRIB) format, are then changed to a COAMPS-compatible format by another program in the suite (SSC-00278). Once a forecast is complete, still another program in the suite (SSC-00274) sends the output data to a different server computer. The second suite of software (SSC- 00275) addresses the need to ingest up-to-date land-use-and-land-cover (LULC) data into COAMPS for use in specifying typical climatological values of such surface parameters as albedo, aerodynamic roughness, and ground wetness. This suite includes (1) a program to process LULC data derived from observations by the Moderate Resolution Imaging Spectroradiometer (MODIS) instruments aboard NASA s Terra and Aqua satellites, (2) programs to derive new climatological parameters for the 17-land-use-category MODIS data; and (3) a modified version of a FORTRAN subroutine to be used by COAMPS. The MODIS data files are processed to reformat them into a compressed American Standard Code for Information Interchange (ASCII) format used by COAMPS for efficient processing.

  9. A national study of efficiency for dialysis centers: an examination of market competition and facility characteristics for production of multiple dialysis outputs.

    PubMed

    Ozgen, Hacer; Ozcan, Yasar A

    2002-06-01

    To examine market competition and facility characteristics that can be related to technical efficiency in the production of multiple dialysis outputs from the perspective of the industrial organization model. Freestanding dialysis facilities that operated in 1997 submitted cost report fonns to the Health Care Financing Administration (HCFA), and offered all three outputs--outpatient dialysis, dialysis training, and home program dialysis. The Independent Renal Facility Cost Report Data file (IRFCRD) from HCFA was utilized to obtain information on output and input variables and market and facility features for 791 multiple-output facilities. Information regarding population characteristics was obtained from the Area Resources File. Cross-sectional data for the year 1997 were utilized to obtain facility-specific technical efficiency scores estimated through Data Envelopment Analysis (DEA). A binary variable of efficiency status was then regressed against its market and facility characteristics and control factors in a multivariate logistic regression analysis. The majority of the facilities in the sample are functioning technically inefficiently. Neither the intensity of market competition nor a policy of dialyzer reuse has a significant effect on the facilities' efficiency. Technical efficiency is significantly associated, however, with type of ownership, with the interaction between the market concentration of for-profits and ownership type, and with affiliations with chains of different sizes. Nonprofit and government-owned Facilities are more likely than their for-profit counterparts to become inefficient producers of renal dialysis outputs. On the other hand, that relationship between ownership form and efficiency is reversed as the market concentration of for-profits in a given market increases. Facilities that are members of large chains are more likely to be technically inefficient. Facilities do not appear to benefit from joint production of a variety of dialysis outputs, which may explain the ongoing tendency toward single-output production. Ownership form does make a positive difference in production efficiency, but only in local markets where competition exists between nonprofit and for-profit facilities. The increasing inefficiency associated with membership in large chains suggests that the growing consolidation in the dialysis industry may not, in fact, be the strategy for attaining more technical efficiency in the production of multiple dialysis outputs.

  10. NoSQL: collection document and cloud by using a dynamic web query form

    NASA Astrophysics Data System (ADS)

    Abdalla, Hemn B.; Lin, Jinzhao; Li, Guoquan

    2015-07-01

    Mongo-DB (from "humongous") is an open-source document database and the leading NoSQL database. A NoSQL (Not Only SQL, next generation databases, being non-relational, deal, open-source and horizontally scalable) presenting a mechanism for storage and retrieval of documents. Previously, we stored and retrieved the data using the SQL queries. Here, we use the MonogoDB that means we are not utilizing the MySQL and SQL queries. Directly importing the documents into our Drives, retrieving the documents on that drive by not applying the SQL queries, using the IO BufferReader and Writer, BufferReader for importing our type of document files to my folder (Drive). For retrieving the document files, the usage is BufferWriter from the particular folder (or) Drive. In this sense, providing the security for those storing files for what purpose means if we store the documents in our local folder means all or views that file and modified that file. So preventing that file, we are furnishing the security. The original document files will be changed to another format like in this paper; Binary format is used. Our documents will be converting to the binary format after that direct storing in one of our folder, that time the storage space will provide the private key for accessing that file. Wherever any user tries to discover the Document files means that file data are in the binary format, the document's file owner simply views that original format using that personal key from receive the secret key from the cloud.

  11. ListingAnalyst: A program for analyzing the main output file from MODFLOW

    USGS Publications Warehouse

    Winston, Richard B.; Paulinski, Scott

    2014-01-01

    ListingAnalyst is a Windows® program for viewing the main output file from MODFLOW-2005, MODFLOW-NWT, or MODFLOW-LGR. It organizes and displays large files quickly without using excessive memory. The sections and subsections of the file are displayed in a tree-view control, which allows the user to navigate quickly to desired locations in the files. ListingAnalyst gathers error and warning messages scattered throughout the main output file and displays them all together in an error and a warning tab. A grid view displays tables in a readable format and allows the user to copy the table into a spreadsheet. The user can also search the file for terms of interest.

  12. Software system for data management and distributed processing of multichannel biomedical signals.

    PubMed

    Franaszczuk, P J; Jouny, C C

    2004-01-01

    The presented software is designed for efficient utilization of cluster of PC computers for signal analysis of multichannel physiological data. The system consists of three main components: 1) a library of input and output procedures, 2) a database storing additional information about location in a storage system, 3) a user interface for selecting data for analysis, choosing programs for analysis, and distributing computing and output data on cluster nodes. The system allows for processing multichannel time series data in multiple binary formats. The description of data format, channels and time of recording are included in separate text files. Definition and selection of multiple channel montages is possible. Epochs for analysis can be selected both manually and automatically. Implementation of a new signal processing procedures is possible with a minimal programming overhead for the input/output processing and user interface. The number of nodes in cluster used for computations and amount of storage can be changed with no major modification to software. Current implementations include the time-frequency analysis of multiday, multichannel recordings of intracranial EEG of epileptic patients as well as evoked response analyses of repeated cognitive tasks.

  13. Binary CMOS image sensor with a gate/body-tied MOSFET-type photodetector for high-speed operation

    NASA Astrophysics Data System (ADS)

    Choi, Byoung-Soo; Jo, Sung-Hyun; Bae, Myunghan; Kim, Sang-Hwan; Shin, Jang-Kyoo

    2016-05-01

    In this paper, a binary complementary metal oxide semiconductor (CMOS) image sensor with a gate/body-tied (GBT) metal oxide semiconductor field effect transistor (MOSFET)-type photodetector is presented. The sensitivity of the GBT MOSFET-type photodetector, which was fabricated using the standard CMOS 0.35-μm process, is higher than the sensitivity of the p-n junction photodiode, because the output signal of the photodetector is amplified by the MOSFET. A binary image sensor becomes more efficient when using this photodetector. Lower power consumptions and higher speeds of operation are possible, compared to the conventional image sensors using multi-bit analog to digital converters (ADCs). The frame rate of the proposed image sensor is over 2000 frames per second, which is higher than those of the conventional CMOS image sensors. The output signal of an active pixel sensor is applied to a comparator and compared with a reference level. The 1-bit output data of the binary process is determined by this level. To obtain a video signal, the 1-bit output data is stored in the memory and is read out by horizontal scanning. The proposed chip is composed of a GBT pixel array (144 × 100), binary-process circuit, vertical scanner, horizontal scanner, and readout circuit. The operation mode can be selected from between binary mode and multi-bit mode.

  14. libprofit: Image creation from luminosity profiles

    NASA Astrophysics Data System (ADS)

    Robotham, A. S. G.; Taranu, D.; Tobar, R.

    2016-12-01

    libprofit is a C++ library for image creation based on different luminosity profiles. It offers fast and accurate two-dimensional integration for a useful number of profiles, including Sersic, Core-Sersic, broken-exponential, Ferrer, Moffat, empirical King, point-source and sky, with a simple mechanism for adding new profiles. libprofit provides a utility to read the model and profile parameters from the command-line and generate the corresponding image. It can output the resulting image as text values, a binary stream, or as a simple FITS file. It also provides a shared library exposing an API that can be used by any third-party application. R and Python interfaces are available: ProFit (ascl:1612.004) and PyProfit (ascl:1612.005).

  15. Information retrieval system

    NASA Technical Reports Server (NTRS)

    Berg, R. F.; Holcomb, J. E.; Kelroy, E. A.; Levine, D. A.; Mee, C., III

    1970-01-01

    Generalized information storage and retrieval system capable of generating and maintaining a file, gathering statistics, sorting output, and generating final reports for output is reviewed. File generation and file maintenance programs written for the system are general purpose routines.

  16. A National Study of Efficiency for Dialysis Centers: An Examination of Market Competition and Facility Characteristics for Production of Multiple Dialysis Outputs

    PubMed Central

    Ozgen, Hacer; A. Ozcan, Yasar

    2002-01-01

    Objective To examine market competition and facility characteristics that can be related to technical efficiency in the production of multiple dialysis outputs from the perspective of the industrial organization model. Study Setting Freestanding dialysis facilities that operated in 1997 submitted cost report forms to the Health Care Financing Administration (HCFA), and offered all three outputs—outpatient dialysis, dialysis training, and home program dialysis. Data Sources The Independent Renal Facility Cost Report Data file (IRFCRD) from HCFA was utilized to obtain information on output and input variables and market and facility features for 791 multiple-output facilities. Information regarding population characteristics was obtained from the Area Resources File. Study Design Cross-sectional data for the year 1997 were utilized to obtain facility-specific technical efficiency scores estimated through Data Envelopment Analysis (DEA). A binary variable of efficiency status was then regressed against its market and facility characteristics and control factors in a multivariate logistic regression analysis. Principal Findings The majority of the facilities in the sample are functioning technically inefficiently. Neither the intensity of market competition nor a policy of dialyzer reuse has a significant effect on the facilities' efficiency. Technical efficiency is significantly associated, however, with type of ownership, with the interaction between the market concentration of for-profits and ownership type, and with affiliations with chains of different sizes. Nonprofit and government-owned facilities are more likely than their for-profit counterparts to become inefficient producers of renal dialysis outputs. On the other hand, that relationship between ownership form and efficiency is reversed as the market concentration of for-profits in a given market increases. Facilities that are members of large chains are more likely to be technically inefficient. Conclusions Facilities do not appear to benefit from joint production of a variety of dialysis outputs, which may explain the ongoing tendency toward single-output production. Ownership form does make a positive difference in production efficiency, but only in local markets where competition exists between nonprofit and for-profit facilities. The increasing inefficiency associated with membership in large chains suggests that the growing consolidation in the dialysis industry may not, in fact, be the strategy for attaining more technical efficiency in the production of multiple dialysis outputs. PMID:12132602

  17. Obscuration Code with Space Station Applications (Manual)

    DTIC Science & Technology

    1985-12-01

    used to perform this DCL style com - mand parsing, readers are referred to the VMS documentation concerning the Command Definition Utility or CDU. I I I...FOR0O7.DAT; Input echo file: USERI: [RJM.NASJAN5S1 .LIS;3 The above examples show the operation of the SET OUTPUT com - mand. Note that the printer file is...be opened using the SET OUTPUT com - mand. The output files can be opened and closed using the SET OUTPUT /ECHOING, /PRINTABLE, /PLOTTABLE commands

  18. Modulator for tone and binary signals. [phase of modulation of tone and binary signals on carrier waves in communication systems

    NASA Technical Reports Server (NTRS)

    Mcchesney, J. R.; Lerner, T.; Fitch, E. J. (Inventor)

    1975-01-01

    Tones and binary information are transmitted as phase variations on a carrier wave of constant amplitude and frequency. The carrier and tones are applied to a balanced modulator for deriving an output signal including a pair of sidebands relative to the carrier. The carrier is phase modulated by a digital signal so that it is + or - 90 deg out of phase with the predetermined phase of the carrier. The carrier is combined in an algebraic summing device with the phase modulated signal and the balanced modulator output signal. The output of the algebraic summing device is hard limited to derive a constant amplitude and frequency signal having very narrow bandwidth requirements. At a receiver, the tones and binary data are detected with a phase locked loop having a voltage controlled oscillator driving a pair of orthogonal detection channels.

  19. Radar - 449MHz - Forks, WA (FKS) - Raw Data

    DOE Data Explorer

    Gottas, Daniel

    2018-06-25

    **Winds.** A radar wind profiler measures the Doppler shift of electromagnetic energy scattered back from atmospheric turbulence and hydrometeors along 3-5 vertical and off-vertical point beam directions. Back-scattered signal strength and radial-component velocities are remotely sensed along all beam directions and are combined to derive the horizontal wind field over the radar. These data typically are sampled and averaged hourly and usually have 6-m and/or 100-m vertical resolutions up to 4 km for the 915 MHz and 8 km for the 449 MHz systems. **Temperature.** To measure atmospheric temperature, a radio acoustic sounding system (RASS) is used in conjunction with the wind profile. These data typically are sampled and averaged for five minutes each hour and have a 60-m vertical resolution up to 1.5 km for the 915 MHz and 60 m up to 3.5 km for the 449 MHz. **Moments and Spectra.** The raw spectra and moments data are available for all dwells along each beam and are stored in daily files. For each day, there are files labeled "header" and "data." These files are generated by the radar data acquisition system (LAP-XM) and are encoded in a proprietary binary format. Values of spectral density at each Doppler velocity (FFT point), as well as the radial velocity, signal-to-noise ratio, and spectra width for the selected signal peak are included in these files. Attached zip files, *449mhz-spectra-data-extraction.zip* and *449mhz-moment-data-extraction.zip*, include executables to unpack the spectra, (GetSpectra32.exe) and moments (GetMomSp32.exe), respectively. Documentation on usage and output file formats also are included in the zip files.

  20. Radar - 449MHz - North Bend, OR (OTH) - Raw Data

    DOE Data Explorer

    Gottas, Daniel

    2018-06-25

    **Winds.** A radar wind profiler measures the Doppler shift of electromagnetic energy scattered back from atmospheric turbulence and hydrometeors along 3-5 vertical and off-vertical point beam directions. Back-scattered signal strength and radial-component velocities are remotely sensed along all beam directions and are combined to derive the horizontal wind field over the radar. These data typically are sampled and averaged hourly and usually have 6-m and/or 100-m vertical resolutions up to 4 km for the 915 MHz and 8 km for the 449 MHz systems. **Temperature.** To measure atmospheric temperature, a radio acoustic sounding system (RASS) is used in conjunction with the wind profile. These data typically are sampled and averaged for five minutes each hour and have a 60-m vertical resolution up to 1.5 km for the 915 MHz and 60 m up to 3.5 km for the 449 MHz. **Moments and Spectra.** The raw spectra and moments data are available for all dwells along each beam and are stored in daily files. For each day, there are files labeled "header" and "data." These files are generated by the radar data acquisition system (LAP-XM) and are encoded in a proprietary binary format. Values of spectral density at each Doppler velocity (FFT point), as well as the radial velocity, signal-to-noise ratio, and spectra width for the selected signal peak are included in these files. Attached zip files, *449mhz-spectra-data-extraction.zip* and *449mhz-moment-data-extraction.zip*, include executables to unpack the spectra, (GetSpectra32.exe) and moments (GetMomSp32.exe), respectively. Documentation on usage and output file formats also are included in the zip files.

  1. Radar - 449MHz - North Bend, OR (OTH) - Reviewed Data

    DOE Data Explorer

    Gottas, Daniel

    2018-06-25

    **Winds.** A radar wind profiler measures the Doppler shift of electromagnetic energy scattered back from atmospheric turbulence and hydrometeors along 3-5 vertical and off-vertical point beam directions. Back-scattered signal strength and radial-component velocities are remotely sensed along all beam directions and are combined to derive the horizontal wind field over the radar. These data typically are sampled and averaged hourly and usually have 6-m and/or 100-m vertical resolutions up to 4 km for the 915 MHz and 8 km for the 449 MHz systems. **Temperature.** To measure atmospheric temperature, a radio acoustic sounding system (RASS) is used in conjunction with the wind profile. These data typically are sampled and averaged for five minutes each hour and have a 60-m vertical resolution up to 1.5 km for the 915 MHz and 60 m up to 3.5 km for the 449 MHz. **Moments and Spectra.** The raw spectra and moments data are available for all dwells along each beam and are stored in daily files. For each day, there are files labeled "header" and "data." These files are generated by the radar data acquisition system (LAP-XM) and are encoded in a proprietary binary format. Values of spectral density at each Doppler velocity (FFT point), as well as the radial velocity, signal-to-noise ratio, and spectra width for the selected signal peak are included in these files. Attached zip files, *449mhz-spectra-data-extraction.zip* and *449mhz-moment-data-extraction.zip*, include executables to unpack the spectra, (GetSpectra32.exe) and moments (GetMomSp32.exe), respectively. Documentation on usage and output file formats also are included in the zip files.

  2. Radar - 449MHz - Forks, WA (FKS) - Reviewed Data

    DOE Data Explorer

    Gottas, Daniel

    2018-06-25

    **Winds.** A radar wind profiler measures the Doppler shift of electromagnetic energy scattered back from atmospheric turbulence and hydrometeors along 3-5 vertical and off-vertical point beam directions. Back-scattered signal strength and radial-component velocities are remotely sensed along all beam directions and are combined to derive the horizontal wind field over the radar. These data typically are sampled and averaged hourly and usually have 6-m and/or 100-m vertical resolutions up to 4 km for the 915 MHz and 8 km for the 449 MHz systems. **Temperature.** To measure atmospheric temperature, a radio acoustic sounding system (RASS) is used in conjunction with the wind profile. These data typically are sampled and averaged for five minutes each hour and have a 60-m vertical resolution up to 1.5 km for the 915 MHz and 60 m up to 3.5 km for the 449 MHz. **Moments and Spectra.** The raw spectra and moments data are available for all dwells along each beam and are stored in daily files. For each day, there are files labeled "header" and "data." These files are generated by the radar data acquisition system (LAP-XM) and are encoded in a proprietary binary format. Values of spectral density at each Doppler velocity (FFT point), as well as the radial velocity, signal-to-noise ratio, and spectra width for the selected signal peak are included in these files. Attached zip files, *449mhz-spectra-data-extraction.zip* and *449mhz-moment-data-extraction.zip*, include executables to unpack the spectra, (GetSpectra32.exe) and moments (GetMomSp32.exe), respectively. Documentation on usage and output file formats also are included in the zip files.

  3. Radar - 449MHz - Astoria, OR (AST) - Reviewed Data

    DOE Data Explorer

    Gottas, Daniel

    2018-06-25

    **Winds.** A radar wind profiler measures the Doppler shift of electromagnetic energy scattered back from atmospheric turbulence and hydrometeors along 3-5 vertical and off-vertical point beam directions. Back-scattered signal strength and radial-component velocities are remotely sensed along all beam directions and are combined to derive the horizontal wind field over the radar. These data typically are sampled and averaged hourly and usually have 6-m and/or 100-m vertical resolutions up to 4 km for the 915 MHz and 8 km for the 449 MHz systems. **Temperature.** To measure atmospheric temperature, a radio acoustic sounding system (RASS) is used in conjunction with the wind profile. These data typically are sampled and averaged for five minutes each hour and have a 60-m vertical resolution up to 1.5 km for the 915 MHz and 60 m up to 3.5 km for the 449 MHz. **Moments and Spectra.** The raw spectra and moments data are available for all dwells along each beam and are stored in daily files. For each day, there are files labeled "header" and "data." These files are generated by the radar data acquisition system (LAP-XM) and are encoded in a proprietary binary format. Values of spectral density at each Doppler velocity (FFT point), as well as the radial velocity, signal-to-noise ratio, and spectra width for the selected signal peak are included in these files. Attached zip files, *449mhz-spectra-data-extraction.zip* and *449mhz-moment-data-extraction.zip*, include executables to unpack the spectra, (GetSpectra32.exe) and moments (GetMomSp32.exe), respectively. Documentation on usage and output file formats also are included in the zip files.

  4. Radar - 449MHz - Astoria, OR (AST) - Raw Data

    DOE Data Explorer

    Gottas, Daniel

    2018-06-25

    **Winds.** A radar wind profiler measures the Doppler shift of electromagnetic energy scattered back from atmospheric turbulence and hydrometeors along 3-5 vertical and off-vertical point beam directions. Back-scattered signal strength and radial-component velocities are remotely sensed along all beam directions and are combined to derive the horizontal wind field over the radar. These data typically are sampled and averaged hourly and usually have 6-m and/or 100-m vertical resolutions up to 4 km for the 915 MHz and 8 km for the 449 MHz systems. **Temperature.** To measure atmospheric temperature, a radio acoustic sounding system (RASS) is used in conjunction with the wind profile. These data typically are sampled and averaged for five minutes each hour and have a 60-m vertical resolution up to 1.5 km for the 915 MHz and 60 m up to 3.5 km for the 449 MHz. **Moments and Spectra.** The raw spectra and moments data are available for all dwells along each beam and are stored in daily files. For each day, there are files labeled "header" and "data." These files are generated by the radar data acquisition system (LAP-XM) and are encoded in a proprietary binary format. Values of spectral density at each Doppler velocity (FFT point), as well as the radial velocity, signal-to-noise ratio, and spectra width for the selected signal peak are included in these files. Attached zip files, *449mhz-spectra-data-extraction.zip* and *449mhz-moment-data-extraction.zip*, include executables to unpack the spectra, (GetSpectra32.exe) and moments (GetMomSp32.exe), respectively. Documentation on usage and output file formats also are included in the zip files.

  5. VizieR Online Data Catalog: Parameters of 529 Kepler eclipsing binaries (Kjurkchieva+, 2017)

    NASA Astrophysics Data System (ADS)

    Kjurkchieva, D.; Vasileva, D.; Atanasova, T.

    2017-11-01

    We reviewed the Kepler eclipsing binary catalog (Prsa et al. 2011, Cat. J/AJ/141/83; Slawson et al. 2011, Cat. J/AJ/142/160; Matijevic et al. 2012) to search for detached eclipsing binaries with eccentric orbits. (5 data files).

  6. Adaptive detection of missed text areas in OCR outputs: application to the automatic assessment of OCR quality in mass digitization projects

    NASA Astrophysics Data System (ADS)

    Ben Salah, Ahmed; Ragot, Nicolas; Paquet, Thierry

    2013-01-01

    The French National Library (BnF*) has launched many mass digitization projects in order to give access to its collection. The indexation of digital documents on Gallica (digital library of the BnF) is done through their textual content obtained thanks to service providers that use Optical Character Recognition softwares (OCR). OCR softwares have become increasingly complex systems composed of several subsystems dedicated to the analysis and the recognition of the elements in a page. However, the reliability of these systems is always an issue at stake. Indeed, in some cases, we can find errors in OCR outputs that occur because of an accumulation of several errors at different levels in the OCR process. One of the frequent errors in OCR outputs is the missed text components. The presence of such errors may lead to severe defects in digital libraries. In this paper, we investigate the detection of missed text components to control the OCR results from the collections of the French National Library. Our verification approach uses local information inside the pages based on Radon transform descriptors and Local Binary Patterns descriptors (LBP) coupled with OCR results to control their consistency. The experimental results show that our method detects 84.15% of the missed textual components, by comparing the OCR ALTO files outputs (produced by the service providers) to the images of the document.

  7. Chapter 21: Programmatic Interfaces - STILTS

    NASA Astrophysics Data System (ADS)

    Fitzpatrick, M. J.

    STILTS is the Starlink Tables Infrastructure Library Tool Set developed by Mark Taylor of the former Starlink Project. STILTS is a command-line tool (see the NVOSS_HOME/bin/stilts command) providing access to the same functionality driving the TOPCAT application and can be run using either the STILTS-specific jar file, or the more general TOPCAT jar file (both are available in the NVOSS_HOME/java/lib directory and are included in the default software environment classpath). The heart of both STILTS and TOPCAT is the STIL Java library. STIL is designed to efficiently handle the input, output and processing of very large tabular datasets and the STILTS task interface makes it an ideal tool for the scripting environment. Multiple formats are supported (including FITS Binary Tables, VOTable, CSV, SQL databases and ASCII, amongst others) and while some tools will generically handle all supported formats, others are specific to the VOTable format. Converting a VOTable to a more script-friendly format is the first thing most users will encounter, but there are many other useful tools as well.

  8. VizieR Online Data Catalog: Algorithm for correcting CoRoT raw light curves (Mislis+, 2010)

    NASA Astrophysics Data System (ADS)

    Mislis, D.; Schmitt, J. H. M. M.; Carone, L.; Guenther, E. W.; Patzold, M.

    2010-10-01

    Requirements : gfortran (or g77, ifort) compiler Input Files : The input files sould be raw CoRoT txt files (http://idoc-corot.ias.u-psud.fr/index.jsp) with names CoRoT*.txt Run the cda by typing C>: ./cda.csh (code and data sould be in the same directory) Output files : CDA creates one ascii output file with name - CoRoT*.R.cor for R filter (2 data files).

  9. VizieR Online Data Catalog: ND2 rotational spectrum (Melosso+,

    NASA Astrophysics Data System (ADS)

    Melosso, M.; Degli Esposti, C.; Dore, L.

    2018-01-01

    files used with the SPFIT/SPCAT program suite. There are 8 files of supplementary material, including a ReadMe, which was created by the AAS data editors. The text files are as follows: 1_Explan.txt = information on the content of the other files. 2ND2.fit = the output file of the fit of spectroscopic data used in the present study. 3ND2.lin = the corresponding line file. 4ND2.par = the corresponding parameter file. 5ND2.cat = the output file of the prediction made with the parameters determined in this study. 6ND2.var = the corresponding parameter file 7ND2.int = the corresponding intensity file (1 data file).

  10. Computer Aided Wirewrap Interconnect.

    DTIC Science & Technology

    1980-11-01

    ECLI (180 MHz System Clock Generated via Ring Oscillator) Clock Waveform: Synchronous Phase 0 Output Binary Counter: Power Plane Noie: (Loaded) LSB...LOGIC (ECL) (185 MHz System Clock Generated via Ring Oscillator) Clock Woveform Synchronous Phase 0 Output Binary Counter- Power Plane Voise (Loaded...High Speed .. ......... . 98 Clock Signals Into Logic Panels in a Multiboard System On-Eoard Clock Distribution Via Fanout .... ......... 102 Through

  11. Web-based Toolkit for Dynamic Generation of Data Processors

    NASA Astrophysics Data System (ADS)

    Patel, J.; Dascalu, S.; Harris, F. C.; Benedict, K. K.; Gollberg, G.; Sheneman, L.

    2011-12-01

    All computation-intensive scientific research uses structured datasets, including hydrology and all other types of climate-related research. When it comes to testing their hypotheses, researchers might use the same dataset differently, and modify, transform, or convert it to meet their research needs. Currently, many researchers spend a good amount of time performing data processing and building tools to speed up this process. They might routinely repeat the same process activities for new research projects, spending precious time that otherwise could be dedicated to analyzing and interpreting the data. Numerous tools are available to run tests on prepared datasets and many of them work with datasets in different formats. However, there is still a significant need for applications that can comprehensively handle data transformation and conversion activities and help prepare the various processed datasets required by the researchers. We propose a web-based application (a software toolkit) that dynamically generates data processors capable of performing data conversions, transformations, and customizations based on user-defined mappings and selections. As a first step, the proposed solution allows the users to define various data structures and, in the next step, can select various file formats and data conversions for their datasets of interest. In a simple scenario, the core of the proposed web-based toolkit allows the users to define direct mappings between input and output data structures. The toolkit will also support defining complex mappings involving the use of pre-defined sets of mathematical, statistical, date/time, and text manipulation functions. Furthermore, the users will be allowed to define logical cases for input data filtering and sampling. At the end of the process, the toolkit is designed to generate reusable source code and executable binary files for download and use by the scientists. The application is also designed to store all data structures and mappings defined by a user (an author), and allow the original author to modify them using standard authoring techniques. The users can change or define new mappings to create new data processors for download and use. In essence, when executed, the generated data processor binary file can take an input data file in a given format and output this data, possibly transformed, in a different file format. If they so desire, the users will be able modify directly the source code in order to define more complex mappings and transformations that are not currently supported by the toolkit. Initially aimed at supporting research in hydrology, the toolkit's functions and features can be either directly used or easily extended to other areas of climate-related research. The proposed web-based data processing toolkit will be able to generate various custom software processors for data conversion and transformation in a matter of seconds or minutes, saving a significant amount of researchers' time and allowing them to focus on core research issues.

  12. High-Speed Binary-Output Image Sensor

    NASA Technical Reports Server (NTRS)

    Fossum, Eric; Panicacci, Roger A.; Kemeny, Sabrina E.; Jones, Peter D.

    1996-01-01

    Photodetector outputs digitized by circuitry on same integrated-circuit chip. Developmental special-purpose binary-output image sensor designed to capture up to 1,000 images per second, with resolution greater than 10 to the 6th power pixels per image. Lower-resolution but higher-frame-rate prototype of sensor contains 128 x 128 array of photodiodes on complementary metal oxide/semiconductor (CMOS) integrated-circuit chip. In application for which it is being developed, sensor used to examine helicopter oil to determine whether amount of metal and sand in oil sufficient to warrant replacement.

  13. PCF File Format.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thoreson, Gregory G

    PCF files are binary files designed to contain gamma spectra and neutron count rates from radiation sensors. It is the native format for the GAmma Detector Response and Analysis Software (GADRAS) package [1]. It can contain multiple spectra and information about each spectrum such as energy calibration. This document outlines the format of the file that would allow one to write a computer program to parse and write such files.

  14. Manual for Getdata Version 3.1: a FORTRAN Utility Program for Time History Data

    NASA Technical Reports Server (NTRS)

    Maine, Richard E.

    1987-01-01

    This report documents version 3.1 of the GetData computer program. GetData is a utility program for manipulating files of time history data, i.e., data giving the values of parameters as functions of time. The most fundamental capability of GetData is extracting selected signals and time segments from an input file and writing the selected data to an output file. Other capabilities include converting file formats, merging data from several input files, time skewing, interpolating to common output times, and generating calculated output signals as functions of the input signals. This report also documents the interface standards for the subroutines used by GetData to read and write the time history files. All interface to the data files is through these subroutines, keeping the main body of GetData independent of the precise details of the file formats. Different file formats can be supported by changes restricted to these subroutines. Other computer programs conforming to the interface standards can call the same subroutines to read and write files in compatible formats.

  15. VizieR Online Data Catalog: Orbital parameters of 341 new binaries (Murphy+, 2018)

    NASA Astrophysics Data System (ADS)

    Murphy, S. J.; Moe, M.; Kurtz, D. W.; Bedding, T.; Shibahashi, H.; Boffin, H. M. J.

    2018-01-01

    Kepler targets with effective temperatures between 6600 and 10000K have been investigated for pulsational phase modulation that can be attributed to binary orbital motion. For each target, we provide a binary status, which also reflects whether or not the target pulsates. For the binary systems, we provide the Kepler Input Catalogue (KIC) number, as well as the binary orbital elements: the period, semi-major axis, eccentricity, longitude of periastron, time of periastron passage, binary mass function and a calculated radial velocity semi-amplitude. (3 data files).

  16. User guide for MODPATH version 6 - A particle-tracking model for MODFLOW

    USGS Publications Warehouse

    Pollock, David W.

    2012-01-01

    MODPATH is a particle-tracking post-processing model that computes three-dimensional flow paths using output from groundwater flow simulations based on MODFLOW, the U.S. Geological Survey (USGS) finite-difference groundwater flow model. This report documents MODPATH version 6. Previous versions were documented in USGS Open-File Reports 89-381 and 94-464. The program uses a semianalytical particle-tracking scheme that allows an analytical expression of a particle's flow path to be obtained within each finite-difference grid cell. A particle's path is computed by tracking the particle from one cell to the next until it reaches a boundary, an internal sink/source, or satisfies another termination criterion. Data input to MODPATH consists of a combination of MODFLOW input data files, MODFLOW head and flow output files, and other input files specific to MODPATH. Output from MODPATH consists of several output files, including a number of particle coordinate output files intended to serve as input data for other programs that process, analyze, and display the results in various ways. MODPATH is written in FORTRAN and can be compiled by any FORTRAN compiler that fully supports FORTRAN-2003 or by most commercially available FORTRAN-95 compilers that support the major FORTRAN-2003 language extensions.

  17. Risk Assessment Update: Russian Segment

    NASA Technical Reports Server (NTRS)

    Christiansen, Eric; Lear, Dana; Hyde, James; Bjorkman, Michael; Hoffman, Kevin

    2012-01-01

    BUMPER-II version 1.95j source code was provided to RSC-E- and Khrunichev at January 2012 MMOD TIM in Moscow. MEMCxP and ORDEM 3.0 environments implemented as external data files. NASA provided a sample ORDEM 3.0 g."key" & "daf" environment file set for demonstration and benchmarking BUMPER -II v1.95j installation at the Jan-12 TIM. ORDEM 3.0 has been completed and is currently in beta testing. NASA will provide a preliminary set of ORDEM 3.0 ".key" & ".daf" environment files for the years 2012 through 2028. Bumper output files produced using the new ORDEM 3.0 data files are intended for internal use only, not for requirements verification. Output files will contain these words ORDEM FILE DESCRIPTION = PRELIMINARY VERSION: not for production. The projectile density term in many BUMPER-II ballistic limit equations will need to be updated. Cube demo scripts and output files delivered at the Jan-12 TIM have been updated for the new ORDEM 3.0 data files. Risk assessment results based on ORDEM 3.0 and MEM will be presented for the Russian Segment (RS) of ISS.

  18. Prototype Focal-Plane-Array Optoelectronic Image Processor

    NASA Technical Reports Server (NTRS)

    Fang, Wai-Chi; Shaw, Timothy; Yu, Jeffrey

    1995-01-01

    Prototype very-large-scale integrated (VLSI) planar array of optoelectronic processing elements combines speed of optical input and output with flexibility of reconfiguration (programmability) of electronic processing medium. Basic concept of processor described in "Optical-Input, Optical-Output Morphological Processor" (NPO-18174). Performs binary operations on binary (black and white) images. Each processing element corresponds to one picture element of image and located at that picture element. Includes input-plane photodetector in form of parasitic phototransistor part of processing circuit. Output of each processing circuit used to modulate one picture element in output-plane liquid-crystal display device. Intended to implement morphological processing algorithms that transform image into set of features suitable for high-level processing; e.g., recognition.

  19. Guide to GFS History File Change on May 1, 2007

    Science.gov Websites

    Guide to GFS History File Change on May 1, 2007 On May 1, 2007 12Z, the GFS had a major change. The change caused the internal binary GFS history file to change formats. The file is still in spectral space but now pressure is calculated in a different way. Sometime in the future, the GFS history file may be

  20. Quantitative Microbial Risk Assessment Tutorial: Publishing a Microbial Density Time Series as a Txt File

    EPA Science Inventory

    A SARA Timeseries Utility supports analysis and management of time-varying environmental data including listing, graphing, computing statistics, computing meteorological data and saving in a WDM or text file. File formats supported include WDM, HSPF Binary (.hbn), USGS RDB, and T...

  1. MISR HDF-to-Binary Converter and Radiance/BRF Calculation Tools

    Atmospheric Science Data Center

    2013-04-01

    ... to have the HDF and HDF-EOS libraries for the target computer. The HDF libraries are available from  The HDF Group (THG) . The ... and the HDF-EOS include and library files on the target computer. The following files are included in the distribution tar file for ...

  2. Life and dynamic capacity modeling for aircraft transmissions

    NASA Technical Reports Server (NTRS)

    Savage, Michael

    1991-01-01

    A computer program to simulate the dynamic capacity and life of parallel shaft aircraft transmissions is presented. Five basic configurations can be analyzed: single mesh, compound, parallel, reverted, and single plane reductions. In execution, the program prompts the user for the data file prefix name, takes input from a ASCII file, and writes its output to a second ASCII file with the same prefix name. The input data file includes the transmission configuration, the input shaft torque and speed, and descriptions of the transmission geometry and the component gears and bearings. The program output file describes the transmission, its components, their capabilities, locations, and loads. It also lists the dynamic capability, ninety percent reliability, and mean life of each component and the transmission as a system. Here, the program, its input and output files, and the theory behind the operation of the program are described.

  3. Integrated Geothermal-CO2 Storage Reservoirs: FY1 Final Report

    DOE Data Explorer

    Buscheck, Thomas A.

    2012-01-01

    The purpose of phase 1 is to determine the feasibility of integrating geologic CO2 storage (GCS) with geothermal energy production. Phase 1 includes reservoir analyses to determine injector/producer well schemes that balance the generation of economically useful flow rates at the producers with the need to manage reservoir overpressure to reduce the risks associated with overpressure, such as induced seismicity and CO2 leakage to overlying aquifers. This submittal contains input and output files of the reservoir model analyses. A reservoir-model "index-html" file was sent in a previous submittal to organize the reservoir-model input and output files according to sections of the FY1 Final Report to which they pertain. The recipient should save the file: Reservoir-models-inputs-outputs-index.html in the same directory that the files: Section2.1.*.tar.gz files are saved in.

  4. Fabrication and Testing of Binary-Phase Fourier Gratings for Nonuniform Array Generation

    NASA Technical Reports Server (NTRS)

    Keys, Andrew S.; Crow, Robert W.; Ashley, Paul R.; Nelson, Tom R., Jr.; Parker, Jack H.; Beecher, Elizabeth A.

    2004-01-01

    This effort describes the fabrication and testing of binary-phase Fourier gratings designed to generate an incoherent array of output source points with nonuniform user-defined intensities, symmetric about the zeroth order. Like Dammann fanout gratings, these binary-phase Fourier gratings employ only two phase levels to generate a defined output array. Unlike Dammann fanout gratings, these gratings generate an array of nonuniform, user-defined intensities when projected into the far-field regime. The paper describes the process of design, fabrication, and testing for two different version of the binary-phase grating; one designed for a 12 micron wavelength, referred to as the Long-Wavelength Infrared (LWIR) grating, and one designed for a 5 micron wavelength, referred to as the Mid-Wavelength Infrared Grating (MWIR).

  5. Fortran Program for X-Ray Photoelectron Spectroscopy Data Reformatting

    NASA Technical Reports Server (NTRS)

    Abel, Phillip B.

    1989-01-01

    A FORTRAN program has been written for use on an IBM PC/XT or AT or compatible microcomputer (personal computer, PC) that converts a column of ASCII-format numbers into a binary-format file suitable for interactive analysis on a Digital Equipment Corporation (DEC) computer running the VGS-5000 Enhanced Data Processing (EDP) software package. The incompatible floating-point number representations of the two computers were compared, and a subroutine was created to correctly store floating-point numbers on the IBM PC, which can be directly read by the DEC computer. Any file transfer protocol having provision for binary data can be used to transmit the resulting file from the PC to the DEC machine. The data file header required by the EDP programs for an x ray photoelectron spectrum is also written to the file. The user is prompted for the relevant experimental parameters, which are then properly coded into the format used internally by all of the VGS-5000 series EDP packages.

  6. VizieR Online Data Catalog: Seven sdB eclipsing binaries data (Pulley+, 2018)

    NASA Astrophysics Data System (ADS)

    Pulley, D.; Faillace, G.; Smith, D.; Watkins, A.; von Harrach, S.

    2017-11-01

    File tablea1 contains a list of the observatories and equipment used by this group to gather the data. File tablea2 contains all the observations made by this group It contains timing filter and observatory data. File tablea3 contains a list of the systems studied in this paper with reference stars used in the analysis. (4 data files).

  7. User Interactive Software for Analysis of Human Physiological Data

    NASA Technical Reports Server (NTRS)

    Cowings, Patricia S.; Toscano, William; Taylor, Bruce C.; Acharya, Soumydipta

    2006-01-01

    Ambulatory physiological monitoring has been used to study human health and performance in space and in a variety of Earth-based environments (e.g., military aircraft, armored vehicles, small groups in isolation, and patients). Large, multi-channel data files are typically recorded in these environments, and these files often require the removal of contaminated data prior to processing and analyses. Physiological data processing can now be performed with user-friendly, interactive software developed by the Ames Psychophysiology Research Laboratory. This software, which runs on a Windows platform, contains various signal-processing routines for both time- and frequency- domain data analyses (e.g., peak detection, differentiation and integration, digital filtering, adaptive thresholds, Fast Fourier Transform power spectrum, auto-correlation, etc.). Data acquired with any ambulatory monitoring system that provides text or binary file format are easily imported to the processing software. The application provides a graphical user interface where one can manually select and correct data artifacts utilizing linear and zero interpolation and adding trigger points for missed peaks. Block and moving average routines are also provided for data reduction. Processed data in numeric and graphic format can be exported to Excel. This software, PostProc (for post-processing) requires the Dadisp engineering spreadsheet (DSP Development Corp), or equivalent, for implementation. Specific processing routines were written for electrocardiography, electroencephalography, electromyography, blood pressure, skin conductance level, impedance cardiography (cardiac output, stroke volume, thoracic fluid volume), temperature, and respiration

  8. Logic models to predict continuous outputs based on binary inputs with an application to personalized cancer therapy.

    PubMed

    Knijnenburg, Theo A; Klau, Gunnar W; Iorio, Francesco; Garnett, Mathew J; McDermott, Ultan; Shmulevich, Ilya; Wessels, Lodewyk F A

    2016-11-23

    Mining large datasets using machine learning approaches often leads to models that are hard to interpret and not amenable to the generation of hypotheses that can be experimentally tested. We present 'Logic Optimization for Binary Input to Continuous Output' (LOBICO), a computational approach that infers small and easily interpretable logic models of binary input features that explain a continuous output variable. Applying LOBICO to a large cancer cell line panel, we find that logic combinations of multiple mutations are more predictive of drug response than single gene predictors. Importantly, we show that the use of the continuous information leads to robust and more accurate logic models. LOBICO implements the ability to uncover logic models around predefined operating points in terms of sensitivity and specificity. As such, it represents an important step towards practical application of interpretable logic models.

  9. Improving the Taiwan Military’s Disaster Relief Response to Typhoons

    DTIC Science & Technology

    2015-06-01

    circulation, are mostly westbound. When they reach the vicinity of Taiwan or the Philippines , which are always at the edge of the Pacific subtropical high...files from the POM base case model, one set for each design point. To automate the process of running all the GAMS files, a Windows batch file ( BAT ...is used to call on GAMS to solve each version of the model. The BAT file creates a new directory for each run to hold output, and one of the outputs

  10. The Lagrangian particle dispersion model FLEXPART version 10

    NASA Astrophysics Data System (ADS)

    Pisso, Ignacio; Sollum, Espen; Grythe, Henrik; Kristiansen, Nina; Cassiani, Massimo; Eckhardt, Sabine; Thompson, Rona; Groot Zwaaftnik, Christine; Evangeliou, Nikolaos; Hamburger, Thomas; Sodemann, Harald; Haimberger, Leopold; Henne, Stephan; Brunner, Dominik; Burkhart, John; Fouilloux, Anne; Fang, Xuekun; Phillip, Anne; Seibert, Petra; Stohl, Andreas

    2017-04-01

    The Lagrangian particle dispersion model FLEXPART was in its first original release in 1998 designed for calculating the long-range and mesoscale dispersion of air pollutants from point sources, such as after an accident in a nuclear power plant. The model has now evolved into a comprehensive tool for atmospheric transport modelling and analysis. Its application fields are extended to a range of atmospheric transport processes for both atmospheric gases and aerosols, e.g. greenhouse gases, short-lived climate forces like black carbon, volcanic ash and gases as well as studies of the water cycle. We present the newest release, FLEXPART version 10. Since the last publication fully describing FLEXPART (version 6.2), the model code has been parallelised in order to allow for the possibility to speed up computation. A new, more detailed gravitational settling parametrisation for aerosols was implemented, and the wet deposition scheme for aerosols has been heavily modified and updated to provide a more accurate representation of this physical process. In addition, an optional new turbulence scheme for the convective boundary layer is available, that considers the skewness in the vertical velocity distribution. Also, temporal variation and temperature dependence of the OH-reaction are included. Finally, user input files are updated to a more convenient and user-friendly namelist format, and the option to produce the output-files in netCDF-format instead of binary format is implemented. We present these new developments and show recent model applications. Moreover, we also introduce some tools for the preparation of the meteorological input data, as well as for the processing of FLEXPART output data.

  11. VizieR Online Data Catalog: Statistical test on binary stars non-coevality (Valle+, 2016)

    NASA Astrophysics Data System (ADS)

    Valle, G.; Dell'Omodarme, M.; Valle, G.; Prada Moroni, P. G.; Degl'Innocenti, S.

    2016-01-01

    The table contains the W0.95 critical values, for the 1087 binary systems considered in the paper. Tha table also lists the parameters of the beta distributions approximating the empirical W distributions. (1 data file).

  12. Optimizing Input/Output Using Adaptive File System Policies

    NASA Technical Reports Server (NTRS)

    Madhyastha, Tara M.; Elford, Christopher L.; Reed, Daniel A.

    1996-01-01

    Parallel input/output characterization studies and experiments with flexible resource management algorithms indicate that adaptivity is crucial to file system performance. In this paper we propose an automatic technique for selecting and refining file system policies based on application access patterns and execution environment. An automatic classification framework allows the file system to select appropriate caching and pre-fetching policies, while performance sensors provide feedback used to tune policy parameters for specific system environments. To illustrate the potential performance improvements possible using adaptive file system policies, we present results from experiments involving classification-based and performance-based steering.

  13. Role Discovery in Graphs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2014-08-14

    RolX takes the features from Re-FeX or any other feature matrix as input and outputs role assignments (clusters). The output of RolX is a csv file containing the node-role memberships and a csv file containing the role-feature definitions.

  14. Active Management of Integrated Geothermal-CO2 Storage Reservoirs in Sedimentary Formations

    DOE Data Explorer

    Buscheck, Thomas A.

    2012-01-01

    Active Management of Integrated Geothermal–CO2 Storage Reservoirs in Sedimentary Formations: An Approach to Improve Energy Recovery and Mitigate Risk : FY1 Final Report The purpose of phase 1 is to determine the feasibility of integrating geologic CO2 storage (GCS) with geothermal energy production. Phase 1 includes reservoir analyses to determine injector/producer well schemes that balance the generation of economically useful flow rates at the producers with the need to manage reservoir overpressure to reduce the risks associated with overpressure, such as induced seismicity and CO2 leakage to overlying aquifers. This submittal contains input and output files of the reservoir model analyses. A reservoir-model "index-html" file was sent in a previous submittal to organize the reservoir-model input and output files according to sections of the FY1 Final Report to which they pertain. The recipient should save the file: Reservoir-models-inputs-outputs-index.html in the same directory that the files: Section2.1.*.tar.gz files are saved in.

  15. Active Management of Integrated Geothermal-CO2 Storage Reservoirs in Sedimentary Formations

    DOE Data Explorer

    Buscheck, Thomas A.

    2000-01-01

    Active Management of Integrated Geothermal–CO2 Storage Reservoirs in Sedimentary Formations: An Approach to Improve Energy Recovery and Mitigate Risk: FY1 Final Report The purpose of phase 1 is to determine the feasibility of integrating geologic CO2 storage (GCS) with geothermal energy production. Phase 1 includes reservoir analyses to determine injector/producer well schemes that balance the generation of economically useful flow rates at the producers with the need to manage reservoir overpressure to reduce the risks associated with overpressure, such as induced seismicity and CO2 leakage to overlying aquifers. This submittal contains input and output files of the reservoir model analyses. A reservoir-model "index-html" file was sent in a previous submittal to organize the reservoir-model input and output files according to sections of the FY1 Final Report to which they pertain. The recipient should save the file: Reservoir-models-inputs-outputs-index.html in the same directory that the files: Section2.1.*.tar.gz files are saved in.

  16. A Multiple-star Combined Solution Program - Application to the Population II Binary μ Cas

    NASA Astrophysics Data System (ADS)

    Gudehus, D. H.

    2001-05-01

    A multiple-star combined-solution computer program which can simultaneously fit astrometric, speckle, and spectroscopic data, and solve for the orbital parameters, parallax, proper motion, and masses has been written and is now publicly available. Some features of the program are the ability to scale the weights at run time, hold selected parameters constant, handle up to five spectroscopic subcomponents for the primary and the secondary each, account for the light travel time across the system, account for apsidal motion, plot the results, and write the residuals in position to a standard file for further analysis. The spectroscopic subcomponent data can be represented by reflex velocities and/or by independent measurements. A companion editing program which can manage the data files is included in the package. The program has been applied to the Population II binary μ Cas to derive improved masses and an estimate of the primordial helium abundance. The source code, executables, sample data files, and documentation for OpenVMS and Unix, including Linux, are available at http://www.chara.gsu.edu/\\rlap\\ \\ gudehus/binary.html.

  17. A New Compression Method for FITS Tables

    NASA Technical Reports Server (NTRS)

    Pence, William; Seaman, Rob; White, Richard L.

    2010-01-01

    As the size and number of FITS binary tables generated by astronomical observatories increases, so does the need for a more efficient compression method to reduce the amount disk space and network bandwidth required to archive and down1oad the data tables. We have developed a new compression method for FITS binary tables that is modeled after the FITS tiled-image compression compression convention that has been in use for the past decade. Tests of this new method on a sample of FITS binary tables from a variety of current missions show that on average this new compression technique saves about 50% more disk space than when simply compressing the whole FITS file with gzip. Other advantages of this method are (1) the compressed FITS table is itself a valid FITS table, (2) the FITS headers remain uncompressed, thus allowing rapid read and write access to the keyword values, and (3) in the common case where the FITS file contains multiple tables, each table is compressed separately and may be accessed without having to uncompress the whole file.

  18. VizieR Online Data Catalog: Radiative forces for stellar envelopes (Seaton, 1997)

    NASA Astrophysics Data System (ADS)

    Seaton, M. J.; Yan, Y.; Mihalas, D.; Pradhan, A. K.

    2000-02-01

    (1) Primary data files, stages.zz These files give data for the calculation of radiative accelerations, GRAD, for elements with nuclear charge zz. Data are available for zz=06, 07, 08, 10, 11, 12, 13, 14, 16, 18, 20, 24, 25, 26 and 28. Calculations are made using data from the Opacity Project (see papers SYMP and IXZ). The data are given for each ionisation stage, j. They are tabulated on a mesh of (T, Ne, CHI) where T is temperature, Ne electron density and CHI is abundance multiplier. The files include data for ionisation fractions, for each (T, Ne). The file contents are described in the paper ACC and as comments in the code add.f (2) Code add.f This reads a file stages.zz and creates a file acc.zz giving radiative accelerations averaged over ionisation stages. The code prompts for names of input and output files. The code, as provided, gives equal weights (as defined in the paper ACC) to all stages. Th weights are set in SUBROUTINE WEIGHTS, which could be changed to give any weights preferred by the user. The dependence of diffusion coefficients on ionisation stage is given by a function ZET, which is defined in SUBROUTINE ZETA. The expressions used for ZET are as given in the paper. The user can change that subroutine if other expressions are preferred. The output file contains values, ZETBAR, of ZET, averaged over ionisation stages. (3) Files acc.zz Radiative accelerations computed using add.f as provided. The user will need to run the code add.f only if it is required to change the subroutines WEIGHTS or ZETA. The contents of the files acc.zz are described in the paper ACC and in comments contained in the code add.f. (4) Code accfit.f This code gives gives radiative accelerations, and some related data, for a stellar model. Methods used to interpolate data to the values of (T, RHO) for the stellar model are based on those used in the code opfit.for (see the paper OPF). The executable file accfit.com runs accfit.f. It uses a list of files given in accfit.files (see that file for further description). The mesh used for the abundance-multiplier CHI on the output file will generally be finer than that used in the input files acc.zz. The mesh to be used is specified on a file chi.dat. For a test run, the stellar model used is given in the file 10000_4.2 (Teff=10000 K, LOG10(g)=4.2) The output file from that test run is acc100004.2. The contents of the output file are described in the paper ACC and as comments in the code accfit.f. (5) The code diff.f This code reads the output file (e.g. acc1000004.2) created by accfit.f. For any specified depth point in the model and value of CHI, it gives values of radiative accelerations, the quantity ZETBAR required for calculation of diffusion coefficients, and Rosseland-mean opacities. The code prompts for input data. It creates a file recording all data calculated. The code diff.f is intended for incorporation, as a set of subroutines, in codes for diffusion calculations. (1 data file).

  19. Second catalog of interferometric measurements of binary stars (McAlister and Hartkopf 1988): Documentation for the machine-readable version

    NASA Technical Reports Server (NTRS)

    Warren, Wayne H., Jr.

    1989-01-01

    The machine-readable version of the catalog, as it is currently being distributed from the Astronomical Data Center, is described. The catalog is a compilation of measurements of binary- and multiple-star systems obtained by speckle interferometric techniques; this version supersedes a previous edition of the catalog published in 1985. Stars that have been examined for multiplicity with negative results are included, in which case upper limits for the separation are given. The second version is expanded from the first in that a file of newly resolved systems and six cross-index files of alternate designations are included. The data file contains alternate identifications for the observed systems, epochs of observation, reported errors in position angles and separation, and bibliographical references.

  20. Semi-Automated Processing of Trajectory Simulator Output Files for Model Evaluation

    DTIC Science & Technology

    2018-01-01

    ARL-TR-8284 ● JAN 2018 US Army Research Laboratory Semi-Automated Processing of Trajectory Simulator Output Files for Model......Do not return it to the originator. ARL-TR-8284 ● JAN 2018 US Army Research Laboratory Semi-Automated Processing of Trajectory

  1. VizieR Online Data Catalog: Wide binaries in Tycho-Gaia: search method (Andrews+, 2017)

    NASA Astrophysics Data System (ADS)

    Andrews, J. J.; Chaname, J.; Agueros, M. A.

    2017-11-01

    Our catalogue of wide binaries identified in the Tycho-Gaia Astrometric Solution catalogue. The Gaia source IDs, Tycho IDs, astrometry, posterior probabilities for both the log-flat prior and power-law prior models, and angular separation are presented. (1 data file).

  2. Perl-speaks-NONMEM (PsN)--a Perl module for NONMEM related programming.

    PubMed

    Lindbom, Lars; Ribbing, Jakob; Jonsson, E Niclas

    2004-08-01

    The NONMEM program is the most widely used nonlinear regression software in population pharmacokinetic/pharmacodynamic (PK/PD) analyses. In this article we describe a programming library, Perl-speaks-NONMEM (PsN), intended for programmers that aim at using the computational capability of NONMEM in external applications. The library is object oriented and written in the programming language Perl. The classes of the library are built around NONMEM's data, model and output files. The specification of the NONMEM model is easily set or changed through the model and data file classes while the output from a model fit is accessed through the output file class. The classes have methods that help the programmer perform common repetitive tasks, e.g. summarising the output from a NONMEM run, setting the initial estimates of a model based on a previous run or truncating values over a certain threshold in the data file. PsN creates a basis for the development of high-level software using NONMEM as the regression tool.

  3. Natural Resource Information System. Volume 2: System operating procedures and instructions

    NASA Technical Reports Server (NTRS)

    1972-01-01

    A total computer software system description is provided for the prototype Natural Resource Information System designed to store, process, and display data of maximum usefulness to land management decision making. Program modules are described, as are the computer file design, file updating methods, digitizing process, and paper tape conversion to magnetic tape. Operating instructions for the system, data output, printed output, and graphic output are also discussed.

  4. Attitude profile design program

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The Attitude Profile Design (APD) Program was designed to be used as a stand-alone addition to the Simplex Computation of Optimum Orbital Trajectories (SCOOT). The program uses information from a SCOOT output file and the user defined attitude profile to produce time histories of attitude, angular body rates, and accelerations. The APD program is written in standard FORTRAN77 and should be portable to any machine that has an appropriate compiler. The input and output are through formatted files. The program reads the basic flight data, such as the states of the vehicles, acceleration profiles, and burn information, from the SCOOT output file. The user inputs information about the desired attitude profile during coasts in a high level manner. The program then takes these high level commands and executes the maneuvers, outputting the desired information.

  5. Orion Script Generator

    NASA Technical Reports Server (NTRS)

    Dooling, Robert J.

    2012-01-01

    NASA Engineering's Orion Script Generator (OSG) is a program designed to run on Exploration Flight Test One Software. The script generator creates a SuperScript file that, when run, accepts the filename for a listing of Compact Unique Identifiers (CUIs). These CUIs will correspond to different variables on the Orion spacecraft, such as the temperature of a component X, the active or inactive status of another component Y, and so on. OSG will use a linked database to retrieve the value for each CUI, such as "100 05," "True," and so on. Finally, OSG writes SuperScript code to display each of these variables before outputting the ssi file that allows recipients to view a graphical representation of Orion Flight Test One's status through these variables. This project's main challenge was creating flexible software that accepts and transfers many types of data, from Boolean (true or false) values to "Unsigned Long Long'' values (any number from 0 to 18,446,744,073,709,551,615). We also needed to allow bit manipulation for each variable, requiring us to program functions that could convert any of the multiple types of data into binary code. Throughout the project, we explored different methods to optimize the speed of working with the CUI database and long binary numbers. For example, the program handled extended binary numbers much more efficiently when we stored them as collections of Boolean values (true or false representing 1 or 0) instead of as collections of character strings or numbers. We also strove to make OSG as user-friendly and accommodating of different needs as possible its default behavior is to display a current CUI's maximum value and minimum value with three to five intermediate values in between, all in descending order. Fortunately, users can also add other input on the same lines as each CUI name to request different high values, low values, display options (ascending, sine, and so on), and interval sizes for generating intermediate values. Developing input validation took up quite a bit of time, but OSG's flexibility in the end was worth it.

  6. VizieR Online Data Catalog: FADO code (Gomes+, 2017)

    NASA Astrophysics Data System (ADS)

    Gomes, J. M.; Papaderos, P.

    2017-03-01

    FADO comes from the Latin word "fatum" that means fate or destiny. It is also a well known genre of Portuguese music, and by choosing this acronym for this spectral synthesis tool we would like to pay tribute to Portugal. The main goal of FADO is to explore the star-formation and chemical enrichment history (the "Fado") of galaxies based on two hitherto unique elements in spectral fitting models: a) self-consistency between the best-fitting star formation history (SFH) and the nebular characteristics of a galaxy (e.g., hydrogen Balmer-line luminosities and equivalent widths; shape of the nebular continuum, including the Balmer and Paschen discontinuity) and b) genetic optimization and artificial intelligence algorithms. This document is part of the FADO v.1 distribution package, which contains two different ascii files, ReadMe and Read_F, and one tarball archive FADOv1.tar.gz. FADOv1.tar.gz contains the binary (executable) compiled in both OpenSuSE 13.2 64bit LINUX (FADO) and MAC OS X (FADO_MACOSX). The former is compatible with most LINUX distributions, while the latter was only tested for Yosemite 10.10.3. It contains the configuration files for running FADO: FADO.config and PLOT.config, as well as the "Simple Stellar Population" (SSP) base library with the base file list Base.BC03.L, the FADO v.1 short manual Read_F and this file (in the ReadMe directory) and, for testing purposes, three characteristic de-redshifted spectra from SDSS-DR7 in ascii format, corresponding to a star-forming (spec1.txt), composite (spec2.txt) and LINER (spec3.txt) galaxy. Auxiliary files needed for execution of FADO (.HIfboundem.ascii, .HeIIfbound.ascii, .HeIfboundem.ascii, grfont.dat and grfont.txt) are also included in the tarball. By decompressing the tarball the following six directories are created: input, output, plots, ReadMe, SSPs and tables (see below for a brief explanation). (2 data files).

  7. SnopViz, an interactive snow profile visualization tool

    NASA Astrophysics Data System (ADS)

    Fierz, Charles; Egger, Thomas; gerber, Matthias; Bavay, Mathias; Techel, Frank

    2016-04-01

    SnopViz is a visualization tool for both simulation outputs of the snow-cover model SNOWPACK and observed snow profiles. It has been designed to fulfil the needs of operational services (Swiss Avalanche Warning Service, Avalanche Canada) as well as offer the flexibility required to satisfy the specific needs of researchers. This JavaScript application runs on any modern browser and does not require an active Internet connection. The open source code is available for download from models.slf.ch where examples can also be run. Both the SnopViz library and the SnopViz User Interface will become a full replacement of the current research visualization tool SN_GUI for SNOWPACK. The SnopViz library is a stand-alone application that parses the provided input files, for example, a single snow profile (CAAML file format) or multiple snow profiles as output by SNOWPACK (PRO file format). A plugin architecture allows for handling JSON objects (JavaScript Object Notation) as well and plugins for other file formats may be added easily. The outputs are provided either as vector graphics (SVG) or JSON objects. The SnopViz User Interface (UI) is a browser based stand-alone interface. It runs in every modern browser, including IE, and allows user interaction with the graphs. SVG, the XML based standard for vector graphics, was chosen because of its easy interaction with JS and a good software support (Adobe Illustrator, Inkscape) to manipulate graphs outside SnopViz for publication purposes. SnopViz provides new visualization for SNOWPACK timeline output as well as time series input and output. The actual output format for SNOWPACK timelines was retained while time series are read from SMET files, a file format used in conjunction with the open source data handling code MeteoIO. Finally, SnopViz is able to render single snow profiles, either observed or modelled, that are provided as CAAML-file. This file format (caaml.org/Schemas/V5.0/Profiles/SnowProfileIACS) is an international standard to exchange snow profile data. It is supported by the International Association of Cryospheric Sciences (IACS) and was developed in collaboration with practitioners (Avalanche Canada).

  8. Hadamard multimode optical imaging transceiver

    DOEpatents

    Cooke, Bradly J; Guenther, David C; Tiee, Joe J; Kellum, Mervyn J; Olivas, Nicholas L; Weisse-Bernstein, Nina R; Judd, Stephen L; Braun, Thomas R

    2012-10-30

    Disclosed is a method and system for simultaneously acquiring and producing results for multiple image modes using a common sensor without optical filtering, scanning, or other moving parts. The system and method utilize the Walsh-Hadamard correlation detection process (e.g., functions/matrix) to provide an all-binary structure that permits seamless bridging between analog and digital domains. An embodiment may capture an incoming optical signal at an optical aperture, convert the optical signal to an electrical signal, pass the electrical signal through a Low-Noise Amplifier (LNA) to create an LNA signal, pass the LNA signal through one or more correlators where each correlator has a corresponding Walsh-Hadamard (WH) binary basis function, calculate a correlation output coefficient for each correlator as a function of the corresponding WH binary basis function in accordance with Walsh-Hadamard mathematical principles, digitize each of the correlation output coefficient by passing each correlation output coefficient through an Analog-to-Digital Converter (ADC), and performing image mode processing on the digitized correlation output coefficients as desired to produce one or more image modes. Some, but not all, potential image modes include: multi-channel access, temporal, range, three-dimensional, and synthetic aperture.

  9. Electronic Photography at the NASA Langley Research Center

    NASA Technical Reports Server (NTRS)

    Holm, Jack; Judge, Nancianne

    1995-01-01

    An electronic photography facility has been established in the Imaging & Photographic Technology Section, Visual Imaging Branch, at the NASA Langley Research Center (LaRC). The purpose of this facility is to provide the LaRC community with access to digital imaging technology. In particular, capabilities have been established for image scanning, direct image capture, optimized image processing for storage, image enhancement, and optimized device dependent image processing for output. Unique approaches include: evaluation and extraction of the entire film information content through scanning; standardization of image file tone reproduction characteristics for optimal bit utilization and viewing; education of digital imaging personnel on the effects of sampling and quantization to minimize image processing related information loss; investigation of the use of small kernel optimal filters for image restoration; characterization of a large array of output devices and development of image processing protocols for standardized output. Currently, the laboratory has a large collection of digital image files which contain essentially all the information present on the original films. These files are stored at 8-bits per color, but the initial image processing was done at higher bit depths and/or resolutions so that the full 8-bits are used in the stored files. The tone reproduction of these files has also been optimized so the available levels are distributed according to visual perceptibility. Look up tables are available which modify these files for standardized output on various devices, although color reproduction has been allowed to float to some extent to allow for full utilization of output device gamut.

  10. BigWig and BigBed: enabling browsing of large distributed datasets.

    PubMed

    Kent, W J; Zweig, A S; Barber, G; Hinrichs, A S; Karolchik, D

    2010-09-01

    BigWig and BigBed files are compressed binary indexed files containing data at several resolutions that allow the high-performance display of next-generation sequencing experiment results in the UCSC Genome Browser. The visualization is implemented using a multi-layered software approach that takes advantage of specific capabilities of web-based protocols and Linux and UNIX operating systems files, R trees and various indexing and compression tricks. As a result, only the data needed to support the current browser view is transmitted rather than the entire file, enabling fast remote access to large distributed data sets. Binaries for the BigWig and BigBed creation and parsing utilities may be downloaded at http://hgdownload.cse.ucsc.edu/admin/exe/linux.x86_64/. Source code for the creation and visualization software is freely available for non-commercial use at http://hgdownload.cse.ucsc.edu/admin/jksrc.zip, implemented in C and supported on Linux. The UCSC Genome Browser is available at http://genome.ucsc.edu.

  11. 41 CFR 101-30.401-2 - Automated catalog data output.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... available from the Federal Catalog System. (b) Regular file maintenance (RFM). This form of the file... Management Regulations System FEDERAL PROPERTY MANAGEMENT REGULATIONS SUPPLY AND PROCUREMENT 30-FEDERAL CATALOG SYSTEM 30.4-Use of the Federal Catalog System § 101-30.401-2 Automated catalog data output. As a...

  12. 41 CFR 101-30.401-2 - Automated catalog data output.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... available from the Federal Catalog System. (b) Regular file maintenance (RFM). This form of the file... Management Regulations System FEDERAL PROPERTY MANAGEMENT REGULATIONS SUPPLY AND PROCUREMENT 30-FEDERAL CATALOG SYSTEM 30.4-Use of the Federal Catalog System § 101-30.401-2 Automated catalog data output. As a...

  13. 41 CFR 101-30.401-2 - Automated catalog data output.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... available from the Federal Catalog System. (b) Regular file maintenance (RFM). This form of the file... Management Regulations System FEDERAL PROPERTY MANAGEMENT REGULATIONS SUPPLY AND PROCUREMENT 30-FEDERAL CATALOG SYSTEM 30.4-Use of the Federal Catalog System § 101-30.401-2 Automated catalog data output. As a...

  14. 41 CFR 101-30.401-2 - Automated catalog data output.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... available from the Federal Catalog System. (b) Regular file maintenance (RFM). This form of the file... Management Regulations System FEDERAL PROPERTY MANAGEMENT REGULATIONS SUPPLY AND PROCUREMENT 30-FEDERAL CATALOG SYSTEM 30.4-Use of the Federal Catalog System § 101-30.401-2 Automated catalog data output. As a...

  15. 41 CFR 101-30.401-2 - Automated catalog data output.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... available from the Federal Catalog System. (b) Regular file maintenance (RFM). This form of the file... Management Regulations System FEDERAL PROPERTY MANAGEMENT REGULATIONS SUPPLY AND PROCUREMENT 30-FEDERAL CATALOG SYSTEM 30.4-Use of the Federal Catalog System § 101-30.401-2 Automated catalog data output. As a...

  16. Multi-canister overpack project -- verification and validation, MCNP 4A

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goldmann, L.H.

    This supporting document contains the software verification and validation (V and V) package used for Phase 2 design of the Spent Nuclear Fuel Multi-Canister Overpack. V and V packages for both ANSYS and MCNP are included. Description of Verification Run(s): This software requires that it be compiled specifically for the machine it is to be used on. Therefore to facilitate ease in the verification process the software automatically runs 25 sample problems to ensure proper installation and compilation. Once the runs are completed the software checks for verification by performing a file comparison on the new output file and themore » old output file. Any differences between any of the files will cause a verification error. Due to the manner in which the verification is completed a verification error does not necessarily indicate a problem. This indicates that a closer look at the output files is needed to determine the cause of the error.« less

  17. Searching for New Variable Stars: an Educational Project to Mine Archival Data

    NASA Astrophysics Data System (ADS)

    Walls, B. D.; Redmond, C. E.; Murdick, L. J.; Caton, D. B.

    1998-12-01

    As a Senior Seminar project,. three students were each assigned a night of images of a field containing a variable star observed under our eclipsing binary photometry program. Each field was eight arc-minutes square, with the images coming from the DFM 32-inch telescope at our Dark Sky Observatory. The exposures used a Photometrics CH250 camera with a Tektronix 1024(2) CCD and V-filter. Darks were obtained throughout the night, as well as sky flats at dusk or dawn. The fields were around the systems V442 Cas, WW Cyg, and V345 Lac. The students used Axiom Research's MIRA AP software for doing the aperture photometry, using one initial coordinates file for all of the reasonably bright stars in the field. This number varied from about 60 to almost 200 stars. The MIRA software is easy to use, with auto-centroiding and calibration built in, so it was just a matter of loading images and applying the calibration. One of the student/authors (BDW) wrote an application in Microsoft Visual BASIC to scan the output data files and produce new files, per star. These data sets were examined using PSI-Plot, to look for variability. Errors due to occasional drift led to centroiding problems, a lesson in itself! There were still some residual variations in a few stars that may be real. Follow-up observations will be made to verify these suspicions.

  18. ANLPS. Graphics Driver for PostScript Output

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Engert, D.E.

    1987-09-01

    ANLPS is a PostScript graphics device driver for use with the proprietary CA TELLAGRAF, CUECHART, and DISSPLA products. The driver allows the user to create and send text and graphics output in the Adobe Systems` PostScript page description language, which is accepted by many print devices. The PostScript output can be generated by TELLAGRAF 6.0 and DISSPLA 10.0. The files containing the PostScript output are sent to PostScript laser printers, such as the Apple LaserWriter. It is not necessary to initialize the printer, as the output for each plot is self-contained. All CA fonts are mapped to PostScript fonts, e.g.more » Swiss-Medium is mapped to Helvetica, and the mapping is easily changed. Hardware shading and hardware characters, area fill, and color are included. Auxiliary routines are provided which allow graphics files containing figures, logos, and diagrams to be merged with text files. The user can then position, scale, and rotate the figures on the output page in the reserved area specified.« less

  19. Anisoft - Advanced Treatment of Magnetic Anisotropy Data

    NASA Astrophysics Data System (ADS)

    Chadima, M.

    2017-12-01

    Since its first release, Anisoft (Anisotropy Data Browser) has gained a wide popularity in magnetic fabric community mainly due to its simple and user-friendly interface enabling very fast visualization of magnetic anisotropy tensors. Here, a major Anisoft update is presented transforming a rather simple data viewer into a platform offering an advanced treatment of magnetic anisotropy data. The updated software introduces new enlarged binary data format which stores both in-phase and out-of-phase (if measured) susceptibility tensors (AMS) or tensors of anisotropy of magnetic remanence (AMR) together with their respective confidence ellipses and values of F-tests for anisotropy. In addition to the tensor data, a whole array of specimen orientation angles, orientation of mesoscopic foliation(s) and lineation(s) is stored for each record enabling later editing or corrections. The input data may be directly acquired by AGICO Kappabridges (AMS) or Spinner Magnetometers (AMR); imported from various data formats, including the long-time standard binary ran-format; or manually created. Multiple anisotropy files can be combined together or split into several files by manual data selection or data filtering according to their values. Anisotropy tensors are conventionally visualized as principal directions (eigenvectors) in equal-area projection (stereoplot) together with a wide array of quantitative anisotropy parameters presented in histograms or in color-coded scatter plots showing mutual relationship of up to three quantitative parameters. When dealing with AMS in variable low fields, field-independent and field-dependent components of anisotropy can be determined (Hrouda 2009). For a group of specimens, individual principal directions can be contoured, or a mean tensor and respective confidence ellipses of its principal directions can be calculated using either the Hext-Jelinek (Jelinek 1978) statistics or the Bootstrap method (Constable & Tauxe 1990). Each graphical output can be exported into several vector or raster graphical formats or, via clipboard, pasted directly into a presentation or publication manuscript. Calculated principal directions or anisotropy parameters can be exported into various types of text files ready to be visualized or processed by any software of user's choice.

  20. glopara files

    Science.gov Websites

    prepbufr BUFR biascr.$CDUMP.$CDATE Time dependent sat bias correction file abias text satang.$CDUMP.$CDATE Angle dependent sat bias correction satang text sfcanl.$CDUMP.$CDATE surface analysis sfcanl binary tcvitl.$CDUMP.$CDATE Tropical Storm Vitals syndata.tcvitals.tm00 text adpsfc.$CDUMP.$CDATE Surface land

  1. INSPECTION MEANS FOR INDUCTION MOTORS

    DOEpatents

    Williams, A.W.

    1959-03-10

    an appartus is descripbe for inspcting electric motors and more expecially an appartus for detecting falty end rings inn suqirrel cage inductio motors while the motor is running. In its broua aspects, the mer would around ce of reference tedtor means also itons in the phase ition of the An electronic circuit for conversion of excess-3 binary coded serial decimal numbers to straight binary coded serial decimal numbers is reported. The converter of the invention in its basic form generally coded pulse words of a type having an algebraic sign digit followed serially by a plurality of decimal digits in order of decreasing significance preceding a y algebraic sign digit followed serially by a plurality of decimal digits in order of decreasing significance. A switching martix is coupled to said input circuit and is internally connected to produce serial straight binary coded pulse groups indicative of the excess-3 coded input. A stepping circuit is coupled to the switching matrix and to a synchronous counter having a plurality of x decimal digit and plurality of y decimal digit indicator terminals. The stepping circuit steps the counter in synchornism with the serial binary pulse group output from the switching matrix to successively produce pulses at corresponding ones of the x and y decimal digit indicator terminals. The combinations of straight binary coded pulse groups and corresponding decimal digit indicator signals so produced comprise a basic output suitable for application to a variety of output apparatus.

  2. GeoDataspaces: Simplifying Data Management Tasks with Globus

    NASA Astrophysics Data System (ADS)

    Malik, T.; Chard, K.; Tchoua, R. B.; Foster, I.

    2014-12-01

    Data and its management are central to modern scientific enterprise. Typically, geoscientists rely on observations and model output data from several disparate sources (file systems, RDBMS, spreadsheets, remote data sources). Integrated data management solutions that provide intuitive semantics and uniform interfaces, irrespective of the kind of data source are, however, lacking. Consequently, geoscientists are left to conduct low-level and time-consuming data management tasks, individually, and repeatedly for discovering each data source, often resulting in errors in handling. In this talk we will describe how the EarthCube GeoDataspace project is improving this situation for seismologists, hydrologists, and space scientists by simplifying some of the existing data management tasks that arise when developing computational models. We will demonstrate a GeoDataspace, bootstrapped with "geounits", which are self-contained metadata packages that provide complete description of all data elements associated with a model run, including input/output and parameter files, model executable and any associated libraries. Geounits link raw and derived data as well as associating provenance information describing how data was derived. We will discuss challenges in establishing geounits and describe machine learning and human annotation approaches that can be used for extracting and associating ad hoc and unstructured scientific metadata hidden in binary formats with data resources and models. We will show how geounits can improve search and discoverability of data associated with model runs. To support this model, we will describe efforts related towards creating a scalable metadata catalog that helps to maintain, search and discover geounits within the Globus network of accessible endpoints. This talk will focus on the issue of creating comprehensive personal inventories of data assets for computational geoscientists, and describe a publishing mechanism, which can be used to feed into national, international, or thematic discovery portals.

  3. VizieR Online Data Catalog: Binary systems among nearby dwarfs searching (Khovritchev+, 2018)

    NASA Astrophysics Data System (ADS)

    Khovritchev, M. Yu.; Apetyan, A. A.; Roshchina, E. A.; Izmailov, I. S.; Bikulova, D. A.; Ershova, A. P.; Balyaev, I. A.; Kulikova, A. M.; Petjur, V. V.; Shumilov, A. A.; Oskina, K. I.; Maksimova, L. A.

    2018-03-01

    All results are collected in three tables: saturn1m-bc.dat, saturn1m-sdss-bc.dat and sdss-bc.dat. They have the same byte-by-byte description. The tables contain the estimates of spatial parameters of binaries (rho and d_m), relative ellipticity and asymmetry index. In addition, the positions, proper motions, photometric magnitudes, parallaxes and metallicities are presented. All stars listed in these tables are binary candidates. (3 data files).

  4. Federal Logistics Information System (FLIS) Procedures Manual. General and Administrative Information. Volume 1.

    DTIC Science & Technology

    1996-04-01

    Logistics Transfer 3 Data KFA Match Through Association 1 KFC File Data Minus Security Classi- 1 Note 1: Output DICs other than Search and Inter- fled...vols 8/9 KEC Output Exceeds AUTODIN Limitations 4,5 vols 8/9 KFA Match through Association 4 vols 8/9 KFC File Data Minus Security Classified...Activities 2 Nuclear Ordnance 4 Reference Numbers 2 SECURITY CLASSIFIED DATA, FILE DATA MINUS 4 vols 8/9, DIC KFC SECURITY CLASSIFIED CHARACTERISTICS 4 vols

  5. Transferable Output ASCII Data (TOAD) gateway: Version 1.0 user's guide

    NASA Technical Reports Server (NTRS)

    Bingel, Bradford D.

    1991-01-01

    The Transferable Output ASCII Data (TOAD) Gateway, release 1.0 is described. This is a software tool for converting tabular data from one format into another via the TOAD format. This initial release of the Gateway allows free data interchange among the following file formats: TOAD; Standard Interface File (SIF); Program to Optimize Simulated Trajectories (POST) input; Comma Separated Value (TSV); and a general free-form file format. As required, additional formats can be accommodated quickly and easily.

  6. Digital plus analog output encoder

    NASA Technical Reports Server (NTRS)

    Hafle, R. S. (Inventor)

    1976-01-01

    The disclosed encoder is adapted to produce both digital and analog output signals corresponding to the angular position of a rotary shaft, or the position of any other movable member. The digital signals comprise a series of binary signals constituting a multidigit code word which defines the angular position of the shaft with a degree of resolution which depends upon the number of digits in the code word. The basic binary signals are produced by photocells actuated by a series of binary tracks on a code disc or member. The analog signals are in the form of a series of ramp signals which are related in length to the least significant bit of the digital code word. The analog signals are derived from sine and cosine tracks on the code disc.

  7. Visualizing NetCDF Files by Using the EverVIEW Data Viewer

    USGS Publications Warehouse

    Conzelmann, Craig; Romañach, Stephanie S.

    2010-01-01

    Over the past few years, modelers in South Florida have started using Network Common Data Form (NetCDF) as the standard data container format for storing hydrologic and ecologic modeling inputs and outputs. With its origins in the meteorological discipline, NetCDF was created by the Unidata Program Center at the University Corporation for Atmospheric Research, in conjunction with the National Aeronautics and Space Administration and other organizations. NetCDF is a portable, scalable, self-describing, binary file format optimized for storing array-based scientific data. Despite attributes which make NetCDF desirable to the modeling community, many natural resource managers have few desktop software packages which can consume NetCDF and unlock the valuable data contained within. The U.S. Geological Survey and the Joint Ecosystem Modeling group, an ecological modeling community of practice, are working to address this need with the EverVIEW Data Viewer. Available for several operating systems, this desktop software currently supports graphical displays of NetCDF data as spatial overlays on a three-dimensional globe and views of grid-cell values in tabular form. An included Open Geospatial Consortium compliant, Web-mapping service client and charting interface allows the user to view Web-available spatial data as additional map overlays and provides simple charting visualizations of NetCDF grid values.

  8. Aircraft signal definition for flight safety system monitoring system

    NASA Technical Reports Server (NTRS)

    Gibbs, Michael (Inventor); Omen, Debi Van (Inventor)

    2003-01-01

    A system and method compares combinations of vehicle variable values against known combinations of potentially dangerous vehicle input signal values. Alarms and error messages are selectively generated based on such comparisons. An aircraft signal definition is provided to enable definition and monitoring of sets of aircraft input signals to customize such signals for different aircraft. The input signals are compared against known combinations of potentially dangerous values by operational software and hardware of a monitoring function. The aircraft signal definition is created using a text editor or custom application. A compiler receives the aircraft signal definition to generate a binary file that comprises the definition of all the input signals used by the monitoring function. The binary file also contains logic that specifies how the inputs are to be interpreted. The file is then loaded into the monitor function, where it is validated and used to continuously monitor the condition of the aircraft.

  9. CARTAM. The Cartesian Access Method for Data Structures with n-dimensional Keys.

    DTIC Science & Technology

    1979-01-01

    become apparent later, I have chosen to store structural information in an explicit binary tree , with modifications. instead of the left and right links of...the usual binary tree , I use the child and twin pointers of a ring structure or circular list. This ring structure as illustrated in figure 3-1* also...Since the file is being stored as an explicit binary tree , note that additional records are being generated, and the concept of an Ni-thm record for

  10. VizieR Online Data Catalog: Habitable zones around main-sequence stars (Kopparapu+, 2014)

    NASA Astrophysics Data System (ADS)

    Kopparapu, R. K.; Ramirez, R. M.; Schottelkotte, J.; Kasting, J. F.; Domagal-Goldman, S.; Eymet, V.

    2017-08-01

    Language: Fortran 90 Code tested under the following compilers/operating systems: ifort/CentOS linux Description of input data: No input necessary. Description of output data: Output files: HZs.dat, HZ_coefficients.dat System requirements: No major system requirement. Fortran compiler necessary. Calls to external routines: None. Additional comments: None (1 data file).

  11. Modifications to the accuracy assessment analysis routine SPATL to produce an output file

    NASA Technical Reports Server (NTRS)

    Carnes, J. G.

    1978-01-01

    The SPATL is an analysis program in the Accuracy Assessment Software System which makes comparisons between ground truth information and dot labeling for an individual segment. In order to facilitate the aggregation cf this information, SPATL was modified to produce a disk output file containing the necessary information about each segment.

  12. Design and optimization of geothermal power generation, heating, and cooling

    NASA Astrophysics Data System (ADS)

    Kanoglu, Mehmet

    Most of the world's geothermal power plants have been built in 1970s and 1980s following 1973 oil crisis. Urgency to generate electricity from alternative energy sources and the fact that geothermal energy was essentially free adversely affected careful designs of plants which would maximize their performance for a given geothermal resource. There are, however, tremendous potentials to improve performance of many existing geothermal power plants by retrofitting, optimizing the operating conditions, re-selecting the most appropriate binary fluid in binary plants, and considering cogeneration such as a district heating and/or cooling system or a system to preheat water entering boilers in industrial facilities. In this dissertation, some representative geothermal resources and existing geothermal power plants in Nevada are investigated to show these potentials. Economic analysis of a typical geothermal resource shows that geothermal heating and cooling may generate up to 3 times as much revenue as power generation alone. A district heating/cooling system is designed for its incorporation into an existing 27 MW air-cooled binary geothermal power plant. The system as designed has the capability to meet the entire heating needs of an industrial park as well as 40% of its cooling needs, generating potential revenues of $14,040,000 per year. A study of the power plant shows that evaporative cooling can increase the power output by up to 29% in summer by decreasing the condenser temperature. The power output of the plant can be increased by 2.8 percent by optimizing the maximum pressure in the cycle. Also, replacing the existing working fluid isobutane by butane, R-114, isopentane, and pentane can increase the power output by up to 2.5 percent. Investigation of some well-known geothermal power generation technologies as alternatives to an existing 12.8 MW single-flash geothermal power plant shows that double-flash, binary, and combined flash/binary designs can increase the net power output by up to 31 percent, 35 percent, and 54 percent, respectively, at optimum operating conditions. An economic comparison of these designs appears to favor the combined flash/binary design, followed by the double-flash design.

  13. Simulations of Brady's-Type Fault Undergoing CO2 Push-Pull: Pressure-Transient and Sensitivity Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jung, Yoojin; Doughty, Christine

    Input and output files used for fault characterization through numerical simulation using iTOUGH2. The synthetic data for the push period are generated by running a forward simulation (input parameters are provided in iTOUGH2 Brady GF6 Input Parameters.txt [InvExt6i.txt]). In general, the permeability of the fault gouge, damage zone, and matrix are assumed to be unknown. The input and output files are for the inversion scenario where only pressure transients are available at the monitoring well located 200 m above the injection well and only the fault gouge permeability is estimated. The input files are named InvExt6i, INPUT.tpl, FOFT.ins, CO2TAB, andmore » the output files are InvExt6i.out, pest.fof, and pest.sav (names below are display names). The table graphic in the data files below summarizes the inversion results, and indicates the fault gouge permeability can be estimated even if imperfect guesses are used for matrix and damage zone permeabilities, and permeability anisotropy is not taken into account.« less

  14. Utilizing HDF4 File Content Maps for the Cloud

    NASA Technical Reports Server (NTRS)

    Lee, Hyokyung Joe

    2016-01-01

    We demonstrate a prototype study that HDF4 file content map can be used for efficiently organizing data in cloud object storage system to facilitate cloud computing. This approach can be extended to any binary data formats and to any existing big data analytics solution powered by cloud computing because HDF4 file content map project started as long term preservation of NASA data that doesn't require HDF4 APIs to access data.

  15. Sodar - PNNL Scintec MFAS, Oregon Raceway Park - Raw Data

    DOE Data Explorer

    Pekour, Mikhail

    2017-10-23

    Provide measurements of wind speed and direction up to 400 m AGL (max). The data are stored in 2 forms: ASCII and raw (binary). ASCII files contain averaged data (currently -- 15 min time step and 10 m range gate); raw files could be reprocessed with the sodar software (APRun by Scintec) to produce ASCII files with different time and/or height averaging settings (highest resolution is approx. 90 sec and 10 m).

  16. Fallon, Nevada FORGE Thermal-Hydrological-Mechanical Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blankenship, Doug; Sonnenthal, Eric

    Archive contains thermal-mechanical simulation input/output files. Included are files which fall into the following categories: ( 1 ) Spreadsheets with various input parameter calculations ( 2 ) Final Simulation Inputs ( 3 ) Native-State Thermal-Hydrological Model Input File Folders ( 4 ) Native-State Thermal-Hydrological-Mechanical Model Input Files ( 5 ) THM Model Stimulation Cases See 'File Descriptions.xlsx' resource below for additional information on individual files.

  17. SAGE III/ISS L2 Solar Event Species Profiles (Native) V5 (g3bsspb)

    Atmospheric Science Data Center

    2017-12-21

    SAGE III/ISS L2 Solar Event Species Profiles (Native) V5 (g3bsspb)   Project ... present Temporal Resolution:  1 file per event File Format:  BINARY Tools:  Earthdata ... Radiation Longwave Radiation Shortwave Radiation Event Tag Event Type Obs Beta Angle Order Data:  ...

  18. SAGE III/ISS L2 Lunar Event Species Profiles (Native) V5 (g3blspb)

    Atmospheric Science Data Center

    2018-01-08

    SAGE III/ISS L2 Lunar Event Species Profiles (Native) V5 (g3blspb)   Project ... present Temporal Resolution:  1 file per event File Format:  BINARY Tools:  Earthdata ... Radiation Longwave Radiation Shortwave Radiation Event Tag Event Type Obs Beta Angle Order Data:  ...

  19. Sesame IO Library User Manual Version 8

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abhold, Hilary; Young, Ginger Ann

    This document is a user manual for SES_IO, a low-level library for reading and writing sesame files. The purpose of the SES_IO library is to provide a simple user interface for accessing and creating sesame files that does not change across sesame format type (such as binary, ascii, and xml).

  20. Device-independent tests of quantum channels

    NASA Astrophysics Data System (ADS)

    Dall'Arno, Michele; Brandsen, Sarah; Buscemi, Francesco

    2017-03-01

    We develop a device-independent framework for testing quantum channels. That is, we falsify a hypothesis about a quantum channel based only on an observed set of input-output correlations. Formally, the problem consists of characterizing the set of input-output correlations compatible with any arbitrary given quantum channel. For binary (i.e. two input symbols, two output symbols) correlations, we show that extremal correlations are always achieved by orthogonal encodings and measurements, irrespective of whether or not the channel preserves commutativity. We further provide a full, closed-form characterization of the sets of binary correlations in the case of: (i) any dihedrally covariant qubit channel (such as any Pauli and amplitude-damping channels) and (ii) any universally-covariant commutativity-preserving channel in an arbitrary dimension (such as any erasure, depolarizing, universal cloning and universal transposition channels).

  1. Device-independent tests of quantum channels.

    PubMed

    Dall'Arno, Michele; Brandsen, Sarah; Buscemi, Francesco

    2017-03-01

    We develop a device-independent framework for testing quantum channels. That is, we falsify a hypothesis about a quantum channel based only on an observed set of input-output correlations. Formally, the problem consists of characterizing the set of input-output correlations compatible with any arbitrary given quantum channel. For binary (i.e. two input symbols, two output symbols) correlations, we show that extremal correlations are always achieved by orthogonal encodings and measurements, irrespective of whether or not the channel preserves commutativity. We further provide a full, closed-form characterization of the sets of binary correlations in the case of: (i) any dihedrally covariant qubit channel (such as any Pauli and amplitude-damping channels) and (ii) any universally-covariant commutativity-preserving channel in an arbitrary dimension (such as any erasure, depolarizing, universal cloning and universal transposition channels).

  2. ACTS (Advanced Communications Technology Satellite) Propagation Experiment: Preprocessing Software User's Manual

    NASA Technical Reports Server (NTRS)

    Crane, Robert K.; Wang, Xuhe; Westenhaver, David

    1996-01-01

    The preprocessing software manual describes the Actspp program originally developed to observe and diagnose Advanced Communications Technology Satellite (ACTS) propagation terminal/receiver problems. However, it has been quite useful for automating the preprocessing functions needed to convert the terminal output to useful attenuation estimates. Prior to having data acceptable for archival functions, the individual receiver system must be calibrated and the power level shifts caused by ranging tone modulation must be received. Actspp provides three output files: the daylog, the diurnal coefficient file, and the file that contains calibration information.

  3. A memory-mapped output interface: Omega navigation output data from the JOLT (TM) microcomputer

    NASA Technical Reports Server (NTRS)

    Lilley, R. W.

    1976-01-01

    A hardware interface which allows both digital and analog data output from the JOLT microcomputer is described in the context of a software-based Omega Navigation receiver. The interface hardware described is designed for output of six (or eight with simple extensions) bits of binary output in response to a memory store command from the microcomputer. The interface was produced in breadboard form and is operational as an evaluation aid for the software Omega receiver.

  4. Mutual information against correlations in binary communication channels.

    PubMed

    Pregowska, Agnieszka; Szczepanski, Janusz; Wajnryb, Eligiusz

    2015-05-19

    Explaining how the brain processing is so fast remains an open problem (van Hemmen JL, Sejnowski T., 2004). Thus, the analysis of neural transmission (Shannon CE, Weaver W., 1963) processes basically focuses on searching for effective encoding and decoding schemes. According to the Shannon fundamental theorem, mutual information plays a crucial role in characterizing the efficiency of communication channels. It is well known that this efficiency is determined by the channel capacity that is already the maximal mutual information between input and output signals. On the other hand, intuitively speaking, when input and output signals are more correlated, the transmission should be more efficient. A natural question arises about the relation between mutual information and correlation. We analyze the relation between these quantities using the binary representation of signals, which is the most common approach taken in studying neuronal processes of the brain. We present binary communication channels for which mutual information and correlation coefficients behave differently both quantitatively and qualitatively. Despite this difference in behavior, we show that the noncorrelation of binary signals implies their independence, in contrast to the case for general types of signals. Our research shows that the mutual information cannot be replaced by sheer correlations. Our results indicate that neuronal encoding has more complicated nature which cannot be captured by straightforward correlations between input and output signals once the mutual information takes into account the structure and patterns of the signals.

  5. A Columnar Storage Strategy with Spatiotemporal Index for Big Climate Data

    NASA Astrophysics Data System (ADS)

    Hu, F.; Bowen, M. K.; Li, Z.; Schnase, J. L.; Duffy, D.; Lee, T. J.; Yang, C. P.

    2015-12-01

    Large collections of observational, reanalysis, and climate model output data may grow to as large as a 100 PB in the coming years, so climate dataset is in the Big Data domain, and various distributed computing frameworks have been utilized to address the challenges by big climate data analysis. However, due to the binary data format (NetCDF, HDF) with high spatial and temporal dimensions, the computing frameworks in Apache Hadoop ecosystem are not originally suited for big climate data. In order to make the computing frameworks in Hadoop ecosystem directly support big climate data, we propose a columnar storage format with spatiotemporal index to store climate data, which will support any project in the Apache Hadoop ecosystem (e.g. MapReduce, Spark, Hive, Impala). With this approach, the climate data will be transferred into binary Parquet data format, a columnar storage format, and spatial and temporal index will be built and attached into the end of Parquet files to enable real-time data query. Then such climate data in Parquet data format could be available to any computing frameworks in Hadoop ecosystem. The proposed approach is evaluated using the NASA Modern-Era Retrospective Analysis for Research and Applications (MERRA) climate reanalysis dataset. Experimental results show that this approach could efficiently overcome the gap between the big climate data and the distributed computing frameworks, and the spatiotemporal index could significantly accelerate data querying and processing.

  6. TOPPE: A framework for rapid prototyping of MR pulse sequences.

    PubMed

    Nielsen, Jon-Fredrik; Noll, Douglas C

    2018-06-01

    To introduce a framework for rapid prototyping of MR pulse sequences. We propose a simple file format, called "TOPPE", for specifying all details of an MR imaging experiment, such as gradient and radiofrequency waveforms and the complete scan loop. In addition, we provide a TOPPE file "interpreter" for GE scanners, which is a binary executable that loads TOPPE files and executes the sequence on the scanner. We also provide MATLAB scripts for reading and writing TOPPE files and previewing the sequence prior to hardware execution. With this setup, the task of the pulse sequence programmer is reduced to creating TOPPE files, eliminating the need for hardware-specific programming. No sequence-specific compilation is necessary; the interpreter only needs to be compiled once (for every scanner software upgrade). We demonstrate TOPPE in three different applications: k-space mapping, non-Cartesian PRESTO whole-brain dynamic imaging, and myelin mapping in the brain using inhomogeneous magnetization transfer. We successfully implemented and executed the three example sequences. By simply changing the various TOPPE sequence files, a single binary executable (interpreter) was used to execute several different sequences. The TOPPE file format is a complete specification of an MR imaging experiment, based on arbitrary sequences of a (typically small) number of unique modules. Along with the GE interpreter, TOPPE comprises a modular and flexible platform for rapid prototyping of new pulse sequences. Magn Reson Med 79:3128-3134, 2018. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.

  7. Input data requirements for special processors in the computation system containing the VENTURE neutronics code. [LMFBR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vondy, D.R.; Fowler, T.B.; Cunningham, G.W.

    1979-07-01

    User input data requirements are presented for certain special processors in a nuclear reactor computation system. These processors generally read data in formatted form and generate binary interface data files. Some data processing is done to convert from the user oriented form to the interface file forms. The VENTURE diffusion theory neutronics code and other computation modules in this system use the interface data files which are generated.

  8. Fast large-scale object retrieval with binary quantization

    NASA Astrophysics Data System (ADS)

    Zhou, Shifu; Zeng, Dan; Shen, Wei; Zhang, Zhijiang; Tian, Qi

    2015-11-01

    The objective of large-scale object retrieval systems is to search for images that contain the target object in an image database. Where state-of-the-art approaches rely on global image representations to conduct searches, we consider many boxes per image as candidates to search locally in a picture. In this paper, a feature quantization algorithm called binary quantization is proposed. In binary quantization, a scale-invariant feature transform (SIFT) feature is quantized into a descriptive and discriminative bit-vector, which allows itself to adapt to the classic inverted file structure for box indexing. The inverted file, which stores the bit-vector and box ID where the SIFT feature is located inside, is compact and can be loaded into the main memory for efficient box indexing. We evaluate our approach on available object retrieval datasets. Experimental results demonstrate that the proposed approach is fast and achieves excellent search quality. Therefore, the proposed approach is an improvement over state-of-the-art approaches for object retrieval.

  9. SAGE III/ISS L1B Solar Event Transmission Data (Native) V5 (g3btb)

    Atmospheric Science Data Center

    2017-12-21

    SAGE III/ISS L1B Solar Event Transmission Data (Native) V5 (g3btb)   Project Title:  ... present Temporal Resolution:  1 file per event File Format:  BINARY Tools:  Earthdata ... Radiation Longwave Radiation Shortwave Radiation Event Tag Event Type Obs Beta Angle Order Data:  ...

  10. Framework for Integrating Science Data Processing Algorithms Into Process Control Systems

    NASA Technical Reports Server (NTRS)

    Mattmann, Chris A.; Crichton, Daniel J.; Chang, Albert Y.; Foster, Brian M.; Freeborn, Dana J.; Woollard, David M.; Ramirez, Paul M.

    2011-01-01

    A software framework called PCS Task Wrapper is responsible for standardizing the setup, process initiation, execution, and file management tasks surrounding the execution of science data algorithms, which are referred to by NASA as Product Generation Executives (PGEs). PGEs codify a scientific algorithm, some step in the overall scientific process involved in a mission science workflow. The PCS Task Wrapper provides a stable operating environment to the underlying PGE during its execution lifecycle. If the PGE requires a file, or metadata regarding the file, the PCS Task Wrapper is responsible for delivering that information to the PGE in a manner that meets its requirements. If the PGE requires knowledge of upstream or downstream PGEs in a sequence of executions, that information is also made available. Finally, if information regarding disk space, or node information such as CPU availability, etc., is required, the PCS Task Wrapper provides this information to the underlying PGE. After this information is collected, the PGE is executed, and its output Product file and Metadata generation is managed via the PCS Task Wrapper framework. The innovation is responsible for marshalling output Products and Metadata back to a PCS File Management component for use in downstream data processing and pedigree. In support of this, the PCS Task Wrapper leverages the PCS Crawler Framework to ingest (during pipeline processing) the output Product files and Metadata produced by the PGE. The architectural components of the PCS Task Wrapper framework include PGE Task Instance, PGE Config File Builder, Config File Property Adder, Science PGE Config File Writer, and PCS Met file Writer. This innovative framework is really the unifying bridge between the execution of a step in the overall processing pipeline, and the available PCS component services as well as the information that they collectively manage.

  11. Recognition of Computer Viruses by Detecting Their Gene of Self Replication

    DTIC Science & Technology

    2006-03-01

    etection A pproach ................................................................................................. 6 1.4.1 The syntactic analysis m...Therefore a group of instructions acting together in the right order have to be identified for the gene of self-replication to be obvious in a...its first system call NtCreateFile, while the outputs of NtWriteFile become its output arguments. These four blocks form the final structure - The Gene

  12. AIDE - Advanced Intrusion Detection Environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Cathy L.

    2013-04-28

    Would you like to know when someone has dropped an undesirable executable binary on our system? What about something less malicious such as a software installation by a user? What about the user who decides to install a newer version of mod_perl or PHP on your web server without letting you know beforehand? Or even something as simple as when an undocumented config file change is made by another member of the admin group? Do you even want to know about all the changes that happen on a daily basis on your server? The purpose of an intrusion detection systemmore » (IDS) is to detect unauthorized, possibly malicious activity. The purpose of a host-based IDS, or file integrity checker, is check for unauthorized changes to key system files, binaries, libraries, and directories on the system. AIDE is an Open Source file and directory integrity checker. AIDE will let you know when a file or directory has been added, deleted, modified. It is included with the Red Hat Enterprise 6. It is available for other Linux distros. This is a case study describing the process of configuring AIDE on an out of the box RHEL6 installation. Its goal is to illustrate the thinking and the process by which a useful AIDE configuration is built.« less

  13. MULTI-CHANNEL ELECTRIC PULSE HEIGHT ANALYZER

    DOEpatents

    Gallagher, J.D. et al.

    1960-11-22

    An apparatus is given for converting binary information into coded decimal form comprising means, in combination with a binary adder, a live memory and a source of bigit pulses, for synchronizing the bigit pulses and the adder output pulses; a source of digit pulses synchronized with every fourth bigit pulse; means for generating a conversion pulse in response to the time coincidence of the adder output pulse and a digit pulse: means having a delay equal to two bigit pulse periods coupling the adder output with the memory; means for promptly impressing said conversion pulse on the input of said memory: and means having a delay equal to one bigit pulse period for again impressing the conversion pulse on the input of the memory whereby a fourth bigit adder pulse results in the insertion into the memory of second, third and fourth bigits.

  14. A computer program to predict rotor rotational noise of a stationary rotor from blade loading coefficient

    NASA Technical Reports Server (NTRS)

    Ramakrishnan, R.; Randall, D.; Hosier, R. N.

    1976-01-01

    The programing language used is FORTRAN IV. A description of all main and subprograms is provided so that any user possessing a FORTRAN compiler and random access capability can adapt the program to his facility. Rotor blade surface-pressure spectra can be used by the program to calculate: (1) blade station loading spectra, (2) chordwise and/or spanwise integrated blade-loading spectra, and (3) far-field rotational noise spectra. Any of five standard inline functions describing the chordwise distribution of the blade loading can be chosen in order to study parametrically the acoustic predictions. The program output consists of both printed and graphic descriptions of the blade-loading coefficient spectra and far-field acoustic spectrum. The results may also be written on binary file for future processing. Examples of the application of the program along with a description of the rotational noise prediction theory on which the program is based are also provided.

  15. Nemo: an evolutionary and population genetics programming framework.

    PubMed

    Guillaume, Frédéric; Rougemont, Jacques

    2006-10-15

    Nemo is an individual-based, genetically explicit and stochastic population computer program for the simulation of population genetics and life-history trait evolution in a metapopulation context. It comes as both a C++ programming framework and an executable program file. Its object-oriented programming design gives it the flexibility and extensibility needed to implement a large variety of forward-time evolutionary models. It provides developers with abstract models allowing them to implement their own life-history traits and life-cycle events. Nemo offers a large panel of population models, from the Island model to lattice models with demographic or environmental stochasticity and a variety of already implemented traits (deleterious mutations, neutral markers and more), life-cycle events (mating, dispersal, aging, selection, etc.) and output operators for saving data and statistics. It runs on all major computer platforms including parallel computing environments. The source code, binaries and documentation are available under the GNU General Public License at http://nemo2.sourceforge.net.

  16. Logic models to predict continuous outputs based on binary inputs with an application to personalized cancer therapy

    PubMed Central

    Knijnenburg, Theo A.; Klau, Gunnar W.; Iorio, Francesco; Garnett, Mathew J.; McDermott, Ultan; Shmulevich, Ilya; Wessels, Lodewyk F. A.

    2016-01-01

    Mining large datasets using machine learning approaches often leads to models that are hard to interpret and not amenable to the generation of hypotheses that can be experimentally tested. We present ‘Logic Optimization for Binary Input to Continuous Output’ (LOBICO), a computational approach that infers small and easily interpretable logic models of binary input features that explain a continuous output variable. Applying LOBICO to a large cancer cell line panel, we find that logic combinations of multiple mutations are more predictive of drug response than single gene predictors. Importantly, we show that the use of the continuous information leads to robust and more accurate logic models. LOBICO implements the ability to uncover logic models around predefined operating points in terms of sensitivity and specificity. As such, it represents an important step towards practical application of interpretable logic models. PMID:27876821

  17. ATLAS, an integrated structural analysis and design system. Volume 4: Random access file catalog

    NASA Technical Reports Server (NTRS)

    Gray, F. P., Jr. (Editor)

    1979-01-01

    A complete catalog is presented for the random access files used by the ATLAS integrated structural analysis and design system. ATLAS consists of several technical computation modules which output data matrices to corresponding random access file. A description of the matrices written on these files is contained herein.

  18. External-Compression Supersonic Inlet Design Code

    NASA Technical Reports Server (NTRS)

    Slater, John W.

    2011-01-01

    A computer code named SUPIN has been developed to perform aerodynamic design and analysis of external-compression, supersonic inlets. The baseline set of inlets include axisymmetric pitot, two-dimensional single-duct, axisymmetric outward-turning, and two-dimensional bifurcated-duct inlets. The aerodynamic methods are based on low-fidelity analytical and numerical procedures. The geometric methods are based on planar geometry elements. SUPIN has three modes of operation: 1) generate the inlet geometry from a explicit set of geometry information, 2) size and design the inlet geometry and analyze the aerodynamic performance, and 3) compute the aerodynamic performance of a specified inlet geometry. The aerodynamic performance quantities includes inlet flow rates, total pressure recovery, and drag. The geometry output from SUPIN includes inlet dimensions, cross-sectional areas, coordinates of planar profiles, and surface grids suitable for input to grid generators for analysis by computational fluid dynamics (CFD) methods. The input data file for SUPIN and the output file from SUPIN are text (ASCII) files. The surface grid files are output as formatted Plot3D or stereolithography (STL) files. SUPIN executes in batch mode and is available as a Microsoft Windows executable and Fortran95 source code with a makefile for Linux.

  19. Sodar - PNNL Scintec MFAS, Oregon Raceway Park - Reviewed Data

    DOE Data Explorer

    Pekour, Mikhail

    2018-01-26

    These data provide measurements of wind speed and direction up to 400 m above ground level (AGL) (max). The data are stored in two forms: ASCII and raw (binary). ASCII files contain averaged data (currently: 15 min time step and 10 m range gate). Raw files can be reprocessed with sodar software (APRun by Scintec) to produce ASCII files with different time and/or height averaging settings (highest resolution is approximately 90 sec and 10 m). NOTE: Wind direction is reported with respect to magnetic North.

  20. Brady's Geothermal Field - March 2016 Vibroseis SEG-Y Files and UTM Locations

    DOE Data Explorer

    Kurt Feigl

    2016-03-31

    PoroTomo March 2016 (Task 6.4) Updated vibroseis source locations with UTM locations. Supersedes gdr.openei.org/submissions/824. Updated vibroseis source location data for Stages 1-4, PoroTomo March 2016. This revision includes source point locations in UTM format (meters) for all four Stages of active source acquisition. Vibroseis sweep data were collected on a Signature Recorder unit (mfr Seismic Source) mounted in the vibroseis cab during the March 2016 PoroTomo active seismic survey Stages 1 to 4. Each sweep generated a GPS timed SEG-Y file with 4 input channels and a 20 second record length. Ch1 = pilot sweep, Ch2 = accelerometer output from the vibe's mass, Ch3 = accel output from the baseplase, and Ch4 = weighted sum of the accelerometer outputs. SEG-Y files are available via the links below.

  1. VizieR Online Data Catalog: ASAS, NSVS, and LINEAR detached eclipsing binaries (Lee, 2015)

    NASA Astrophysics Data System (ADS)

    Lee, C.-H.

    2016-04-01

    We follow the approach of Devor et al. (2008AJ....135..850D, Cat. J/AJ/135/850) to analyse the LC from ASAS (Pojmanski et al., Cat. II/264, NSVS (Wozniak et al., 2004AJ....127.2436W, and LINEAR (Palaversa et al., Cat. J/AJ/146/101) and extract the physical properties of the eclipsing binaries. (3 data files).

  2. Automatic computer subprogram selection from application program libraries

    NASA Technical Reports Server (NTRS)

    Drozdowski, J. M.

    1972-01-01

    The program ALTLIB (ALTernate LIBrary) which allows a user access to an alternate subprogram library with a minimum effort is discussed. The ALTLIB program selects subprograms from an alternate library file and merges them with the user's program load file. Only subprograms that are called for (directly or indirectly) by the user's programs and that are available on the alternate library file will be selected. ALTLIB eliminates the need for elaborate control-card manipulations to add subprograms from a subprogram file. ALTLIB returns to the user his binary file and the selected subprograms in correct order for a call to the loader. The user supplies the alternate library file. Subprogram requests which are not satisfied from the alternate library file will be satisfied at load time from the system library.

  3. Construct User Guide

    DTIC Science & Technology

    2012-11-01

    validation using calibrated grounding. In 2007 BRIMS Conference Proceedings, Norfolk, VA. Simon, H. A. (1957). Administrative Behavior: A study of...Construct will write the output to the directory specified by the path name. Users should ensure that if they have opened any output files (e.g., in Excel... open an input file, it will exit and close. There are times when an error message is not present to the user in this situation! Users should ensure

  4. Program Aids In Printing FORTRAN-Coded Output

    NASA Technical Reports Server (NTRS)

    Akian, Richard A.

    1993-01-01

    FORPRINT computer program prints FORTRAN-coded output files on most non-Postscript printers with such extra features as control of fonts for Epson and Hewlett Packard printers. Rewrites data to printer and inserts correct printer-control codes. Alternative uses include ability to separate data or ASCII file during printing by use of editing software to insert "1" in first column of data line that starts new page. Written in FORTRAN 77.

  5. Conjoin

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sjaardema, Gregory

    2010-08-06

    Conjoin is a code for joining sequentially in time multiple exodusII database files. It is used to create a single results or restart file from multiple results or restart files which typically arise as the result of multiple restarted analyses. The resulting output file will be the union of the input files with a status variable indicating the status of each element at the various time planes.Combining multiple exodusII files arising from a restarted analysis or combining multiple exodusII files arising from a finite element analysis with dynamic topology changes.

  6. Shuttle Data Center File-Processing Tool in Java

    NASA Technical Reports Server (NTRS)

    Barry, Matthew R.; Miller, Walter H.

    2006-01-01

    A Java-language computer program has been written to facilitate mining of data in files in the Shuttle Data Center (SDC) archives. This program can be executed on a variety of workstations or via Web-browser programs. This program is partly similar to prior C-language programs used for the same purpose, while differing from those programs in that it exploits the platform-neutrality of Java in implementing several features that are important for analysis of large sets of time-series data. The program supports regular expression queries of SDC archive files, reads the files, interleaves the time-stamped samples according to a chosen output, then transforms the results into that format. A user can choose among a variety of output file formats that are useful for diverse purposes, including plotting, Markov modeling, multivariate density estimation, and wavelet multiresolution analysis, as well as for playback of data in support of simulation and testing.

  7. Software for Managing Personal Files.

    ERIC Educational Resources Information Center

    Lundeen, Gerald

    1989-01-01

    Discusses the special characteristics of personal file management software and compares four microcomputer software packages: Notebook II with Bibliography and Convert, Pro-Cite with Biblio-Links, askSam, and Reference Manager. Each package is evaluated in terms of the user interface, file maintenance, retrieval capabilities, output, and…

  8. Model input and output files for the simulation of time of arrival of landfill leachate at the water table, Municipal Solid Waste Landfill Facility, U.S. Army Air Defense Artillery Center and Fort Bliss, El Paso County, Texas

    USGS Publications Warehouse

    Abeyta, Cynthia G.; Frenzel, Peter F.

    1999-01-01

    This report contains listings of model input and output files for the simulation of the time of arrival of landfill leachate at the water table from the Municipal Solid Waste Landfill Facility (MSWLF), about 10 miles northeast of downtown El Paso, Texas. This simulation was done by the U.S. Geological Survey in cooperation with the U.S. Department of the Army, U.S. Army Air Defense Artillery Center and Fort Bliss, El Paso, Texas. The U.S. Environmental Protection Agency-developed Hydrologic Evaluation of Landfill Performance (HELP) and Multimedia Exposure Assessment (MULTIMED) computer models were used to simulate the production of leachate by a landfill and transport of landfill leachate to the water table. Model input data files used with and output files generated by the HELP and MULTIMED models are provided in ASCII format on a 3.5-inch 1.44-megabyte IBM-PC compatible floppy disk.

  9. PATSTAGS - PATRAN-STAGSC-1 TRANSLATOR

    NASA Technical Reports Server (NTRS)

    Otte, N. E.

    1994-01-01

    PATSTAGS translates PATRAN finite model data into STAGS (Structural Analysis of General Shells) input records to be used for engineering analysis. The program reads data from a PATRAN neutral file and writes STAGS input records into a STAGS input file and a UPRESS data file. It is able to support translations of nodal constraints, nodal, element, force and pressure data. PATSTAGS uses three files: the PATRAN neutral file to be translated, a STAGS input file and a STAGS pressure data file. The user provides the names for the neutral file and the desired names of the STAGS files to be created. The pressure data file contains the element live pressure data used in the STAGS subroutine UPRESS. PATSTAGS is written in FORTRAN 77 for DEC VAX series computers running VMS. The main memory requirement for execution is approximately 790K of virtual memory. Output blocks can be modified to output the data in any format desired, allowing the program to be used to translate model data to analysis codes other than STAGSC-1 (HQN-10967). This program is available in DEC VAX BACKUP format on a 9-track magnetic tape or TK50 tape cartridge. Documentation is included in the price of the program. PATSTAGS was developed in 1990. DEC, VAX, TK50 and VMS are trademarks of Digital Equipment Corporation.

  10. User’s Guide. To the Federal Insurance Administration’s 1978-1979 Flood Claims File for Computation of Depth-Damage Relationships.

    DTIC Science & Technology

    1981-12-01

    reading a file either saved in a previous session or created as a result of the internal execution save file (described later). LOAND PFN LOADS...command is used to make new data retrievals. READ PEN DIRECT ENTRY FROM A PREVIOUSLY SAVED FILE This command bypasses the conventional terminal entry by...INTERNAL SAVE FILE This command accesses a file created using the internal execution save file output option. Loading a file results in entering the

  11. Computer input and output files associated with ground-water-flow simulations of the Albuquerque Basin, central New Mexico, 1901-94, with projections to 2020; (supplement one to U.S. Geological Survey Water-resources investigations report 94-4251)

    USGS Publications Warehouse

    Kernodle, J.M.

    1996-01-01

    This report presents the computer input files required to run the three-dimensional ground-water-flow model of the Albuquerque Basin, central New Mexico, documented in Kernodle and others (Kernodle, J.M., McAda, D.P., and Thorn, C.R., 1995, Simulation of ground-water flow in the Albuquerque Basin, central New Mexico, 1901-1994, with projections to 2020: U.S. Geological Survey Water-Resources Investigations Report 94-4251, 114 p.). Output files resulting from the computer simulations are included for reference.

  12. 3D Visualization of Hydrological Model Outputs For a Better Understanding of Multi-Scale Phenomena

    NASA Astrophysics Data System (ADS)

    Richard, J.; Schertzer, D. J. M.; Tchiguirinskaia, I.

    2014-12-01

    During the last decades, many hydrological models has been created to simulate extreme events or scenarios on catchments. The classical outputs of these models are 2D maps, time series or graphs, which are easily understood by scientists, but not so much by many stakeholders, e.g. mayors or local authorities, and the general public. One goal of the Blue Green Dream project is to create outputs that are adequate for them. To reach this goal, we decided to convert most of the model outputs into a unique 3D visualization interface that combines all of them. This conversion has to be performed with an hydrological thinking to keep the information consistent with the context and the raw outputs.We focus our work on the conversion of the outputs of the Multi-Hydro (MH) model, which is physically based, fully distributed and with a GIS data interface. MH splits the urban water cycle into 4 components: the rainfall, the surface runoff, the infiltration and the drainage. To each of them, corresponds a modeling module with specific inputs and outputs. The superimposition of all this information will highlight the model outputs and help to verify the quality of the raw input data. For example, the spatial and the time variability of the rain generated by the rainfall module will be directly visible in 4D (3D + time) before running a full simulation. It is the same with the runoff module: because the result quality depends of the resolution of the rasterized land use, it will confirm or not the choice of the cell size.As most of the inputs and outputs are GIS files, two main conversions will be applied to display the results into 3D. First, a conversion from vector files to 3D objects. For example, buildings are defined in 2D inside a GIS vector file. Each polygon can be extruded with an height to create volumes. The principle is the same for the roads but an intrusion, instead of an extrusion, is done inside the topography file. The second main conversion is the raster conversion. Several files, such as the topography, the land use, the water depth, etc., are defined by geo-referenced grids. The corresponding grids are converted into a list of triangles to be displayed inside the 3D window. For the water depth, the display in pixels will not longer be the only solution. Creation of water contours will be done to more easily delineate the flood inside the catchment.

  13. Engineering description of the ascent/descent bet product

    NASA Technical Reports Server (NTRS)

    Seacord, A. W., II

    1986-01-01

    The Ascent/Descent output product is produced in the OPIP routine from three files which constitute its input. One of these, OPIP.IN, contains mission specific parameters. Meteorological data, such as atmospheric wind velocities, temperatures, and density, are obtained from the second file, the Corrected Meteorological Data File (METDATA). The third file is the TRJATTDATA file which contains the time-tagged state vectors that combine trajectory information from the Best Estimate of Trajectory (BET) filter, LBRET5, and Best Estimate of Attitude (BEA) derived from IMU telemetry. Each term in the two output data files (BETDATA and the Navigation Block, or NAVBLK) are defined. The description of the BETDATA file includes an outline of the algorithm used to calculate each term. To facilitate describing the algorithms, a nomenclature is defined. The description of the nomenclature includes a definition of the coordinate systems used. The NAVBLK file contains navigation input parameters. Each term in NAVBLK is defined and its source is listed. The production of NAVBLK requires only two computational algorithms. These two algorithms, which compute the terms DELTA and RSUBO, are described. Finally, the distribution of data in the NAVBLK records is listed.

  14. Continuous-tone applications in digital hard-copy output devices

    NASA Astrophysics Data System (ADS)

    Saunders, Jeffrey C.

    1990-11-01

    Dye diffusion technology has made a recent entry into the hardcopy printer arena making it now possible to achieve near-photographic quality images from digital raster image data. Whereas the majority of low cost printers utilizing ink-jet, thermal wax, or dotmatrix technologies advertise high resolution printheads, the restrictions which dithering algorithms apply to these inherently binary printing systems force them to sacrifice spatial resolution capability for tone scale reproduction. Dye diffusion technology allows a fully continuous range of density at each pixel location thus preserving the full spatial resolution capability of the printhead; spatial resolution is not sacrificed for tone scale. This results in images whose quality is far superior to the ink-jet or wax-transfer products; image quality so high in fact, to the unaided eye, dye diffusion images are indistinguishable from their silver-halide counterparts. Eastman Kodak Co. offers a highly refined application of dye diffusion technology in the Kodak XL 7700 Digital Continuous Tone Printer and Kodak EKTATHERM media products. The XL . 7700 Printer represents a serious alternative to expensive laser-based film recorders for applications which require high quality image output from digital data files. This paper presents an explanation of dye diffusion printing, what distinguishes it from other technologies, sensitometric control and image quality parameters, and applications within the industry, particularly that of Airborne Reconnaissance and Remote Sensing.

  15. EXODUS II: A finite element data model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schoof, L.A.; Yarberry, V.R.

    1994-09-01

    EXODUS II is a model developed to store and retrieve data for finite element analyses. It is used for preprocessing (problem definition), postprocessing (results visualization), as well as code to code data transfer. An EXODUS II data file is a random access, machine independent, binary file that is written and read via C, C++, or Fortran library routines which comprise the Application Programming Interface (API).

  16. Information retrieval and display system

    NASA Technical Reports Server (NTRS)

    Groover, J. L.; King, W. L.

    1977-01-01

    Versatile command-driven data management system offers users, through simplified command language, a means of storing and searching data files, sorting data files into specified orders, performing simple or complex computations, effecting file updates, and printing or displaying output data. Commands are simple to use and flexible enough to meet most data management requirements.

  17. Establishing Malware Attribution and Binary Provenance Using Multicompilation Techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ramshaw, M. J.

    2017-07-28

    Malware is a serious problem for computer systems and costs businesses and customers billions of dollars a year in addition to compromising their private information. Detecting malware is particularly difficult because malware source code can be compiled in many different ways and generate many different digital signatures, which causes problems for most anti-malware programs that rely on static signature detection. Our project uses a convolutional neural network to identify malware programs but these require large amounts of data to be effective. Towards that end, we gather thousands of source code files from publicly available programming contest sites and compile themmore » with several different compilers and flags. Building upon current research, we then transform these binary files into image representations and use them to train a long-term recurrent convolutional neural network that will eventually be used to identify how a malware binary was compiled. This information will include the compiler, version of the compiler and the options used in compilation, information which can be critical in determining where a malware program came from and even who authored it.« less

  18. Evaluation of selected strapdown inertial instruments and pulse torque loops, volume 1

    NASA Technical Reports Server (NTRS)

    Sinkiewicz, J. S.; Feldman, J.; Lory, C. B.

    1974-01-01

    Design, operational and performance variations between ternary, binary and forced-binary pulse torque loops are presented. A fill-in binary loop which combines the constant power advantage of binary with the low sampling error of ternary is also discussed. The effects of different output-axis supports on the performance of a single-degree-of-freedom, floated gyroscope under a strapdown environment are illustrated. Three types of output-axis supports are discussed: pivot-dithered jewel, ball bearing and electromagnetic. A test evaluation on a Kearfott 2544 single-degree-of-freedom, strapdown gyroscope operating with a pulse torque loop, under constant rates and angular oscillatory inputs is described and the results presented. Contributions of the gyroscope's torque generator and the torque-to-balance electronics on scale factor variation with rate are illustrated for a SDF 18 IRIG Mod-B strapdown gyroscope operating with various pulse rebalance loops. Also discussed are methods of reducing this scale factor variation with rate by adjusting the tuning network which shunts the torque coil. A simplified analysis illustrating the principles of operation of the Teledyne two-degree-of-freedom, elastically-supported, tuned gyroscope and the results of a static and constant rate test evaluation of that instrument are presented.

  19. Bread: CDC 7600 program that processes Spent Fuel Test Climax data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hage, G.L.

    BREAD will process a family of files copied from a data tape made by Hewlett-Packard equipment employed for data acquisition on the Spent Fuel Test-Climax at NTS. Tapes are delivered to Livermore approximately monthly. The process at this stage consists of four steps: read the binary files and convert from H-P 16-bit words to CDC 7600 60-bit words; check identification and data ranges; write the data in 6-bit ASCII (BCD) format, one data point per line; then sort the file by identifier and time.

  20. Data handling with SAM and art at the NO vA experiment

    DOE PAGES

    Aurisano, A.; Backhouse, C.; Davies, G. S.; ...

    2015-12-23

    During operations, NOvA produces between 5,000 and 7,000 raw files per day with peaks in excess of 12,000. These files must be processed in several stages to produce fully calibrated and reconstructed analysis files. In addition, many simulated neutrino interactions must be produced and processed through the same stages as data. To accommodate the large volume of data and Monte Carlo, production must be possible both on the Fermilab grid and on off-site farms, such as the ones accessible through the Open Science Grid. To handle the challenge of cataloging these files and to facilitate their off-line processing, we havemore » adopted the SAM system developed at Fermilab. SAM indexes files according to metadata, keeps track of each file's physical locations, provides dataset management facilities, and facilitates data transfer to off-site grids. To integrate SAM with Fermilab's art software framework and the NOvA production workflow, we have developed methods to embed metadata into our configuration files, art files, and standalone ROOT files. A module in the art framework propagates the embedded information from configuration files into art files, and from input art files to output art files, allowing us to maintain a complete processing history within our files. Embedding metadata in configuration files also allows configuration files indexed in SAM to be used as inputs to Monte Carlo production jobs. Further, SAM keeps track of the input files used to create each output file. Parentage information enables the construction of self-draining datasets which have become the primary production paradigm used at NOvA. In this study we will present an overview of SAM at NOvA and how it has transformed the file production framework used by the experiment.« less

  1. Persistence of Antibiotic Resistance Plasmids in Biofilms

    DTIC Science & Technology

    2014-10-01

    from a particular file #e.g. here everything from “data1.xls” will have identifier of “Rep1” r1<-collate(ls( patt ="*rep1"),TRUE,lastVal="Rep1") r2...collate(ls( patt ="*rep2"),TRUE,lastVal="Rep2") #merge the data from the two files allOutput<-rbind(r1,r2) #at this point allOutput can be used

  2. RCHILD - an R-package for flexible use of the landscape evolution model CHILD

    NASA Astrophysics Data System (ADS)

    Dietze, Michael

    2014-05-01

    Landscape evolution models provide powerful approaches to numerically assess earth surface processes, to quantify rates of landscape change, infer sediment transfer rates, estimate sediment budgets, investigate the consequences of changes in external drivers on a geomorphic system, to provide spatio-temporal interpolations between known landscape states or to test conceptual hypotheses. CHILD (Channel-Hillslope Integrated Landscape Development Model) is one of the most-used models of landscape change in the context of at least tectonic and geomorphologic process interactions. Running CHILD from command line and working with the model output can be a rather awkward task (static model control via text input file, only numeric output in text files). The package RCHILD is a collection of functions for the free statistical software R that help using CHILD in a flexible, dynamic and user-friendly way. The comprised functions allow creating maps, real-time scenes, animations and further thematic plots from model output. The model input files can be modified dynamically and, hence, (feedback-related) changes in external factors can be implemented iteratively. Output files can be written to common formats that can be readily imported to standard GIS software. This contribution presents the basic functionality of the model CHILD as visualised and modified by the package. A rough overview of the available functions is given. Application examples help to illustrate the great potential of numeric modelling of geomorphologic processes.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burnett, R.A.

    A major goal of the Analysis of Large Data Sets (ALDS) research project at Pacific Northwest Laboratory (PNL) is to provide efficient data organization, storage, and access capabilities for statistical applications involving large amounts of data. As part of the effort to achieve this goal, a self-describing binary (SDB) data file structure has been designed and implemented together with a set of basic data manipulation functions and supporting SDB data access routines. Logical and physical data descriptors are stored in SDB files preceding the data values. SDB files thus provide a common data representation for interfacing diverse software components. Thismore » paper describes the various types of data descriptors and data structures permitted by the file design. Data buffering, file segmentation, and a segment overflow handler are also discussed.« less

  4. A novel encoding scheme for effective biometric discretization: Linearly Separable Subcode.

    PubMed

    Lim, Meng-Hui; Teoh, Andrew Beng Jin

    2013-02-01

    Separability in a code is crucial in guaranteeing a decent Hamming-distance separation among the codewords. In multibit biometric discretization where a code is used for quantization-intervals labeling, separability is necessary for preserving distance dissimilarity when feature components are mapped from a discrete space to a Hamming space. In this paper, we examine separability of Binary Reflected Gray Code (BRGC) encoding and reveal its inadequacy in tackling interclass variation during the discrete-to-binary mapping, leading to a tradeoff between classification performance and entropy of binary output. To overcome this drawback, we put forward two encoding schemes exhibiting full-ideal and near-ideal separability capabilities, known as Linearly Separable Subcode (LSSC) and Partially Linearly Separable Subcode (PLSSC), respectively. These encoding schemes convert the conventional entropy-performance tradeoff into an entropy-redundancy tradeoff in the increase of code length. Extensive experimental results vindicate the superiority of our schemes over the existing encoding schemes in discretization performance. This opens up possibilities of achieving much greater classification performance with high output entropy.

  5. 77 FR 26791 - Records Schedules; Availability and Request for Comments

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-07

    ...-374- 09-7, 1 item, 1 temporary item). Master files of an electronic information system containing...-2012-0006, 1 item, 1 temporary item). Master files of an electronic information system used to document...-0001, 2 items, 2 temporary items). Master files and outputs of an electronic information system used to...

  6. Technical Report for the Period 1 October 1987 - 30 September 1989

    DTIC Science & Technology

    1990-03-01

    low pass filter results. -dt dt specifies the sampling rate in seconds. -gin specifies .w file (binary waveform data) input. - gout specifies .w file...waves arriving at moderate incidence angles, * high signal-to-noise ratio (SNR). The following assumptions are made, for simplicity* * additive...spatially uncorrelated noise, * simple signal model, free of refraction and scattering effects. This study is limited to the case of a plane incident P

  7. Washington Play Fairway Analysis Geothermal GIS Data

    DOE Data Explorer

    Corina Forson

    2015-12-15

    This file contains file geodatabases of the Mount St. Helens seismic zone (MSHSZ), Wind River valley (WRV) and Mount Baker (MB) geothermal play-fairway sites in the Washington Cascades. The geodatabases include input data (feature classes) and output rasters (generated from modeling and interpolation) from the geothermal play-fairway in Washington State, USA. These data were gathered and modeled to provide an estimate of the heat and permeability potential within the play-fairways based on: mapped volcanic vents, hot springs and fumaroles, geothermometry, intrusive rocks, temperature-gradient wells, slip tendency, dilation tendency, displacement, displacement gradient, max coulomb shear stress, sigma 3, maximum shear strain rate, and dilational strain rate at 200m and 3 km depth. In addition this file contains layer files for each of the output rasters. For details on the areas of interest please see the 'WA_State_Play_Fairway_Phase_1_Technical_Report' in the download package. This submission also includes a file with the geothermal favorability of the Washington Cascade Range based off of an earlier statewide assessment. Additionally, within this file there are the maximum shear and dilational strain rate rasters for all of Washington State.

  8. VizieR Online Data Catalog: Field RR Lyrae stars (Liska+, 2016)

    NASA Astrophysics Data System (ADS)

    Liska, J.; Skarka, M.; Zejda, M.; Mikulasek, Z.; de Villiers, S. N.

    2016-05-01

    Differential photometry for VX Her in 'table1.dat' file. New photometric measurements for VX Her were performed at Masaryk University Observatory, Brno, Czech Republic during 13 nights (April-August 2014) with 0.6-m (24-inch) Newtonian telescope, CCD G2-0402, in BVRI bands. CCD images were calibrated in a standard way (dark frame and flat field corrections). The C-Munipack software (Motl 2009) was used for this processing as well as for differential photometry. TYC 1510-269-1 and TYC 1510-149-1 were used as comparison and check stars, respectively. Differential photometry for AT Ser and SS Leo is in 'table2.dat' file. New photometric measurements for these two stars were obtained using 1-inch refractor (a photographic lens Sonnar 4/135mm, lens focal ratio/focal length) and ATIK 16IC CCD camera with green photometric filter with similar throughput as the Johnson V filter. Exposures were 30s and each five frames were combined to a single image to achieve a better signal-to-noise ratio. The time resolution of a such combined frame is about 170s. The comparison stars were HD 142799 for AT Ser and HD 100763 for SS Leo. List with candidates for binaries with RR Lyrae component - RRLyrBinCan database (version 2016 May 5) is in 'table3.dat' file. 'table4.dat' file contains false-positives binary candidates among RR Lyrae stars. 'table5.dat' and 'table6.dat' files contain used maxima timings given in GEOS RR Lyr database, or newly determined in this study. (7 data files).

  9. VizieR Online Data Catalog: OGLE eclipsing binaries in LMC (Wyrzykowski+, 2003)

    NASA Astrophysics Data System (ADS)

    Wyrzykowski, L.; Udalski, A.; Kubiak, M.; Szymanski, M.; Zebrun, K.; Soszynski, I.; Wozniak, P. R.; Pietrzynski, G.; Szewczyk, O.

    2003-09-01

    We present the catalog of 2580 eclipsing binary stars detected in 4.6 square degree area of the central parts of the Large Magellanic Cloud. The photometric data were collected during the second phase of the OGLE microlensing search from 1997 to 2000. The eclipsing objects were selected with the automatic search algorithm based on an artificial neural network. Basic statistics of eclipsing stars are presented. Also, the list of 36 candidates of detached eclipsing binaries for spectroscopic study and for precise LMC distance determination is provided. The full catalog is accessible from the OGLE Internet archive. (2 data files).

  10. Flight dynamics analysis and simulation of heavy lift airships. Volume 3: User's manual

    NASA Technical Reports Server (NTRS)

    Emmen, R. D.; Tischler, M. B.

    1982-01-01

    The User's Manual provides the basic information necessary to run the programs. This includes descriptions of the various data files necessary for the program, the various outputs from the program and the options available to the user when executing the program. Additional data file information is contained in the three appendices to the manual. These appendices list all input variables and their permissible values, an example listing of these variables, and all output variables available to the user.

  11. Research Output of Australian Universities: Are the Newer Institutions Catching up?

    ERIC Educational Resources Information Center

    Williams, Ross

    2010-01-01

    Two decades on from the abolition of the binary divide in higher education in Australia, what has happened to the relative research performance of institutions that started from quite diverse positions? We use two databases, Thomson Reuters ISI and Scopus, to measure growth rates in research output. We find that there has been some convergence in…

  12. A PC-based bus monitor program for use with the transport systems research vehicle RS-232 communication interfaces

    NASA Technical Reports Server (NTRS)

    Easley, Wesley C.

    1991-01-01

    Experiment critical use of RS-232 data busses in the Transport Systems Research Vehicle (TSRV) operated by the Advanced Transport Operating Systems Program Office at the NASA Langley Research Center has recently increased. Each application utilizes a number of nonidentical computer and peripheral configurations and requires task specific software development. To aid these development tasks, an IBM PC-based RS-232 bus monitoring system was produced. It can simultaneously monitor two communication ports of a PC or clone, including the nonstandard bus expansion of the TSRV Grid laptop computers. Display occurs in a separate window for each port's input with binary display being selectable. A number of other features including binary log files, screen capture to files, and a full range of communication parameters are provided.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wickstrom, Gregory Lloyd; Gale, Jason Carl; Ma, Kwok Kee

    The Sandia Secure Processor (SSP) is a new native Java processor that has been specifically designed for embedded applications. The SSP's design is a system composed of a core Java processor that directly executes Java bytecodes, on-chip intelligent IO modules, and a suite of software tools for simulation and compiling executable binary files. The SSP is unique in that it provides a way to control real-time IO modules for embedded applications. The system software for the SSP is a 'class loader' that takes Java .class files (created with your favorite Java compiler), links them together, and compiles a binary. Themore » complete SSP system provides very powerful functionality with very light hardware requirements with the potential to be used in a wide variety of small-system embedded applications. This paper gives a detail description of the Sandia Secure Processor and its unique features.« less

  14. VizieR Online Data Catalog: RefleX : X-ray-tracing code (Paltani+, 2017)

    NASA Astrophysics Data System (ADS)

    Paltani, S.; Ricci, C.

    2017-11-01

    We provide here the RefleX executable, for both Linux and MacOSX, together with the User Manual and example script file and output file Running (for instance): reflex_linux will produce the file reflex.out Note that the results may differ slightly depending on the OS, because of slight differences in some implementations numerical computations. The difference are scientifically meaningless. (5 data files).

  15. Construct User Guide

    DTIC Science & Technology

    2012-11-01

    interactions in construct: An empirical validation using calibrated grounding. In 2007 BRIMS Conference Proceedings, Norfolk, VA. Simon, H. A...by the path name. Users should ensure that if they have opened any output files (e.g., in Excel to view the files), they should either close the file...stringvars to delimit string variables. Common Gotchas If Construct is unable to open an input file, it will exit and close. There are times when an

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bosler, Peter

    Stride Search provides a flexible tool for detecting storms or other extreme climate events in high-resolution climate data sets saved on uniform latitude-longitude grids in standard NetCDF format. Users provide the software a quantitative description of a meteorological event they are interested in; the software searches a data set for locations in space and time that meet the user’s description. In its first stage, Stride Search performs a spatial search of the data set at each timestep by dividing a search domain into circular sectors of constant geodesic radius. Data from a netCDF file is read into memory for eachmore » circular search sector. If the data meet or exceed a set of storm identification criteria (defined by the user), a storm is recorded to a linked list. Finally, the linked list is examined and duplicate detections of the same storm are removed and the results are written to an output file. The first stage’s output file is read by a second program that builds storm. Additional identification criteria may be applied at this stage to further classify storms. Storm tracks are the software’s ultimate output and routines are provided for formatting that output for various external software libraries for plotting and tabulating data.« less

  17. Computer input and output files associated with ground-water-flow simulations of the Albuquerque Basin, central New Mexico, 1901-95, with projections to 2020; (supplement three to U.S. Geological Survey Water-resources investigations report 94-4251)

    USGS Publications Warehouse

    Kernodle, J.M.

    1996-01-01

    This report presents the computer input files required to run the three-dimensional ground-water-flow model of the Albuquerque Basin, central New Mexico, documented in Kernodle and others (Kernodle, J.M., McAda, D.P., and Thorn, C.R., 1995, Simulation of ground-water flow in the Albuquerque Basin, central New Mexico, 1901-1994, with projections to 2020: U.S. Geological Survey Water-Resources Investigations Report 94-4251, 114 p.) and revised by Kernodle (Kernodle, J.M., 1998, Simulation of ground-water flow in the Albuquerque Basin, 1901-95, with projections to 2020 (supplement two to U.S. Geological Survey Water-Resources Investigations Report 94-4251): U.S. Geological Survey Open-File Report 96-209, 54 p.). Output files resulting from the computer simulations are included for reference.

  18. BOREAS Forest Cover Data Layers over the SSA-MSA in Raster Format

    NASA Technical Reports Server (NTRS)

    Nickeson, Jaime; Gruszka, F; Hall, F.

    2000-01-01

    This data set, originally provided as vector polygons with attributes, has been processed by BORIS staff to provide raster files that can be used for modeling or for comparison purposes. The original data were received as ARC/INFO coverages or as export files from SERM. The data include information on forest parameters for the BOREAS SSA-MSA. Most of the data used for this product were acquired by BORIS in 1993; the maps were produced from aerial photography taken as recently as 1988. The data are stored in binary, image format files.

  19. Deep classification hashing for person re-identification

    NASA Astrophysics Data System (ADS)

    Wang, Jiabao; Li, Yang; Zhang, Xiancai; Miao, Zhuang; Tao, Gang

    2018-04-01

    As the development of surveillance in public, person re-identification becomes more and more important. The largescale databases call for efficient computation and storage, hashing technique is one of the most important methods. In this paper, we proposed a new deep classification hashing network by introducing a new binary appropriation layer in the traditional ImageNet pre-trained CNN models. It outputs binary appropriate features, which can be easily quantized into binary hash-codes for hamming similarity comparison. Experiments show that our deep hashing method can outperform the state-of-the-art methods on the public CUHK03 and Market1501 datasets.

  20. PHAST--a program for simulating ground-water flow, solute transport, and multicomponent geochemical reactions

    USGS Publications Warehouse

    Parkhurst, David L.; Kipp, Kenneth L.; Engesgaard, Peter; Charlton, Scott R.

    2004-01-01

    The computer program PHAST simulates multi-component, reactive solute transport in three-dimensional saturated ground-water flow systems. PHAST is a versatile ground-water flow and solute-transport simulator with capabilities to model a wide range of equilibrium and kinetic geochemical reactions. The flow and transport calculations are based on a modified version of HST3D that is restricted to constant fluid density and constant temperature. The geochemical reactions are simulated with the geochemical model PHREEQC, which is embedded in PHAST. PHAST is applicable to the study of natural and contaminated ground-water systems at a variety of scales ranging from laboratory experiments to local and regional field scales. PHAST can be used in studies of migration of nutrients, inorganic and organic contaminants, and radionuclides; in projects such as aquifer storage and recovery or engineered remediation; and in investigations of the natural rock-water interactions in aquifers. PHAST is not appropriate for unsaturated-zone flow, multiphase flow, density-dependent flow, or waters with high ionic strengths. A variety of boundary conditions are available in PHAST to simulate flow and transport, including specified-head, flux, and leaky conditions, as well as the special cases of rivers and wells. Chemical reactions in PHAST include (1) homogeneous equilibria using an ion-association thermodynamic model; (2) heterogeneous equilibria between the aqueous solution and minerals, gases, surface complexation sites, ion exchange sites, and solid solutions; and (3) kinetic reactions with rates that are a function of solution composition. The aqueous model (elements, chemical reactions, and equilibrium constants), minerals, gases, exchangers, surfaces, and rate expressions may be defined or modified by the user. A number of options are available to save results of simulations to output files. The data may be saved in three formats: a format suitable for viewing with a text editor; a format suitable for exporting to spreadsheets and post-processing programs; or in Hierarchical Data Format (HDF), which is a compressed binary format. Data in the HDF file can be visualized on Windows computers with the program Model Viewer and extracted with the utility program PHASTHDF; both programs are distributed with PHAST. Operator splitting of the flow, transport, and geochemical equations is used to separate the three processes into three sequential calculations. No iterations between transport and reaction calculations are implemented. A three-dimensional Cartesian coordinate system and finite-difference techniques are used for the spatial and temporal discretization of the flow and transport equations. The non-linear chemical equilibrium equations are solved by a Newton-Raphson method, and the kinetic reaction equations are solved by a Runge-Kutta or an implicit method for integrating ordinary differential equations. The PHAST simulator may require large amounts of memory and long Central Processing Unit (CPU) times. To reduce the long CPU times, a parallel version of PHAST has been developed that runs on a multiprocessor computer or on a collection of computers that are networked. The parallel version requires Message Passing Interface, which is currently (2004) freely available. The parallel version is effective in reducing simulation times. This report documents the use of the PHAST simulator, including running the simulator, preparing the input files, selecting the output files, and visualizing the results. It also presents four examples that verify the numerical method and demonstrate the capabilities of the simulator. PHAST requires three input files. Only the flow and transport file is described in detail in this report. The other two files, the chemistry data file and the database file, are identical to PHREEQC files and the detailed description of these files is found in the PHREEQC documentation.

  1. Advances in a distributed approach for ocean model data interoperability

    USGS Publications Warehouse

    Signell, Richard P.; Snowden, Derrick P.

    2014-01-01

    An infrastructure for earth science data is emerging across the globe based on common data models and web services. As we evolve from custom file formats and web sites to standards-based web services and tools, data is becoming easier to distribute, find and retrieve, leaving more time for science. We describe recent advances that make it easier for ocean model providers to share their data, and for users to search, access, analyze and visualize ocean data using MATLAB® and Python®. These include a technique for modelers to create aggregated, Climate and Forecast (CF) metadata convention datasets from collections of non-standard Network Common Data Form (NetCDF) output files, the capability to remotely access data from CF-1.6-compliant NetCDF files using the Open Geospatial Consortium (OGC) Sensor Observation Service (SOS), a metadata standard for unstructured grid model output (UGRID), and tools that utilize both CF and UGRID standards to allow interoperable data search, browse and access. We use examples from the U.S. Integrated Ocean Observing System (IOOS®) Coastal and Ocean Modeling Testbed, a project in which modelers using both structured and unstructured grid model output needed to share their results, to compare their results with other models, and to compare models with observed data. The same techniques used here for ocean modeling output can be applied to atmospheric and climate model output, remote sensing data, digital terrain and bathymetric data.

  2. A Database of Computer Attacks for the Evaluation of Intrusion Detection Systems

    DTIC Science & Technology

    1999-06-01

    administrator whenever a system binary file (such as the ps, login , or ls program) is modified. Normal users have no legitimate reason to alter these files...development of EMERALD [46], which combines statistical anomaly detection from NIDES with signature verification. Specification-based intrusion detection...the creation of a single host that can act as many hosts. Daemons that provide network services—including telnetd, ftpd, and login — display banners

  3. Gstat: a program for geostatistical modelling, prediction and simulation

    NASA Astrophysics Data System (ADS)

    Pebesma, Edzer J.; Wesseling, Cees G.

    1998-01-01

    Gstat is a computer program for variogram modelling, and geostatistical prediction and simulation. It provides a generic implementation of the multivariable linear model with trends modelled as a linear function of coordinate polynomials or of user-defined base functions, and independent or dependent, geostatistically modelled, residuals. Simulation in gstat comprises conditional or unconditional (multi-) Gaussian sequential simulation of point values or block averages, or (multi-) indicator sequential simulation. Besides many of the popular options found in other geostatistical software packages, gstat offers the unique combination of (i) an interactive user interface for modelling variograms and generalized covariances (residual variograms), that uses the device-independent plotting program gnuplot for graphical display, (ii) support for several ascii and binary data and map file formats for input and output, (iii) a concise, intuitive and flexible command language, (iv) user customization of program defaults, (v) no built-in limits, and (vi) free, portable ANSI-C source code. This paper describes the class of problems gstat can solve, and addresses aspects of efficiency and implementation, managing geostatistical projects, and relevant technical details.

  4. PuLSE: Quality control and quantification of peptide sequences explored by phage display libraries.

    PubMed

    Shave, Steven; Mann, Stefan; Koszela, Joanna; Kerr, Alastair; Auer, Manfred

    2018-01-01

    The design of highly diverse phage display libraries is based on assumption that DNA bases are incorporated at similar rates within the randomized sequence. As library complexity increases and expected copy numbers of unique sequences decrease, the exploration of library space becomes sparser and the presence of truly random sequences becomes critical. We present the program PuLSE (Phage Library Sequence Evaluation) as a tool for assessing randomness and therefore diversity of phage display libraries. PuLSE runs on a collection of sequence reads in the fastq file format and generates tables profiling the library in terms of unique DNA sequence counts and positions, translated peptide sequences, and normalized 'expected' occurrences from base to residue codon frequencies. The output allows at-a-glance quantitative quality control of a phage library in terms of sequence coverage both at the DNA base and translated protein residue level, which has been missing from toolsets and literature. The open source program PuLSE is available in two formats, a C++ source code package for compilation and integration into existing bioinformatics pipelines and precompiled binaries for ease of use.

  5. Processing Of Binary Images

    NASA Astrophysics Data System (ADS)

    Hou, H. S.

    1985-07-01

    An overview of the recent progress in the area of digital processing of binary images in the context of document processing is presented here. The topics covered include input scan, adaptive thresholding, halftoning, scaling and resolution conversion, data compression, character recognition, electronic mail, digital typography, and output scan. Emphasis has been placed on illustrating the basic principles rather than descriptions of a particular system. Recent technology advances and research in this field are also mentioned.

  6. Interactive digital signal processor

    NASA Technical Reports Server (NTRS)

    Mish, W. H.; Wenger, R. M.; Behannon, K. W.; Byrnes, J. B.

    1982-01-01

    The Interactive Digital Signal Processor (IDSP) is examined. It consists of a set of time series analysis Operators each of which operates on an input file to produce an output file. The operators can be executed in any order that makes sense and recursively, if desired. The operators are the various algorithms used in digital time series analysis work. User written operators can be easily interfaced to the sysatem. The system can be operated both interactively and in batch mode. In IDSP a file can consist of up to n (currently n=8) simultaneous time series. IDSP currently includes over thirty standard operators that range from Fourier transform operations, design and application of digital filters, eigenvalue analysis, to operators that provide graphical output, allow batch operation, editing and display information.

  7. Spin wave based parallel logic operations for binary data coded with domain walls

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Urazuka, Y.; Oyabu, S.; Chen, H.

    2014-05-07

    We numerically investigate the feasibility of spin wave (SW) based parallel logic operations, where the phase of SW packet (SWP) is exploited as a state variable and the phase shift caused by the interaction with domain wall (DW) is utilized as a logic inversion functionality. A designed functional element consists of parallel ferromagnetic nanowires (6 nm-thick, 36 nm-width, 5120 nm-length, and 200 nm separation) with the perpendicular magnetization and sub-μm scale overlaid conductors. The logic outputs for binary data, coded with the existence (“1”) or absence (“0”) of the DW, are inductively read out from interferometric aspect of the superposed SWPs, one of themmore » propagating through the stored data area. A practical exclusive-or operation, based on 2π periodicity in the phase logic, is demonstrated for the individual nanowire with an order of different output voltage V{sub out}, depending on the logic output for the stored data. The inductive output from the two nanowires exhibits well defined three different signal levels, corresponding to the information distance (Hamming distance) between 2-bit data stored in the multiple nanowires.« less

  8. VizieR Online Data Catalog: Light curves for the eclipsing binary V1094 Tau (Maxted+, 2015)

    NASA Astrophysics Data System (ADS)

    Maxted, P. F. L.; Hutcheon, R. J.; Torres, G.; Lacy, C. H. S.; Southworth, J.; Smalley, B.; Pavlovski, K.; Marschall, L. A.; Clausen, J. V.

    2015-04-01

    Photometric light curves of the detached eclipsing binary V1094 Tau in the Stroemgren u-,v-,b- and y-bands, and in the Johnson V-band. The curves in the Stroemgren bands were obtained with the Stroemgren Automatic Telescope (SAT) at ESO, La Silla. The curves in the V-band were obtained with the NFO telescope in New Mexico and with the URSA telescope at the University of Arkansas. (6 data files).

  9. User's guide for a large signal computer model of the helical traveling wave tube

    NASA Technical Reports Server (NTRS)

    Palmer, Raymond W.

    1992-01-01

    The use is described of a successful large-signal, two-dimensional (axisymmetric), deformable disk computer model of the helical traveling wave tube amplifier, an extensively revised and operationally simplified version. We also discuss program input and output and the auxiliary files necessary for operation. Included is a sample problem and its input data and output results. Interested parties may now obtain from the author the FORTRAN source code, auxiliary files, and sample input data on a standard floppy diskette, the contents of which are described herein.

  10. Compact 0-complete trees: A new method for searching large files

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Orlandic, R.; Pfaltz, J.L.

    1988-01-26

    In this report, a novel approach to ordered retrieval in very large files is developed. The method employs a B-tree like search algorithm that is independent of key type or key length because all keys in index blocks are encoded by a 1 byte surrogate. The replacement of actual key sequences by the 1 byte surrogate ensures a maximal possible fan out and greatly reduces the storage overhead of maintaining access indices. Initially, retrieval in binary trie structure is developed. With the aid of a fairly complex recurrence relation, the rather scraggly binary trie is transformed into compact multi-way searchmore » tree. Then the recurrence relation itself is replaced by an unusually simple search algorithm. Then implementation details and empirical performance results are presented. Reduction of index size by 50%--75% opens up the possibility of replicating system-wide indices for parallel access in distributed databases. 23 figs.« less

  11. SLICER Airborne Laser Altimeter Characterization of Canopy Structure and Sub-canopy Topography for the BOREAS Northern and Southern Study Regions: Instrument and Data Product Description

    NASA Technical Reports Server (NTRS)

    Hall, Forrest G. (Editor); Nickeson, Jaime (Editor); Harding, D. J.; Blair, J. B.; Rabine, D. L.; Still, K. L.

    2000-01-01

    SLICER data were acquired in support of BOREAS at all of the TF sites in the SSA and NSA, and along transects between the study areas. Data were acquired on 5 days between 18-Jul and 30-Jul-1996. Each coverage of a tower site is typically 40 km in length, with a minimum of 3 and a maximum of 10 lines across each tower oriented in a variety of azimuths. The SLICER data were acquired simultaneously with ASAS hyperspectral, multiview angle images. The SLICER Level 3 products consist of binary files for each flight line with a data record for each laser shot composed of 13 parameters and a 600-byte waveform that is the raw record of the backscatter laser energy reflected from Earth's surface. The SLICER data are stored in a combination of ASCII and binary data files.

  12. imzML: Imaging Mass Spectrometry Markup Language: A common data format for mass spectrometry imaging.

    PubMed

    Römpp, Andreas; Schramm, Thorsten; Hester, Alfons; Klinkert, Ivo; Both, Jean-Pierre; Heeren, Ron M A; Stöckli, Markus; Spengler, Bernhard

    2011-01-01

    Imaging mass spectrometry is the method of scanning a sample of interest and generating an "image" of the intensity distribution of a specific analyte. The data sets consist of a large number of mass spectra which are usually acquired with identical settings. Existing data formats are not sufficient to describe an MS imaging experiment completely. The data format imzML was developed to allow the flexible and efficient exchange of MS imaging data between different instruments and data analysis software.For this purpose, the MS imaging data is divided in two separate files. The mass spectral data is stored in a binary file to ensure efficient storage. All metadata (e.g., instrumental parameters, sample details) are stored in an XML file which is based on the standard data format mzML developed by HUPO-PSI. The original mzML controlled vocabulary was extended to include specific parameters of imaging mass spectrometry (such as x/y position and spatial resolution). The two files (XML and binary) are connected by offset values in the XML file and are unambiguously linked by a universally unique identifier. The resulting datasets are comparable in size to the raw data and the separate metadata file allows flexible handling of large datasets.Several imaging MS software tools already support imzML. This allows choosing from a (growing) number of processing tools. One is no longer limited to proprietary software, but is able to use the processing software which is best suited for a specific question or application. On the other hand, measurements from different instruments can be compared within one software application using identical settings for data processing. All necessary information for evaluating and implementing imzML can be found at http://www.imzML.org .

  13. File concepts for parallel I/O

    NASA Technical Reports Server (NTRS)

    Crockett, Thomas W.

    1989-01-01

    The subject of input/output (I/O) was often neglected in the design of parallel computer systems, although for many problems I/O rates will limit the speedup attainable. The I/O problem is addressed by considering the role of files in parallel systems. The notion of parallel files is introduced. Parallel files provide for concurrent access by multiple processes, and utilize parallelism in the I/O system to improve performance. Parallel files can also be used conventionally by sequential programs. A set of standard parallel file organizations is proposed, organizations are suggested, using multiple storage devices. Problem areas are also identified and discussed.

  14. Non-synchronous control of self-oscillating resonant converters

    DOEpatents

    Glaser, John Stanley; Zane, Regan Andrew

    2002-01-01

    A self-oscillating switching power converter has a controllable reactance including an active device connected to a reactive element, wherein the effective reactance of the reactance and the active device is controlled such that the control waveform for the active device is binary digital and is not synchronized with the switching converter output frequency. The active device is turned completely on and off at a frequency that is substantially greater than the maximum frequency imposed on the output terminals of the active device. The effect is to vary the average resistance across the active device output terminals, and thus the effective output reactance, thereby providing converter output control, while maintaining the response speed of the converter.

  15. Rotorcraft Optimization Tools: Incorporating Rotorcraft Design Codes into Multi-Disciplinary Design, Analysis, and Optimization

    NASA Technical Reports Server (NTRS)

    Meyn, Larry A.

    2018-01-01

    One of the goals of NASA's Revolutionary Vertical Lift Technology Project (RVLT) is to provide validated tools for multidisciplinary design, analysis and optimization (MDAO) of vertical lift vehicles. As part of this effort, the software package, RotorCraft Optimization Tools (RCOTOOLS), is being developed to facilitate incorporating key rotorcraft conceptual design codes into optimizations using the OpenMDAO multi-disciplinary optimization framework written in Python. RCOTOOLS, also written in Python, currently supports the incorporation of the NASA Design and Analysis of RotorCraft (NDARC) vehicle sizing tool and the Comprehensive Analytical Model of Rotorcraft Aerodynamics and Dynamics II (CAMRAD II) analysis tool into OpenMDAO-driven optimizations. Both of these tools use detailed, file-based inputs and outputs, so RCOTOOLS provides software wrappers to update input files with new design variable values, execute these codes and then extract specific response variable values from the file outputs. These wrappers are designed to be flexible and easy to use. RCOTOOLS also provides several utilities to aid in optimization model development, including Graphical User Interface (GUI) tools for browsing input and output files in order to identify text strings that are used to identify specific variables as optimization input and response variables. This paper provides an overview of RCOTOOLS and its use

  16. TICK: Transparent Incremental Checkpointing at Kernel Level

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Petrini, Fabrizio; Gioiosa, Roberto

    2004-10-25

    TICK is a software package implemented in Linux 2.6 that allows the save and restore of user processes, without any change to the user code or binary. With TICK a process can be suspended by the Linux kernel upon receiving an interrupt and saved in a file. This file can be later thawed in another computer running Linux (potentially the same computer). TICK is implemented as a Linux kernel module, in the Linux version 2.6.5

  17. Hybrid cryptosystem implementation using fast data encipherment algorithm (FEAL) and goldwasser-micali algorithm for file security

    NASA Astrophysics Data System (ADS)

    Rachmawati, D.; Budiman, M. A.; Siburian, W. S. E.

    2018-05-01

    On the process of exchanging files, security is indispensable to avoid the theft of data. Cryptography is one of the sciences used to secure the data by way of encoding. Fast Data Encipherment Algorithm (FEAL) is a block cipher symmetric cryptographic algorithms. Therefore, the file which wants to protect is encrypted and decrypted using the algorithm FEAL. To optimize the security of the data, session key that is utilized in the algorithm FEAL encoded with the Goldwasser-Micali algorithm, which is an asymmetric cryptographic algorithm and using probabilistic concept. In the encryption process, the key was converted into binary form. The selection of values of x that randomly causes the results of the cipher key is different for each binary value. The concept of symmetry and asymmetry algorithm merger called Hybrid Cryptosystem. The use of the algorithm FEAL and Goldwasser-Micali can restore the message to its original form and the algorithm FEAL time required for encryption and decryption is directly proportional to the length of the message. However, on Goldwasser- Micali algorithm, the length of the message is not directly proportional to the time of encryption and decryption.

  18. Virtual Machine Language

    NASA Technical Reports Server (NTRS)

    Grasso, Christopher; Page, Dennis; O'Reilly, Taifun; Fteichert, Ralph; Lock, Patricia; Lin, Imin; Naviaux, Keith; Sisino, John

    2005-01-01

    Virtual Machine Language (VML) is a mission-independent, reusable software system for programming for spacecraft operations. Features of VML include a rich set of data types, named functions, parameters, IF and WHILE control structures, polymorphism, and on-the-fly creation of spacecraft commands from calculated values. Spacecraft functions can be abstracted into named blocks that reside in files aboard the spacecraft. These named blocks accept parameters and execute in a repeatable fashion. The sizes of uplink products are minimized by the ability to call blocks that implement most of the command steps. This block approach also enables some autonomous operations aboard the spacecraft, such as aerobraking, telemetry conditional monitoring, and anomaly response, without developing autonomous flight software. Operators on the ground write blocks and command sequences in a concise, high-level, human-readable programming language (also called VML ). A compiler translates the human-readable blocks and command sequences into binary files (the operations products). The flight portion of VML interprets the uplinked binary files. The ground subsystem of VML also includes an interactive sequence- execution tool hosted on workstations, which runs sequences at several thousand times real-time speed, affords debugging, and generates reports. This tool enables iterative development of blocks and sequences within times of the order of seconds.

  19. 17 CFR 232.11 - Definition of terms used in part 232.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ..., PDF, and static graphic files. Such code may be in binary (machine language) or in script form... Act means the Trust Indenture Act of 1939. Unofficial PDF copy. The term unofficial PDF copy means an...

  20. Merged analog and photon counting profiles used as input for other RLPROF VAPs

    DOE Data Explorer

    Newsom, Rob

    2014-10-03

    The rlprof_merge VAP "merges" the photon counting and analog signals appropriately for each channel, creating an output data file that is very similar to the original raw data file format that the Raman lidar initially had.

  1. Merged analog and photon counting profiles used as input for other RLPROF VAPs

    DOE Data Explorer

    Newsom, Rob

    1998-03-01

    The rlprof_merge VAP "merges" the photon counting and analog signals appropriately for each channel, creating an output data file that is very similar to the original raw data file format that the Raman lidar initially had.

  2. 76 FR 63575 - Transportation Conformity Rule: MOVES Regional Grace Period Extension

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-13

    ... written in FORTRAN and used simple text files for data input and output, MOVES2010a is written in JAVA and uses a relational database structure in MYSQL to handle input and output as data tables. These changes...

  3. 76 FR 63554 - Transportation Conformity Rule: MOVES Regional Grace Period Extension

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-13

    ... written in FORTRAN and used simple text files for data input and output, MOVES2010a is written in JAVA and uses a relational database structure in MYSQL to handle input and output as data tables. These changes...

  4. 77 FR 11394 - Transportation Conformity Rule: MOVES Regional Grace Period Extension

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-27

    ... written in FORTRAN and used simple text files for data input and output, MOVES is written in JAVA and uses a relational database structure in MYSQL to handle input and output as data tables.\\13\\ \\13\\ Some...

  5. Binary-mask generation for diffractive optical elements using microcomputers.

    PubMed

    O'Shea, D C; Beletic, J W; Poutous, M

    1993-05-10

    A new technique for generation of binary masks for the fabrication of diffractive optical elements is investigated. This technique, which uses commercially available desktop-publishing hardware and software in conjunction with a standard photoreduction camera, is much faster and less expensive thanhe conventional methods. The short turnaround time and low cost should give researchers a much greater degree of flexibility in the field of binary optics and enable wider application of diffractive-optics technology. Techniques for generating optical elements by using standard software packages that produce PostScript output are described. An evaluation of the dimensional fidelity of the mask reproduction from design to its realization in photoresist is presented.

  6. BOREAS TE-20 Soils Data Over the NSA-MSA and Tower Sites in Raster Format

    NASA Technical Reports Server (NTRS)

    Hall, Forrest G. (Editor); Veldhuis, Hugo; Knapp, David; Veldhuis, Hugo

    2000-01-01

    The BOREAS TE-20 team collected several data sets for use in developing and testing models of forest ecosystem dynamics. This data set was gridded from vector layers of soil maps that were received from Dr. Hugo Veldhuis, who did the original mapping in the field during 1994. The vector layers were gridded into raster files that cover the NSA-MSA and tower sites. The data are stored in binary, image format files. The data files are available on a CD-ROM (see document number 20010000884), or from the Oak Ridge National Laboratory (ORNL) Distributed Active Center (DAAC).

  7. User’s guide for MapMark4GUI—A graphical user interface for the MapMark4 R package

    USGS Publications Warehouse

    Shapiro, Jason

    2018-05-29

    MapMark4GUI is an R graphical user interface (GUI) developed by the U.S. Geological Survey to support user implementation of the MapMark4 R statistical software package. MapMark4 was developed by the U.S. Geological Survey to implement probability calculations for simulating undiscovered mineral resources in quantitative mineral resource assessments. The GUI provides an easy-to-use tool to input data, run simulations, and format output results for the MapMark4 package. The GUI is written and accessed in the R statistical programming language. This user’s guide includes instructions on installing and running MapMark4GUI and descriptions of the statistical output processes, output files, and test data files.

  8. Binary catalogue of exoplanets

    NASA Astrophysics Data System (ADS)

    Schwarz, Richard; Bazso, Akos; Zechner, Renate; Funk, Barbara

    2016-02-01

    Since 1995 there is a database which list most of the known exoplanets (The Extrasolar Planets Encyclopaedia at http://exoplanet.eu/). With the growing number of detected exoplanets in binary and multiple star systems it became more important to mark and to separate them into a new database, which is not available in the Extrasolar Planets Encyclopaedia. Therefore we established an online database (which can be found at: http://www.univie.ac.at/adg/schwarz/multiple.html) for all known exoplanets in binary star systems and in addition for multiple star systems, which will be updated regularly and linked to the Extrasolar Planets Encyclopaedia. The binary catalogue of exoplanets is available online as data file and can be used for statistical purposes. Our database is divided into two parts: the data of the stars and the planets, given in a separate list. We describe also the different parameters of the exoplanetary systems and present some applications.

  9. DICOM to print, 35-mm slides, web, and video projector: tutorial using Adobe Photoshop.

    PubMed

    Gurney, Jud W

    2002-10-01

    Preparing images for publication has dealt with film and the photographic process. With picture archiving and communications systems, many departments will no longer produce film. This will change how images are produced for publication. DICOM, the file format for radiographic images, has to be converted and then prepared for traditional publication, 35-mm slides, the newest techniques of video projection, and the World Wide Web. Tagged image file format is the common format for traditional print publication, whereas joint photographic expert group is the current file format for the World Wide Web. Each medium has specific requirements that can be met with a common image-editing program such as Adobe Photoshop (Adobe Systems, San Jose, CA). High-resolution images are required for print, a process that requires interpolation. However, the Internet requires images with a small file size for rapid transmission. The resolution of each output differs and the image resolution must be optimized to match the output of the publishing medium.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    McSpaden, Alexander Thomas

    Two Python scripts have been written that process the output files of MCNP6 into a format that mimics the list-mode output of Los Alamos National Laboratory’s MC-15 and NPOD neutron detection systems. This report details the methods implemented in these scripts and instructions on their use.

  11. Monte Carlo Particle Lists: MCPL

    NASA Astrophysics Data System (ADS)

    Kittelmann, T.; Klinkby, E.; Knudsen, E. B.; Willendrup, P.; Cai, X. X.; Kanaki, K.

    2017-09-01

    A binary format with lists of particle state information, for interchanging particles between various Monte Carlo simulation applications, is presented. Portable C code for file manipulation is made available to the scientific community, along with converters and plugins for several popular simulation packages.

  12. Microwave Instrumentation for Multifrequency Attenuation Measurements Through Propellant Gases

    DTIC Science & Technology

    1952-02-13

    50K 58KK Il1K .,I4pg I4 .TOE CATHODE RESISTORS ANDC I NI PU T fI 1 j1 00OM 0ON T0 ALL BINARY DIVIDE GRI THORDARSON T22 000 TO BINARY DIVIDERS 5U4... THORDARSON 1100 T- 10054 2K OUTPUT PULSE SOAPER, S0 8C + - 25 + AND CATHODE FOLLOWERS 10 30 30 1D HEATERS OF DIVIDERS _L THRO ARSON T-21FID HEATERS OF

  13. Unbiased All-Optical Random-Number Generator

    NASA Astrophysics Data System (ADS)

    Steinle, Tobias; Greiner, Johannes N.; Wrachtrup, Jörg; Giessen, Harald; Gerhardt, Ilja

    2017-10-01

    The generation of random bits is of enormous importance in modern information science. Cryptographic security is based on random numbers which require a physical process for their generation. This is commonly performed by hardware random-number generators. These often exhibit a number of problems, namely experimental bias, memory in the system, and other technical subtleties, which reduce the reliability in the entropy estimation. Further, the generated outcome has to be postprocessed to "iron out" such spurious effects. Here, we present a purely optical randomness generator, based on the bistable output of an optical parametric oscillator. Detector noise plays no role and postprocessing is reduced to a minimum. Upon entering the bistable regime, initially the resulting output phase depends on vacuum fluctuations. Later, the phase is rigidly locked and can be well determined versus a pulse train, which is derived from the pump laser. This delivers an ambiguity-free output, which is reliably detected and associated with a binary outcome. The resulting random bit stream resembles a perfect coin toss and passes all relevant randomness measures. The random nature of the generated binary outcome is furthermore confirmed by an analysis of resulting conditional entropies.

  14. Analysis of the possibility of using G.729 codec for steganographic transmission

    NASA Astrophysics Data System (ADS)

    Piotrowski, Zbigniew; Ciołek, Michał; Dołowski, Jerzy; Wojtuń, Jarosław

    2017-04-01

    Network steganography is dedicated in particular for those communication services for which there are no bridges or nodes carrying out unintentional attacks on steganographic sequence. In order to set up a hidden communication channel the method of data encoding and decoding was implemented using code books of codec G.729. G.729 codec includes, in its construction, linear prediction vocoder CS-ACELP (Conjugate Structure Algebraic Code Excited Linear Prediction), and by modifying the binary content of the codebook, it is easy to change a binary output stream. The article describes the results of research on the selection of these bits of the codebook codec G.729 which the negation of the least have influence to the loss of quality and fidelity of the output signal. The study was performed with the use of subjective and objective listening tests.

  15. A Taxonomy-Based Approach to Shed Light on the Babel of Mathematical Models for Rice Simulation

    NASA Technical Reports Server (NTRS)

    Confalonieri, Roberto; Bregaglio, Simone; Adam, Myriam; Ruget, Francoise; Li, Tao; Hasegawa, Toshihiro; Yin, Xinyou; Zhu, Yan; Boote, Kenneth; Buis, Samuel; hide

    2016-01-01

    For most biophysical domains, differences in model structures are seldom quantified. Here, we used a taxonomy-based approach to characterise thirteen rice models. Classification keys and binary attributes for each key were identified, and models were categorised into five clusters using a binary similarity measure and the unweighted pair-group method with arithmetic mean. Principal component analysis was performed on model outputs at four sites. Results indicated that (i) differences in structure often resulted in similar predictions and (ii) similar structures can lead to large differences in model outputs. User subjectivity during calibration may have hidden expected relationships between model structure and behaviour. This explanation, if confirmed, highlights the need for shared protocols to reduce the degrees of freedom during calibration, and to limit, in turn, the risk that user subjectivity influences model performance.

  16. The Cheetah Data Management System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kunz, P.F.; Word, G.B.

    1991-03-01

    Cheetah is a data management system based on the C programming language. The premise of Cheetah is that the banks' of FORTRAN based systems should be structures' as defined by the C language. Cheetah is a system to mange these structures, while preserving the use of the C language in its native form. For C structures managed by Cheetah, the user can use Cheetah utilities such as reading and writing, in a machine independent form, both binary and text files to disk or over a network. Files written by Cheetah also contain a dictionary describing in detail the data containedmore » in the file. Such information is intended to be used by interactive programs for presenting the contents of the file. Such information is intended to be used by interactive programs for presenting the contents of file. Cheetah has been ported to many different operating systems with no operating system dependent switches.« less

  17. Java-based Graphical User Interface for MAVERIC-II

    NASA Technical Reports Server (NTRS)

    Seo, Suk Jai

    2005-01-01

    A computer program entitled "Marshall Aerospace Vehicle Representation in C II, (MAVERIC-II)" is a vehicle flight simulation program written primarily in the C programming language. It is written by James W. McCarter at NASA/Marshall Space Flight Center. The goal of the MAVERIC-II development effort is to provide a simulation tool that facilitates the rapid development of high-fidelity flight simulations for launch, orbital, and reentry vehicles of any user-defined configuration for all phases of flight. MAVERIC-II has been found invaluable in performing flight simulations for various Space Transportation Systems. The flexibility provided by MAVERIC-II has allowed several different launch vehicles, including the Saturn V, a Space Launch Initiative Two-Stage-to-Orbit concept and a Shuttle-derived launch vehicle, to be simulated during ascent and portions of on-orbit flight in an extremely efficient manner. It was found that MAVERIC-II provided the high fidelity vehicle and flight environment models as well as the program modularity to allow efficient integration, modification and testing of advanced guidance and control algorithms. In addition to serving as an analysis tool for techno logy development, many researchers have found MAVERIC-II to be an efficient, powerful analysis tool that evaluates guidance, navigation, and control designs, vehicle robustness, and requirements. MAVERIC-II is currently designed to execute in a UNIX environment. The input to the program is composed of three segments: 1) the vehicle models such as propulsion, aerodynamics, and guidance, navigation, and control 2) the environment models such as atmosphere and gravity, and 3) a simulation framework which is responsible for executing the vehicle and environment models and propagating the vehicle s states forward in time and handling user input/output. MAVERIC users prepare data files for the above models and run the simulation program. They can see the output on screen and/or store in files and examine the output data later. Users can also view the output stored in output files by calling a plotting program such as gnuplot. A typical scenario of the use of MAVERIC consists of three-steps; editing existing input data files, running MAVERIC, and plotting output results.

  18. Two graphical user interfaces for managing and analyzing MODFLOW groundwater-model scenarios

    USGS Publications Warehouse

    Banta, Edward R.

    2014-01-01

    Scenario Manager and Scenario Analyzer are graphical user interfaces that facilitate the use of calibrated, MODFLOW-based groundwater models for investigating possible responses to proposed stresses on a groundwater system. Scenario Manager allows a user, starting with a calibrated model, to design and run model scenarios by adding or modifying stresses simulated by the model. Scenario Analyzer facilitates the process of extracting data from model output and preparing such display elements as maps, charts, and tables. Both programs are designed for users who are familiar with the science on which groundwater modeling is based but who may not have a groundwater modeler’s expertise in building and calibrating a groundwater model from start to finish. With Scenario Manager, the user can manipulate model input to simulate withdrawal or injection wells, time-variant specified hydraulic heads, recharge, and such surface-water features as rivers and canals. Input for stresses to be simulated comes from user-provided geographic information system files and time-series data files. A Scenario Manager project can contain multiple scenarios and is self-documenting. Scenario Analyzer can be used to analyze output from any MODFLOW-based model; it is not limited to use with scenarios generated by Scenario Manager. Model-simulated values of hydraulic head, drawdown, solute concentration, and cell-by-cell flow rates can be presented in display elements. Map data can be represented as lines of equal value (contours) or as a gradated color fill. Charts and tables display time-series data obtained from output generated by a transient-state model run or from user-provided text files of time-series data. A display element can be based entirely on output of a single model run, or, to facilitate comparison of results of multiple scenarios, an element can be based on output from multiple model runs. Scenario Analyzer can export display elements and supporting metadata as a Portable Document Format file.

  19. As-built design specification for segment map (Sgmap) program

    NASA Technical Reports Server (NTRS)

    Tompkins, M. A. (Principal Investigator)

    1981-01-01

    The segment map program (SGMAP), which is part of the CLASFYT package, is described in detail. This program is designed to output symbolic maps or numerical dumps from LANDSAT cluster/classification files or aircraft ground truth/processed ground truth files which are in 'universal' format.

  20. Creation of a Machine File and Subsequent Computer-Assisted Production of Publishing Outputs, Including a Translation Journal and an Index.

    ERIC Educational Resources Information Center

    Buckland, Lawrence F.; Weaver, Vance

    Reported are the findings of the Uspekhi experiment in creating a labeled machine file, as well as sample products of this system - an article from a scientific journal and an index page. Production cost tables are presented for the machine file, primary journals, and journal indexes. Comparisons were made between the 1965 predicted costs and the…

  1. Image processing tool for automatic feature recognition and quantification

    DOEpatents

    Chen, Xing; Stoddard, Ryan J.

    2017-05-02

    A system for defining structures within an image is described. The system includes reading of an input file, preprocessing the input file while preserving metadata such as scale information and then detecting features of the input file. In one version the detection first uses an edge detector followed by identification of features using a Hough transform. The output of the process is identified elements within the image.

  2. ExcelAutomat: a tool for systematic processing of files as applied to quantum chemical calculations

    NASA Astrophysics Data System (ADS)

    Laloo, Jalal Z. A.; Laloo, Nassirah; Rhyman, Lydia; Ramasami, Ponnadurai

    2017-07-01

    The processing of the input and output files of quantum chemical calculations often necessitates a spreadsheet as a key component of the workflow. Spreadsheet packages with a built-in programming language editor can automate the steps involved and thus provide a direct link between processing files and the spreadsheet. This helps to reduce user-interventions as well as the need to switch between different programs to carry out each step. The ExcelAutomat tool is the implementation of this method in Microsoft Excel (MS Excel) using the default Visual Basic for Application (VBA) programming language. The code in ExcelAutomat was adapted to work with the platform-independent open-source LibreOffice Calc, which also supports VBA. ExcelAutomat provides an interface through the spreadsheet to automate repetitive tasks such as merging input files, splitting, parsing and compiling data from output files, and generation of unique filenames. Selected extracted parameters can be retrieved as variables which can be included in custom codes for a tailored approach. ExcelAutomat works with Gaussian files and is adapted for use with other computational packages including the non-commercial GAMESS. ExcelAutomat is available as a downloadable MS Excel workbook or as a LibreOffice workbook.

  3. ExcelAutomat: a tool for systematic processing of files as applied to quantum chemical calculations.

    PubMed

    Laloo, Jalal Z A; Laloo, Nassirah; Rhyman, Lydia; Ramasami, Ponnadurai

    2017-07-01

    The processing of the input and output files of quantum chemical calculations often necessitates a spreadsheet as a key component of the workflow. Spreadsheet packages with a built-in programming language editor can automate the steps involved and thus provide a direct link between processing files and the spreadsheet. This helps to reduce user-interventions as well as the need to switch between different programs to carry out each step. The ExcelAutomat tool is the implementation of this method in Microsoft Excel (MS Excel) using the default Visual Basic for Application (VBA) programming language. The code in ExcelAutomat was adapted to work with the platform-independent open-source LibreOffice Calc, which also supports VBA. ExcelAutomat provides an interface through the spreadsheet to automate repetitive tasks such as merging input files, splitting, parsing and compiling data from output files, and generation of unique filenames. Selected extracted parameters can be retrieved as variables which can be included in custom codes for a tailored approach. ExcelAutomat works with Gaussian files and is adapted for use with other computational packages including the non-commercial GAMESS. ExcelAutomat is available as a downloadable MS Excel workbook or as a LibreOffice workbook.

  4. AP-IO: asynchronous pipeline I/O for hiding periodic output cost in CFD simulation.

    PubMed

    Xiaoguang, Ren; Xinhai, Xu

    2014-01-01

    Computational fluid dynamics (CFD) simulation often needs to periodically output intermediate results to files in the form of snapshots for visualization or restart, which seriously impacts the performance. In this paper, we present asynchronous pipeline I/O (AP-IO) optimization scheme for the periodically snapshot output on the basis of asynchronous I/O and CFD application characteristics. In AP-IO, dedicated background I/O processes or threads are in charge of handling the file write in pipeline mode, therefore the write overhead can be hidden with more calculation than classic asynchronous I/O. We design the framework of AP-IO and implement it in OpenFOAM, providing CFD users with a user-friendly interface. Experimental results on the Tianhe-2 supercomputer demonstrate that AP-IO can achieve a good optimization effect for the periodical snapshot output in CFD application, and the effect is especially better for massively parallel CFD simulations, which can reduce the total execution time up to about 40%.

  5. AP-IO: Asynchronous Pipeline I/O for Hiding Periodic Output Cost in CFD Simulation

    PubMed Central

    Xiaoguang, Ren; Xinhai, Xu

    2014-01-01

    Computational fluid dynamics (CFD) simulation often needs to periodically output intermediate results to files in the form of snapshots for visualization or restart, which seriously impacts the performance. In this paper, we present asynchronous pipeline I/O (AP-IO) optimization scheme for the periodically snapshot output on the basis of asynchronous I/O and CFD application characteristics. In AP-IO, dedicated background I/O processes or threads are in charge of handling the file write in pipeline mode, therefore the write overhead can be hidden with more calculation than classic asynchronous I/O. We design the framework of AP-IO and implement it in OpenFOAM, providing CFD users with a user-friendly interface. Experimental results on the Tianhe-2 supercomputer demonstrate that AP-IO can achieve a good optimization effect for the periodical snapshot output in CFD application, and the effect is especially better for massively parallel CFD simulations, which can reduce the total execution time up to about 40%. PMID:24955390

  6. NASA GES DISC On-line Visualization and Analysis System for Gridded Remote Sensing Data

    NASA Technical Reports Server (NTRS)

    Leptoukh, Gregory G.; Berrick, S.; Rui, H.; Liu, Z.; Zhu, T.; Teng, W.; Shen, S.; Qin, J.

    2005-01-01

    The ability to use data stored in the current NASA Earth Observing System (EOS) archives for studying regional or global phenomena is highly dependent on having a detailed understanding of the data's internal structure and physical implementation. Gaining this understanding and applying it to data reduction is a time-consuming task that must be undertaken before the core investigation can begin. This is an especially difficult challenge when science objectives require users to deal with large multi-sensor data sets that are usually of different formats, structures, and resolutions. The NASA Goddard Earth Sciences Data and Information Services Center (GES DISC) has taken a major step towards meeting this challenge by developing an infrastructure with a Web interface that allows users to perform interactive analysis online without downloading any data, the GES-DISC Interactive Online Visualization and Analysis Infrastructure or "Giovanni." Giovanni provides interactive, online, analysis tools for data users to facilitate their research. There have been several instances of this interface created to serve TRMM users, Aerosol scientists, Ocean Color and Agriculture applications users. The first generation of these tools support gridded data only. The user selects geophysical parameters, area of interest, time period; and the system generates an output on screen in a matter of seconds. The currently available output options are: Area plot averaged or accumulated over any available data period for any rectangular area; Time plot time series averaged over any rectangular area; Hovmoller plots image view of any longitude-time and latitude-time cross sections; ASCII output for all plot types; Image animation for area plot. Another analysis suite deals with parameter intercomparison: scatter plots, temporal correlation maps, GIs-compatible outputs, etc. This allow user to focus on data content (i.e. science parameters) and eliminate the need for expensive learning, development and processing tasks that are redundantly incurred by an archive's user community. The current implementation utilizes the GrADS-DODS Server (GDS), and provides subsetting and analysis services across the Internet for any GrADS-readable dataset. The subsetting capability allows users to retrieve a specified temporal and/or spatial subdomain from a large dataset, eliminating the need to download everything simply to access a small relevant portion of a dataset. The analysis capability allows users to retrieve the results of an operation applied to one or more datasets on the server. We use this approach to read pre-processed binary files and/or to read and extract the needed parts directly from HDF or HDF-EOS files. These subsets then serve as inputs into GrADS analysis scripts. It can be used in a wide variety of Earth science applications: climate and weather events study and monitoring; modeling. It can be easily configured for new applications.

  7. SARAH 3.2: Dirac gauginos, UFO output, and more

    NASA Astrophysics Data System (ADS)

    Staub, Florian

    2013-07-01

    SARAH is a Mathematica package optimized for the fast, efficient and precise study of supersymmetric models beyond the MSSM: a new model can be defined in a short form and all vertices are derived. This allows SARAH to create model files for FeynArts/FormCalc, CalcHep/CompHep and WHIZARD/O'Mega. The newest version of SARAH now provides the possibility to create model files in the UFO format which is supported by MadGraph 5, MadAnalysis 5, GoSam, and soon by Herwig++. Furthermore, SARAH also calculates the mass matrices, RGEs and 1-loop corrections to the mass spectrum. This information is used to write source code for SPheno in order to create a precision spectrum generator for the given model. This spectrum-generator-generator functionality as well as the output of WHIZARD and CalcHep model files has seen further improvement in this version. Also models including Dirac gauginos are supported with the new version of SARAH, and additional checks for the consistency of the implementation of new models have been created. Program summaryProgram title:SARAH Catalogue identifier: AEIB_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEIB_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 3 22 411 No. of bytes in distributed program, including test data, etc.: 3 629 206 Distribution format: tar.gz Programming language: Mathematica. Computer: All for which Mathematica is available. Operating system: All for which Mathematica is available. Classification: 11.1, 11.6. Catalogue identifier of previous version: AEIB_v1_0 Journal reference of previous version: Comput. Phys. Comm. 182 (2011) 808 Does the new version supersede the previous version?: Yes, the new version includes all known features of the previous version but also provides the new features mentioned below. Nature of problem: To use Madgraph for new models it is necessary to provide the corresponding model files which include all information about the interactions of the model. However, the derivation of the vertices for a given model and putting those into model files which can be used with Madgraph is usually very time consuming. Dirac gauginos are not present in the minimal supersymmetric standard model (MSSM) or many extensions of it. Dirac mass terms for vector superfields lead to new structures in the supersymmetric (SUSY) Lagrangian (bilinear mass term between gaugino and matter fermion as well as new D-terms) and modify also the SUSY renormalization group equations (RGEs). The Dirac character of gauginos can change the collider phenomenology. In addition, they come with an extended Higgs sector for which a precise calculation of the 1-loop masses has not happened so far. Solution method: SARAH calculates the complete Lagrangian for a given model whose gauge sector can be any direct product of SU(N) gauge groups. The chiral superfields can transform as any, irreducible representation with respect to these gauge groups and it is possible to handle an arbitrary number of symmetry breakings or particle rotations. Also the gauge fixing is automatically added. Using this information, SARAH derives all vertices for a model. These vertices can be exported to model files in the UFO which is supported by Madgraph and other codes like GoSam, MadAnalysis or ALOHA. The user can also study models with Dirac gauginos. In that case SARAH includes all possible terms in the Lagrangian stemming from the new structures and can also calculate the RGEs. The entire impact of these terms is then taken into account in the output of SARAH to UFO, CalcHep, WHIZARD, FeynArts and SPheno. Reasons for new version: SARAH provides, with this version, the possibility of creating model files in the UFO format. The UFO format is supposed to become a standard format for model files which should be supported by many different tools in the future. Also models with Dirac gauginos were not supported in earlier versions. Summary of revisions: Support of models with Dirac gauginos. Output of model files in the UFO format, speed improvement in the output of WHIZARD model files, CalcHep output supports the internal diagonalization of mass matrices, output of control files for LHPC spectrum plotter, support of generalized PDG numbering scheme PDG.IX, improvement of the calculation of the decay widths and branching ratios with SPheno, the calculation of new low energy observables are added to the SPheno output, the handling of gauge fixing terms has been significantly simplified. Restrictions: SARAH can only derive the Lagrangian in an automatized way for N=1 SUSY models, but not for those with more SUSY generators. Furthermore, SARAH supports only renormalizable operators in the output of model files in the UFO format and also for CalcHep, FeynArts and WHIZARD. Also color sextets are not yet included in the model files for Monte Carlo tools. Dimension 5 operators are only supported in the calculation of the RGEs and mass matrices. Unusual features: SARAH does not need the Lagrangian of a model as input to calculate the vertices. The gauge structure, particle and content and superpotential as well as rotations stemming from gauge symmetry breaking are sufficient. All further information is derived by SARAH on its own. Therefore, the model files are very short and the implementation of new models is fast and easy. In addition, the implementation of a model can be checked for physical and formal consistency. In addition, SARAH can generate Fortran code for a full 1-loop analysis of the mass spectrum in the context for Dirac gauginos. Running time: Measured CPU time for the evaluation of the MSSM using a Lenovo Thinkpad X220 with i7 processor (2.53 GHz). Calculating the complete Lagrangian: 9 s. Calculating all vertices: 51 s. Output of the UFO model files: 49 s.

  8. Program Description: Financial Master File Processor-SWRL Financial System.

    ERIC Educational Resources Information Center

    Ideda, Masumi

    Computer routines designed to produce various management and accounting reports required by the Southwest Regional Laboratory's (SWRL) Financial System are described. Input data requirements and output report formats are presented together with a discussion of the Financial Master File updating capabilities of the system. This document should be…

  9. Chopper-stabilized phase detector

    NASA Technical Reports Server (NTRS)

    Hopkins, P. M.

    1978-01-01

    Phase-detector circuit for binary-tracking loops and other binary-data acquisition systems minimizes effects of drift, gain imbalance, and voltage offset in detector circuitry. Input signal passes simultaneously through two channels where it is mixed with early and late codes that are alternately switched between channels. Code switching is synchronized with polarity switching of detector output of each channel so that each channel uses each detector for half time. Net result is that dc offset errors are canceled, and effect of gain imbalance is simply change in sensitivity.

  10. Programmable high-output-impedance, large-voltage compliance, microstimulator for low-voltage biomedical applications.

    PubMed

    Farahmand, Sina; Maghami, Mohammad Hossein; Sodagar, Amir M

    2012-01-01

    This paper reports on the design of a programmable, high output impedance, large voltage compliance microstimulator for low-voltage biomedical applications. A 6-bit binary-weighted digital to analog converter (DAC) is used to generate biphasic stimulus current pulses. A compact current mirror with large output voltage compliance and high output resistance conveys the current pulses to the target tissue. Designed and simulated in a standard 0.18µm CMOS process, the microstimulator circuit is capable of delivering a maximum stimulation current of 160µA to a 10-kΩ resistive load. Operated at a 1.8-V supply voltage, the output stage exhibits a voltage compliance of 1.69V and output resistance of 160MΩ at full scale stimulus current. Layout of the core microelectrode circuit measures 25.5µm×31.5µm.

  11. Experimental Directory Structure (Exdir): An Alternative to HDF5 Without Introducing a New File Format

    PubMed Central

    Dragly, Svenn-Arne; Hobbi Mobarhan, Milad; Lepperød, Mikkel E.; Tennøe, Simen; Fyhn, Marianne; Hafting, Torkel; Malthe-Sørenssen, Anders

    2018-01-01

    Natural sciences generate an increasing amount of data in a wide range of formats developed by different research groups and commercial companies. At the same time there is a growing desire to share data along with publications in order to enable reproducible research. Open formats have publicly available specifications which facilitate data sharing and reproducible research. Hierarchical Data Format 5 (HDF5) is a popular open format widely used in neuroscience, often as a foundation for other, more specialized formats. However, drawbacks related to HDF5's complex specification have initiated a discussion for an improved replacement. We propose a novel alternative, the Experimental Directory Structure (Exdir), an open specification for data storage in experimental pipelines which amends drawbacks associated with HDF5 while retaining its advantages. HDF5 stores data and metadata in a hierarchy within a complex binary file which, among other things, is not human-readable, not optimal for version control systems, and lacks support for easy access to raw data from external applications. Exdir, on the other hand, uses file system directories to represent the hierarchy, with metadata stored in human-readable YAML files, datasets stored in binary NumPy files, and raw data stored directly in subdirectories. Furthermore, storing data in multiple files makes it easier to track for version control systems. Exdir is not a file format in itself, but a specification for organizing files in a directory structure. Exdir uses the same abstractions as HDF5 and is compatible with the HDF5 Abstract Data Model. Several research groups are already using data stored in a directory hierarchy as an alternative to HDF5, but no common standard exists. This complicates and limits the opportunity for data sharing and development of common tools for reading, writing, and analyzing data. Exdir facilitates improved data storage, data sharing, reproducible research, and novel insight from interdisciplinary collaboration. With the publication of Exdir, we invite the scientific community to join the development to create an open specification that will serve as many needs as possible and as a foundation for open access to and exchange of data. PMID:29706879

  12. Experimental Directory Structure (Exdir): An Alternative to HDF5 Without Introducing a New File Format.

    PubMed

    Dragly, Svenn-Arne; Hobbi Mobarhan, Milad; Lepperød, Mikkel E; Tennøe, Simen; Fyhn, Marianne; Hafting, Torkel; Malthe-Sørenssen, Anders

    2018-01-01

    Natural sciences generate an increasing amount of data in a wide range of formats developed by different research groups and commercial companies. At the same time there is a growing desire to share data along with publications in order to enable reproducible research. Open formats have publicly available specifications which facilitate data sharing and reproducible research. Hierarchical Data Format 5 (HDF5) is a popular open format widely used in neuroscience, often as a foundation for other, more specialized formats. However, drawbacks related to HDF5's complex specification have initiated a discussion for an improved replacement. We propose a novel alternative, the Experimental Directory Structure (Exdir), an open specification for data storage in experimental pipelines which amends drawbacks associated with HDF5 while retaining its advantages. HDF5 stores data and metadata in a hierarchy within a complex binary file which, among other things, is not human-readable, not optimal for version control systems, and lacks support for easy access to raw data from external applications. Exdir, on the other hand, uses file system directories to represent the hierarchy, with metadata stored in human-readable YAML files, datasets stored in binary NumPy files, and raw data stored directly in subdirectories. Furthermore, storing data in multiple files makes it easier to track for version control systems. Exdir is not a file format in itself, but a specification for organizing files in a directory structure. Exdir uses the same abstractions as HDF5 and is compatible with the HDF5 Abstract Data Model. Several research groups are already using data stored in a directory hierarchy as an alternative to HDF5, but no common standard exists. This complicates and limits the opportunity for data sharing and development of common tools for reading, writing, and analyzing data. Exdir facilitates improved data storage, data sharing, reproducible research, and novel insight from interdisciplinary collaboration. With the publication of Exdir, we invite the scientific community to join the development to create an open specification that will serve as many needs as possible and as a foundation for open access to and exchange of data.

  13. Adjustable repetition-rate multiplication of optical pulses using fractional temporal Talbot effect with preceded binary intensity modulation

    NASA Astrophysics Data System (ADS)

    Xie, Qijie; Zheng, Bofang; Shu, Chester

    2017-05-01

    We demonstrate a simple approach for adjustable multiplication of optical pulses in a fiber using the temporal Talbot effect. Binary electrical patterns are used to control the multiplication factor in our approach. The input 10 GHz picosecond pulses are pedestal-free and are shaped directly from a CW laser. The pulses are then intensity modulated by different sets of binary patterns prior to entering a fiber of fixed dispersion. Tunable repetition-rate multiplication by different factors of 2, 4, and 8 have been achieved and up to 80 GHz pulse train has been experimentally generated. We also evaluate numerically the influence of the extinction ratio of the intensity modulator on the performance of the multiplied pulse train. In addition, the impact of the modulator bias on the uniformity of the output pulses has also been analyzed through simulation and experiment and a good agreement is reached. Last, we perform numerical simulation on the RF spectral characteristics of the output pulses. The insensitivity of the signal-to-subharmonic noise ratio (SSNR) to the laser linewidth shows that our multiplication scheme is highly tolerant to the incoherence of the input optical pulses.

  14. VizieR Online Data Catalog: Cataclysmic Binaries, LMXBs, and related objects (Ritter+, 2003)

    NASA Astrophysics Data System (ADS)

    Ritter, H.; Kolb, U.

    2004-03-01

    Cataclysmic Binaries are semi-detached binaries consisting of a white dwarf or a white dwarf precursor primary and a low-mass secondary which is filling its critical Roche lobe. The secondary is not necessarily unevolved, it may even be a highly evolved star as for example in the case of the AM CVn-type stars. Low-Mass X-Ray Binaries are semi-detached binaries consisting of either a neutron star or a black hole primary, and a low-mass secondary which is filling its critical Roche lobe. Related Objects are detached binaries consisting of either a white dwarf or a white dwarf precursor primary and of a low-mass secondary. The secondary may also be a highly evolved star. The catalogue lists coordinates, apparent magnitudes, orbital parameters, stellar parameters of the components and other characteristic properties of 522 cataclysmic binaries, 75 low-mass X-ray binaries and 117 related objects with known or suspected orbital periods together with a comprehensive selection of the relevant recent literature. In addition the catalogue contains a list of references to published finding charts for 695 of the 714 objects. A cross-reference list of alias object designations concludes the catalogue. Literature published before 31 December 2003 has, as far as possible, been taken into account. This catalogue supersedes the 5th edition (catalogue ) and the updated lists by Ritter and Kolb (1995; catalogue ) (1998; catalogue ). (10 data files).

  15. VizieR Online Data Catalog: Cataclysmic Binaries, LMXBs, and related objects (Ritter+, 2003)

    NASA Astrophysics Data System (ADS)

    Ritter, H.; Kolb, U.

    2005-03-01

    Cataclysmic Binaries are semi-detached binaries consisting of a white dwarf or a white dwarf precursor primary and a low-mass secondary which is filling its critical Roche lobe. The secondary is not necessarily unevolved, it may even be a highly evolved star as for example in the case of the AM CVn-type stars. Low-Mass X-Ray Binaries are semi-detached binaries consisting of either a neutron star or a black hole primary, and a low-mass secondary which is filling its critical Roche lobe. Related Objects are detached binaries consisting of either a white dwarf or a white dwarf precursor primary and of a low-mass secondary. The secondary may also be a highly evolved star. The catalogue lists coordinates, apparent magnitudes, orbital parameters, stellar parameters of the components and other characteristic properties of 572 cataclysmic binaries, 80 low-mass X-ray binaries and 142 related objects with known or suspected orbital periods together with a comprehensive selection of the relevant recent literature. In addition the catalogue contains a list of references to published finding charts for 761 of the 794 objects. A cross-reference list of alias object designations concludes the catalogue. Literature published before 31 December 2004 has, as far as possible, been taken into account. This catalogue supersedes the 5th edition (catalogue ) and the updated lists by Ritter and Kolb (1995; catalogue ) (1998; catalogue ). (10 data files).

  16. TOAD Editor

    NASA Technical Reports Server (NTRS)

    Bingle, Bradford D.; Shea, Anne L.; Hofler, Alicia S.

    1993-01-01

    Transferable Output ASCII Data (TOAD) computer program (LAR-13755), implements format designed to facilitate transfer of data across communication networks and dissimilar host computer systems. Any data file conforming to TOAD format standard called TOAD file. TOAD Editor is interactive software tool for manipulating contents of TOAD files. Commonly used to extract filtered subsets of data for visualization of results of computation. Also offers such user-oriented features as on-line help, clear English error messages, startup file, macroinstructions defined by user, command history, user variables, UNDO features, and full complement of mathematical statistical, and conversion functions. Companion program, TOAD Gateway (LAR-14484), converts data files from variety of other file formats to that of TOAD. TOAD Editor written in FORTRAN 77.

  17. SciDB versus Spark: A Preliminary Comparison Based on an Earth Science Use Case

    NASA Astrophysics Data System (ADS)

    Clune, T.; Kuo, K. S.; Doan, K.; Oloso, A.

    2015-12-01

    We compare two Big Data technologies, SciDB and Spark, for performance, usability, and extensibility, when applied to a representative Earth science use case. SciDB is a new-generation parallel distributed database management system (DBMS) based on the array data model that is capable of handling multidimensional arrays efficiently but requires lengthy data ingest prior to analysis, whereas Spark is a fast and general engine for large scale data processing that can immediately process raw data files and thereby avoid the ingest process. Once data have been ingested, SciDB is very efficient in database operations such as subsetting. Spark, on the other hand, provides greater flexibility by supporting a wide variety of high-level tools including DBMS's. For the performance aspect of this preliminary comparison, we configure Spark to operate directly on text or binary data files and thereby limit the need for additional tools. Arguably, a more appropriate comparison would involve exploring other configurations of Spark which exploit supported high-level tools, but that is beyond our current resources. To make the comparison as "fair" as possible, we export the arrays produced by SciDB into text files (or converting them to binary files) for the intake by Spark and thereby avoid any additional file processing penalties. The Earth science use case selected for this comparison is the identification and tracking of snowstorms in the NASA Modern Era Retrospective-analysis for Research and Applications (MERRA) reanalysis data. The identification portion of the use case is to flag all grid cells of the MERRA high-resolution hourly data that satisfies our criteria for snowstorm, whereas the tracking portion connects flagged cells adjacent in time and space to form a snowstorm episode. We will report the results of our comparisons at this presentation.

  18. Conversion of HSPF Legacy Model to a Platform-Independent, Open-Source Language

    NASA Astrophysics Data System (ADS)

    Heaphy, R. T.; Burke, M. P.; Love, J. T.

    2015-12-01

    Since its initial development over 30 years ago, the Hydrologic Simulation Program - FORTAN (HSPF) model has been used worldwide to support water quality planning and management. In the United States, HSPF receives widespread endorsement as a regulatory tool at all levels of government and is a core component of the EPA's Better Assessment Science Integrating Point and Nonpoint Sources (BASINS) system, which was developed to support nationwide Total Maximum Daily Load (TMDL) analysis. However, the model's legacy code and data management systems have limitations in their ability to integrate with modern software, hardware, and leverage parallel computing, which have left voids in optimization, pre-, and post-processing tools. Advances in technology and our scientific understanding of environmental processes that have occurred over the last 30 years mandate that upgrades be made to HSPF to allow it to evolve and continue to be a premiere tool for water resource planners. This work aims to mitigate the challenges currently facing HSPF through two primary tasks: (1) convert code to a modern widely accepted, open-source, high-performance computing (hpc) code; and (2) convert model input and output files to modern widely accepted, open-source, data model, library, and binary file format. Python was chosen as the new language for the code conversion. It is an interpreted, object-oriented, hpc code with dynamic semantics that has become one of the most popular open-source languages. While python code execution can be slow compared to compiled, statically typed programming languages, such as C and FORTRAN, the integration of Numba (a just-in-time specializing compiler) has allowed this challenge to be overcome. For the legacy model data management conversion, HDF5 was chosen to store the model input and output. The code conversion for HSPF's hydrologic and hydraulic modules has been completed. The converted code has been tested against HSPF's suite of "test" runs and shown good agreement and similar execution times while using the Numba compiler. Continued verification of the accuracy of the converted code against more complex legacy applications and improvement upon execution times by incorporating an intelligent network change detection tool is currently underway, and preliminary results will be presented.

  19. Multifunction audio digitizer. [producing direct delta and pulse code modulation

    NASA Technical Reports Server (NTRS)

    Monford, L. G., Jr. (Inventor)

    1974-01-01

    An illustrative embodiment of the invention includes apparatus which simultaneously produces both direct delta modulation and pulse code modulation. An input signal, after amplification, is supplied to a window comparator which supplies a polarity control signal to gate the output of a clock to the appropriate input of a binary up-down counter. The control signals provide direct delta modulation while the up-down counter output provides pulse code modulation.

  20. Forecast Modelling via Variations in Binary Image-Encoded Information Exploited by Deep Learning Neural Networks.

    PubMed

    Liu, Da; Xu, Ming; Niu, Dongxiao; Wang, Shoukai; Liang, Sai

    2016-01-01

    Traditional forecasting models fit a function approximation from dependent invariables to independent variables. However, they usually get into trouble when date are presented in various formats, such as text, voice and image. This study proposes a novel image-encoded forecasting method that input and output binary digital two-dimensional (2D) images are transformed from decimal data. Omitting any data analysis or cleansing steps for simplicity, all raw variables were selected and converted to binary digital images as the input of a deep learning model, convolutional neural network (CNN). Using shared weights, pooling and multiple-layer back-propagation techniques, the CNN was adopted to locate the nexus among variations in local binary digital images. Due to the computing capability that was originally developed for binary digital bitmap manipulation, this model has significant potential for forecasting with vast volume of data. The model was validated by a power loads predicting dataset from the Global Energy Forecasting Competition 2012.

  1. Forecast Modelling via Variations in Binary Image-Encoded Information Exploited by Deep Learning Neural Networks

    PubMed Central

    Xu, Ming; Niu, Dongxiao; Wang, Shoukai; Liang, Sai

    2016-01-01

    Traditional forecasting models fit a function approximation from dependent invariables to independent variables. However, they usually get into trouble when date are presented in various formats, such as text, voice and image. This study proposes a novel image-encoded forecasting method that input and output binary digital two-dimensional (2D) images are transformed from decimal data. Omitting any data analysis or cleansing steps for simplicity, all raw variables were selected and converted to binary digital images as the input of a deep learning model, convolutional neural network (CNN). Using shared weights, pooling and multiple-layer back-propagation techniques, the CNN was adopted to locate the nexus among variations in local binary digital images. Due to the computing capability that was originally developed for binary digital bitmap manipulation, this model has significant potential for forecasting with vast volume of data. The model was validated by a power loads predicting dataset from the Global Energy Forecasting Competition 2012. PMID:27281032

  2. ALPS - A LINEAR PROGRAM SOLVER

    NASA Technical Reports Server (NTRS)

    Viterna, L. A.

    1994-01-01

    Linear programming is a widely-used engineering and management tool. Scheduling, resource allocation, and production planning are all well-known applications of linear programs (LP's). Most LP's are too large to be solved by hand, so over the decades many computer codes for solving LP's have been developed. ALPS, A Linear Program Solver, is a full-featured LP analysis program. ALPS can solve plain linear programs as well as more complicated mixed integer and pure integer programs. ALPS also contains an efficient solution technique for pure binary (0-1 integer) programs. One of the many weaknesses of LP solvers is the lack of interaction with the user. ALPS is a menu-driven program with no special commands or keywords to learn. In addition, ALPS contains a full-screen editor to enter and maintain the LP formulation. These formulations can be written to and read from plain ASCII files for portability. For those less experienced in LP formulation, ALPS contains a problem "parser" which checks the formulation for errors. ALPS creates fully formatted, readable reports that can be sent to a printer or output file. ALPS is written entirely in IBM's APL2/PC product, Version 1.01. The APL2 workspace containing all the ALPS code can be run on any APL2/PC system (AT or 386). On a 32-bit system, this configuration can take advantage of all extended memory. The user can also examine and modify the ALPS code. The APL2 workspace has also been "packed" to be run on any DOS system (without APL2) as a stand-alone "EXE" file, but has limited memory capacity on a 640K system. A numeric coprocessor (80X87) is optional but recommended. The standard distribution medium for ALPS is a 5.25 inch 360K MS-DOS format diskette. IBM, IBM PC and IBM APL2 are registered trademarks of International Business Machines Corporation. MS-DOS is a registered trademark of Microsoft Corporation.

  3. VizieR Online Data Catalog: Hα observations of LSI+61 303 (Zamanov+, 2013)

    NASA Astrophysics Data System (ADS)

    Zamanov, R.; Stoyanov, K.; Marti, J.; Tomov, N. A.; Belcheva, G.; Luque-Escamilla, P. L.; Latev, G.

    2013-09-01

    Optical spectroscopic observations of the Hα emission line (137 spectra obtained during the period of September 1998 - January 2013) are presented for the the radio- and gamma-ray-emitting Be/X-ray binary LSI+61 303. (2 data files).

  4. BioVEC: a program for biomolecule visualization with ellipsoidal coarse-graining.

    PubMed

    Abrahamsson, Erik; Plotkin, Steven S

    2009-09-01

    Biomolecule Visualization with Ellipsoidal Coarse-graining (BioVEC) is a tool for visualizing molecular dynamics simulation data while allowing coarse-grained residues to be rendered as ellipsoids. BioVEC reads in configuration files, which may be output from molecular dynamics simulations that include orientation output in either quaternion or ANISOU format, and can render frames of the trajectory in several common image formats for subsequent concatenation into a movie file. The BioVEC program is written in C++, uses the OpenGL API for rendering, and is open source. It is lightweight, allows for user-defined settings for and texture, and runs on either Windows or Linux platforms.

  5. PROP3D: A Program for 3D Euler Unsteady Aerodynamic and Aeroelastic (Flutter and Forced Response) Analysis of Propellers. Version 1.0

    NASA Technical Reports Server (NTRS)

    Srivastava, R.; Reddy, T. S. R.

    1996-01-01

    This guide describes the input data required, for steady or unsteady aerodynamic and aeroelastic analysis of propellers and the output files generated, in using PROP3D. The aerodynamic forces are obtained by solving three dimensional unsteady, compressible Euler equations. A normal mode structural analysis is used to obtain the aeroelastic equations, which are solved using either time domain or frequency domain solution method. Sample input and output files are included in this guide for steady aerodynamic analysis of single and counter-rotation propellers, and aeroelastic analysis of single-rotation propeller.

  6. GEOTHERM user guide

    USGS Publications Warehouse

    Swanson, James R.

    1977-01-01

    GEOTHERM is a computerized geothermal resources file developed by the U.S. Geological Survey. The file contains data on geothermal fields, wells, and chemical analyses from the United states and international sources. The General Information Processing System (GIPSY) in the IBM 370/155 computer is used to store and retrieve data. The GIPSY retrieval program contains simple commands which can be used to search the file, select a narrowly defined subset, sort the records, and output the data in a variety of forms. Eight commands are listed and explained so that the GEOTHERM file can be accessed directly by geologists. No programming experience is necessary to retrieve data from the file.

  7. EnergyPlus™

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Originally developed in 1999, an updated version 8.8.0 with bug fixes was released on September 30th, 2017. EnergyPlus™ is a whole building energy simulation program that engineers, architects, and researchers use to model both energy consumption—for heating, cooling, ventilation, lighting and plug and process loads—and water use in buildings. EnergyPlus is a console-based program that reads input and writes output to text files. It ships with a number of utilities including IDF-Editor for creating input files using a simple spreadsheet-like interface, EP-Launch for managing input and output files and performing batch simulations, and EP-Compare for graphically comparing the results ofmore » two or more simulations. Several comprehensive graphical interfaces for EnergyPlus are also available. DOE does most of its work with EnergyPlus using the OpenStudio® software development kit and suite of applications. DOE releases major updates to EnergyPlus twice annually.« less

  8. MAGE (M-file/Mif Automatic GEnerator): A graphical interface tool for automatic generation of Object Oriented Micromagnetic Framework configuration files and Matlab scripts for results analysis

    NASA Astrophysics Data System (ADS)

    Chęciński, Jakub; Frankowski, Marek

    2016-10-01

    We present a tool for fully-automated generation of both simulations configuration files (Mif) and Matlab scripts for automated data analysis, dedicated for Object Oriented Micromagnetic Framework (OOMMF). We introduce extended graphical user interface (GUI) that allows for fast, error-proof and easy creation of Mifs, without any programming skills usually required for manual Mif writing necessary. With MAGE we provide OOMMF extensions for complementing it by mangetoresistance and spin-transfer-torque calculations, as well as local magnetization data selection for output. Our software allows for creation of advanced simulations conditions like simultaneous parameters sweeps and synchronic excitation application. Furthermore, since output of such simulation could be long and complicated we provide another GUI allowing for automated creation of Matlab scripts suitable for analysis of such data with Fourier and wavelet transforms as well as user-defined operations.

  9. Computational Support for Technology- Investment Decisions

    NASA Technical Reports Server (NTRS)

    Adumitroaie, Virgil; Hua, Hook; Lincoln, William; Block, Gary; Mrozinski, Joseph; Shelton, Kacie; Weisbin, Charles; Elfes, Alberto; Smith, Jeffrey

    2007-01-01

    Strategic Assessment of Risk and Technology (START) is a user-friendly computer program that assists human managers in making decisions regarding research-and-development investment portfolios in the presence of uncertainties and of non-technological constraints that include budgetary and time limits, restrictions related to infrastructure, and programmatic and institutional priorities. START facilitates quantitative analysis of technologies, capabilities, missions, scenarios and programs, and thereby enables the selection and scheduling of value-optimal development efforts. START incorporates features that, variously, perform or support a unique combination of functions, most of which are not systematically performed or supported by prior decision- support software. These functions include the following: Optimal portfolio selection using an expected-utility-based assessment of capabilities and technologies; Temporal investment recommendations; Distinctions between enhancing and enabling capabilities; Analysis of partial funding for enhancing capabilities; and Sensitivity and uncertainty analysis. START can run on almost any computing hardware, within Linux and related operating systems that include Mac OS X versions 10.3 and later, and can run in Windows under the Cygwin environment. START can be distributed in binary code form. START calls, as external libraries, several open-source software packages. Output is in Excel (.xls) file format.

  10. DLRS: gene tree evolution in light of a species tree.

    PubMed

    Sjöstrand, Joel; Sennblad, Bengt; Arvestad, Lars; Lagergren, Jens

    2012-11-15

    PrIME-DLRS (or colloquially: 'Delirious') is a phylogenetic software tool to simultaneously infer and reconcile a gene tree given a species tree. It accounts for duplication and loss events, a relaxed molecular clock and is intended for the study of homologous gene families, for example in a comparative genomics setting involving multiple species. PrIME-DLRS uses a Bayesian MCMC framework, where the input is a known species tree with divergence times and a multiple sequence alignment, and the output is a posterior distribution over gene trees and model parameters. PrIME-DLRS is available for Java SE 6+ under the New BSD License, and JAR files and source code can be downloaded from http://code.google.com/p/jprime/. There is also a slightly older C++ version available as a binary package for Ubuntu, with download instructions at http://prime.sbc.su.se. The C++ source code is available upon request. joel.sjostrand@scilifelab.se or jens.lagergren@scilifelab.se. PrIME-DLRS is based on a sound probabilistic model (Åkerborg et al., 2009) and has been thoroughly validated on synthetic and biological datasets (Supplementary Material online).

  11. Hydratools, a MATLAB® based data processing package for Sontek Hydra data

    USGS Publications Warehouse

    Martini, M.; Lightsom, F.L.; Sherwood, C.R.; Xu, Jie; Lacy, J.R.; Ramsey, A.; Horwitz, R.

    2005-01-01

    The U.S. Geological Survey (USGS) has developed a set of MATLAB tools to process and convert data collected by Sontek Hydra instruments to netCDF, which is a format used by the USGS to process and archive oceanographic time-series data. The USGS makes high-resolution current measurements within 1.5 meters of the bottom. These data are used in combination with other instrument data from sediment transport studies to develop sediment transport models. Instrument manufacturers provide software which outputs unique binary data formats. Multiple data formats are cumbersome. The USGS solution is to translate data streams into a common data format: netCDF. The Hydratools toolbox is written to create netCDF format files following EPIC conventions, complete with embedded metadata. Data are accepted from both the ADV and the PCADP. The toolbox will detect and remove bad data, substitute other sources of heading and tilt measurements if necessary, apply ambiguity corrections, calculate statistics, return information about data quality, and organize metadata. Standardized processing and archiving makes these data more easily and routinely accessible locally and over the Internet. In addition, documentation of the techniques used in the toolbox provides a baseline reference for others utilizing the data.

  12. OpenSWPC: an open-source integrated parallel simulation code for modeling seismic wave propagation in 3D heterogeneous viscoelastic media

    NASA Astrophysics Data System (ADS)

    Maeda, Takuto; Takemura, Shunsuke; Furumura, Takashi

    2017-07-01

    We have developed an open-source software package, Open-source Seismic Wave Propagation Code (OpenSWPC), for parallel numerical simulations of seismic wave propagation in 3D and 2D (P-SV and SH) viscoelastic media based on the finite difference method in local-to-regional scales. This code is equipped with a frequency-independent attenuation model based on the generalized Zener body and an efficient perfectly matched layer for absorbing boundary condition. A hybrid-style programming using OpenMP and the Message Passing Interface (MPI) is adopted for efficient parallel computation. OpenSWPC has wide applicability for seismological studies and great portability to allowing excellent performance from PC clusters to supercomputers. Without modifying the code, users can conduct seismic wave propagation simulations using their own velocity structure models and the necessary source representations by specifying them in an input parameter file. The code has various modes for different types of velocity structure model input and different source representations such as single force, moment tensor and plane-wave incidence, which can easily be selected via the input parameters. Widely used binary data formats, the Network Common Data Form (NetCDF) and the Seismic Analysis Code (SAC) are adopted for the input of the heterogeneous structure model and the outputs of the simulation results, so users can easily handle the input/output datasets. All codes are written in Fortran 2003 and are available with detailed documents in a public repository.[Figure not available: see fulltext.

  13. Recursive Feature Extraction in Graphs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2014-08-14

    ReFeX extracts recursive topological features from graph data. The input is a graph as a csv file and the output is a csv file containing feature values for each node in the graph. The features are based on topological counts in the neighborhoods of each nodes, as well as recursive summaries of neighbors' features.

  14. 18 CFR 35.12 - Filing of initial rate schedules and tariffs.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... integration of hydroelectric generating resources whose output cannot be predicted quantitatively due to water... explanation of how the proposed rate or charge was derived. For example, is it a standard rate of the filing public utility; is it a special rate arrived at through negotiations and, if so, were unusual customer...

  15. 18 CFR 35.12 - Filing of initial rate schedules and tariffs.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... integration of hydroelectric generating resources whose output cannot be predicted quantitatively due to water... explanation of how the proposed rate or charge was derived. For example, is it a standard rate of the filing public utility; is it a special rate arrived at through negotiations and, if so, were unusual customer...

  16. 18 CFR 35.12 - Filing of initial rate schedules and tariffs.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... integration of hydroelectric generating resources whose output cannot be predicted quantitatively due to water... explanation of how the proposed rate or charge was derived. For example, is it a standard rate of the filing public utility; is it a special rate arrived at through negotiations and, if so, were unusual customer...

  17. 18 CFR 35.12 - Filing of initial rate schedules and tariffs.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... integration of hydroelectric generating resources whose output cannot be predicted quantitatively due to water... explanation of how the proposed rate or charge was derived. For example, is it a standard rate of the filing public utility; is it a special rate arrived at through negotiations and, if so, were unusual customer...

  18. 18 CFR 35.12 - Filing of initial rate schedules and tariffs.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... integration of hydroelectric generating resources whose output cannot be predicted quantitatively due to water... explanation of how the proposed rate or charge was derived. For example, is it a standard rate of the filing public utility; is it a special rate arrived at through negotiations and, if so, were unusual customer...

  19. ARSRP Signal Processing Software

    DTIC Science & Technology

    1993-02-01

    Hardcopy of plot? Y or N : Y Enter name of postscript output file. : wave2 .ps Postscript file created. Another plot? Y or N :N 19 The two plots...created and stored in wavel.ps and wave2 .ps are shown in Figures 4 and 5 with the corresponding MSS real- time plots from the ARSRP Monitoring Support

  20. wsacrvpthrc.a1

    DOE Data Explorer

    Gaustad, Krista; Hardin, Joseph

    2015-12-14

    The wsacr PCM process executed by the sacr3 binary reads in wsacr.00 data and produces CF/Radial compliant NetCDF files for each of the radar operational scanning modes. This incorporates raw data from the radar, as well as scientifically important base derived parameters that affect interpretation of the data.

  1. wsacrppivh.a1

    DOE Data Explorer

    Gaustad, Krista; Hardin, Joseph

    2015-07-22

    The wsacr PCM process executed by the sacr3 binary reads in wsacr.00 data and produces CF/Radial compliant NetCDF files for each of the radar operational scanning modes. This incorporates raw data from the radar, as well as scientifically important base derived parameters that affect interpretation of the data.

  2. wsacrzrhiv.a1

    DOE Data Explorer

    Gaustad, Krista; Hardin, Joseph

    2015-07-22

    The wsacr PCM process executed by the sacr3 binary reads in wsacr.00 data and produces CF/Radial compliant NetCDF files for each of the radar operational scanning modes. This incorporates raw data from the radar, as well as scientifically important base derived parameters that affect interpretation of the data.

  3. kasacrvpthrc.a1

    DOE Data Explorer

    Gaustad, Krista; Hardin, Joseph

    2015-07-22

    The kasacr PCM process executed by the sacr3 binary reads in kasacr.00 data and produces CF/Radial compliant NetCDF files for each of the radar operational scanning modes. This incorporates raw data from the radar, as well as scientifically important base derived parameters that affect interpretation of the data.

  4. The NetLogger Toolkit V2.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gunter, Dan; Lee, Jason; Stoufer, Martin

    2003-03-28

    The NetLogger Toolkit is designed to monitor, under actual operating conditions, the behavior of all the elements of the application-to-application communication path in order to determine exactly where time is spent within a complex system Using NetLogger, distnbuted application components are modified to produce timestamped logs of "interesting" events at all the critical points of the distributed system Events from each component are correlated, which allov^ one to characterize the performance of all aspects of the system and network in detail. The NetLogger Toolkit itself consists of four components an API and library of functions to simplify the generation ofmore » application-level event logs, a set of tools for collecting and sorting log files, an event archive system, and a tool for visualization and analysis of the log files In order to instrument an application to produce event logs, the application developer inserts calls to the NetLogger API at all the critical points in the code, then links the application with the NetLogger library All the tools in the NetLogger Toolkit share a common log format, and assume the existence of accurate and synchronized system clocks NetLogger messages can be logged using an easy-to-read text based format based on the lETF-proposed ULM format, or a binary format that can still be used through the same API but that is several times faster and smaller, with performance comparable or better than binary message formats such as MPI, XDR, SDDF-Binary, and PBIO. The NetLogger binary format is both highly efficient and self-describing, thus optimized for the dynamic message construction and parsing of application instrumentation. NetLogger includes an "activation" API that allows NetLogger logging to be turned on, off, or modified by changing an external file This IS useful for activating logging in daemons/services (e g GndFTP server). The NetLogger reliability API provides the ability to specify backup logging locations and penodically try to reconnect broken TCP pipe. A typical use for this is to store data on local disk while net is down. An event archiver can log one or more incoming NetLogger streams to a local disk file (netlogd) or to a mySQL database (netarchd). We have found exploratory, visual analysis of the log event data to be the most useful means of determining the causes of performance anomalies The NetLogger Visualization tool, niv, has been developed to provide a flexible and interactive graphical representation of system-level and application-level events.« less

  5. Data files from the Grays Harbor Sediment Transport Experiment Spring 2001

    USGS Publications Warehouse

    Landerman, Laura A.; Sherwood, Christopher R.; Gelfenbaum, Guy; Lacy, Jessica; Ruggiero, Peter; Wilson, Douglas; Chisholm, Tom; Kurrus, Keith

    2005-01-01

    This publication consists of two DVD-ROMs, both of which are presented here. This report describes data collected during the Spring 2001 Grays Harbor Sediment Transport Experiment, and provides additional information needed to interpret the data. Two DVDs accompany this report; both contain documentation in html format that assist the user in navigating through the data. DVD-ROM-1 contains a digital version of this report in .pdf format, raw Aquatec acoustic backscatter (ABS) data in .zip format, Sonar data files in .avi format, and coastal processes and morphology data in ASCII format. ASCII data files are provided in .zip format; bundled coastal processes ASCII files are separated by deployment and instrument; bundled morphology ASCII files are separated into monthly data collection efforts containing the beach profiles collected (or extracted from the surface map) at that time; weekly surface maps are also bundled together. DVD-ROM-2 contains a digital version of this report in .pdf format, the binary data files collected by the SonTek instrumentation, calibration files for the pressure sensors, and Matlab m-files for loading the ABS data into Matlab and cleaning-up the optical backscatter (OBS) burst time-series data.

  6. User's Manual for Aerofcn: a FORTRAN Program to Compute Aerodynamic Parameters

    NASA Technical Reports Server (NTRS)

    Conley, Joseph L.

    1992-01-01

    The computer program AeroFcn is discussed. AeroFcn is a utility program that computes the following aerodynamic parameters: geopotential altitude, Mach number, true velocity, dynamic pressure, calibrated airspeed, equivalent airspeed, impact pressure, total pressure, total temperature, Reynolds number, speed of sound, static density, static pressure, static temperature, coefficient of dynamic viscosity, kinematic viscosity, geometric altitude, and specific energy for a standard- or a modified standard-day atmosphere using compressible flow and normal shock relations. Any two parameters that define a unique flight condition are selected, and their values are entered interactively. The remaining parameters are computed, and the solutions are stored in an output file. Multiple cases can be run, and the multiple case solutions can be stored in another output file for plotting. Parameter units, the output format, and primary constants in the atmospheric and aerodynamic equations can also be changed.

  7. Characterizing output bottlenecks in a supercomputer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xie, Bing; Chase, Jeffrey; Dillow, David A

    2012-01-01

    Supercomputer I/O loads are often dominated by writes. HPC (High Performance Computing) file systems are designed to absorb these bursty outputs at high bandwidth through massive parallelism. However, the delivered write bandwidth often falls well below the peak. This paper characterizes the data absorption behavior of a center-wide shared Lustre parallel file system on the Jaguar supercomputer. We use a statistical methodology to address the challenges of accurately measuring a shared machine under production load and to obtain the distribution of bandwidth across samples of compute nodes, storage targets, and time intervals. We observe and quantify limitations from competing traffic,more » contention on storage servers and I/O routers, concurrency limitations in the client compute node operating systems, and the impact of variance (stragglers) on coupled output such as striping. We then examine the implications of our results for application performance and the design of I/O middleware systems on shared supercomputers.« less

  8. VizieR Online Data Catalog: AR Sco VLA radio observations (Stanway+, 2018)

    NASA Astrophysics Data System (ADS)

    Stanway, E. R.; Marsh, T. R.; Chote, P.; Gaensicke, B. T.; Steeghs, D.; Wheatley, P. J.

    2018-02-01

    Time series VLA radio observations were undertaken of the highly variable white dwarf binary AR Scorpii. These were analysed for periodicity, spectral behaviour and other characteristics. Here we present time series data in the Stokes I parameter at three frequencies. These were centred at 1.5GHz (1GHz bandwidth), 5GHz (2GHz bandwidth) and 9GHz (2GHz bandwidth). The AR Sco binary is unresolved at these frequencies. In the case of the 1.5GHz data, fluxes have been deconvolved with those of a neighbouring object. (3 data files).

  9. VizieR Online Data Catalog: 1992-1997 binary star speckle measurements (Balega+, 1999)

    NASA Astrophysics Data System (ADS)

    Balega, I. I.; Balega, Y. Y.; Maksimov, A. F.; Pluzhnik, E. A.; Shkhagosheva, Z. U.; Vasyuk, V. A.

    2000-11-01

    We present the results of speckle interferometric measurements of binary stars made with the television photon-counting camera at the 6-m Big Azimuthal Telescope (BTA) and 1-m telescope of the Special Astrophysical Observatory (SAO) between August 1992 and May 1997. The data contain 89 observations of 62 star systems on the large telescope and 21 on the smaller one. For the 6-m aperture 18 systems remained unresolved. The measured angular separation ranged from 39 mas, two times above the BTA diffraction limit, to 1593 mas. (3 data files).

  10. Files synchronization from a large number of insertions and deletions

    NASA Astrophysics Data System (ADS)

    Ellappan, Vijayan; Kumari, Savera

    2017-11-01

    Synchronization between different versions of files is becoming a major issue that most of the applications are facing. To make the applications more efficient a economical algorithm is developed from the previously used algorithm of “File Loading Algorithm”. I am extending this algorithm in three ways: First, dealing with non-binary files, Second backup is generated for uploaded files and lastly each files are synchronized with insertions and deletions. User can reconstruct file from the former file with minimizing the error and also provides interactive communication by eliminating the frequency without any disturbance. The drawback of previous system is overcome by using synchronization, in which multiple copies of each file/record is created and stored in backup database and is efficiently restored in case of any unwanted deletion or loss of data. That is, to introduce a protocol that user B may use to reconstruct file X from file Y with suitably low probability of error. Synchronization algorithms find numerous areas of use, including data storage, file sharing, source code control systems, and cloud applications. For example, cloud storage services such as Drop box synchronize between local copies and cloud backups each time users make changes to local versions. Similarly, synchronization tools are necessary in mobile devices. Specialized synchronization algorithms are used for video and sound editing. Synchronization tools are also capable of performing data duplication.

  11. VizieR Online Data Catalog: Chromospherically Active Binaries (Strassmeier+ 1993)

    NASA Astrophysics Data System (ADS)

    Strassmeier, K. G.; Hall, D. S.; Fekel, F. C.

    1996-08-01

    Stars always appear in order of increasing right-ascension for the epoch 2000.0. For the current version of the catalog, the literature was searched through December 31, 1991 although a few later references are included. Additionally, some entries are cited with "private communication", which make this catalog also a first-hand source. A number in parentheses behind an entry always corresponds to a reference given in the bibliography. See the 1988 publication for specific requirements and restrictions in compiling these catalogs. See the source reference for more details about this catalog. The following binary systems, which were listed in the first edition of the catalog, were not included in the present edition due to insufficient evidence for chromospheric activity: eta And 26 Aql 4 UMi nu2 Sgr tau Sgr the following stars are chromospherically active but are components in a "wide" binary and were not included. HD 25893 HD 79211 Forty three new binary systems have been included in the present edition. (12 data files).

  12. Signal processing method of the diameter measurement system based on CCD parallel light projection method

    NASA Astrophysics Data System (ADS)

    Song, Qing; Zhu, Sijia; Yan, Han; Wu, Wenqian

    2008-03-01

    Parallel light projection method for the diameter measurement is to project the workpiece to be measured on the photosensitive units of CCD, but the original signal output from CCD cannot be directly used for counting or measurement. The weak signal with high-frequency noise should be filtered and amplified firstly. This paper introduces RC low-pass filter and multiple feed-back second-order low-pass filter with infinite gain. Additionally there is always dispersion on the light band and the output signal has a transition between the irradiant area and the shadow, because of the instability of the light source intensity and the imperfection of the light system adjustment. To obtain exactly the shadow size related to the workpiece diameter, binary-value processing is necessary to achieve a square wave. Comparison method and differential method can be adopted for binary-value processing. There are two ways to decide the threshold value when using voltage comparator: the fixed level method and the floated level method. The latter has a high accuracy. Deferential method is to output two spike pulses with opposite pole by the rising edge and the failing edge of the video signal related to the differential circuit firstly, then the rising edge of the signal output from the differential circuit is acquired by half-wave rectifying circuit. After traveling through the zero passing comparator and the maintain- resistance edge trigger, the square wave which indicates the measured size is acquired at last. And then it is used for filling through standard pulses and for counting through the counter. Data acquisition and information processing is accomplished by the computer and the control software. This paper will introduce in detail the design and analysis of the filter circuit, binary-value processing circuit and the interface circuit towards the computer.

  13. MS/MS Automated Selected Ion Chromatograms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Monroe, Matthew

    2005-12-12

    This program can be used to read a LC-MS/MS data file from either a Finnigan ion trap mass spectrometer (.Raw file) or an Agilent Ion Trap mass spectrometer (.MGF and .CDF files) and create a selected ion chromatogram (SIC) for each of the parent ion masses chosen for fragmentation. The largest peak in each SIC is also identified, with reported statistics including peak elution time, height, area, and signal to noise ratio. It creates several output files, including a base peak intensity (BPI) chromatogram for the survey scan, a BPI for the fragmentation scans, an XML file containing the SICmore » data for each parent ion, and a "flat file" (ready for import into a database) containing summaries of the SIC data statistics.« less

  14. High resolution time interval counter

    DOEpatents

    Condreva, Kenneth J.

    1994-01-01

    A high resolution counter circuit measures the time interval between the occurrence of an initial and a subsequent electrical pulse to two nanoseconds resolution using an eight megahertz clock. The circuit includes a main counter for receiving electrical pulses and generating a binary word--a measure of the number of eight megahertz clock pulses occurring between the signals. A pair of first and second pulse stretchers receive the signal and generate a pair of output signals whose widths are approximately sixty-four times the time between the receipt of the signals by the respective pulse stretchers and the receipt by the respective pulse stretchers of a second subsequent clock pulse. Output signals are thereafter supplied to a pair of start and stop counters operable to generate a pair of binary output words representative of the measure of the width of the pulses to a resolution of two nanoseconds. Errors associated with the pulse stretchers are corrected by providing calibration data to both stretcher circuits, and recording start and stop counter values. Stretched initial and subsequent signals are combined with autocalibration data and supplied to an arithmetic logic unit to determine the time interval in nanoseconds between the pair of electrical pulses being measured.

  15. High resolution time interval counter

    DOEpatents

    Condreva, K.J.

    1994-07-26

    A high resolution counter circuit measures the time interval between the occurrence of an initial and a subsequent electrical pulse to two nanoseconds resolution using an eight megahertz clock. The circuit includes a main counter for receiving electrical pulses and generating a binary word--a measure of the number of eight megahertz clock pulses occurring between the signals. A pair of first and second pulse stretchers receive the signal and generate a pair of output signals whose widths are approximately sixty-four times the time between the receipt of the signals by the respective pulse stretchers and the receipt by the respective pulse stretchers of a second subsequent clock pulse. Output signals are thereafter supplied to a pair of start and stop counters operable to generate a pair of binary output words representative of the measure of the width of the pulses to a resolution of two nanoseconds. Errors associated with the pulse stretchers are corrected by providing calibration data to both stretcher circuits, and recording start and stop counter values. Stretched initial and subsequent signals are combined with autocalibration data and supplied to an arithmetic logic unit to determine the time interval in nanoseconds between the pair of electrical pulses being measured. 3 figs.

  16. E-beam generated holographic masks for optical vector-matrix multiplication

    NASA Technical Reports Server (NTRS)

    Arnold, S. M.; Case, S. K.

    1981-01-01

    An optical vector matrix multiplication scheme that encodes the matrix elements as a holographic mask consisting of linear diffraction gratings is proposed. The binary, chrome on glass masks are fabricated by e-beam lithography. This approach results in a fairly simple optical system that promises both large numerical range and high accuracy. A partitioned computer generated hologram mask was fabricated and tested. This hologram was diagonally separated outputs, compact facets and symmetry about the axis. The resultant diffraction pattern at the output plane is shown. Since the grating fringes are written at 45 deg relative to the facet boundaries, the many on-axis sidelobes from each output are seen to be diagonally separated from the adjacent output signals.

  17. NEMAR plotting computer program

    NASA Technical Reports Server (NTRS)

    Myler, T. R.

    1981-01-01

    A FORTRAN coded computer program which generates CalComp plots of trajectory parameters is examined. The trajectory parameters are calculated and placed on a data file by the Near Earth Mission Analysis Routine computer program. The plot program accesses the data file and generates the plots as defined by inputs to the plot program. Program theory, user instructions, output definitions, subroutine descriptions and detailed FORTRAN coding information are included. Although this plot program utilizes a random access data file, a data file of the same type and formatted in 102 numbers per record could be generated by any computer program and used by this plot program.

  18. Tool for Merging Proposals Into DSN Schedules

    NASA Technical Reports Server (NTRS)

    Khanampornpan, Teerapat; Kwok, John; Call, Jared

    2008-01-01

    A Practical Extraction and Reporting Language (Perl) script called merge7da has been developed to facilitate determination, by a project scheduler in NASA's Deep Space Network, of whether a proposal for use of the DSN could create a conflict with the current DSN schedule. Prior to the development of merge7da, there was no way to quickly identify potential schedule conflicts: it was necessary to submit a proposal and wait a day or two for a response from a DSN scheduling facility. By using merge7da to detect and eliminate potential schedule conflicts before submitting a proposal, a project scheduler saves time and gains assurance that the proposal will probably be accepted. merge7da accepts two input files, one of which contains the current DSN schedule and is in a DSN-standard format called '7da'. The other input file contains the proposal and is in another DSN-standard format called 'C1/C2'. merge7da processes the two input files to produce a merged 7da-format output file that represents the DSN schedule as it would be if the proposal were to be adopted. This 7da output file can be loaded into various DSN scheduling software tools now in use.

  19. Online data handling and storage at the CMS experiment

    NASA Astrophysics Data System (ADS)

    Andre, J.-M.; Andronidis, A.; Behrens, U.; Branson, J.; Chaze, O.; Cittolin, S.; Darlea, G.-L.; Deldicque, C.; Demiragli, Z.; Dobson, M.; Dupont, A.; Erhan, S.; Gigi, D.; Glege, F.; Gómez-Ceballos, G.; Hegeman, J.; Holzner, A.; Jimenez-Estupiñán, R.; Masetti, L.; Meijers, F.; Meschi, E.; Mommsen, RK; Morovic, S.; Nuñez-Barranco-Fernández, C.; O'Dell, V.; Orsini, L.; Paus, C.; Petrucci, A.; Pieri, M.; Racz, A.; Roberts, P.; Sakulin, H.; Schwick, C.; Stieger, B.; Sumorok, K.; Veverka, J.; Zaza, S.; Zejdl, P.

    2015-12-01

    During the LHC Long Shutdown 1, the CMS Data Acquisition (DAQ) system underwent a partial redesign to replace obsolete network equipment, use more homogeneous switching technologies, and support new detector back-end electronics. The software and hardware infrastructure to provide input, execute the High Level Trigger (HLT) algorithms and deal with output data transport and storage has also been redesigned to be completely file- based. All the metadata needed for bookkeeping are stored in files as well, in the form of small documents using the JSON encoding. The Storage and Transfer System (STS) is responsible for aggregating these files produced by the HLT, storing them temporarily and transferring them to the T0 facility at CERN for subsequent offline processing. The STS merger service aggregates the output files from the HLT from ∼62 sources produced with an aggregate rate of ∼2GB/s. An estimated bandwidth of 7GB/s in concurrent read/write mode is needed. Furthermore, the STS has to be able to store several days of continuous running, so an estimated of 250TB of total usable disk space is required. In this article we present the various technological and implementation choices of the three components of the STS: the distributed file system, the merger service and the transfer system.

  20. Online Data Handling and Storage at the CMS Experiment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andre, J. M.; et al.

    2015-12-23

    During the LHC Long Shutdown 1, the CMS Data Acquisition (DAQ) system underwent a partial redesign to replace obsolete network equipment, use more homogeneous switching technologies, and support new detector back-end electronics. The software and hardware infrastructure to provide input, execute the High Level Trigger (HLT) algorithms and deal with output data transport and storage has also been redesigned to be completely file- based. All the metadata needed for bookkeeping are stored in files as well, in the form of small documents using the JSON encoding. The Storage and Transfer System (STS) is responsible for aggregating these files produced bymore » the HLT, storing them temporarily and transferring them to the T0 facility at CERN for subsequent offline processing. The STS merger service aggregates the output files from the HLT from ~62 sources produced with an aggregate rate of ~2GB/s. An estimated bandwidth of 7GB/s in concurrent read/write mode is needed. Furthermore, the STS has to be able to store several days of continuous running, so an estimated of 250TB of total usable disk space is required. In this article we present the various technological and implementation choices of the three components of the STS: the distributed file system, the merger service and the transfer system.« less

  1. VizieR Online Data Catalog: Cataclysmic Binaries, LMXBs, and related objects (Ritter+, 2003)

    NASA Astrophysics Data System (ADS)

    Ritter, H.; Kolb, U.

    2003-08-01

    Cataclysmic Binaries are semi-detached binaries consisting of a white dwarf or a white dwarf precursor primary and a low-mass secondary which is filling its critical Roche lobe. The secondary is not necessarily unevolved, it may even be a highly evolved star as for example in the case of the AM CVn-type stars. Low-Mass X-Ray Binaries are semi-detached binaries consisting of either a neutron star or a black hole primary, and a low-mass secondary which is filling its critical Roche lobe. Related Objects are detached binaries consisting of either a white dwarf or a white dwarf precursor primary and of a low-mass secondary. The secondary may also be a highly evolved star. The catalogue lists coordinates, apparent magnitudes, orbital parameters, stellar parameters of the components and other characteristic properties of 501 cataclysmic binaries, 74 low-mass X-ray binaries and 114 related objects with known or suspected orbital periods together with a comprehensive selection of the relevant recent literature. In addition the catalogue contains a list of references to published finding charts for 651 of the 689 objects. A cross-reference list of alias object designations concludes the catalogue. Literature published before 30 June 2003 has, as far as possible, been taken into account. This catalogue supersedes the 5th edition (catalogue ) and the updated lists by Ritter and Kolb (1995; catalogue ) (1998; catalogue ). (10 data files).

  2. Coastal Area Tactical-mapping System (CATS)

    DTIC Science & Technology

    2007-09-30

    file (I.STD) file. A direct comparison to the timetag of the scanner index wedge times should then yield the shot number that corresponds to the...output of 4 relevant timing files. They are as follows: A.STD: The time at which the A-Scan Wedge index mark was detected. This is recorded as...a coarse time (seconds) and a fine time (microseconds). B.STD: The time at which the B-Scan Wedge index mark was detected. This is recorded as a

  3. JPLEX: Java Simplex Implementation with Branch-and-Bound Search for Automated Test Assembly

    ERIC Educational Resources Information Center

    Park, Ryoungsun; Kim, Jiseon; Dodd, Barbara G.; Chung, Hyewon

    2011-01-01

    JPLEX, short for Java simPLEX, is an automated test assembly (ATA) program. It is a mixed integer linear programming (MILP) solver written in Java. It reads in a configuration file, solves the minimization problem, and produces an output file for postprocessing. It implements the simplex algorithm to create a fully relaxed solution and…

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson, Mark A.; Bigelow, Matthew; Gilkey, Jeff C.

    The Super Strypi SWIL is a six degree-of-freedom (6DOF) simulation for the Super Strypi Launch Vehicle that includes a subset of the Super Strypi NGC software (guidance, ACS and sequencer). Aerodynamic and propulsive forces, mass properties, ACS (attitude control system) parameters, guidance parameters and Monte-Carlo parameters are defined in input files. Output parameters are saved to a Matlab mat file.

  5. Program VSMOKE--Users Manual

    Treesearch

    Leonidas G. Lavdas

    1996-01-01

    This is a users manual for VSMOKE, a computer porgram for predicting the smoke and dry weather visibility impact of a singel prescvribed fire at several downwind locations. VSMOKE is a FORTRAN 77 program that depends on the input in file VSMOKE.IPT to generate output in file compatible with those used by the U.S. Environmental Protection Agency. VSMOKE is uniquely...

  6. Assessing Homegrown Library Collections: Using Google Analytics to Track Use of Screencasts and Flash-Based Learning Objects

    ERIC Educational Resources Information Center

    Betty, Paul

    2009-01-01

    Increasing use of screencast and Flash authoring software within libraries is resulting in "homegrown" library collections of digital learning objects and multimedia presentations. The author explores the use of Google Analytics to track usage statistics for interactive Shockwave Flash (.swf) files, the common file output for screencast and Flash…

  7. 75 FR 33748 - Amateur Radio Use of the Allocation at 5 MHz

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-15

    ... envelope power (PEP). 3. The existing amateur radio use of the 60 meter band represents a balancing of... Comment Filing System (ECFS), (2) the Federal Government's eRulemaking Portal, or (3) by filing paper... transmitter output power in modern amateur radio transceivers is 100 W PEP, and that the present 50 W PEP...

  8. Dynamic detection-rate-based bit allocation with genuine interval concealment for binary biometric representation.

    PubMed

    Lim, Meng-Hui; Teoh, Andrew Beng Jin; Toh, Kar-Ann

    2013-06-01

    Biometric discretization is a key component in biometric cryptographic key generation. It converts an extracted biometric feature vector into a binary string via typical steps such as segmentation of each feature element into a number of labeled intervals, mapping of each interval-captured feature element onto a binary space, and concatenation of the resulted binary output of all feature elements into a binary string. Currently, the detection rate optimized bit allocation (DROBA) scheme is one of the most effective biometric discretization schemes in terms of its capability to assign binary bits dynamically to user-specific features with respect to their discriminability. However, we learn that DROBA suffers from potential discriminative feature misdetection and underdiscretization in its bit allocation process. This paper highlights such drawbacks and improves upon DROBA based on a novel two-stage algorithm: 1) a dynamic search method to efficiently recapture such misdetected features and to optimize the bit allocation of underdiscretized features and 2) a genuine interval concealment technique to alleviate crucial information leakage resulted from the dynamic search. Improvements in classification accuracy on two popular face data sets vindicate the feasibility of our approach compared with DROBA.

  9. Transferable Output ASCII Data (TOAD) editor version 1.0 user's guide

    NASA Technical Reports Server (NTRS)

    Bingel, Bradford D.; Shea, Anne L.; Hofler, Alicia S.

    1991-01-01

    The Transferable Output ASCII Data (TOAD) editor is an interactive software tool for manipulating the contents of TOAD files. The TOAD editor is specifically designed to work with tabular data. Selected subsets of data may be displayed to the user's screen, sorted, exchanged, duplicated, removed, replaced, inserted, or transferred to and from external files. It also offers a number of useful features including on-line help, macros, a command history, an 'undo' option, variables, and a full compliment of mathematical functions and conversion factors. Written in ANSI FORTRAN 77 and completely self-contained, the TOAD editor is very portable and has already been installed on SUN, SGI/IRIS, and CONVEX hosts.

  10. User's Manual for DuctE3D: A Program for 3D Euler Unsteady Aerodynamic and Aeroelastic Analysis of Ducted Fans

    NASA Technical Reports Server (NTRS)

    Srivastava, R.; Reddy, T. S. R.

    1997-01-01

    The program DuctE3D is used for steady or unsteady aerodynamic and aeroelastic analysis of ducted fans. This guide describes the input data required and the output files generated, in using DuctE3D. The analysis solves three dimensional unsteady, compressible Euler equations to obtain the aerodynamic forces. A normal mode structural analysis is used to obtain the aeroelastic equations, which are solved using either the time domain or the frequency domain solution method. Sample input and output files are included in this guide for steady aerodynamic analysis and aeroelastic analysis of an isolated fan row.

  11. On the decoding process in ternary error-correcting output codes.

    PubMed

    Escalera, Sergio; Pujol, Oriol; Radeva, Petia

    2010-01-01

    A common way to model multiclass classification problems is to design a set of binary classifiers and to combine them. Error-Correcting Output Codes (ECOC) represent a successful framework to deal with these type of problems. Recent works in the ECOC framework showed significant performance improvements by means of new problem-dependent designs based on the ternary ECOC framework. The ternary framework contains a larger set of binary problems because of the use of a "do not care" symbol that allows us to ignore some classes by a given classifier. However, there are no proper studies that analyze the effect of the new symbol at the decoding step. In this paper, we present a taxonomy that embeds all binary and ternary ECOC decoding strategies into four groups. We show that the zero symbol introduces two kinds of biases that require redefinition of the decoding design. A new type of decoding measure is proposed, and two novel decoding strategies are defined. We evaluate the state-of-the-art coding and decoding strategies over a set of UCI Machine Learning Repository data sets and into a real traffic sign categorization problem. The experimental results show that, following the new decoding strategies, the performance of the ECOC design is significantly improved.

  12. The Eclipsing Binary On-Line Atlas (EBOLA)

    NASA Astrophysics Data System (ADS)

    Bradstreet, D. H.; Steelman, D. P.; Sanders, S. J.; Hargis, J. R.

    2004-05-01

    In conjunction with the upcoming release of \\it Binary Maker 3.0, an extensive on-line database of eclipsing binaries is being made available. The purposes of the atlas are: \\begin {enumerate} Allow quick and easy access to information on published eclipsing binaries. Amass a consistent database of light and radial velocity curve solutions to aid in solving new systems. Provide invaluable querying capabilities on all of the parameters of the systems so that informative research can be quickly accomplished on a multitude of published results. Aid observers in establishing new observing programs based upon stars needing new light and/or radial velocity curves. Encourage workers to submit their published results so that others may have easy access to their work. Provide a vast but easily accessible storehouse of information on eclipsing binaries to accelerate the process of understanding analysis techniques and current work in the field. \\end {enumerate} The database will eventually consist of all published eclipsing binaries with light curve solutions. The following information and data will be supplied whenever available for each binary: original light curves in all bandpasses, original radial velocity observations, light curve parameters, RA and Dec, V-magnitudes, spectral types, color indices, periods, binary type, 3D representation of the system near quadrature, plots of the original light curves and synthetic models, plots of the radial velocity observations with theoretical models, and \\it Binary Maker 3.0 data files (parameter, light curve, radial velocity). The pertinent references for each star are also given with hyperlinks directly to the papers via the NASA Abstract website for downloading, if available. In addition the Atlas has extensive searching options so that workers can specifically search for binaries with specific characteristics. The website has more than 150 systems already uploaded. The URL for the site is http://ebola.eastern.edu/.

  13. MICE data handling on the Grid

    NASA Astrophysics Data System (ADS)

    Martyniak, J.; Mice Collaboration

    2014-06-01

    The international Muon Ionisation Cooling Experiment (MICE) is designed to demonstrate the principle of muon ionisation cooling for the first time, for application to a future Neutrino factory or Muon Collider. The experiment is currently under construction at the ISIS synchrotron at the Rutherford Appleton Laboratory (RAL), UK. In this paper we present a system - the Raw Data Mover, which allows us to store and distribute MICE raw data - and a framework for offline reconstruction and data management. The aim of the Raw Data Mover is to upload raw data files onto a safe tape storage as soon as the data have been written out by the DAQ system and marked as ready to be uploaded. Internal integrity of the files is verified and they are uploaded to the RAL Tier-1 Castor Storage Element (SE) and placed on two tapes for redundancy. We also make another copy at a separate disk-based SE at this stage to make it easier for users to access data quickly. Both copies are check-summed and the replicas are registered with an instance of the LCG File Catalog (LFC). On success a record with basic file properties is added to the MICE Metadata DB. The reconstruction process is triggered by new raw data records filled in by the mover system described above. Off-line reconstruction jobs for new raw files are submitted to RAL Tier-1 and the output is stored on tape. Batch reprocessing is done at multiple MICE enabled Grid sites and output files are shipped to central tape or disk storage at RAL using a custom File Transfer Controller.

  14. Configuration Management File Manager Developed for Numerical Propulsion System Simulation

    NASA Technical Reports Server (NTRS)

    Follen, Gregory J.

    1997-01-01

    One of the objectives of the High Performance Computing and Communication Project's (HPCCP) Numerical Propulsion System Simulation (NPSS) is to provide a common and consistent way to manage applications, data, and engine simulations. The NPSS Configuration Management (CM) File Manager integrated with the Common Desktop Environment (CDE) window management system provides a common look and feel for the configuration management of data, applications, and engine simulations for U.S. engine companies. In addition, CM File Manager provides tools to manage a simulation. Features include managing input files, output files, textual notes, and any other material normally associated with simulation. The CM File Manager includes a generic configuration management Application Program Interface (API) that can be adapted for the configuration management repositories of any U.S. engine company.

  15. AQUATOX Frequently Asked Questions

    EPA Pesticide Factsheets

    Capabilities, Installation, Source Code, Example Study Files, Biotic State Variables, Initial Conditions, Loadings, Volume, Sediments, Parameters, Libraries, Ecotoxicology, Waterbodies, Link to Watershed Models, Output, Metals, Troubleshooting

  16. DOE-2 sample run book: Version 2.1E

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Winkelmann, F.C.; Birdsall, B.E.; Buhl, W.F.

    1993-11-01

    The DOE-2 Sample Run Book shows inputs and outputs for a variety of building and system types. The samples start with a simple structure and continue to a high-rise office building, a medical building, three small office buildings, a bar/lounge, a single-family residence, a small office building with daylighting, a single family residence with an attached sunspace, a ``parameterized`` building using input macros, and a metric input/output example. All of the samples use Chicago TRY weather. The main purpose of the Sample Run Book is instructional. It shows the relationship of LOADS-SYSTEMS-PLANT-ECONOMICS inputs, displays various input styles, and illustrates manymore » of the basic and advanced features of the program. Many of the sample runs are preceded by a sketch of the building showing its general appearance and the zoning used in the input. In some cases we also show a 3-D rendering of the building as produced by the program DrawBDL. Descriptive material has been added as comments in the input itself. We find that a number of users have loaded these samples onto their editing systems and use them as ``templates`` for creating new inputs. Another way of using them would be to store various portions as files that can be read into the input using the {number_sign}{number_sign} include command, which is part of the Input Macro feature introduced in version DOE-2.lD. Note that the energy rate structures here are the same as in the DOE-2.lD samples, but have been rewritten using the new DOE-2.lE commands and keywords for ECONOMICS. The samples contained in this report are the same as those found on the DOE-2 release files. However, the output numbers that appear here may differ slightly from those obtained from the release files. The output on the release files can be used as a check set to compare results on your computer.« less

  17. BOREAS TE-17 Production Efficiency Model Images

    NASA Technical Reports Server (NTRS)

    Hall, Forrest G.; Papagno, Andrea (Editor); Goetz, Scott J.; Goward, Samual N.; Prince, Stephen D.; Czajkowski, Kevin; Dubayah, Ralph O.

    2000-01-01

    A Boreal Ecosystem-Atmospheric Study (BOREAS) version of the Global Production Efficiency Model (http://www.inform.umd.edu/glopem/) was developed by TE-17 (Terrestrial Ecology) to generate maps of gross and net primary production, autotrophic respiration, and light use efficiency for the BOREAS region. This document provides basic information on the model and how the maps were generated. The data generated by the model are stored in binary image-format files. The data files are available on a CD-ROM (see document number 20010000884), or from the Oak Ridge National Laboratory (ORNL) Distributed Active Archive Center (DAAC).

  18. Angle detector

    NASA Technical Reports Server (NTRS)

    Parra, G. T. (Inventor)

    1978-01-01

    An angle detector for determining a transducer's angular disposition to a capacitive pickup element is described. The transducer comprises a pendulum mounted inductive element moving past the capacitive pickup element. The capacitive pickup element divides the inductive element into two parts L sub 1 and L sub 2 which form the arms of one side of an a-c bridge. Two networks R sub 1 and R sub 2 having a plurality of binary weighted resistors and an equal number of digitally controlled switches for removing resistors from the networks form the arms of the other side of the a-c bridge. A binary counter, controlled by a phase detector, balances the bridge by adjusting the resistance of R sub 1 and R sub 2. The binary output of the counter is representative of the angle.

  19. User Guide for HUFPrint, A Tabulation and Visualization Utility for the Hydrogeologic-Unit Flow (HUF) Package of MODFLOW

    USGS Publications Warehouse

    Banta, Edward R.; Provost, Alden M.

    2008-01-01

    This report documents HUFPrint, a computer program that extracts and displays information about model structure and hydraulic properties from the input data for a model built using the Hydrogeologic-Unit Flow (HUF) Package of the U.S. Geological Survey's MODFLOW program for modeling ground-water flow. HUFPrint reads the HUF Package and other MODFLOW input files, processes the data by hydrogeologic unit and by model layer, and generates text and graphics files useful for visualizing the data or for further processing. For hydrogeologic units, HUFPrint outputs such hydraulic properties as horizontal hydraulic conductivity along rows, horizontal hydraulic conductivity along columns, horizontal anisotropy, vertical hydraulic conductivity or anisotropy, specific storage, specific yield, and hydraulic-conductivity depth-dependence coefficient. For model layers, HUFPrint outputs such effective hydraulic properties as horizontal hydraulic conductivity along rows, horizontal hydraulic conductivity along columns, horizontal anisotropy, specific storage, primary direction of anisotropy, and vertical conductance. Text files tabulating hydraulic properties by hydrogeologic unit, by model layer, or in a specified vertical section may be generated. Graphics showing two-dimensional cross sections and one-dimensional vertical sections at specified locations also may be generated. HUFPrint reads input files designed for MODFLOW-2000 or MODFLOW-2005.

  20. User's guide to PHREEQC (Version 2) : a computer program for speciation, batch-reaction, one-dimensional transport, and inverse geochemical calculations

    USGS Publications Warehouse

    Parkhurst, David L.; Appelo, C.A.J.

    1999-01-01

    PHREEQC version 2 is a computer program written in the C programming language that is designed to perform a wide variety of low-temperature aqueous geochemical calculations. PHREEQC is based on an ion-association aqueous model and has capabilities for (1) speciation and saturation-index calculations; (2) batch-reaction and one-dimensional (1D) transport calculations involving reversible reactions, which include aqueous, mineral, gas, solid-solution, surface-complexation, and ion-exchange equilibria, and irreversible reactions, which include specified mole transfers of reactants, kinetically controlled reactions, mixing of solutions, and temperature changes; and (3) inverse modeling, which finds sets of mineral and gas mole transfers that account for differences in composition between waters, within specified compositional uncertainty limits.New features in PHREEQC version 2 relative to version 1 include capabilities to simulate dispersion (or diffusion) and stagnant zones in 1D-transport calculations, to model kinetic reactions with user-defined rate expressions, to model the formation or dissolution of ideal, multicomponent or nonideal, binary solid solutions, to model fixed-volume gas phases in addition to fixed-pressure gas phases, to allow the number of surface or exchange sites to vary with the dissolution or precipitation of minerals or kinetic reactants, to include isotope mole balances in inverse modeling calculations, to automatically use multiple sets of convergence parameters, to print user-defined quantities to the primary output file and (or) to a file suitable for importation into a spreadsheet, and to define solution compositions in a format more compatible with spreadsheet programs. This report presents the equations that are the basis for chemical equilibrium, kinetic, transport, and inverse-modeling calculations in PHREEQC; describes the input for the program; and presents examples that demonstrate most of the program's capabilities.

  1. VizieR Online Data Catalog: Radial velocities of 8 KOI eclipsing binaries (Lillo-Box+, 2015)

    NASA Astrophysics Data System (ADS)

    Lillo-Box, J.; Barrado, D.; Mancini, L.; Henning, T.; Figueira, P.; Ciceri, S.; Santos, N.

    2015-06-01

    We have obtained multi-epoch RV data for the 13 KOIs. We used the CAFE instrument mounted on the 2.2m telescope at Calar Alto Observatory (Almeria Spain) to obtain high-resolution spectra (R=59000-67000) in the optical range (4000-9000Å). (2 data files).

  2. Internet Wargaming with Distributed Processing Using the Client-Server Model

    DTIC Science & Technology

    1997-03-01

    in for war game development . There are tool kits for writing binary files that are interpreted by a particular plug-in. The most popular plug-in set...multi-player game development , the speed with which the environment is changing should be taken into 35 account. For this project JavaScript was chosen

  3. Users guide: Steady-state aerodynamic-loads program for shuttle TPS tiles

    NASA Technical Reports Server (NTRS)

    Kerr, P. A.; Petley, D. H.

    1984-01-01

    A user's guide for the computer program that calculates the steady-state aerodynamic loads on the Shuttle thermal-protection tiles is presented. The main element in the program is the MITAS-II, Martin Marietta Interactive Thermal Analysis System. The MITAS-II is used to calculate the mass flow in a nine-tile model designed to simulate conditions duing a Shuttle flight. The procedures used to execute the program using the MITAS-II software are described. A list of the necessry software and data files along with a brief description of their functions is given. The format of the data file containing the surface pressure data is specified. The interpolation techniques used to calculate the pressure profile over the tile matrix are briefly described. In addition, the output from a sample run is explained. The actual output and the procedure file used to execute the program at NASA Langley Research Center on a CDC CYBER-175 are provided in the appendices.

  4. BamTools: a C++ API and toolkit for analyzing and managing BAM files.

    PubMed

    Barnett, Derek W; Garrison, Erik K; Quinlan, Aaron R; Strömberg, Michael P; Marth, Gabor T

    2011-06-15

    Analysis of genomic sequencing data requires efficient, easy-to-use access to alignment results and flexible data management tools (e.g. filtering, merging, sorting, etc.). However, the enormous amount of data produced by current sequencing technologies is typically stored in compressed, binary formats that are not easily handled by the text-based parsers commonly used in bioinformatics research. We introduce a software suite for programmers and end users that facilitates research analysis and data management using BAM files. BamTools provides both the first C++ API publicly available for BAM file support as well as a command-line toolkit. BamTools was written in C++, and is supported on Linux, Mac OSX and MS Windows. Source code and documentation are freely available at http://github.org/pezmaster31/bamtools.

  5. Distributive On-line Processing, Visualization and Analysis System for Gridded Remote Sensing Data

    NASA Technical Reports Server (NTRS)

    Leptoukh, G.; Berrick, S.; Liu, Z.; Pham, L.; Rui, H.; Shen, S.; Teng, W.; Zhu, T.

    2004-01-01

    The ability to use data stored in the current Earth Observing System (EOS) archives for studying regional or global phenomena is highly dependent on having a detailed understanding of the data's internal structure and physical implementation. Gaining this understanding and applying it to data reduction is a time- consuming task that must be undertaken before the core investigation can begin. This is an especially difficult challenge when science objectives require users to deal with large multi-sensor data sets that are usually of different formats, structures, and resolutions, for example, when preparing data for input into modeling systems. The NASA Goddard Earth Sciences Data and Information Services Center (GES DISC) has taken a major step towards meeting this challenge by developing an infrastructure with a Web interface that allows users to perform interactive analysis online without downloading any data, the GES-DISC Interactive Online Visualization and Analysis Infrastructure or "Giovanni." Giovanni provides interactive, online, analysis tools for data users to facilitate their research. There have been several instances of this interface created to serve TRMM users, Aerosol scientists, Ocean Color and Agriculture applications users. The first generation of these tools support gridded data only. The user selects geophysical parameters, area of interest, time period; and the system generates an output on screen in a matter of seconds. The currently available output options are: Area plot averaged or accumulated over any available data period for any rectangular area; Time plot time series averaged over any rectangular area; Time plots image view of any longitude-time and latitude-time cross sections; ASCII output for all plot types; Image animation for area plot. In the future, we will add correlation plots, GIS-compatible outputs, etc. This allow user to focus on data content (i.e. science parameters) and eliminate the need for expensive learning, development and processing tasks that are redundantly incurred by an archive's user community. The current implementation utilizes the GrADS-DODS Server (GDS), a stable, secure data server that provides subsetting and analysis services across the Internet for any GrADS-readable dataset. The subsetting capability allows users to retrieve a specified temporal and/or spatial subdomain from a large dataset, eliminating the need to download everything simply to access a small relevant portion of a dataset. The analysis capability allows users to retrieve the results of an operation applied to one or more datasets on the server. In our case, we use this approach to read pre-processed binary files and/or to read and extract the needed parts from HDF or HDF-EOS files. These subsets then serve as inputs into GrADS processing and analysis scripts. It can be used in a wide variety of Earth science applications: climate and weather events study and monitoring; modeling. It can be easily configured for new applications.

  6. Threshold Based Stochastic Resonance for the Binary-Input Ternary-Output Discrete Memoryless Channels

    DTIC Science & Technology

    2012-05-01

    noise (AGN) [1] and [11]. We focus on threshold communication systems due to the underwater environment, noncoherent communication techniques are...the threshold level. In the context of the underwater communications, where noncoherent communication techniques are affected both by noise and

  7. Beowawe Bottoming Binary Unit - Final Technical Report for EE0002856

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McDonald, Dale Edward

    2013-02-12

    This binary plant is the first high-output refrigeration based waste heat recovery cycle in the industry. Its working fluid is environmentally friendly and as such, the permits that would be required with a butane based cycle are not necessary. The unit is modularized, meaning that the unit’s individual skids were assembled in another location and were shipped via truck to the plant site. This project proves the technical feasibility of using low temperature brine The development of the unit led to the realization of low temperature, high output, and environmentally friendly heat recovery systems through domestic research and engineering. Themore » project generates additional renewable energy for Nevada, resulting in cleaner air and reduced carbon dioxide emissions. Royalty and tax payments to governmental agencies will increase, resulting in reduced financial pressure on local entities. The major components of the unit were sourced from American companies, resulting in increased economic activity throughout the country.« less

  8. Logic gates based all-optical binary half adder using triple core photonic crystal fiber

    NASA Astrophysics Data System (ADS)

    Uthayakumar, T.; Vasantha Jayakantha Raja, R.

    2018-06-01

    This study presents the implementation of an all-optical binary logic half adder by employing a triple core photonic crystal fiber (TPCF). The noteworthy feature of the present investigation is that an identical set of TPCF schemes, which demonstrated all-optical logic functions in our previous report, has revealed the ability to demonstrate the successful half adder operation. The control signal (CS) power defining the extinction ratios of the output ports for the considered symmetric planar and triangular TPCFs is evaluated through a numerical algorithm. Through suitable CS power and input combinations, the logic outputs are generated from extinction ratios to demonstrate the half adder operation. The results obtained display the significant influence of the input conditions on the delivery of half adder operation for different TPCF schemes considered. Furthermore, chloroform filled TPCF structures demonstrated the efficient low power half adder operation with a significant figure of merit, compared to that of the silica counterpart.

  9. Rapid Diagnostics of Onboard Sequences

    NASA Technical Reports Server (NTRS)

    Starbird, Thomas W.; Morris, John R.; Shams, Khawaja S.; Maimone, Mark W.

    2012-01-01

    Keeping track of sequences onboard a spacecraft is challenging. When reviewing Event Verification Records (EVRs) of sequence executions on the Mars Exploration Rover (MER), operators often found themselves wondering which version of a named sequence the EVR corresponded to. The lack of this information drastically impacts the operators diagnostic capabilities as well as their situational awareness with respect to the commands the spacecraft has executed, since the EVRs do not provide argument values or explanatory comments. Having this information immediately available can be instrumental in diagnosing critical events and can significantly enhance the overall safety of the spacecraft. This software provides auditing capability that can eliminate that uncertainty while diagnosing critical conditions. Furthermore, the Restful interface provides a simple way for sequencing tools to automatically retrieve binary compiled sequence SCMFs (Space Command Message Files) on demand. It also enables developers to change the underlying database, while maintaining the same interface to the existing applications. The logging capabilities are also beneficial to operators when they are trying to recall how they solved a similar problem many days ago: this software enables automatic recovery of SCMF and RML (Robot Markup Language) sequence files directly from the command EVRs, eliminating the need for people to find and validate the corresponding sequences. To address the lack of auditing capability for sequences onboard a spacecraft during earlier missions, extensive logging support was added on the Mars Science Laboratory (MSL) sequencing server. This server is responsible for generating all MSL binary SCMFs from RML input sequences. The sequencing server logs every SCMF it generates into a MySQL database, as well as the high-level RML file and dictionary name inputs used to create the SCMF. The SCMF is then indexed by a hash value that is automatically included in all command EVRs by the onboard flight software. Second, both the binary SCMF result and the RML input file can be retrieved simply by specifying the hash to a Restful web interface. This interface enables command line tools as well as large sophisticated programs to download the SCMF and RMLs on-demand from the database, enabling a vast array of tools to be built on top of it. One such command line tool can retrieve and display RML files, or annotate a list of EVRs by interleaving them with the original sequence commands. This software has been integrated with the MSL sequencing pipeline where it will serve sequences useful in diagnostics, debugging, and situational awareness throughout the mission.

  10. VizieR Online Data Catalog: Kepler Mission. VII. Eclipsing binaries in DR3 (Kirk+, 2016)

    NASA Astrophysics Data System (ADS)

    Kirk, B.; Conroy, K.; Prsa, A.; Abdul-Masih, M.; Kochoska, A.; Matijevic, G.; Hambleton, K.; Barclay, T.; Bloemen, S.; Boyajian, T.; Doyle, L. R.; Fulton, B. J.; Hoekstra, A. J.; Jek, K.; Kane, S. R.; Kostov, V.; Latham, D.; Mazeh, T.; Orosz, J. A.; Pepper, J.; Quarles, B.; Ragozzine, D.; Shporer, A.; Southworth, J.; Stassun, K.; Thompson, S. E.; Welsh, W. F.; Agol, E.; Derekas, A.; Devor, J.; Fischer, D.; Green, G.; Gropp, J.; Jacobs, T.; Johnston, C.; Lacourse, D. M.; Saetre, K.; Schwengeler, H.; Toczyski, J.; Werner, G.; Garrett, M.; Gore, J.; Martinez, A. O.; Spitzer, I.; Stevick, J.; Thomadis, P. C.; Vrijmoet, E. H.; Yenawine, M.; Batalha, N.; Borucki, W.

    2016-07-01

    The Kepler Eclipsing Binary Catalog lists the stellar parameters from the Kepler Input Catalog (KIC) augmented by: primary and secondary eclipse depth, eclipse width, separation of eclipse, ephemeris, morphological classification parameter, and principal parameters determined by geometric analysis of the phased light curve. The previous release of the Catalog (Paper II; Slawson et al. 2011, cat. J/AJ/142/160) contained 2165 objects, through the second Kepler data release (Q0-Q2). In this release, 2878 objects are identified and analyzed from the entire data set of the primary Kepler mission (Q0-Q17). The online version of the Catalog is currently maintained at http://keplerEBs.villanova.edu/. A static version of the online Catalog associated with this paper is maintained at MAST https://archive.stsci.edu/kepler/eclipsing_binaries.html. (10 data files).

  11. VizieR Online Data Catalog: Excess CaII H&K emission in active binaries (Montes+, 1996)

    NASA Astrophysics Data System (ADS)

    Montes, D.; Fernandez-Figueroa, M. J.; Cornide, M.; de Castro, E.

    1996-05-01

    In this work we analyze the behaviour of the excess CaII H & K and H_epsilon emissions in a sample of 73 chromospherically active binary systems (RS CVn and BY Dra classes), of different activity levels and luminosity classes. This sample includes the 53 stars analyzed by Fernandez-Figueroa et al. (1994) and the observations of 28 systems described by Montes et al. (1995). By using the spectral subtraction technique (subtraction of a synthesized stellar spectrum constructed from reference stars of spectral type and luminosity class similar to those of the binary star components) we obtain the active-chromosphere contribution to the CaII H & K lines in these 73 systems. We have determined the excess CaII H & K emission equivalent widths and converted them into surface fluxes. The emissions arising from each component were obtained when it was possible to deblend both contributions. (4 data files).

  12. TRACTS: a program to map oligopurine.oligopyrimidine and other binary DNA tracts

    PubMed Central

    Gal, Moshe; Katz, Tzvi; Ovadia, Amir; Yagil, Gad

    2003-01-01

    A program to map the locations and frequencies of DNA tracts composed of only two bases (‘Binary DNA’) is described. The program, TRACTS (URL http://bioportal.weizmann.ac.il/tracts/tracts.html and/or http://bip.weizmann.ac.il/miwbin/servers/tracts) is of interest because long tracts composed of only two bases are highly over-represented in most genomes. In eukaryotes, oligopurine.oligopyrimidine tracts (‘R.Y tracts’) are found in the highest excess. In prokaryotes, W tracts predominate (A,T ‘rich’). A pre-program, ANEX, parses database annotation files of GenBank and EMBL, to produce a convenient one-line list of every gene (exon, intron) in a genome. The main unit lists and analyzes tracts of the three possible binary pairs (R.Y, K.M and S;W). As an example, the results of R.Y tract mapping of mammalian gene p53 is described. PMID:12824393

  13. Forensic Analysis of Compromised Computers

    NASA Technical Reports Server (NTRS)

    Wolfe, Thomas

    2004-01-01

    Directory Tree Analysis File Generator is a Practical Extraction and Reporting Language (PERL) script that simplifies and automates the collection of information for forensic analysis of compromised computer systems. During such an analysis, it is sometimes necessary to collect and analyze information about files on a specific directory tree. Directory Tree Analysis File Generator collects information of this type (except information about directories) and writes it to a text file. In particular, the script asks the user for the root of the directory tree to be processed, the name of the output file, and the number of subtree levels to process. The script then processes the directory tree and puts out the aforementioned text file. The format of the text file is designed to enable the submission of the file as input to a spreadsheet program, wherein the forensic analysis is performed. The analysis usually consists of sorting files and examination of such characteristics of files as ownership, time of creation, and time of most recent access, all of which characteristics are among the data included in the text file.

  14. Issues in ATM Support of High-Performance, Geographically Distributed Computing

    NASA Technical Reports Server (NTRS)

    Claus, Russell W.; Dowd, Patrick W.; Srinidhi, Saragur M.; Blade, Eric D.G

    1995-01-01

    This report experimentally assesses the effect of the underlying network in a cluster-based computing environment. The assessment is quantified by application-level benchmarking, process-level communication, and network file input/output. Two testbeds were considered, one small cluster of Sun workstations and another large cluster composed of 32 high-end IBM RS/6000 platforms. The clusters had Ethernet, fiber distributed data interface (FDDI), Fibre Channel, and asynchronous transfer mode (ATM) network interface cards installed, providing the same processors and operating system for the entire suite of experiments. The primary goal of this report is to assess the suitability of an ATM-based, local-area network to support interprocess communication and remote file input/output systems for distributed computing.

  15. Modification and Validation of Conceptual Design Aerodynamic Prediction Method HASC95 With VTXCHN

    NASA Technical Reports Server (NTRS)

    Albright, Alan E.; Dixon, Charles J.; Hegedus, Martin C.

    1996-01-01

    A conceptual/preliminary design level subsonic aerodynamic prediction code HASC (High Angle of Attack Stability and Control) has been improved in several areas, validated, and documented. The improved code includes improved methodologies for increased accuracy and robustness, and simplified input/output files. An engineering method called VTXCHN (Vortex Chine) for prediciting nose vortex shedding from circular and non-circular forebodies with sharp chine edges has been improved and integrated into the HASC code. This report contains a summary of modifications, description of the code, user's guide, and validation of HASC. Appendices include discussion of a new HASC utility code, listings of sample input and output files, and a discussion of the application of HASC to buffet analysis.

  16. Exploiting Efficient Transpacking for One-Sided Communication and MPI-IO

    NASA Astrophysics Data System (ADS)

    Mir, Faisal Ghias; Träff, Jesper Larsson

    Based on a construction of socalled input-output datatypes that define a mapping between non-consecutive input and output buffers, we outline an efficient method for copying of structured data. We term this operation transpacking, and show how transpacking can be applied for the MPI implementation of one-sided communication and MPI-IO. For one-sided communication via shared-memory, we demonstrate the expected performance improvements by up to a factor of two. For individual MPI-IO, the time to read or write from file dominates the overall time, but even here efficient transpacking can in some scenarios reduce file I/O time considerably. The reported results have been achieved on a single NEC SX-8 vector node.

  17. The Comparison of Point Data Models for the Output of WRF Hydro Model in the IDV

    NASA Astrophysics Data System (ADS)

    Ho, Y.; Weber, J.

    2017-12-01

    WRF Hydro netCDF output files contain streamflow, flow depth, longitude, latitude, altitude and stream order values for each forecast point. However, the data are not CF compliant. The total number of forecast points for the US CONUS is approximately 2.7 million and it is a big challenge for any visualization and analysis tool. The IDV point cloud display shows point data as a set of points colored by parameter. This display is very efficient compared to a standard point type display for rendering a large number of points. The one problem we have is that the data I/O can be a bottleneck issue when dealing with a large collection of point input files. In this presentation, we will experiment with different point data models and their APIs to access the same WRF Hydro model output. The results will help us construct a CF compliant netCDF point data format for the community.

  18. All-optical 4-bit binary to binary coded decimal converter with the help of semiconductor optical amplifier-assisted Sagnac switch

    NASA Astrophysics Data System (ADS)

    Bhattachryya, Arunava; Kumar Gayen, Dilip; Chattopadhyay, Tanay

    2013-04-01

    All-optical 4-bit binary to binary coded decimal (BCD) converter has been proposed and described, with the help of semiconductor optical amplifier (SOA)-assisted Sagnac interferometric switches in this manuscript. The paper describes all-optical conversion scheme using a set of all-optical switches. BCD is common in computer systems that display numeric values, especially in those consisting solely of digital logic with no microprocessor. In many personal computers, the basic input/output system (BIOS) keep the date and time in BCD format. The operations of the circuit are studied theoretically and analyzed through numerical simulations. The model accounts for the SOA small signal gain, line-width enhancement factor and carrier lifetime, the switching pulse energy and width, and the Sagnac loop asymmetry. By undertaking a detailed numerical simulation the influence of these key parameters on the metrics that determine the quality of switching is thoroughly investigated.

  19. Effects of binary stellar populations on direct collapse black hole formation

    NASA Astrophysics Data System (ADS)

    Agarwal, Bhaskar; Cullen, Fergus; Khochfar, Sadegh; Klessen, Ralf S.; Glover, Simon C. O.; Johnson, Jarrett

    2017-06-01

    The critical Lyman-Werner (LW) flux required for direct collapse blackholes (DCBH) formation, or Jcrit, depends on the shape of the irradiating spectral energy distribution (SED). The SEDs employed thus far have been representative of realistic single stellar populations. We study the effect of binary stellar populations on the formation of DCBH, as a result of their contribution to the LW radiation field. Although binary populations with ages > 10 Myr yield a larger LW photon output, we find that the corresponding values of Jcrit can be up to 100 times higher than single stellar populations. We attribute this to the shape of the binary SEDs as they produce a sub-critical rate of H- photodetaching 0.76 eV photons as compared to single stellar populations, reaffirming the role that H- plays in DCBH formation. This further corroborates the idea that DCBH formation is better understood in terms of a critical region in the H2-H- photodestruction rate parameter space, rather than a single value of LW flux.

  20. Navy Tethered Balloon Measurements Made During the ’Fire’ Marine Stratocu IFO (Intensive Field Operation)

    DTIC Science & Technology

    1989-04-03

    instrument described in the previous sec- tion. RH2 = V4 - VB2 (8) where V4 is the output voltage stored in the data file OUTXXYY.SHR and VB2 is the...archive comment. The V option can be followed by a C for a verbose listing with file comments. PKXARC FAST! Archive Extract Utility Version 3.3 10-23-86...Copyright (c) 1986 PKWARE, Inc. All Rights Reserved. PKXARC/h for help Extracts files from an archive to their original name, size, time, & date

  1. Interactive Visual Simulation of Communication Systems. Volume 2

    DTIC Science & Technology

    1988-04-29

    struct( int non; ) ARM *PARAMPTR; Itypedef struct float x[8], y [8]; FILE *fp; mnt time; ) STATE, *STATEPTR; *psk8_dem (pparam, size,pstate,pstar) I...i*THETA); pstate-> y ~i]=sin(i*THETA) ; pstate->fp=fopen ( "graf .dat"l,"w"); pstate->time=0; Iif (length output_fifo(o)==maxlength-output_fifo(0...time򒾐) fprintf(pstate->fp,"\

  2. Prewarping techniques in imaging: applications in nanotechnology and biotechnology

    NASA Astrophysics Data System (ADS)

    Poonawala, Amyn; Milanfar, Peyman

    2005-03-01

    In all imaging systems, the underlying process introduces undesirable distortions that cause the output signal to be a warped version of the input. When the input to such systems can be controlled, pre-warping techniques can be employed which consist of systematically modifying the input such that it cancels out (or compensates for) the process losses. In this paper, we focus on the mask (reticle) design problem for 'optical micro-lithography', a process similar to photographic printing used for transferring binary circuit patterns onto silicon wafers. We use a pixel-based mask representation and model the above process as a cascade of convolution (aerial image formation) and thresholding (high-contrast recording) operations. The pre-distorted mask is obtained by minimizing the norm of the difference between the 'desired' output image and the 'reproduced' output image. We employ the regularization framework to ensure that the resulting masks are close-to-binary as well as simple and easy to fabricate. Finally, we provide insight into two additional applications of pre-warping techniques. First is 'e-beam lithography', used for fabricating nano-scale structures, and second is 'electronic visual prosthesis' which aims at providing limited vision to the blind by using a prosthetic retinally implanted chip capable of electrically stimulating the retinal neuron cells.

  3. A merged pipe organ binary-analog correlator

    NASA Astrophysics Data System (ADS)

    Miller, R. S.; Berry, M. B.

    1982-02-01

    The design of a 96-stage, programmable binary-analog correlator is described. An array of charge coupled device (CCD) delay lines of differing lengths perform the delay and sum functions. Merging of several CCD channels is employed to reduce the active area. This device architecture allows simplified output detection while maintaining good device performance at higher speeds (5-10 MHz). Experimental results indicate a 50 dB broadband dynamic range and excellent agreement with the theoretical processing gain (19.8 dB) when operated at a 6 MHz sampling frequency as a p-n sequence matched filter.

  4. BINARY STORAGE ELEMENT

    DOEpatents

    Chu, J.C.

    1958-06-10

    A binary storage device is described comprising a toggle provided with associsted improved driver circuits adapted to produce reliable action of the toggle during clearing of the toggle to one of its two states. or transferring information into and out of the toggle. The invention resides in the development of a self-regulating driver circuit to minimize the fluctuation of the driving voltages for the toggle. The disclosed driver circuit produces two pulses in response to an input pulse: a first or ''clear'' pulse beginning nt substantially the same time but endlrg slightly sooner than the second or ''transfer'' output pulse.

  5. Negative base encoding in optical linear algebra processors

    NASA Technical Reports Server (NTRS)

    Perlee, C.; Casasent, D.

    1986-01-01

    In the digital multiplication by analog convolution algorithm, the bits of two encoded numbers are convolved to form the product of the two numbers in mixed binary representation; this output can be easily converted to binary. Attention is presently given to negative base encoding, treating base -2 initially, and then showing that the negative base system can be readily extended to any radix. In general, negative base encoding in optical linear algebra processors represents a more efficient technique than either sign magnitude or 2's complement encoding, when the additions of digitally encoded products are performed in parallel.

  6. VizieR Online Data Catalog: Orbits based on SOAR speckle interferometry. II. (Tokovinin, 2017)

    NASA Astrophysics Data System (ADS)

    Tokovinin, A.

    2018-01-01

    We present new or updated orbits of 44 binary systems or subsystems. It is based on speckle interferometric measurements made at the 4.1m Southern Astrophysical Research (SOAR) telescope (Tokovinin et al. 2010, Cat. J/AJ/139/743; 2014, Cat. J/AJ/147/123; 2015, Cat. J/AJ/150/50; 2016, Cat. J/AJ/151/153; 2010PASP..122.1483T; Tokovinin 2012, Cat. J/AJ/144/56) combined with archival data collected in the Washington Double Star Catalog (WDS; Mason et al. 2001-2014, Cat. B/wds). It continues previous work on binary orbits resulting from the SOAR speckle program and follows the template of the Paper I (Tokovinin 2016, Cat. J/AJ/152/138), where the motivation is discussed. Briefly, the calculation of binary orbits is part of the astronomical infrastructure, and visual orbital elements are used in many areas. The state of the art is reflected in the Sixth Catalog of Visual Binary Orbits (VB6; Hartkopf et al. 2001AJ....122.3472H; http://www.usno.navy.mil/USNO/astrometry/optical-IR-prod/wds/orb6.html). (5 data files).

  7. Super Strypi HWIL 6DOF (Hardware-In-Loop six-degree-of-freedom) Rev. 2175

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gilkey, Jeff C.; Harl, Nathan R.; Kowalchuk, Scott A.

    2016-02-23

    The Super Strypi HWIL is a six degree-of-freedom (6DOF) simulation for the Super Strypi Launch Vehicle. The simulation is used to test the NGC flight software including the navigation software. Aerodynamic and propulsive forces, mass properties, ACS (attitude control system) parameters are defined in input files. Output parameters are saved to a Matlab mat file.

  8. Combining multiple decisions: applications to bioinformatics

    NASA Astrophysics Data System (ADS)

    Yukinawa, N.; Takenouchi, T.; Oba, S.; Ishii, S.

    2008-01-01

    Multi-class classification is one of the fundamental tasks in bioinformatics and typically arises in cancer diagnosis studies by gene expression profiling. This article reviews two recent approaches to multi-class classification by combining multiple binary classifiers, which are formulated based on a unified framework of error-correcting output coding (ECOC). The first approach is to construct a multi-class classifier in which each binary classifier to be aggregated has a weight value to be optimally tuned based on the observed data. In the second approach, misclassification of each binary classifier is formulated as a bit inversion error with a probabilistic model by making an analogy to the context of information transmission theory. Experimental studies using various real-world datasets including cancer classification problems reveal that both of the new methods are superior or comparable to other multi-class classification methods.

  9. Legato: Personal Computer Software for Analyzing Pressure-Sensitive Paint Data

    NASA Technical Reports Server (NTRS)

    Schairer, Edward T.

    2001-01-01

    'Legato' is personal computer software for analyzing radiometric pressure-sensitive paint (PSP) data. The software is written in the C programming language and executes under Windows 95/98/NT operating systems. It includes all operations normally required to convert pressure-paint image intensities to normalized pressure distributions mapped to physical coordinates of the test article. The program can analyze data from both single- and bi-luminophore paints and provides for both in situ and a priori paint calibration. In addition, there are functions for determining paint calibration coefficients from calibration-chamber data. The software is designed as a self-contained, interactive research tool that requires as input only the bare minimum of information needed to accomplish each function, e.g., images, model geometry, and paint calibration coefficients (for a priori calibration) or pressure-tap data (for in situ calibration). The program includes functions that can be used to generate needed model geometry files for simple model geometries (e.g., airfoils, trapezoidal wings, rotor blades) based on the model planform and airfoil section. All data files except images are in ASCII format and thus are easily created, read, and edited. The program does not use database files. This simplifies setup but makes the program inappropriate for analyzing massive amounts of data from production wind tunnels. Program output consists of Cartesian plots, false-colored real and virtual images, pressure distributions mapped to the surface of the model, assorted ASCII data files, and a text file of tabulated results. Graphical output is displayed on the computer screen and can be saved as publication-quality (PostScript) files.

  10. QX MAN: Q and X file manipulation

    NASA Technical Reports Server (NTRS)

    Krein, Mark A.

    1992-01-01

    QX MAN is a grid and solution file manipulation program written primarily for the PARC code and the GRIDGEN family of grid generation codes. QX MAN combines many of the features frequently encountered in grid generation, grid refinement, the setting-up of initial conditions, and post processing. QX MAN allows the user to manipulate single block and multi-block grids (and their accompanying solution files) by splitting, concatenating, rotating, translating, re-scaling, and stripping or adding points. In addition, QX MAN can be used to generate an initial solution file for the PARC code. The code was written to provide several formats for input and output in order for it to be useful in a broad spectrum of applications.

  11. Morphometric analysis of root canal cleaning after rotary instrumentation with or without laser irradiation

    NASA Astrophysics Data System (ADS)

    Marchesan, Melissa A.; Geurisoli, Danilo M. Z.; Brugnera, Aldo, Jr.; Barbin, Eduardo L.; Pecora, Jesus D.

    2002-06-01

    The present study examined root canal cleaning, using the optic microscope, after rotary instrumentation with ProFile.04 with or without laser application with different output energies. Cleaning and shaping can be accomplished manually, with ultra-sonic and sub-sonic devices, with rotary instruments and recently, increasing development in laser radiation has shown promising results for disinfection and smear layer removal. In this study, 30 palatal maxillary molar roots were examined using an optic microscope after rotary instrumentation with ProFile .04 with or without Er:YAG laser application (KaVo KeyLaser II, Germany) with different output energies (2940 nm, 15 Hz, 300 pulses, 500 milli-sec duration, 42 J, 140 mJ showed on the display- input, 61 mJ at fiberoptic tip-output and 140 mJ showed on the display-input and 51 mJ at fiberoptic tip-output). Statistical analysis showed no statistical differences between the tested treatments (ANOVA, p>0.05). ANOVA also showed a statistically significant difference (p<0.01) between the root canal thirds, indicating that the middle third had less debris than the apical third. We conclude that: 1) none of the tested treatments led to totally cleaned root canals; 2) all treatments removed debris similarly, 3) the middle third had less debris than the apical third; 4) variation in output energy did not increase cleaning.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pitarka, Arben

    GEN_SRF_4 is a computer program for generation kinematic earthquake rupture models for use in ground motion modeling and simulations of earthquakes. The output is an ascii SRF formatted file containing kinematic rupture parameters.

  13. Neural Networks for Handwritten English Alphabet Recognition

    NASA Astrophysics Data System (ADS)

    Perwej, Yusuf; Chaturvedi, Ashish

    2011-04-01

    This paper demonstrates the use of neural networks for developing a system that can recognize hand-written English alphabets. In this system, each English alphabet is represented by binary values that are used as input to a simple feature extraction system, whose output is fed to our neural network system.

  14. Combination Base64 Algorithm and EOF Technique for Steganography

    NASA Astrophysics Data System (ADS)

    Rahim, Robbi; Nurdiyanto, Heri; Hidayat, Rahmat; Saleh Ahmar, Ansari; Siregar, Dodi; Putera Utama Siahaan, Andysah; Faisal, Ilham; Rahman, Sayuti; Suita, Diana; Zamsuri, Ahmad; Abdullah, Dahlan; Napitupulu, Darmawan; Ikhsan Setiawan, Muhammad; Sriadhi, S.

    2018-04-01

    The steganography process combines mathematics and computer science. Steganography consists of a set of methods and techniques to embed the data into another media so that the contents are unreadable to anyone who does not have the authority to read these data. The main objective of the use of base64 method is to convert any file in order to achieve privacy. This paper discusses a steganography and encoding method using base64, which is a set of encoding schemes that convert the same binary data to the form of a series of ASCII code. Also, the EoF technique is used to embed encoding text performed by Base64. As an example, for the mechanisms a file is used to represent the texts, and by using the two methods together will increase the security level for protecting the data, this research aims to secure many types of files in a particular media with a good security and not to damage the stored files and coverage media that used.

  15. BOREAS Regional Soils Data in Raster Format and AEAC Projection

    NASA Technical Reports Server (NTRS)

    Monette, Bryan; Knapp, David; Hall, Forrest G. (Editor); Nickeson, Jaime (Editor)

    2000-01-01

    This data set was gridded by BOREAS Information System (BORIS) Staff from a vector data set received from the Canadian Soil Information System (CanSIS). The original data came in two parts that covered Saskatchewan and Manitoba. The data were gridded and merged into one data set of 84 files covering the BOREAS region. The data were gridded into the AEAC projection. Because the mapping of the two provinces was done separately in the original vector data, there may be discontinuities in some of the soil layers because of different interpretations of certain soil properties. The data are stored in binary, image format files.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dolan, Daniel H.; Ao, Tommy

    The Sandia Data Archive (SDA) format is a specific implementation of the HDF5 (Hierarchal Data Format version 5) standard. The format was developed for storing data in a universally accessible manner. SDA files may contain one or more data records, each associated with a distinct text label. Primitive records provide basic data storage, while compound records support more elaborate grouping. External records allow text/binary files to be carried inside an archive and later recovered. This report documents version 1.0 of the SDA standard. The information provided here is sufficient for reading from and writing to an archive. Although the formatmore » was original designed for use in MATLAB, broader use is encouraged.« less

  17. Update to the NASA Lewis Ice Accretion Code LEWICE

    NASA Technical Reports Server (NTRS)

    Wright, William B.

    1994-01-01

    This report is intended as an update to NASA CR-185129 'User's Manual for the NASA Lewis Ice Accretion Prediction Code (LEWICE).' It describes modifications and improvements made to this code as well as changes to the input and output files, interactive input, and graphics output. The comparison of this code to experimental data is shown to have improved as a result of these modifications.

  18. BamTools: a C++ API and toolkit for analyzing and managing BAM files

    PubMed Central

    Barnett, Derek W.; Garrison, Erik K.; Quinlan, Aaron R.; Strömberg, Michael P.; Marth, Gabor T.

    2011-01-01

    Motivation: Analysis of genomic sequencing data requires efficient, easy-to-use access to alignment results and flexible data management tools (e.g. filtering, merging, sorting, etc.). However, the enormous amount of data produced by current sequencing technologies is typically stored in compressed, binary formats that are not easily handled by the text-based parsers commonly used in bioinformatics research. Results: We introduce a software suite for programmers and end users that facilitates research analysis and data management using BAM files. BamTools provides both the first C++ API publicly available for BAM file support as well as a command-line toolkit. Availability: BamTools was written in C++, and is supported on Linux, Mac OSX and MS Windows. Source code and documentation are freely available at http://github.org/pezmaster31/bamtools. Contact: barnetde@bc.edu PMID:21493652

  19. First use of LHC Run 3 Conditions Database infrastructure for auxiliary data files in ATLAS

    NASA Astrophysics Data System (ADS)

    Aperio Bella, L.; Barberis, D.; Buttinger, W.; Formica, A.; Gallas, E. J.; Rinaldi, L.; Rybkin, G.; ATLAS Collaboration

    2017-10-01

    Processing of the large amount of data produced by the ATLAS experiment requires fast and reliable access to what we call Auxiliary Data Files (ADF). These files, produced by Combined Performance, Trigger and Physics groups, contain conditions, calibrations, and other derived data used by the ATLAS software. In ATLAS this data has, thus far for historical reasons, been collected and accessed outside the ATLAS Conditions Database infrastructure and related software. For this reason, along with the fact that ADF are effectively read by the software as binary objects, this class of data appears ideal for testing the proposed Run 3 conditions data infrastructure now in development. This paper describes this implementation as well as the lessons learned in exploring and refining the new infrastructure with the potential for deployment during Run 2.

  20. BOREAS Forest Cover Data Layers of the NSA in Raster Format

    NASA Technical Reports Server (NTRS)

    Hall, Forrest G. (Editor); Knapp, David; Tuinhoff, Manning

    2000-01-01

    This data set was processed by BORIS staff from the original vector data of species, crown closure, cutting class, and site classification/subtype into raster files. The original polygon data were received from Linnet Graphics, the distributor of data for MNR. In the case of the species layer, the percentages of species composition were removed. This reduced the amount of information contained in the species layer of the gridded product, but it was necessary in order to make the gridded product easier to use. The original maps were produced from 1:15,840-scale aerial photography collected in 1988 over an area of the BOREAS NSA MSA. The data are stored in binary, image format files and they are available from Oak Ridge National Laboratory. The data files are available on a CD-ROM (see document number 20010000884).

  1. Arlequin suite ver 3.5: a new series of programs to perform population genetics analyses under Linux and Windows.

    PubMed

    Excoffier, Laurent; Lischer, Heidi E L

    2010-05-01

    We present here a new version of the Arlequin program available under three different forms: a Windows graphical version (Winarl35), a console version of Arlequin (arlecore), and a specific console version to compute summary statistics (arlsumstat). The command-line versions run under both Linux and Windows. The main innovations of the new version include enhanced outputs in XML format, the possibility to embed graphics displaying computation results directly into output files, and the implementation of a new method to detect loci under selection from genome scans. Command-line versions are designed to handle large series of files, and arlsumstat can be used to generate summary statistics from simulated data sets within an Approximate Bayesian Computation framework. © 2010 Blackwell Publishing Ltd.

  2. Managing the computational chemistry big data problem: the ioChem-BD platform.

    PubMed

    Álvarez-Moreno, M; de Graaf, C; López, N; Maseras, F; Poblet, J M; Bo, C

    2015-01-26

    We present the ioChem-BD platform ( www.iochem-bd.org ) as a multiheaded tool aimed to manage large volumes of quantum chemistry results from a diverse group of already common simulation packages. The platform has an extensible structure. The key modules managing the main tasks are to (i) upload of output files from common computational chemistry packages, (ii) extract meaningful data from the results, and (iii) generate output summaries in user-friendly formats. A heavy use of the Chemical Mark-up Language (CML) is made in the intermediate files used by ioChem-BD. From them and using XSL techniques, we manipulate and transform such chemical data sets to fulfill researchers' needs in the form of HTML5 reports, supporting information, and other research media.

  3. SWIFT MODELLER: a Java based GUI for molecular modeling.

    PubMed

    Mathur, Abhinav; Shankaracharya; Vidyarthi, Ambarish S

    2011-10-01

    MODELLER is command line argument based software which requires tedious formatting of inputs and writing of Python scripts which most people are not comfortable with. Also the visualization of output becomes cumbersome due to verbose files. This makes the whole software protocol very complex and requires extensive study of MODELLER manuals and tutorials. Here we describe SWIFT MODELLER, a GUI that automates formatting, scripting and data extraction processes and present it in an interactive way making MODELLER much easier to use than before. The screens in SWIFT MODELLER are designed keeping homology modeling in mind and their flow is a depiction of its steps. It eliminates the formatting of inputs, scripting processes and analysis of verbose output files through automation and makes pasting of the target sequence as the only prerequisite. Jmol (3D structure visualization tool) has been integrated into the GUI which opens and demonstrates the protein data bank files created by the MODELLER software. All files required and created by the software are saved in a folder named after the work instance's date and time of execution. SWIFT MODELLER lowers the skill level required for the software through automation of many of the steps in the original software protocol, thus saving an enormous amount of time per instance and making MODELLER very easy to work with.

  4. Broadband Heating Rate Profile Project (BBHRP) - SGP ripbe370mcfarlane

    DOE Data Explorer

    Riihimaki, Laura; Shippert, Timothy

    2014-11-05

    The objective of the ARM Broadband Heating Rate Profile (BBHRP) Project is to provide a structure for the comprehensive assessment of our ability to model atmospheric radiative transfer for all conditions. Required inputs to BBHRP include surface albedo and profiles of atmospheric state (temperature, humidity), gas concentrations, aerosol properties, and cloud properties. In the past year, the Radiatively Important Parameters Best Estimate (RIPBE) VAP was developed to combine all of the input properties needed for BBHRP into a single gridded input file. Additionally, an interface between the RIPBE input file and the RRTM was developed using the new ARM integrated software development environment (ISDE) and effort was put into developing quality control (qc) flags and provenance information on the BBHRP output files so that analysis of the output would be more straightforward. This new version of BBHRP, sgp1bbhrpripbeC1.c1, uses the RIPBE files as input to RRTM, and calculates broadband SW and LW fluxes and heating rates at 1-min resolution using the independent column approximation. The vertical resolution is 45 m in the lower and middle troposphere to match the input cloud properties, but is at coarser resolution in the upper atmosphere. Unlike previous versions, the vertical grid is the same for both clear-sky and cloudy-sky calculations.

  5. Broadband Heating Rate Profile Project (BBHRP) - SGP 1bbhrpripbe1mcfarlane

    DOE Data Explorer

    Riihimaki, Laura; Shippert, Timothy

    2014-11-05

    The objective of the ARM Broadband Heating Rate Profile (BBHRP) Project is to provide a structure for the comprehensive assessment of our ability to model atmospheric radiative transfer for all conditions. Required inputs to BBHRP include surface albedo and profiles of atmospheric state (temperature, humidity), gas concentrations, aerosol properties, and cloud properties. In the past year, the Radiatively Important Parameters Best Estimate (RIPBE) VAP was developed to combine all of the input properties needed for BBHRP into a single gridded input file. Additionally, an interface between the RIPBE input file and the RRTM was developed using the new ARM integrated software development environment (ISDE) and effort was put into developing quality control (qc) flags and provenance information on the BBHRP output files so that analysis of the output would be more straightforward. This new version of BBHRP, sgp1bbhrpripbeC1.c1, uses the RIPBE files as input to RRTM, and calculates broadband SW and LW fluxes and heating rates at 1-min resolution using the independent column approximation. The vertical resolution is 45 m in the lower and middle troposphere to match the input cloud properties, but is at coarser resolution in the upper atmosphere. Unlike previous versions, the vertical grid is the same for both clear-sky and cloudy-sky calculations.

  6. Broadband Heating Rate Profile Project (BBHRP) - SGP ripbe1mcfarlane

    DOE Data Explorer

    Riihimaki, Laura; Shippert, Timothy

    2014-11-05

    The objective of the ARM Broadband Heating Rate Profile (BBHRP) Project is to provide a structure for the comprehensive assessment of our ability to model atmospheric radiative transfer for all conditions. Required inputs to BBHRP include surface albedo and profiles of atmospheric state (temperature, humidity), gas concentrations, aerosol properties, and cloud properties. In the past year, the Radiatively Important Parameters Best Estimate (RIPBE) VAP was developed to combine all of the input properties needed for BBHRP into a single gridded input file. Additionally, an interface between the RIPBE input file and the RRTM was developed using the new ARM integrated software development environment (ISDE) and effort was put into developing quality control (qc) flags and provenance information on the BBHRP output files so that analysis of the output would be more straightforward. This new version of BBHRP, sgp1bbhrpripbeC1.c1, uses the RIPBE files as input to RRTM, and calculates broadband SW and LW fluxes and heating rates at 1-min resolution using the independent column approximation. The vertical resolution is 45 m in the lower and middle troposphere to match the input cloud properties, but is at coarser resolution in the upper atmosphere. Unlike previous versions, the vertical grid is the same for both clear-sky and cloudy-sky calculations.

  7. Simulation of Distributed PV Power Output in Oahu Hawaii

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lave, Matthew Samuel

    2016-08-01

    Distributed solar photovoltaic (PV) power generation in Oahu has grown rapidly since 2008. For applications such as determining the value of energy storage, it is important to have PV power output timeseries. Since these timeseries of not typically measured, here we produce simulated distributed PV power output for Oahu. Simulated power output is based on (a) satellite-derived solar irradiance, (b) PV permit data by neighborhood, and (c) population data by census block. Permit and population data was used to model locations of distributed PV, and irradiance data was then used to simulate power output. PV power output simulations are presentedmore » by sub-neighborhood polygons, neighborhoods, and for the whole island of Oahu. Summary plots of annual PV energy and a sample week timeseries of power output are shown, and a the files containing the entire timeseries are described.« less

  8. MIMS for TRIM

    EPA Pesticide Factsheets

    MIMS supports complex computational studies that use multiple interrelated models / programs, such as the modules within TRIM. MIMS is used by TRIM to run various models in sequence, while sharing input and output files.

  9. R package to estimate intracluster correlation coefficient with confidence interval for binary data.

    PubMed

    Chakraborty, Hrishikesh; Hossain, Akhtar

    2018-03-01

    The Intracluster Correlation Coefficient (ICC) is a major parameter of interest in cluster randomized trials that measures the degree to which responses within the same cluster are correlated. There are several types of ICC estimators and its confidence intervals (CI) suggested in the literature for binary data. Studies have compared relative weaknesses and advantages of ICC estimators as well as its CI for binary data and suggested situations where one is advantageous in practical research. The commonly used statistical computing systems currently facilitate estimation of only a very few variants of ICC and its CI. To address the limitations of current statistical packages, we developed an R package, ICCbin, to facilitate estimating ICC and its CI for binary responses using different methods. The ICCbin package is designed to provide estimates of ICC in 16 different ways including analysis of variance methods, moments based estimation, direct probabilistic methods, correlation based estimation, and resampling method. CI of ICC is estimated using 5 different methods. It also generates cluster binary data using exchangeable correlation structure. ICCbin package provides two functions for users. The function rcbin() generates cluster binary data and the function iccbin() estimates ICC and it's CI. The users can choose appropriate ICC and its CI estimate from the wide selection of estimates from the outputs. The R package ICCbin presents very flexible and easy to use ways to generate cluster binary data and to estimate ICC and it's CI for binary response using different methods. The package ICCbin is freely available for use with R from the CRAN repository (https://cran.r-project.org/package=ICCbin). We believe that this package can be a very useful tool for researchers to design cluster randomized trials with binary outcome. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. Predicators of Success for Undergraduate Students Reinstated after an Academic Dismissal at a Small Midwest Private University Campus

    ERIC Educational Resources Information Center

    Meador, Ryan E.

    2012-01-01

    This study examined students who successfully applied for reinstatement after being academically dismissed for the first time in order to discover indicators of future success. This study examined 666 students' appeals filed at the DeVry University Kansas City campus between 2004 and 2009. Binary logistic regression was used to discover if a…

  11. HEPDOOP: High-Energy Physics Analysis using Hadoop

    NASA Astrophysics Data System (ADS)

    Bhimji, W.; Bristow, T.; Washbrook, A.

    2014-06-01

    We perform a LHC data analysis workflow using tools and data formats that are commonly used in the "Big Data" community outside High Energy Physics (HEP). These include Apache Avro for serialisation to binary files, Pig and Hadoop for mass data processing and Python Scikit-Learn for multi-variate analysis. Comparison is made with the same analysis performed with current HEP tools in ROOT.

  12. Eclipsing Binaries with Possible Tertiary Components

    NASA Astrophysics Data System (ADS)

    Snyder, LeRoy F.

    2013-05-01

    Many eclipsing binary star systems (EBS) show long-term variations in their orbital periods which are evident in their O-C (observed minus calculated period) diagrams. This research carried out an analysis of 324 eclipsing binary systems taken from the systems analyzed in the Bob Nelson's O-C Files database. Of these 18 systems displayed evidence of periodic variations of the arrival times of the eclipses. These rates of period changes are sinusoidal variations. The sinusoidal character of these variations is suggestive of Keplerian motion caused by an orbiting companion. The reason for these changes is unknown, but mass loss, apsidal motion, magnetic activity and the presence of a third body have been proposed. This paper has assumed light time effect as the cause of the sinusoidal variations caused by the gravitational pull of a tertiary companion orbiting around the eclipsing binary systems. An observed minus calculated (O-C) diagram of the 324 systems was plotted using a quadratic ephemeris to determine if the system displayed a sinusoidal trend in theO-C residuals. After analysis of the 18 systems, seven systems, AW UMa, BB PEG, OO Aql, V508 Oph, VW Cep, WCrv and YY ERI met the benchmark of the criteria of a possible orbiting companion. The other 11 systems displayed a sinusoidal variation in the O-C residuals of the primary eclipses but these systems in the Bob Nelson's O-C Files did not contain times of minimum (Tmin) of the secondary eclipses and therefore not conclusive in determining the presents of the effects of a tertiary companion. An analysis of the residuals of the seven systems yields a light-time semi-amplitude, orbital period, eccentricity and mass of the tertiary companion as the amplitude of the variation is proportional to the mass, period and inclination of the 3rd orbiting body. Knowing the low mass of the tertiary body in the seven cases the possibility of five of these tertiary companions being brown dwarfs is discussed.

  13. ThermalTracker Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    The software processes recorded thermal video and detects the flight tracks of birds and bats that passed through the camera's field of view. The output is a set of images that show complete flight tracks for any detections, with the direction of travel indicated and the thermal image of the animal delineated. A report of the descriptive features of each detected track is also output in the form of a comma-separated value text file.

  14. Effect of Spatial Locality Prefetching on Structural Locality

    DTIC Science & Technology

    1991-12-01

    Pollution module calculates the SLC and CAM cache pollution percentages. And finally, the Generate Reference Frequency List module produces the output...3.2.5 Generate Reference Frequency List 3.2.6 Each program module in the structure chart is mapped into an Ada package. By performing this encapsulation...call routine to generate reference -- frequency list -- end if -- end loop -- close input, output, and reference files end Cache Simulator Figure 3.5

  15. Coupled binary embedding for large-scale image retrieval.

    PubMed

    Zheng, Liang; Wang, Shengjin; Tian, Qi

    2014-08-01

    Visual matching is a crucial step in image retrieval based on the bag-of-words (BoW) model. In the baseline method, two keypoints are considered as a matching pair if their SIFT descriptors are quantized to the same visual word. However, the SIFT visual word has two limitations. First, it loses most of its discriminative power during quantization. Second, SIFT only describes the local texture feature. Both drawbacks impair the discriminative power of the BoW model and lead to false positive matches. To tackle this problem, this paper proposes to embed multiple binary features at indexing level. To model correlation between features, a multi-IDF scheme is introduced, through which different binary features are coupled into the inverted file. We show that matching verification methods based on binary features, such as Hamming embedding, can be effectively incorporated in our framework. As an extension, we explore the fusion of binary color feature into image retrieval. The joint integration of the SIFT visual word and binary features greatly enhances the precision of visual matching, reducing the impact of false positive matches. Our method is evaluated through extensive experiments on four benchmark datasets (Ukbench, Holidays, DupImage, and MIR Flickr 1M). We show that our method significantly improves the baseline approach. In addition, large-scale experiments indicate that the proposed method requires acceptable memory usage and query time compared with other approaches. Further, when global color feature is integrated, our method yields competitive performance with the state-of-the-arts.

  16. Method and apparatus for detecting a desired behavior in digital image data

    DOEpatents

    Kegelmeyer, Jr., W. Philip

    1997-01-01

    A method for detecting stellate lesions in digitized mammographic image data includes the steps of prestoring a plurality of reference images, calculating a plurality of features for each of the pixels of the reference images, and creating a binary decision tree from features of randomly sampled pixels from each of the reference images. Once the binary decision tree has been created, a plurality of features, preferably including an ALOE feature (analysis of local oriented edges), are calculated for each of the pixels of the digitized mammographic data. Each of these plurality of features of each pixel are input into the binary decision tree and a probability is determined, for each of the pixels, corresponding to the likelihood of the presence of a stellate lesion, to create a probability image. Finally, the probability image is spatially filtered to enforce local consensus among neighboring pixels and the spatially filtered image is output.

  17. Method and apparatus for detecting a desired behavior in digital image data

    DOEpatents

    Kegelmeyer, Jr., W. Philip

    1997-01-01

    A method for detecting stellate lesions in digitized mammographic image data includes the steps of prestoring a plurality of reference images, calculating a plurality of features for each of the pixels of the reference images, and creating a binary decision tree from features of randomly sampled pixels from each of the reference images. Once the binary decision tree has been created, a plurality of features, preferably including an ALOE feature (analysis of local oriented edges), are calculated for each of the pixels of the digitized mammographic data. Each of these plurality of features of each pixel are input into the binary decision tree and a probability is determined, for each of the pixels, corresponding to the likelihood of the presence of a stellate lesion, to create a probability image. Finally, the probability image is spacially filtered to enforce local consensus among neighboring pixels and the spacially filtered image is output.

  18. File-based data flow in the CMS Filter Farm

    NASA Astrophysics Data System (ADS)

    Andre, J.-M.; Andronidis, A.; Bawej, T.; Behrens, U.; Branson, J.; Chaze, O.; Cittolin, S.; Darlea, G.-L.; Deldicque, C.; Dobson, M.; Dupont, A.; Erhan, S.; Gigi, D.; Glege, F.; Gomez-Ceballos, G.; Hegeman, J.; Holzner, A.; Jimenez-Estupiñán, R.; Masetti, L.; Meijers, F.; Meschi, E.; Mommsen, R. K.; Morovic, S.; Nunez-Barranco-Fernandez, C.; O'Dell, V.; Orsini, L.; Paus, C.; Petrucci, A.; Pieri, M.; Racz, A.; Roberts, P.; Sakulin, H.; Schwick, C.; Stieger, B.; Sumorok, K.; Veverka, J.; Zaza, S.; Zejdl, P.

    2015-12-01

    During the LHC Long Shutdown 1, the CMS Data Acquisition system underwent a partial redesign to replace obsolete network equipment, use more homogeneous switching technologies, and prepare the ground for future upgrades of the detector front-ends. The software and hardware infrastructure to provide input, execute the High Level Trigger (HLT) algorithms and deal with output data transport and storage has also been redesigned to be completely file- based. This approach provides additional decoupling between the HLT algorithms and the input and output data flow. All the metadata needed for bookkeeping of the data flow and the HLT process lifetimes are also generated in the form of small “documents” using the JSON encoding, by either services in the flow of the HLT execution (for rates etc.) or watchdog processes. These “files” can remain memory-resident or be written to disk if they are to be used in another part of the system (e.g. for aggregation of output data). We discuss how this redesign improves the robustness and flexibility of the CMS DAQ and the performance of the system currently being commissioned for the LHC Run 2.

  19. HiRel: Hybrid Automated Reliability Predictor (HARP) integrated reliability tool system, (version 7.0). Volume 4: HARP Output (HARPO) graphics display user's guide

    NASA Technical Reports Server (NTRS)

    Sproles, Darrell W.; Bavuso, Salvatore J.

    1994-01-01

    The Hybrid Automated Reliability Predictor (HARP) integrated Reliability (HiRel) tool system for reliability/availability prediction offers a toolbox of integrated reliability/availability programs that can be used to customize the user's application in a workstation or nonworkstation environment. HiRel consists of interactive graphical input/output programs and four reliability/availability modeling engines that provide analytical and simulative solutions to a wide host of highly reliable fault-tolerant system architectures and is also applicable to electronic systems in general. The tool system was designed at the outset to be compatible with most computing platforms and operating systems and some programs have been beta tested within the aerospace community for over 8 years. This document is a user's guide for the HiRel graphical postprocessor program HARPO (HARP Output). HARPO reads ASCII files generated by HARP. It provides an interactive plotting capability that can be used to display alternate model data for trade-off analyses. File data can also be imported to other commercial software programs.

  20. Work production using the two-way shape memory effect in NiTi and a Ni-rich NiTiHf high-temperature shape memory alloy

    NASA Astrophysics Data System (ADS)

    Atli, K. C.; Karaman, I.; Noebe, R. D.; Bigelow, G.; Gaydosh, D.

    2015-12-01

    The work output capacity of the two-way shape memory effect (TWSME) in a Ni50.3Ti29.7Hf20 (at%) high-temperature shape memory alloy (HTSMA) was investigated and compared to that of binary Ni49.9Ti50.1 (at%). TWSME was induced through a training procedure of 100 thermomechanical cycles under different tensile stresses. It was observed that TWSME in as-extruded and trained Ni50.3Ti29.7Hf20 could produce 0.7% strain against a compressive stress of 100 MPa, corresponding to a maximum work output of 0.08 J g-1, compared to a maximum value of 0.06 J g-1 for binary NiTi. A peak aging heat treatment of 3 h at 550 °C, which previously has been shown to result in near-perfect functional stability in Ni50.3Ti29.7Hf20 during isobaric thermal cycling, did not improve the TWSME and actually resulted in a decrease in the magnitude and stability of the TWSME and its work output capacity. Nevertheless, the magnitude of TWSM behavior of Ni50.3Ti29.7Hf20, in the absence of an aging heat treatment, renders it an attractive candidate for high-temperature TWSM actuation.

  1. A taxonomy-based approach to shed light on the babel of mathematical analogies for rice simulation

    USDA-ARS?s Scientific Manuscript database

    For most biophysical domains, different models are available and the extent to which their structures differ with respect to differences in outputs was never quantified. We use a taxonomy-based approach to address the question with thirteen rice models. Classification keys and binary attributes for ...

  2. Rounding Technique for High-Speed Digital Signal Processing

    NASA Technical Reports Server (NTRS)

    Wechsler, E. R.

    1983-01-01

    Arithmetic technique facilitates high-speed rounding of 2's complement binary data. Conventional rounding of 2's complement numbers presents problems in high-speed digital circuits. Proposed technique consists of truncating K + 1 bits then attaching bit in least significant position. Mean output error is zero, eliminating introducing voltage offset at input.

  3. Unitary Response Regression Models

    ERIC Educational Resources Information Center

    Lipovetsky, S.

    2007-01-01

    The dependent variable in a regular linear regression is a numerical variable, and in a logistic regression it is a binary or categorical variable. In these models the dependent variable has varying values. However, there are problems yielding an identity output of a constant value which can also be modelled in a linear or logistic regression with…

  4. Datacomputer Project

    DTIC Science & Technology

    1975-06-30

    assigned small inte- gers called Job File Numbers or JFNs with which future references are made. Since the name of the device on which a file...what could reasonably be called the "Datacomputer proper", and are the primary output of the Datacomputer project. They are conceptually and func...sections, each of which is broken into 512 word blocks called pages. When the Request Handler - 18 - MHM^MMMMMM 1 P ■■" Li..i..ii.i »I

  5. Fault detection and multiclassifier fusion for unmanned aerial vehicles (UAVs)

    NASA Astrophysics Data System (ADS)

    Yan, Weizhong

    2001-03-01

    UAVs demand more accurate fault accommodation for their mission manager and vehicle control system in order to achieve a reliability level that is comparable to that of a pilot aircraft. This paper attempts to apply multi-classifier fusion techniques to achieve the necessary performance of the fault detection function for the Lockheed Martin Skunk Works (LMSW) UAV Mission Manager. Three different classifiers that meet the design requirements of the fault detection of the UAAV are employed. The binary decision outputs from the classifiers are then aggregated using three different classifier fusion schemes, namely, majority vote, weighted majority vote, and Naieve Bayes combination. All of the three schemes are simple and need no retraining. The three fusion schemes (except the majority vote that gives an average performance of the three classifiers) show the classification performance that is better than or equal to that of the best individual. The unavoidable correlation between the classifiers with binary outputs is observed in this study. We conclude that it is the correlation between the classifiers that limits the fusion schemes to achieve an even better performance.

  6. The Effect of Latent Binary Variables on the Uncertainty of the Prediction of a Dichotomous Outcome Using Logistic Regression Based Propensity Score Matching.

    PubMed

    Szekér, Szabolcs; Vathy-Fogarassy, Ágnes

    2018-01-01

    Logistic regression based propensity score matching is a widely used method in case-control studies to select the individuals of the control group. This method creates a suitable control group if all factors affecting the output variable are known. However, if relevant latent variables exist as well, which are not taken into account during the calculations, the quality of the control group is uncertain. In this paper, we present a statistics-based research in which we try to determine the relationship between the accuracy of the logistic regression model and the uncertainty of the dependent variable of the control group defined by propensity score matching. Our analyses show that there is a linear correlation between the fit of the logistic regression model and the uncertainty of the output variable. In certain cases, a latent binary explanatory variable can result in a relative error of up to 70% in the prediction of the outcome variable. The observed phenomenon calls the attention of analysts to an important point, which must be taken into account when deducting conclusions.

  7. Displaying Composite and Archived Soundings in the Advanced Weather Interactive Processing System

    NASA Technical Reports Server (NTRS)

    Barrett, Joe H., III; Volkmer, Matthew R.; Blottman, Peter F.; Sharp, David W.

    2008-01-01

    In a previous task, the Applied Meteorology Unit (AMU) developed spatial and temporal climatologies of lightning occurrence based on eight atmospheric flow regimes. The AMU created climatological, or composite, soundings of wind speed and direction, temperature, and dew point temperature at four rawinsonde observation stations at Jacksonville, Tampa, Miami, and Cape Canaveral Air Force Station, for each of the eight flow regimes. The composite soundings were delivered to the National Weather Service (NWS) Melbourne (MLB) office for display using the National version of the Skew-T Hodograph analysis and Research Program (NSHARP) software program. The NWS MLB requested the AMU make the composite soundings available for display in the Advanced Weather Interactive Processing System (AWIPS), so they could be overlaid on current observed soundings. This will allow the forecasters to compare the current state of the atmosphere with climatology. This presentation describes how the AMU converted the composite soundings from NSHARP Archive format to Network Common Data Form (NetCDF) format, so that the soundings could be displayed in AWl PS. The NetCDF is a set of data formats, programming interfaces, and software libraries used to read and write scientific data files. In AWIPS, each meteorological data type, such as soundings or surface observations, has a unique NetCDF format. Each format is described by a NetCDF template file. Although NetCDF files are in binary format, they can be converted to a text format called network Common data form Description Language (CDL). A software utility called ncgen is used to create a NetCDF file from a CDL file, while the ncdump utility is used to create a CDL file from a NetCDF file. An AWIPS receives soundings in Binary Universal Form for the Representation of Meteorological data (BUFR) format (http://dss.ucar.edu/docs/formats/bufr/), and then decodes them into NetCDF format. Only two sounding files are generated in AWIPS per day. One file contains all of the soundings received worldwide between 0000 UTC and 1200 UTC, and the other includes all soundings between 1200 UTC and 0000 UTC. In order to add the composite soundings into AWIPS, a procedure was created to configure, or localize, AWIPS. This involved modifying and creating several configuration text files. A unique fourcharacter site identifier was created for each of the 32 soundings so each could be viewed separately. The first three characters were based on the site identifier of the observed sounding, while the last character was based on the flow regime. While researching the localization process for soundings, the AMU discovered a method of archiving soundings so old soundings would not get purged automatically by AWl PS. This method could provide an alternative way of localizing AWl PS for composite soundings. In addition, this would allow forecasters to use archived soundings in AWIPS for case studies. A test sounding file in NetCDF format was written in order to verify the correct format for soundings in AWIPS. After the file was viewed successfully in AWIPS, the AMU wrote a software program in the Tool Command Language/Tool Kit (Tcl/Tk) language to convert the 32 composite soundings from NSHARP Archive to CDL format. The ncgen utility was then used to convert the CDL file to a NetCDF file. The NetCDF file could then be read and displayed in AWIPS.

  8. User-Friendly Data Servers for Climate Studies at the Asia-Pacific Data-Research Center (APDRC)

    NASA Astrophysics Data System (ADS)

    Yuan, G.; Shen, Y.; Zhang, Y.; Merrill, R.; Waseda, T.; Mitsudera, H.; Hacker, P.

    2002-12-01

    The APDRC was recently established within the International Pacific Research Center (IPRC) at the University of Hawaii. The APDRC mission is to increase understanding of climate variability in the Asia-Pacific region by developing the computational, data-management, and networking infrastructure necessary to make data resources readily accessible and usable by researchers, and by undertaking data-intensive research activities that will both advance knowledge and lead to improvements in data preparation and data products. A focus of recent activity is the implementation of user-friendly data servers. The APDRC is currently running a Live Access Server (LAS) developed at NOAA/PMEL to provide access to and visualization of gridded climate products via the web. The LAS also allows users to download the selected data subsets in various formats (such as binary, netCDF and ASCII). Most of the datasets served by the LAS are also served through our OPeNDAP server (formerly DODS), which allows users to directly access the data using their desktop client tools (e.g. GrADS, Matlab and Ferret). In addition, the APDRC is running an OPeNDAP Catalog/Aggregation Server (CAS) developed by Unidata at UCAR to serve climate data and products such as model output and satellite-derived products. These products are often large (> 2 GB) and are therefore stored as multiple files (stored separately in time or in parameters). The CAS remedies the inconvenience of multiple files and allows access to the whole dataset (or any subset that cuts across the multiple files) via a single request command from any DODS enabled client software. Once the aggregation of files is configured at the server (CAS), the process of aggregation is transparent to the user. The user only needs to know a single URL for the entire dataset, which is, in fact, stored as multiple files. CAS even allows aggregation of files on different systems and at different locations. Currently, the APDRC is serving NCEP, ECMWF, SODA, WOCE-Satellite, TMI, GPI and GSSTF products through the CAS. The APDRC is also running an EPIC server developed by PMEL/NOAA. EPIC is a web-based, data search and display system suited for in situ (station versus gridded) data. The process of locating and selecting individual station data from large collections (millions of profiles or time series, etc.) of in situ data is a major challenge. Serving in situ data on the Internet faces two problems: the irregularity of data formats; and the large quantity of data files. To solve the first problem, we have converted the in situ data into netCDF data format. The second problem was solved by using the EPIC server, which allows users to easily subset the files using a friendly graphical interface. Furthermore, we enhanced the capability of EPIC and configured OPeNDAP into EPIC to serve the numerous in situ data files and to export them to users through two different options: 1) an OPeNDAP pointer file of user-selected data files; and 2) a data package that includes meta-information (e.g., location, time, cruise no, etc.), a local pointer file, and the data files that the user selected. Option 1) is for those who do not want to download the selected data but want to use their own application software (such as GrADS, Matlab and Ferret) for access and analysis; option 2) is for users who want to store the data on their own system (e.g. laptops before going for a cruise) for subsequent analysis. Currently, WOCE CTD and bottle data, the WOCE current meter data, and some Argo float data are being served on the EPIC server.

  9. Reporting Differences Between Spacecraft Sequence Files

    NASA Technical Reports Server (NTRS)

    Khanampompan, Teerapat; Gladden, Roy E.; Fisher, Forest W.

    2010-01-01

    A suite of computer programs, called seq diff suite, reports differences between the products of other computer programs involved in the generation of sequences of commands for spacecraft. These products consist of files of several types: replacement sequence of events (RSOE), DSN keyword file [DKF (wherein DSN signifies Deep Space Network)], spacecraft activities sequence file (SASF), spacecraft sequence file (SSF), and station allocation file (SAF). These products can include line numbers, request identifications, and other pieces of information that are not relevant when generating command sequence products, though these fields can result in the appearance of many changes to the files, particularly when using the UNIX diff command to inspect file differences. The outputs of prior software tools for reporting differences between such products include differences in these non-relevant pieces of information. In contrast, seq diff suite removes the fields containing the irrelevant pieces of information before processing to extract differences, so that only relevant differences are reported. Thus, seq diff suite is especially useful for reporting changes between successive versions of the various products and in particular flagging difference in fields relevant to the sequence command generation and review process.

  10. Software support for SBGN maps: SBGN-ML and LibSBGN.

    PubMed

    van Iersel, Martijn P; Villéger, Alice C; Czauderna, Tobias; Boyd, Sarah E; Bergmann, Frank T; Luna, Augustin; Demir, Emek; Sorokin, Anatoly; Dogrusoz, Ugur; Matsuoka, Yukiko; Funahashi, Akira; Aladjem, Mirit I; Mi, Huaiyu; Moodie, Stuart L; Kitano, Hiroaki; Le Novère, Nicolas; Schreiber, Falk

    2012-08-01

    LibSBGN is a software library for reading, writing and manipulating Systems Biology Graphical Notation (SBGN) maps stored using the recently developed SBGN-ML file format. The library (available in C++ and Java) makes it easy for developers to add SBGN support to their tools, whereas the file format facilitates the exchange of maps between compatible software applications. The library also supports validation of maps, which simplifies the task of ensuring compliance with the detailed SBGN specifications. With this effort we hope to increase the adoption of SBGN in bioinformatics tools, ultimately enabling more researchers to visualize biological knowledge in a precise and unambiguous manner. Milestone 2 was released in December 2011. Source code, example files and binaries are freely available under the terms of either the LGPL v2.1+ or Apache v2.0 open source licenses from http://libsbgn.sourceforge.net. sbgn-libsbgn@lists.sourceforge.net.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wackerbarth, David

    Sandia National Laboratories has developed a computer program to review, reduce and manipulate waveform data. PlotData is designed for post-acquisition waveform data analysis. PlotData is both a post-acquisition and an advanced interactive data analysis environment. PlotData requires unidirectional waveform data with both uniform and discrete time-series measurements. PlotData operates on a National Instruments' LabVIEW™ software platform. Using PlotData, the user can capture waveform data from digitizing oscilloscopes over a GPIB, USB and Ethernet interface from Tektronix, Lecroy or Agilent scopes. PlotData can both import and export several types of binary waveform files including, but not limited to, Tektronix .wmf files,more » Lecroy.trc files and xy pair ASCIIfiles. Waveform manipulation includes numerous math functions, integration, differentiation, smoothing, truncation, and other specialized data reduction routines such as VISAR, POV, PVDF (Bauer) piezoelectric gauges, and piezoresistive gauges such as carbon manganin pressure gauges.« less

  12. VizieR Online Data Catalog: OGLE II SMC eclipsing binaries (Wyrzykowski+, 2004)

    NASA Astrophysics Data System (ADS)

    Wyrzykowski, L.; Udalski, A.; Kubiak, M.; Szymanski, M. K.; Zebrun, K.; Soszinski, I.; Wozniak, P. R.; Pietrzynski, G.; Szewczyk, O.

    2009-03-01

    We present new version of the OGLE-II catalog of eclipsing binary stars detected in the Small Magellanic Cloud, based on Difference Image Analysis catalog of variable stars in the Magellanic Clouds containing data collected from 1997 to 2000. We found 1351 eclipsing binary stars in the central 2.4 square degree area of the SMC. 455 stars are newly discovered objects, not found in the previous release of the catalog. The eclipsing objects were selected with the automatic search algorithm based on the artificial neural network. The full catalog with individual photometry is accessible from the OGLE INTERNET archive, at ftp://sirius.astrouw.edu.pl/ogle/ogle2/var_stars/smc/ecl . Regular observations of the SMC fields started on June 26, 1997 and covered about 2.4 square degrees of central parts of the SMC. Reductions of the photometric data collected up to the end of May 2000 were performed with the Difference Image Analysis (DIA) package. (1 data file).

  13. Selected micrometeorological and soil-moisture data at Amargosa Desert Research Site, an arid site near Beatty, Nye County, Nevada, 1998-2000

    USGS Publications Warehouse

    Johnson, Michael J.; Mayers, Charles J.; Andraski, Brian J.

    2002-01-01

    Selected micrometeorological and soil-moisture data were collected at the Amargosa Desert Research Site adjacent to a low-level radioactive waste and hazardous chemical waste facility near Beatty, Nev., 1998-2000. Data were collected in support of ongoing research studies to improve the understanding of hydrologic and contaminant-transport processes in arid environments. Micrometeorological data include precipitation, air temperature, solar radiation, net radiation, relative humidity, ambient vapor pressure, wind speed and direction, barometric pressure, soil temperature, and soil-heat flux. All micrometeorological data were collected using a 10-second sampling interval by data loggers that output daily mean, maximum, and minimum values, and hourly mean values. For precipitation, data output consisted of daily, hourly, and 5-minute totals. Soil-moisture data included periodic measurements of soil-water content at nine neutron-probe access tubes with measurable depths ranging from 5.25 to 29.75 meters. The computer data files included in this report contain the complete micrometeorological and soil-moisture data sets. The computer data consists of seven files with about 14 megabytes of information. The seven files are in tabular format: (1) one file lists daily mean, maximum, and minimum micrometeorological data and daily total precipitation; (2) three files list hourly mean micrometeorological data and hourly precipitation for each year (1998-2000); (3) one file lists 5-minute precipitation data; (4) one file lists mean soil-water content by date and depth at four experimental sites; and (5) one file lists soil-water content by date and depth for each neutron-probe access tube. This report highlights selected data contained in the computer data files using figures, tables, and brief discussions. Instrumentation used for data collection also is described. Water-content profiles are shown to demonstrate variability of water content with depth. Time-series data are plotted to illustrate temporal variations in micrometeorological and soil-water content data. Substantial precipitation at the end of an El Ni?o cycle in early 1998 resulted in measurable water penetration to a depth of 1.25 meters at one of the four experimental soil-monitoring sites.

  14. NREL: SMARTS - About SMARTS

    Science.gov Websites

    its references list. To use SMARTS, users construct text files of 20-30 lines of simple text and ' output consists of spreadsheet-compatible American Standard Code for Information Interchange (ASCII) text

  15. VizieR Online Data Catalog: Chromospherically Active Binaries. Third version (Eker+, 2008)

    NASA Astrophysics Data System (ADS)

    Eker, Z.; Filiz-Ak, N.; Bilir, S.; Dogru, D.; Tuysuz, M.; Soydugan, E.; Bakis, H.; Ugras, B.; Soydugan, F.; Erdem, A.; Demircan, O.

    2008-06-01

    Chromospherically Active Binaries (CAB) catalogue have been revised and updated. With 203 new identifications, the number of CAB stars is increased to 409. Catalogue is available in electronic format where each system has various number of lines (sub-orders) with a unique order number. Columns contain data of limited number of selected cross references, comments to explain peculiarities and position of the binarity in case it belongs to a multiple system, classical identifications (RS CVn, BY Dra), brightness and colours, photometric and spectroscopic data, description of emission features (Ca II H&K, Hα, UV, IR), X-Ray luminosity, radio flux, physical quantities and orbital information, where each basic entry are referenced so users can go original sources. (10 data files).

  16. VizieR Online Data Catalog: The ELM survey. VI. 11 new ELM WD binaries (Gianninas+, 2015)

    NASA Astrophysics Data System (ADS)

    Gianninas, A.; Kilic, M.; Brown, W. R.; Canton, P.; Kenyon, S. J.

    2016-02-01

    We used the 6.5m MMT telescope equipped with the Blue Channel spectrograph, the 200 inch Hale telescope equipped with the Double spectrograph, the Kitt Peak National Observatory 4m telescope equipped with the R-C spectrograph, and more recently with Kitt Peak Ohio State Multi-Object Spectrograph (KOSMOS), to obtain spectroscopy of our 11 targets in several observing runs. We have also been obtaining radial-velocity measurements for candidates from other sources including the Large Sky Area Multi-Object Spectroscopy Telescope (LAMOST). Those 11 new Extremely low-mass white dwarf (ELM WD) binaries bring the total of ELM WDs identified by the ELM Survey up to 73. (4 data files).

  17. BOREAS Soils Data over the SSA in Raster Format and AEAC Projection

    NASA Technical Reports Server (NTRS)

    Knapp, David; Rostad, Harold; Hall, Forrest G. (Editor)

    2000-01-01

    This data set consists of GIS layers that describe the soils of the BOREAS SSA. The original data were submitted as vector layers that were gridded by BOREAS staff to a 30-meter pixel size in the AEAC projection. These data layers include the soil code (which relates to the soil name), modifier (which also relates to the soil name), and extent (indicating the extent that this soil exists within the polygon). There are three sets of these layers representing the primary, secondary, and tertiary soil characteristics. Thus, there is a total of nine layers in this data set along with supporting files. The data are stored in binary, image format files.

  18. BOREAS RSS-8 Snow Maps Derived from Landsat TM Imagery

    NASA Technical Reports Server (NTRS)

    Hall, Dorothy; Chang, Alfred T. C.; Foster, James L.; Chien, Janeet Y. L.; Hall, Forrest G. (Editor); Nickeson, Jaime (Editor); Smith, David E. (Technical Monitor)

    2000-01-01

    The Boreal Ecosystem-Atmosphere Study (BOREAS) Remote Sensing Science (RSS)-8 team utilized Landsat Thematic Mapper (TM) images to perform mapping of snow extent over the Southern Study Area (SSA). This data set consists of two Landsat TM images that were used to determine the snow-covered pixels over the BOREAS SSA on 18 Jan 1993 and on 06 Feb 1994. The data are stored in binary image format files. The RSS-08 snow map data are available from the Earth Observing System Data and Information System (EOSDIS) Oak Ridge National Laboratory (ORNL) Distributed Active Archive Center (DAAC). The data files are available on a CD-ROM (see document number 20010000884).

  19. BOREAS RSS-20 POLDER Radiance Images From the NASA C-130

    NASA Technical Reports Server (NTRS)

    Leroy, M.; Hall, Forrest G. (Editor); Nickeson, Jaime (Editor); Smith, David E. (Technical Monitor)

    2000-01-01

    These Boreal Ecosystem-Atmosphere Study (BOREAS) Remote Sensing Science (RSS)-20 data are a subset of images collected by the Polarization and Directionality of Earth's Reflectance (POLDER) instrument over tower sites in the BOREAS study areas during the intensive field campaigns (IFCs) in 1994. The POLDER images presented here from the NASA ARC C-130 aircraft are made available for illustration purposes only. The data are stored in binary image-format files. The POLDER radiance images are available from the Earth Observing System Data and Information System (EOSDIS) Oak Ridge National Laboratory (ORNL) Distributed Active Archive Center (DAAC). The data files are available on a CD-ROM (see document number 20010000884).

  20. NASA Tech Briefs, March 2011

    NASA Technical Reports Server (NTRS)

    2011-01-01

    Topics covered include: Optimal Tuner Selection for Kalman-Filter-Based Aircraft Engine Performance Estimation; Airborne Radar Interferometric Repeat-Pass Processing; Plug-and-Play Environmental Monitoring Spacecraft Subsystem; Power-Combined GaN Amplifier with 2.28-W Output Power at 87 GHz; Wallops Ship Surveillance System; Source Lines Counter (SLiC) Version 4.0; Guidance, Navigation, and Control Program; Single-Frame Terrain Mapping Software for Robotic Vehicles; Auto Draw from Excel Input Files; Observation Scheduling System; CFDP for Interplanetary Overlay Network; X-Windows Widget for Image Display; Binary-Signal Recovery; Volumetric 3D Display System with Static Screen; MMIC Replacement for Gunn Diode Oscillators; Feature Acquisition with Imbalanced Training Data; Mount Protects Thin-Walled Glass or Ceramic Tubes from Large Thermal and Vibration Loads; Carbon Nanotube-Based Structural Health Monitoring Sensors; Wireless Inductive Power Device Suppresses Blade Vibrations; Safe, Advanced, Adaptable Isolation System Eliminates the Need for Critical Lifts; Anti-Rotation Device Releasable by Insertion of a Tool; A Magnetically Coupled Cryogenic Pump; Single Piezo-Actuator Rotary-Hammering Drill; Fire-Retardant Polymeric Additives; Catalytic Generation of Lift Gases for Balloons; Ionic Liquids to Replace Hydrazine; Variable Emittance Electrochromics Using Ionic Electrolytes and Low Solar Absorptance Coatings; Spacecraft Radiator Freeze Protection Using a Regenerative Heat Exchanger; Multi-Mission Power Analysis Tool; Correction for Self-Heating When Using Thermometers as Heaters in Precision Control Applications; Gravitational Wave Detection with Single-Laser Atom Interferometers; Titanium Alloy Strong Back for IXO Mirror Segments; Improved Ambient Pressure Pyroelectric Ion Source; Multi-Modal Image Registration and Matching for Localization of a Balloon on Titan; Entanglement in Quantum-Classical Hybrid; Algorithm for Autonomous Landing; Quantum-Classical Hybrid for Information Processing; Small-Scale Dissipation in Binary-Species Transitional Mixing Layers; Superpixel-Augmented Endmember Detection for Hyperspectral Images; Coding for Parallel Links to Maximize the Expected Value of Decodable Messages; and Microwave Tissue Soldering for Immediate Wound Closure.

  1. Computerized Integrated Inventory Control for an Air Force Base-Level Supply System.

    DTIC Science & Technology

    1980-06-01

    3465 4710-4730 9110-9180 3515-3540 4810-4820 9320-9360 3620-3694 4910-4940 9505 -9540 3720-2750 5110-5180 9620-9650 3805-3030 5210-5280 3910-3995 4010...Buffers Disc Files Input 1.2 K Output 1.2 K 2.4 K Printer 2 (double) x 150 .0003 K Tape Input 1.2 K Output 1.2 K 2.4 K Card Reader 2 (double) x 100 . 0002

  2. A program to compute three-dimensional subsonic unsteady aerodynamic characteristics using the doublet lattic method, L216 (DUBFLX). Volume 1: Engineering and usage

    NASA Technical Reports Server (NTRS)

    Richard, M.; Harrison, B. A.

    1979-01-01

    The program input presented consists of configuration geometry, aerodynamic parameters, and modal data; output includes element geometry, pressure difference distributions, integrated aerodynamic coefficients, stability derivatives, generalized aerodynamic forces, and aerodynamic influence coefficient matrices. Optionally, modal data may be input on magnetic file (tape or disk), and certain geometric and aerodynamic output may be saved for subsequent use.

  3. NOSS altimeter algorithm specifications

    NASA Technical Reports Server (NTRS)

    Hancock, D. W.; Forsythe, R. G.; Mcmillan, J. D.

    1982-01-01

    A description of all algorithms required for altimeter processing is given. Each description includes title, description, inputs/outputs, general algebraic sequences and data volume. All required input/output data files are described and the computer resources required for the entire altimeter processing system were estimated. The majority of the data processing requirements for any radar altimeter of the Seasat-1 type are scoped. Additions and deletions could be made for the specific altimeter products required by other projects.

  4. GPFA-AB_Phase1ReservoirTask2DataUpload

    DOE Data Explorer

    Teresa E. Jordan

    2015-10-22

    This submission to the Geothermal Data Repository (GDR) node of the National Geothermal Data System (NGDS) in support of Phase 1 Low Temperature Geothermal Play Fairway Analysis for the Appalachian Basin. The files included in this zip file contain all data pertinent to the methods and results of this task’s output, which is a cohesive multi-state map of all known potential geothermal reservoirs in our region, ranked by their potential favorability. Favorability is quantified using a new metric, Reservoir Productivity Index, as explained in the Reservoirs Methodology Memo (included in zip file). Shapefile and images of the Reservoir Productivity and Reservoir Uncertainty are included as well.

  5. File formats commonly used in mass spectrometry proteomics.

    PubMed

    Deutsch, Eric W

    2012-12-01

    The application of mass spectrometry (MS) to the analysis of proteomes has enabled the high-throughput identification and abundance measurement of hundreds to thousands of proteins per experiment. However, the formidable informatics challenge associated with analyzing MS data has required a wide variety of data file formats to encode the complex data types associated with MS workflows. These formats encompass the encoding of input instruction for instruments, output products of the instruments, and several levels of information and results used by and produced by the informatics analysis tools. A brief overview of the most common file formats in use today is presented here, along with a discussion of related topics.

  6. Delay correlation analysis and representation for vital complaint VHDL models

    DOEpatents

    Rich, Marvin J.; Misra, Ashutosh

    2004-11-09

    A method and system unbind a rise/fall tuple of a VHDL generic variable and create rise time and fall time generics of each generic variable that are independent of each other. Then, according to a predetermined correlation policy, the method and system collect delay values in a VHDL standard delay file, sort the delay values, remove duplicate delay values, group the delay values into correlation sets, and output an analysis file. The correlation policy may include collecting all generic variables in a VHDL standard delay file, selecting each generic variable, and performing reductions on the set of delay values associated with each selected generic variable.

  7. Size reduction techniques for vital compliant VHDL simulation models

    DOEpatents

    Rich, Marvin J.; Misra, Ashutosh

    2006-08-01

    A method and system select delay values from a VHDL standard delay file that correspond to an instance of a logic gate in a logic model. Then the system collects all the delay values of the selected instance and builds super generics for the rise-time and the fall-time of the selected instance. Then, the system repeats this process for every delay value in the standard delay file (310) that correspond to every instance of every logic gate in the logic model. The system then outputs a reduced size standard delay file (314) containing the super generics for every instance of every logic gate in the logic model.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fried, Jack

    The TSPM receives and processes ASIC signals and transmits processed data over to the PC using an Ethernet cable. The data is given a location and a time stamp. This is the heart of the device as it gathers and stamps the timing and location of events on each of the ASICs. The five files for the TSPM are needed to manufacture Pet scanners that are based on the RatCAP (Rat Conscious Animal PET). They include a TSPM schematic, a raw data file to build the RatCAP TSPM, an output file that along with the assay file is used bymore » an assembly house to build the RatCAP TSPM, an assay file that provides the part list and XY location for the components that go on the RatCAP TSPM, firmware that includes the source code to program the FPGA, and a realized program on the TSPM based on the firmware.« less

  9. VizieR Online Data Catalog: SLoWPoKES-II catalog (Dhital+, 2015)

    NASA Astrophysics Data System (ADS)

    Dhital, S.; West, A. A.; Stassun, K. G.; Schluns, K. J.; Massey, A. P.

    2015-11-01

    We have identified the Sloan Low-mass Wide Pairs of Kinematically Equivalent Systems (SLoWPoKES)-II catalog of 105537 wide, low-mass binaries without using proper motions. We extend the SLoWPoKES catalog (Paper I; Dhital et al. 2010, cat. J/AJ/139/2566) by identifying binary systems with angular separations of 1-20'' based entirely on SDSS photometry and astrometry. As in Paper I, we used the Catalog Archive Server query tool (CasJobs6; http://skyserver.sdss3.org/CasJobs/) to select the sample of low-mass stars from the SDSS-DR8 star table as having r-i>=0.3 and i-z>=0.2, consistent with spectral types of K5 or later. Following Paper I (Dhital et al. 2010, cat. J/AJ/139/2566) we classified candidate pairs with a probability of chance alignment Pf{<=}0.05 as real binaries. We note that this limit does not have any physical motivation but was chosen to minimize the number of spurious pairs. This cut results in 105537 M dwarf (dM)+MS (see Table3), 78 white dwarf (WD)+dM (see Table5), and 184 sdM+sdM (see Table6) binary systems with separations of 1-20''. Of the dM+MS binaries, 44 are very low-mass (VLM) binary candidates (see Table4), with colors redder than the median M7 dwarf for both components. This represents a significant increase over the SLoWPoKES catalog of 1342 common proper motion (CPM) binaries that we presented in Paper I (Dhital et al. 2010, cat. J/AJ/139/2566). The SLoWPoKES and SLoWPoKES-II catalogs are available on the Filtergraph portal (http://slowpokes.vanderbilt.edu/). (4 data files).

  10. "One-sample concept" micro-combinatory for high throughput TEM of binary films.

    PubMed

    Sáfrán, György

    2018-04-01

    Phases of thin films may remarkably differ from that of bulk. Unlike to the comprehensive data files of Binary Phase Diagrams [1] available for bulk, complete phase maps for thin binary layers do not exist. This is due to both the diverse metastable, non-equilibrium or instable phases feasible in thin films and the required volume of characterization work with analytical techniques like TEM, SAED and EDS. The aim of the present work was to develop a method that remarkably facilitates the TEM study of the diverse binary phases of thin films, or the creation of phase maps. A micro-combinatorial method was worked out that enables both preparation and study of a gradient two-component film within a single TEM specimen. For a demonstration of the technique thin Mn x Al 1- x binary samples with evolving concentration from x = 0 to x = 1 have been prepared so that the transition from pure Mn to pure Al covers a 1.5 mm long track within the 3 mm diameter TEM grid. The proposed method enables the preparation and study of thin combinatorial samples including all feasible phases as a function of composition or other deposition parameters. Contrary to known "combinatorial chemistry", in which a series of different samples are deposited in one run, and investigated, one at a time, the present micro-combinatorial method produces a single specimen condensing a complete library of a binary system that can be studied, efficiently, within a single TEM session. That provides extremely high throughput for TEM characterization of composition-dependent phases, exploration of new materials, or the construction of phase diagrams of binary films. Copyright © 2018 Elsevier B.V. All rights reserved.

  11. Dynamic Test Generation for Large Binary Programs

    DTIC Science & Technology

    2009-11-12

    the fuzzing@whitestar.linuxbox.orgmailing list, including Jared DeMott, Disco Jonny, and Ari Takanen, for discussions on fuzzing tradeoffs. Martin...as is the case for large applications where exercising all execution paths is virtually hopeless anyway. This point will be further discussed in...consumes trace files generated by iDNA and virtually re-executes the recorded runs. TruScan offers several features that substantially simplify symbolic

  12. Tapir: A web interface for transit/eclipse observability

    NASA Astrophysics Data System (ADS)

    Jensen, Eric

    2013-06-01

    Tapir is a set of tools, written in Perl, that provides a web interface for showing the observability of periodic astronomical events, such as exoplanet transits or eclipsing binaries. The package provides tools for creating finding charts for each target and airmass plots for each event. The code can access target lists that are stored on-line in a Google spreadsheet or in a local text file.

  13. Fallon, Nevada FORGE Distinct Element Reservoir Modeling

    DOE Data Explorer

    Blankenship, Doug; Pettitt, Will; Riahi, Azadeh; Hazzard, Jim; Blanksma, Derrick

    2018-03-12

    Archive containing input/output data for distinct element reservoir modeling for Fallon FORGE. Models created using 3DEC, InSite, and in-house Python algorithms (ITASCA). List of archived files follows; please see 'Modeling Metadata.pdf' (included as a resource below) for additional file descriptions. Data sources include regional geochemical model, well positions and geometry, principal stress field, capability for hydraulic fractures, capability for hydro-shearing, reservoir geomechanical model-stimulation into multiple zones, modeled thermal behavior during circulation, and microseismicity.

  14. Up-to-date state of storage techniques used for large numerical data files

    NASA Technical Reports Server (NTRS)

    Chlouba, V.

    1975-01-01

    Methods for data storage and output in data banks and memory files are discussed along with a survey of equipment available for this. Topics discussed include magnetic tapes, magnetic disks, Terabit magnetic tape memory, Unicon 690 laser memory, IBM 1360 photostore, microfilm recording equipment, holographic recording, film readers, optical character readers, digital data storage techniques, and photographic recording. The individual types of equipment are summarized in tables giving the basic technical parameters.

  15. Navy Occupational Health Information Management System (NOHIMS). Environmental Exposure Module. Users’ Manual

    DTIC Science & Technology

    1987-01-16

    menus , controls user and device access to the system, manages the security features associated with menus , devices, and users, provides...in the files, or the number of files in the system. 2-2 3.0 MODULE INPUT PROCESSES 3.1 Summary of Input Processes The EE module contains many menu ...Output Processes The EE module contains many menu options which enable the user to obtain needed information from the module. These options can be

  16. User's guide to HYPOINVERSE-2000, a Fortran program to solve for earthquake locations and magnitudes

    USGS Publications Warehouse

    Klein, Fred W.

    2002-01-01

    Hypoinverse is a computer program that processes files of seismic station data for an earthquake (like p wave arrival times and seismogram amplitudes and durations) into earthquake locations and magnitudes. It is one of a long line of similar USGS programs including HYPOLAYR (Eaton, 1969), HYPO71 (Lee and Lahr, 1972), and HYPOELLIPSE (Lahr, 1980). If you are new to Hypoinverse, you may want to start by glancing at the section “SOME SIMPLE COMMAND SEQUENCES” to get a feel of some simpler sessions. This document is essentially an advanced user’s guide, and reading it sequentially will probably plow the reader into more detail than he/she needs. Every user must have a crust model, station list and phase data input files, and glancing at these sections is a good place to begin. The program has many options because it has grown over the years to meet the needs of one the largest seismic networks in the world, but small networks with just a few stations do use the program and can ignore most of the options and commands. History and availability. Hypoinverse was originally written for the Eclipse minicomputer in 1978 (Klein, 1978). A revised version for VAX and Pro-350 computers (Klein, 1985) was later expanded to include multiple crustal models and other capabilities (Klein, 1989). This current report documents the expanded Y2000 version and it supercedes the earlier documents. It serves as a detailed user's guide to the current version running on unix and VAX-alpha computers, and to the version supplied with the Earthworm earthquake digitizing system. Fortran-77 source code (Sun and VAX compatible) and copies of this documentation is available via anonymous ftp from computers in Menlo Park. At present, the computer is swave.wr.usgs.gov and the directory is /ftp/pub/outgoing/klein/hyp2000. If you are running Hypoinverse on one of the Menlo Park EHZ or NCSN unix computers, the executable currently is ~klein/hyp2000/hyp2000. New features. The Y2000 version of Hypoinverse includes all of the previous capabilities, but adds Y2000 formats to those defined earlier. In most cases, the new formats add 2 digits to the year field to accommodate the century. Other fields are sometimes rearranged or expanded to accommodate a better field order. The Y2000 formats are invoked with the “200” command. When the Y2000 flag is turned on, all files are read and written in the new format and there is no mixing of format types in a single run. Some formats without a date field, like station files, have not changed. A separate program called 2000CONV has been written to convert old formats to new. Other new features, like expanded station names, calculating amplitude magnitudes from a variety of digital seismometers, station history files, interactive earthquake processing, and locations from CUSP (Caltech USGS Seismic Processing) binary files have been added. General features. Hypoinverse will locate any number of events in an input file, which can be in one of several different formats. Any or all of printout, summary or archive output may be produced. Hypoinverse is driven by user commands. The various commands define input and output files, set adjustable parameters, and solve for locations of a file of earthquake data using the parameters and files currently set. It is both interactive and "batch" in that commands may be executed either from the keyboard or from a file. You execute the commands in a file by typing @filename at the Hypoinverse prompt. Users may either supply parameters on the command line, or omit them and are prompted interactively. The current parameter values are displayed and may be taken as defaults by pressing just the RETURN key after the prompt. This makes the program very easy to use, providing you can remember the names of the commands. Combining commands with and without their required parameters into a command file permits a variety of customized procedures such as automatic input of crustal model and station data, but prompting for a different phase file each time. All commands are 3 letters long and most require one or more parameters or file names. If they appear on a line with a command, character strings such as filenames must be enclosed in apostrophes (single quotes). Appendix 1 gives this and other free-format rules for supplying parameters, which are parsed in Fortran. When several parameters are required following a command, any of them may be omitted by replacing them with null fields (see appendix 1). A null field leaves that parameter unchanged from its current or default value. When you start HYPOINVERSE, default values are in effect for all parameters except file names. Hypoinverse is a complicated program with many features and options. Many of these "advanced" or seldom used features are documented here, but are more detailed than a typical user needs to read about when first starting with the program. I have put some of this material in smaller type so that a first time user can concentrate on the more important information.

  17. Performance of the Galley Parallel File System

    NASA Technical Reports Server (NTRS)

    Nieuwejaar, Nils; Kotz, David

    1996-01-01

    As the input/output (I/O) needs of parallel scientific applications increase, file systems for multiprocessors are being designed to provide applications with parallel access to multiple disks. Many parallel file systems present applications with a conventional Unix-like interface that allows the application to access multiple disks transparently. This interface conceals the parallism within the file system, which increases the ease of programmability, but makes it difficult or impossible for sophisticated programmers and libraries to use knowledge about their I/O needs to exploit that parallelism. Furthermore, most current parallel file systems are optimized for a different workload than they are being asked to support. We introduce Galley, a new parallel file system that is intended to efficiently support realistic parallel workloads. Initial experiments, reported in this paper, indicate that Galley is capable of providing high-performance 1/O to applications the applications that rely on them. In Section 3 we describe that access data in patterns that have been observed to be common.

  18. Sprinting for the Win; Distribution of Power Output in Women's Professional Cycling.

    PubMed

    Peiffer, Jeremiah J; Abbiss, Chris R; Haakonssen, Eric C; Menaspà, Paolo

    2018-04-24

    This study examined the power output distribution and sprint characteristics of professional female road cyclists. 31 race files, representing top-five finishes, were collected from seven professional female cyclists. Files were analysed for sprint characteristics including; mean and peak power output, velocity and duration. The final 20 min before the sprint was analysed to determine the mean maximal power output (MMP) consistent with 5, 15, 30, 60, 240 and 600s durations. Throughout the race, the number of efforts for each duration exceeding 80% of its corresponding final 20-min MMP (MMP 80 ) were determined. The number of 15s efforts exceeding 80% of the mean final sprint power output (MSP 80 ) were determined. Sprint finishes lasted 21.8 ± 6.7s with a mean and peak power output of 679 ± 101W and 886 ± 91W, respectively. Throughout the race, more 5, 15, and 30s efforts above MMP 80 were completed in the 5 th compared with the 1 st - 4 th quintiles of the race. 60s efforts were greater during the 5 th compared 1 st , 2 nd , and 4 th quintiles and during the 3 rd compared with 4 th quintile. More 240s efforts were recorded during the 5 th compared with 1 st and 4 th quintiles. 82% of 15s efforts above MSP 80 were completed in the 2 nd , 3 rd and 5 th quintiles of the race. This data demonstrates the variable nature of women's professional cycling and the physical demands necessary for success; thus providing information that could enhance in-race decision-making and the development of race-specific training programs.

  19. Format( )MEDIC( )Input

    NASA Astrophysics Data System (ADS)

    Foster, K.

    1994-09-01

    This document is a description of a computer program called Format( )MEDIC( )Input. The purpose of this program is to allow the user to quickly reformat wind velocity data in the Model Evaluation Database (MEDb) into a reasonable 'first cut' set of MEDIC input files (MEDIC.nml, StnLoc.Met, and Observ.Met). The user is cautioned that these resulting input files must be reviewed for correctness and completeness. This program will not format MEDb data into a Problem Station Library or Problem Metdata File. A description of how the program reformats the data is provided, along with a description of the required and optional user input and a description of the resulting output files. A description of the MEDb is not provided here but can be found in the RAS Division Model Evaluation Database Description document.

  20. Auto Draw from Excel Input Files

    NASA Technical Reports Server (NTRS)

    Strauss, Karl F.; Goullioud, Renaud; Cox, Brian; Grimes, James M.

    2011-01-01

    The design process often involves the use of Excel files during project development. To facilitate communications of the information in the Excel files, drawings are often generated. During the design process, the Excel files are updated often to reflect new input. The problem is that the drawings often lag the updates, often leading to confusion of the current state of the design. The use of this program allows visualization of complex data in a format that is more easily understandable than pages of numbers. Because the graphical output can be updated automatically, the manual labor of diagram drawing can be eliminated. The more frequent update of system diagrams can reduce confusion and reduce errors and is likely to uncover symmetric problems earlier in the design cycle, thus reducing rework and redesign.

  1. BOREAS TGB-12 Soil Carbon and Flux Data of NSA-MSA in Raster Format

    NASA Technical Reports Server (NTRS)

    Hall, Forrest G. (Editor); Knapp, David E. (Editor); Rapalee, Gloria; Davidson, Eric; Harden, Jennifer W.; Trumbore, Susan E.; Veldhuis, Hugo

    2000-01-01

    The BOREAS TGB-12 team made measurements of soil carbon inventories, carbon concentration in soil gases, and rates of soil respiration at several sites. This data set provides: (1) estimates of soil carbon stocks by horizon based on soil survey data and analyses of data from individual soil profiles; (2) estimates of soil carbon fluxes based on stocks, fire history, drain-age, and soil carbon inputs and decomposition constants based on field work using radiocarbon analyses; (3) fire history data estimating age ranges of time since last fire; and (4) a raster image and an associated soils table file from which area-weighted maps of soil carbon and fluxes and fire history may be generated. This data set was created from raster files, soil polygon data files, and detailed lab analysis of soils data that were received from Dr. Hugo Veldhuis, who did the original mapping in the field during 1994. Also used were soils data from Susan Trumbore and Jennifer Harden (BOREAS TGB-12). The binary raster file covers a 733-km 2 area within the NSA-MSA.

  2. BOREAS RSS-14 Level-2 GOES-7 Shortwave and Longwave Radiation Images

    NASA Technical Reports Server (NTRS)

    Hall, Forrest G. (Editor); Nickeson, Jaime (Editor); Gu, Jiujing; Smith, Eric A.

    2000-01-01

    The BOREAS RSS-14 team collected and processed several GOES-7 and GOES-8 image data sets that covered the BOREAS study region. This data set contains images of shortwave and longwave radiation at the surface and top of the atmosphere derived from collected GOES-7 data. The data cover the time period of 05-Feb-1994 to 20-Sep-1994. The images missing from the temporal series were zero-filled to create a consistent sequence of files. The data are stored in binary image format files. Due to the large size of the images, the level-1a GOES-7 data are not contained on the BOREAS CD-ROM set. An inventory listing file is supplied on the CD-ROM to inform users of what data were collected. The level-1a GOES-7 image data are available from the Earth Observing System Data and Information System (EOSDIS) Oak Ridge National Laboratory (ORNL) Distributed Active Archive Center (DAAC). See sections 15 and 16 for more information. The data files are available on a CD-ROM (see document number 20010000884).

  3. Nonvisual Route Following with Guidance from a Simple Haptic or Auditory Display

    ERIC Educational Resources Information Center

    Marston, James R.; Loomis, Jack M.; Klatzky, Roberta L.; Golledge, Reginald G.

    2007-01-01

    A path-following experiment, using a global positioning system, was conducted with participants who were legally blind. On- and off-course confirmations were delivered by either a vibrotactile or an audio stimulus. These simple binary cues were sufficient for guidance and point to the need to offer output options for guidance systems for people…

  4. Binary recursive partitioning: background, methods, and application to psychology.

    PubMed

    Merkle, Edgar C; Shaffer, Victoria A

    2011-02-01

    Binary recursive partitioning (BRP) is a computationally intensive statistical method that can be used in situations where linear models are often used. Instead of imposing many assumptions to arrive at a tractable statistical model, BRP simply seeks to accurately predict a response variable based on values of predictor variables. The method outputs a decision tree depicting the predictor variables that were related to the response variable, along with the nature of the variables' relationships. No significance tests are involved, and the tree's 'goodness' is judged based on its predictive accuracy. In this paper, we describe BRP methods in a detailed manner and illustrate their use in psychological research. We also provide R code for carrying out the methods.

  5. Defining Geodetic Reference Frame using Matlab®: PlatEMotion 2.0

    NASA Astrophysics Data System (ADS)

    Cannavò, Flavio; Palano, Mimmo

    2016-03-01

    We describe the main features of the developed software tool, namely PlatE-Motion 2.0 (PEM2), which allows inferring the Euler pole parameters by inverting the observed velocities at a set of sites located on a rigid block (inverse problem). PEM2 allows also calculating the expected velocity value for any point located on the Earth providing an Euler pole (direct problem). PEM2 is the updated version of a previous software tool initially developed for easy-to-use file exchange with the GAMIT/GLOBK software package. The software tool is developed in Matlab® framework and, as the previous version, includes a set of MATLAB functions (m-files), GUIs (fig-files), map data files (mat-files) and user's manual as well as some example input files. New changes in PEM2 include (1) some bugs fixed, (2) improvements in the code, (3) improvements in statistical analysis, (4) new input/output file formats. In addition, PEM2 can be now run under the majority of operating systems. The tool is open source and freely available for the scientific community.

  6. Recall of patterns using binary and gray-scale autoassociative morphological memories

    NASA Astrophysics Data System (ADS)

    Sussner, Peter

    2005-08-01

    Morphological associative memories (MAM's) belong to a class of artificial neural networks that perform the operations erosion or dilation of mathematical morphology at each node. Therefore we speak of morphological neural networks. Alternatively, the total input effect on a morphological neuron can be expressed in terms of lattice induced matrix operations in the mathematical theory of minimax algebra. Neural models of associative memories are usually concerned with the storage and the retrieval of binary or bipolar patterns. Thus far, the emphasis in research on morphological associative memory systems has been on binary models, although a number of notable features of autoassociative morphological memories (AMM's) such as optimal absolute storage capacity and one-step convergence have been shown to hold in the general, gray-scale setting. In previous papers, we gained valuable insight into the storage and recall phases of AMM's by analyzing their fixed points and basins of attraction. We have shown in particular that the fixed points of binary AMM's correspond to the lattice polynomials in the original patterns. This paper extends these results in the following ways. In the first place, we provide an exact characterization of the fixed points of gray-scale AMM's in terms of combinations of the original patterns. Secondly, we present an exact expression for the fixed point attractor that represents the output of either a binary or a gray-scale AMM upon presentation of a certain input. The results of this paper are confirmed in several experiments using binary patterns and gray-scale images.

  7. Pilot climate data system user's guide

    NASA Technical Reports Server (NTRS)

    Reph, M. G.; Treinish, L. A.; Bloch, L.

    1984-01-01

    Instructions for using the Pilot Climate Data System (PCDS), an interactive, scientific data management system for locating, obtaining, manipulating, and displaying climate-research data are presented. The PCDS currently provides this supoort for approximately twenty data sets. Figures that illustrate the terminal displays which a user sees when he/she runs the PCDS and some examples of the output from this system are included. The capabilities which are described in detail allow a user to perform the following: (1) obtain comprehensive descriptions of a number of climate parameter data sets and the associated sensor measurements from which they were derived; (2) obtain detailed information about the temporal coverage and data volume of data sets which are readily accessible via the PCDS; (3) extract portions of a data set using criteria such as time range and geographic location, and output the data to tape, user terminal, system printer, or online disk files in a special data-set-independent format; (4) access and manipulate the data in these data-set-independent files, performing such functions as combining the data, subsetting the data, and averaging the data; and (5) create various graphical representations of the data stored in the data-set-independent files.

  8. Network Visualization Project (NVP)

    DTIC Science & Technology

    2016-07-01

    network visualization, network traffic analysis, network forensics 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT UU 18. NUMBER OF...shell, is a command-line framework used for network forensic analysis. Dshell processes existing pcap files and filters output information based on

  9. Addendum I, BIOPLUME III Graphics Conversion to SURFER Format

    EPA Pesticide Factsheets

    This procedure can be used to create a SURFER® compatible grid file from Bioplume III input and output graphics. The input data and results from Bioplume III can be contoured and printed directly from SURFER.

  10. Overview of the ASA24® Researcher Website

    Cancer.gov

    The ASA24 Researcher Website allows researchers, clinicians, and teachers to register to use the ASA24 system for research, clinical practice, or teaching; to manage logistics of data collection; and to obtain analytic output files.

  11. Sight Version 0.1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bronevetsky, G.

    2014-09-01

    Enables applications to emit log information into an output file and produced a structured visual summary of the log data, as well as various statistical analyses of it. This makes it easier for developers to understand the behavior of their applications.

  12. Translator program converts computer printout into braille language

    NASA Technical Reports Server (NTRS)

    Powell, R. A.

    1967-01-01

    Computer program converts print image tape files into six dot Braille cells, enabling a blind computer programmer to monitor and evaluate data generated by his own programs. The Braille output is printed 8 lines per inch.

  13. Statistical evaluation of PACSTAT random number generation capabilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Piepel, G.F.; Toland, M.R.; Harty, H.

    1988-05-01

    This report summarizes the work performed in verifying the general purpose Monte Carlo driver-program PACSTAT. The main objective of the work was to verify the performance of PACSTAT's random number generation capabilities. Secondary objectives were to document (using controlled configuration management procedures) changes made in PACSTAT at Pacific Northwest Laboratory, and to assure that PACSTAT input and output files satisfy quality assurance traceability constraints. Upon receipt of the PRIME version of the PACSTAT code from the Basalt Waste Isolation Project, Pacific Northwest Laboratory staff converted the code to run on Digital Equipment Corporation (DEC) VAXs. The modifications to PACSTAT weremore » implemented using the WITNESS configuration management system, with the modifications themselves intended to make the code as portable as possible. Certain modifications were made to make the PACSTAT input and output files conform to quality assurance traceability constraints. 10 refs., 17 figs., 6 tabs.« less

  14. The distributed production system of the SuperB project: description and results

    NASA Astrophysics Data System (ADS)

    Brown, D.; Corvo, M.; Di Simone, A.; Fella, A.; Luppi, E.; Paoloni, E.; Stroili, R.; Tomassetti, L.

    2011-12-01

    The SuperB experiment needs large samples of MonteCarlo simulated events in order to finalize the detector design and to estimate the data analysis performances. The requirements are beyond the capabilities of a single computing farm, so a distributed production model capable of exploiting the existing HEP worldwide distributed computing infrastructure is needed. In this paper we describe the set of tools that have been developed to manage the production of the required simulated events. The production of events follows three main phases: distribution of input data files to the remote site Storage Elements (SE); job submission, via SuperB GANGA interface, to all available remote sites; output files transfer to CNAF repository. The job workflow includes procedures for consistency checking, monitoring, data handling and bookkeeping. A replication mechanism allows storing the job output on the local site SE. Results from 2010 official productions are reported.

  15. VizieR Online Data Catalog: Habitable zone code (Valle+, 2014)

    NASA Astrophysics Data System (ADS)

    Valle, G.; Dell'Omodarme, M.; Prada Moroni, P. G.; Degl'Innocenti, S.

    2014-06-01

    A C computation code that provide in output the distance dm (i for which the duration of habitability is longest, the corresponding duration tm (in Gyr), the width W (in AU) of the zone for which the habitability lasts tm/2, the inner (Ri) and outer (Ro) boundaries of the 4Gyr continuously habitable zone. The code read the input file HZ-input.dat, containing in each row the mass of the host star (range: 0.70-1.10M⊙), its metallicity (either Z (range: 0.005-0.004) or [Fe/H]), the helium-to-metal enrichment ratio (range: 1-3, standard value = 2), the equilibrium temperature for habitable zone outer boundary computation (range: 169-203K) and the planet Bond Albedo (range: 0.0-1.0, Earth = 0.3). The output is printed on-screen. Compilation: just use your favorite C compiler: gcc hz.c -lm -o HZ (2 data files).

  16. Mapping RNA-seq Reads with STAR

    PubMed Central

    Dobin, Alexander; Gingeras, Thomas R.

    2015-01-01

    Mapping of large sets of high-throughput sequencing reads to a reference genome is one of the foundational steps in RNA-seq data analysis. The STAR software package performs this task with high levels of accuracy and speed. In addition to detecting annotated and novel splice junctions, STAR is capable of discovering more complex RNA sequence arrangements, such as chimeric and circular RNA. STAR can align spliced sequences of any length with moderate error rates providing scalability for emerging sequencing technologies. STAR generates output files that can be used for many downstream analyses such as transcript/gene expression quantification, differential gene expression, novel isoform reconstruction, signal visualization, and so forth. In this unit we describe computational protocols that produce various output files, use different RNA-seq datatypes, and utilize different mapping strategies. STAR is Open Source software that can be run on Unix, Linux or Mac OS X systems. PMID:26334920

  17. Mapping RNA-seq Reads with STAR.

    PubMed

    Dobin, Alexander; Gingeras, Thomas R

    2015-09-03

    Mapping of large sets of high-throughput sequencing reads to a reference genome is one of the foundational steps in RNA-seq data analysis. The STAR software package performs this task with high levels of accuracy and speed. In addition to detecting annotated and novel splice junctions, STAR is capable of discovering more complex RNA sequence arrangements, such as chimeric and circular RNA. STAR can align spliced sequences of any length with moderate error rates, providing scalability for emerging sequencing technologies. STAR generates output files that can be used for many downstream analyses such as transcript/gene expression quantification, differential gene expression, novel isoform reconstruction, and signal visualization. In this unit, we describe computational protocols that produce various output files, use different RNA-seq datatypes, and utilize different mapping strategies. STAR is open source software that can be run on Unix, Linux, or Mac OS X systems. Copyright © 2015 John Wiley & Sons, Inc.

  18. Managing Data From Signal-Propagation Experiments

    NASA Technical Reports Server (NTRS)

    Kantak, A. V.

    1989-01-01

    Computer programs generate characteristic plots from amplitudes and phases. Software system enables minicomputer to process data on amplitudes and phases of signals received during experiments in ground-mobile/satellite radio propagation. Takes advantage of file-handling capabilities of UNIX operating system and C programming language. Interacts with user, under whose guidance programs in FORTRAN language generate plots of spectra or other curves of types commonly used to characterize signals. FORTRAN programs used to process file-handling outputs into any of several useful forms.

  19. Validation Results for LEWICE 2.0. [Supplement

    NASA Technical Reports Server (NTRS)

    Wright, William B.; Rutkowski, Adam

    1999-01-01

    Two CD-ROMs contain experimental ice shapes and code prediction used for validation of LEWICE 2.0 (see NASA/CR-1999-208690, CASI ID 19990021235). The data include ice shapes for both experiment and for LEWICE, all of the input and output files for the LEWICE cases, JPG files of all plots generated, an electronic copy of the text of the validation report, and a Microsoft Excel(R) spreadsheet containing all of the quantitative measurements taken. The LEWICE source code and executable are not contained on the discs.

  20. Inertial Manifolds for Navier-Stokes Equations and Related Dynamical Systems

    DTIC Science & Technology

    1991-05-31

    Graphics IRIS (SGI). The RLE files for the animation are loaded to an Abekas and recorded to tape by Betacam . This computational work was done by using the...scripts and comments, are loaded to the Abekas-A60 digital image storage device, and then recorded to the Betacam BVW-75 analog tape recorder. Static...interfacing, huge data files are output to the Data Vault parallelly with little cost. In addition to the SGIs, Abekas, Betacam and Solitaire, the

  1. User's manual for SYNC: A FORTRAN program for merging and time-synchronizing data

    NASA Technical Reports Server (NTRS)

    Maine, R. E.

    1981-01-01

    The FORTRAN 77 computer program SYNC for merging and time synchronizing data is described. The program SYNC reads one or more input files which contain either synchronous data frames or time-tagged data points, which can be compressed. The program decompresses and time synchronizes the data, correcting for any channel time skews. Interpolation and hold last value synchronization algorithms are available. The output from SYNC is a file of time synchronized data frames at any requested sample rate.

  2. A Prototype Model for Automating Nursing Diagnosis, Nurse Care Planning and Patient Classification.

    DTIC Science & Technology

    1986-03-01

    Each diagnosis has an assessment level. Assessment levels are defining characteristics observed by the nurse or subjectively stated by the patient... characteristics of this order line. Select IV Order (Figure 4.l.1.le] is the first screen of a series of three. Select IV Order has up to 10 selections...For I F Upatient orders. Input Files Used: IVC.Scr and Procfile.Prg * Output Files Used: None Calling Routine: IUB.Prg * Routine Called: None

  3. Conversion of the Forces Mobilization Model (FORCEMOB) from FORTRAN to C

    DTIC Science & Technology

    2015-08-01

    300 K !’"vale Data 18.192 K 136 K Slack 2.560 K 84 K Mapped File 412 K 412 K Sharel!ble 5.444 K 4.440 K Managed Heap - r age Table l.klusable...the C version of FORCEMOB is ready for operational use. This page is intentionally blank. v Contents 1. Introduction...without a graphical user interface (GUI): once run, FORCEMOB reads user-created input files, performs mathematical operations upon them, and outputs text

  4. Federal Logistics Information System (FLIS) Procedures Manual, Volume 1, Change 1

    DTIC Science & Technology

    1996-07-01

    Se- 2 KAT Add FLIS Data Base Data 1 curity Classified Characteristics KDZ Delete Logistics Transfer 3 Data KFA Match Through Association I KFC File...a Cancelled menus normally furnished with this DIC NSNIPSCN, Related Generic or (2) the segment Z data pertains to an NSN. or Reference Number FSC...8/9 KEC Output Exceeds AUTODIN Limitations 4,5 vols 8/9 KFA Match through Association 4 vols 8/9 KFC File Data Minus Security Classified Character- 4

  5. VizieR Online Data Catalog: EBHIS spectra and HI column density maps (Winkel+, 2016)

    NASA Astrophysics Data System (ADS)

    Winkel, B.; Kerp, J.; Floeer, L.; Kalberla, P. M. W.; Ben Bekhti, N.; Keller, R.; Lenz, D.

    2015-11-01

    The EBHIS 1st data release comprises 21-cm neutral atomic hydrogen data of the Milky Way (-600km/s

  6. Auto- and hetero-associative memory using a 2-D optical logic gate

    NASA Technical Reports Server (NTRS)

    Chao, Tien-Hsin

    1989-01-01

    An optical associative memory system suitable for both auto- and hetero-associative recall is demonstrated. This system utilizes Hamming distance as the similarity measure between a binary input and a memory image with the aid of a two-dimensional optical EXCLUSIVE OR (XOR) gate and a parallel electronics comparator module. Based on the Hamming distance measurement, this optical associative memory performs a nearest neighbor search and the result is displayed in the output plane in real-time. This optical associative memory is fast and noniterative and produces no output spurious states as compared with that of the Hopfield neural network model.

  7. Auto- and hetero-associative memory using a 2-D optical logic gate

    NASA Astrophysics Data System (ADS)

    Chao, Tien-Hsin

    1989-06-01

    An optical associative memory system suitable for both auto- and hetero-associative recall is demonstrated. This system utilizes Hamming distance as the similarity measure between a binary input and a memory image with the aid of a two-dimensional optical EXCLUSIVE OR (XOR) gate and a parallel electronics comparator module. Based on the Hamming distance measurement, this optical associative memory performs a nearest neighbor search and the result is displayed in the output plane in real-time. This optical associative memory is fast and noniterative and produces no output spurious states as compared with that of the Hopfield neural network model.

  8. NURD: an implementation of a new method to estimate isoform expression from non-uniform RNA-seq data

    PubMed Central

    2013-01-01

    Background RNA-Seq technology has been used widely in transcriptome study, and one of the most important applications is to estimate the expression level of genes and their alternative splicing isoforms. There have been several algorithms published to estimate the expression based on different models. Recently Wu et al. published a method that can accurately estimate isoform level expression by considering position-related sequencing biases using nonparametric models. The method has advantages in handling different read distributions, but there hasn’t been an efficient program to implement this algorithm. Results We developed an efficient implementation of the algorithm in the program NURD. It uses a binary interval search algorithm. The program can correct both the global tendency of sequencing bias in the data and local sequencing bias specific to each gene. The correction makes the isoform expression estimation more reliable under various read distributions. And the implementation is computationally efficient in both the memory cost and running time and can be readily scaled up for huge datasets. Conclusion NURD is an efficient and reliable tool for estimating the isoform expression level. Given the reads mapping result and gene annotation file, NURD will output the expression estimation result. The package is freely available for academic use at http://bioinfo.au.tsinghua.edu.cn/software/NURD/. PMID:23837734

  9. File-Based Data Flow in the CMS Filter Farm

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andre, J.M.; et al.

    2015-12-23

    During the LHC Long Shutdown 1, the CMS Data Acquisition system underwent a partial redesign to replace obsolete network equipment, use more homogeneous switching technologies, and prepare the ground for future upgrades of the detector front-ends. The software and hardware infrastructure to provide input, execute the High Level Trigger (HLT) algorithms and deal with output data transport and storage has also been redesigned to be completely file- based. This approach provides additional decoupling between the HLT algorithms and the input and output data flow. All the metadata needed for bookkeeping of the data flow and the HLT process lifetimes aremore » also generated in the form of small “documents” using the JSON encoding, by either services in the flow of the HLT execution (for rates etc.) or watchdog processes. These “files” can remain memory-resident or be written to disk if they are to be used in another part of the system (e.g. for aggregation of output data). We discuss how this redesign improves the robustness and flexibility of the CMS DAQ and the performance of the system currently being commissioned for the LHC Run 2.« less

  10. VizieR Online Data Catalog: A spectral approach to transit timing variations (Ofir+, 2018)

    NASA Astrophysics Data System (ADS)

    Ofir, A.; Xie, J.-W.; Jiang, C.-F.; Sari, R.; Aharonson, O.

    2018-03-01

    We used Kepler Data Release 24 as the source data and the Kepler Objects of Interest (KOIs) table downloaded from the NExSci archive on 2015 December 25 as the source of list of candidate signals, and processed 4706 object not dispositioned as "false positive". We remove eclipsing binaries from the candidates list (see section 4.1 for further details). (1 data file).

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Churchill, R. Michael

    Apache Spark is explored as a tool for analyzing large data sets from the magnetic fusion simulation code XGCI. Implementation details of Apache Spark on the NERSC Edison supercomputer are discussed, including binary file reading, and parameter setup. Here, an unsupervised machine learning algorithm, k-means clustering, is applied to XGCI particle distribution function data, showing that highly turbulent spatial regions do not have common coherent structures, but rather broad, ring-like structures in velocity space.

  12. Development of a Distributed Parallel Computing Framework to Facilitate Regional/Global Gridded Crop Modeling with Various Scenarios

    NASA Astrophysics Data System (ADS)

    Jang, W.; Engda, T. A.; Neff, J. C.; Herrick, J.

    2017-12-01

    Many crop models are increasingly used to evaluate crop yields at regional and global scales. However, implementation of these models across large areas using fine-scale grids is limited by computational time requirements. In order to facilitate global gridded crop modeling with various scenarios (i.e., different crop, management schedule, fertilizer, and irrigation) using the Environmental Policy Integrated Climate (EPIC) model, we developed a distributed parallel computing framework in Python. Our local desktop with 14 cores (28 threads) was used to test the distributed parallel computing framework in Iringa, Tanzania which has 406,839 grid cells. High-resolution soil data, SoilGrids (250 x 250 m), and climate data, AgMERRA (0.25 x 0.25 deg) were also used as input data for the gridded EPIC model. The framework includes a master file for parallel computing, input database, input data formatters, EPIC model execution, and output analyzers. Through the master file for parallel computing, the user-defined number of threads of CPU divides the EPIC simulation into jobs. Then, Using EPIC input data formatters, the raw database is formatted for EPIC input data and the formatted data moves into EPIC simulation jobs. Then, 28 EPIC jobs run simultaneously and only interesting results files are parsed and moved into output analyzers. We applied various scenarios with seven different slopes and twenty-four fertilizer ranges. Parallelized input generators create different scenarios as a list for distributed parallel computing. After all simulations are completed, parallelized output analyzers are used to analyze all outputs according to the different scenarios. This saves significant computing time and resources, making it possible to conduct gridded modeling at regional to global scales with high-resolution data. For example, serial processing for the Iringa test case would require 113 hours, while using the framework developed in this study requires only approximately 6 hours, a nearly 95% reduction in computing time.

  13. Alternative Geothermal Power Production Scenarios

    DOE Data Explorer

    Sullivan, John

    2014-03-14

    The information given in this file pertains to Argonne LCAs of the plant cycle stage for a set of ten new geothermal scenario pairs, each comprised of a reference and improved case. These analyses were conducted to compare environmental performances among the scenarios and cases. The types of plants evaluated are hydrothermal binary and flash and Enhanced Geothermal Systems (EGS) binary and flash plants. Each scenario pair was developed by the LCOE group using GETEM as a way to identify plant operational and resource combinations that could reduce geothermal power plant LCOE values. Based on the specified plant and well field characteristics (plant type, capacity, capacity factor and lifetime, and well numbers and depths) for each case of each pair, Argonne generated a corresponding set of material to power ratios (MPRs) and greenhouse gas and fossil energy ratios.

  14. VizieR Online Data Catalog: Stellar rotation in h Per (Moraux+, 2013)

    NASA Astrophysics Data System (ADS)

    Moraux, E.; Artemenko, S.; Bouvier, J.; Irwin, J.; Ibrahimov, M.; Magakian, T.; Grankin, K.; Nikogossian, E.; Cardoso, C.; Hodgkin, S.; Aigrain, S.; Movsessian, T. A.

    2013-10-01

    List of periodic objects ordered by periodogram peak power. The binary flag is set to 2 if the object has been identified as a binary in the i'CFHT, i'CFHT-K CMD and to 1 otherwise. The V and IC magnitudes as well as the membership flag (column before last) are from Currie et al. (2010, Cat. J/ApJS/186/191). The near infrared photometry has been obtained with WIRCam at CFHT and is from Cardoso et al. (in prep.). The last column indicates whether the object has been detected in Hα (Currie et al., 2007, Cat. J/ApJ/659/599) and/or in X-rays (Currie et al., 2009, Cat. J/AJ/137/3210; C. Argiroffi, priv. com). (1 data file).

  15. iTOUGH2 Universal Optimization Using the PEST Protocol

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Finsterle, S.A.

    2010-07-01

    iTOUGH2 (http://www-esd.lbl.gov/iTOUGH2) is a computer program for parameter estimation, sensitivity analysis, and uncertainty propagation analysis [Finsterle, 2007a, b, c]. iTOUGH2 contains a number of local and global minimization algorithms for automatic calibration of a model against measured data, or for the solution of other, more general optimization problems (see, for example, Finsterle [2005]). A detailed residual and estimation uncertainty analysis is conducted to assess the inversion results. Moreover, iTOUGH2 can be used to perform a formal sensitivity analysis, or to conduct Monte Carlo simulations for the examination for prediction uncertainties. iTOUGH2's capabilities are continually enhanced. As the name implies, iTOUGH2more » is developed for use in conjunction with the TOUGH2 forward simulator for nonisothermal multiphase flow in porous and fractured media [Pruess, 1991]. However, iTOUGH2 provides FORTRAN interfaces for the estimation of user-specified parameters (see subroutine USERPAR) based on user-specified observations (see subroutine USEROBS). These user interfaces can be invoked to add new parameter or observation types to the standard set provided in iTOUGH2. They can also be linked to non-TOUGH2 models, i.e., iTOUGH2 can be used as a universal optimization code, similar to other model-independent, nonlinear parameter estimation packages such as PEST [Doherty, 2008] or UCODE [Poeter and Hill, 1998]. However, to make iTOUGH2's optimization capabilities available for use with an external code, the user is required to write some FORTRAN code that provides the link between the iTOUGH2 parameter vector and the input parameters of the external code, and between the output variables of the external code and the iTOUGH2 observation vector. While allowing for maximum flexibility, the coding requirement of this approach limits its applicability to those users with FORTRAN coding knowledge. To make iTOUGH2 capabilities accessible to many application models, the PEST protocol [Doherty, 2007] has been implemented into iTOUGH2. This protocol enables communication between the application (which can be a single 'black-box' executable or a script or batch file that calls multiple codes) and iTOUGH2. The concept requires that for the application model: (1) Input is provided on one or more ASCII text input files; (2) Output is returned to one or more ASCII text output files; (3) The model is run using a system command (executable or script/batch file); and (4) The model runs to completion without any user intervention. For each forward run invoked by iTOUGH2, select parameters cited within the application model input files are then overwritten with values provided by iTOUGH2, and select variables cited within the output files are extracted and returned to iTOUGH2. It should be noted that the core of iTOUGH2, i.e., its optimization routines and related analysis tools, remains unchanged; it is only the communication format between input parameters, the application model, and output variables that are borrowed from PEST. The interface routines have been provided by Doherty [2007]. The iTOUGH2-PEST architecture is shown in Figure 1. This manual contains installation instructions for the iTOUGH2-PEST module, and describes the PEST protocol as well as the input formats needed in iTOUGH2. Examples are provided that demonstrate the use of model-independent optimization and analysis using iTOUGH2.« less

  16. SETI-EC: SETI Encryption Code

    NASA Astrophysics Data System (ADS)

    Heller, René

    2018-03-01

    The SETI Encryption code, written in Python, creates a message for use in testing the decryptability of a simulated incoming interstellar message. The code uses images in a portable bit map (PBM) format, then writes the corresponding bits into the message, and finally returns both a PBM image and a text (TXT) file of the entire message. The natural constants (c, G, h) and the wavelength of the message are defined in the first few lines of the code, followed by the reading of the input files and their conversion into 757 strings of 359 bits to give one page. Each header of a page, i.e. the little-endian binary code translation of the tempo-spatial yardstick, is calculated and written on-the-fly for each page.

  17. BOREAS TE-18 Landsat TM Maximum Likelihood Classification Image of the NSA

    NASA Technical Reports Server (NTRS)

    Hall, Forrest G. (Editor); Knapp, David

    2000-01-01

    The BOREAS TE-18 team focused its efforts on using remotely sensed data to characterize the successional and disturbance dynamics of the boreal forest for use in carbon modeling. The objective of this classification is to provide the BOREAS investigators with a data product that characterizes the land cover of the NSA. A Landsat-5 TM image from 20-Aug-1988 was used to derive this classification. A standard supervised maximum likelihood classification approach was used to produce this classification. The data are provided in a binary image format file. The data files are available on a CD-ROM (see document number 20010000884), or from the Oak Ridge National Laboratory (ORNL) Distributed Activity Archive Center (DAAC).

  18. File Formats Commonly Used in Mass Spectrometry Proteomics*

    PubMed Central

    Deutsch, Eric W.

    2012-01-01

    The application of mass spectrometry (MS) to the analysis of proteomes has enabled the high-throughput identification and abundance measurement of hundreds to thousands of proteins per experiment. However, the formidable informatics challenge associated with analyzing MS data has required a wide variety of data file formats to encode the complex data types associated with MS workflows. These formats encompass the encoding of input instruction for instruments, output products of the instruments, and several levels of information and results used by and produced by the informatics analysis tools. A brief overview of the most common file formats in use today is presented here, along with a discussion of related topics. PMID:22956731

  19. SWITCH user's manual

    NASA Technical Reports Server (NTRS)

    1987-01-01

    The planning program, SWITCH, and its surrounding changed-goal-replanning program, Runaround, are described. The evolution of SWITCH and Runaround from an earlier planner, DEVISER, is recounted. SWITCH's plan representation, and its process of building a plan by backward chaining with strict chronological backtracking, are described. A guide for writing knowledge base files is provided, as are narrative guides for installing the program, running it, and interacting with it while it is running. Some utility functions are documented. For the sake of completeness, a narrative guide to the experimental discrepancy-replanning feature is provided. Appendices contain knowledge base files for a blocksworld domain, and a DRIBBLE file illustrating the output from, and user interaction with, the program in that domain.

  20. Design of High Quality Chemical XOR Gates with Noise Reduction.

    PubMed

    Wood, Mackenna L; Domanskyi, Sergii; Privman, Vladimir

    2017-07-05

    We describe a chemical XOR gate design that realizes gate-response function with filtering properties. Such gate-response function is flat (has small gradients) at and in the vicinity of all the four binary-input logic points, resulting in analog noise suppression. The gate functioning involves cross-reaction of the inputs represented by pairs of chemicals to produce a practically zero output when both are present and nearly equal. This cross-reaction processing step is also designed to result in filtering at low output intensities by canceling out the inputs if one of the latter has low intensity compared with the other. The remaining inputs, which were not reacted away, are processed to produce the output XOR signal by chemical steps that result in filtering at large output signal intensities. We analyze the tradeoff resulting from filtering, which involves loss of signal intensity. We also discuss practical aspects of realizations of such XOR gates. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. Distributed Data Collection for the ATLAS EventIndex

    NASA Astrophysics Data System (ADS)

    Sánchez, J.; Fernández Casaní, A.; González de la Hoz, S.

    2015-12-01

    The ATLAS EventIndex contains records of all events processed by ATLAS, in all processing stages. These records include the references to the files containing each event (the GUID of the file) and the internal pointer to each event in the file. This information is collected by all jobs that run at Tier-0 or on the Grid and process ATLAS events. Each job produces a snippet of information for each permanent output file. This information is packed and transferred to a central broker at CERN using an ActiveMQ messaging system, and then is unpacked, sorted and reformatted in order to be stored and catalogued into a central Hadoop server. This contribution describes in detail the Producer/Consumer architecture to convey this information from the running jobs through the messaging system to the Hadoop server.

  2. Simulating storage part of application with Simgrid

    NASA Astrophysics Data System (ADS)

    Wang, Cong

    2017-10-01

    Design of a file system simulation and visualization system, using simgrid API and visualization techniques to help users understanding and improving the file system portion of their application. The core of the simulator is the API provided by simgrid, cluefs tracks and catches the procedure of the I/O operation. Run the simulator simulating this application to generate the output visualization file, which can visualize the I/O action proportion and time series. Users can also change the parameters in the configuration file to change the parameters of the storage system such as reading and writing bandwidth, users can also adjust the storage strategy, test the performance, getting reference to be much easier to optimize the storage system. We have tested all the aspects of the simulator, the results suggest that the simulator performance can be believable.

  3. The Convergence of High Performance Computing and Large Scale Data Analytics

    NASA Astrophysics Data System (ADS)

    Duffy, D.; Bowen, M. K.; Thompson, J. H.; Yang, C. P.; Hu, F.; Wills, B.

    2015-12-01

    As the combinations of remote sensing observations and model outputs have grown, scientists are increasingly burdened with both the necessity and complexity of large-scale data analysis. Scientists are increasingly applying traditional high performance computing (HPC) solutions to solve their "Big Data" problems. While this approach has the benefit of limiting data movement, the HPC system is not optimized to run analytics, which can create problems that permeate throughout the HPC environment. To solve these issues and to alleviate some of the strain on the HPC environment, the NASA Center for Climate Simulation (NCCS) has created the Advanced Data Analytics Platform (ADAPT), which combines both HPC and cloud technologies to create an agile system designed for analytics. Large, commonly used data sets are stored in this system in a write once/read many file system, such as Landsat, MODIS, MERRA, and NGA. High performance virtual machines are deployed and scaled according to the individual scientist's requirements specifically for data analysis. On the software side, the NCCS and GMU are working with emerging commercial technologies and applying them to structured, binary scientific data in order to expose the data in new ways. Native NetCDF data is being stored within a Hadoop Distributed File System (HDFS) enabling storage-proximal processing through MapReduce while continuing to provide accessibility of the data to traditional applications. Once the data is stored within HDFS, an additional indexing scheme is built on top of the data and placed into a relational database. This spatiotemporal index enables extremely fast mappings of queries to data locations to dramatically speed up analytics. These are some of the first steps toward a single unified platform that optimizes for both HPC and large-scale data analysis, and this presentation will elucidate the resulting and necessary exascale architectures required for future systems.

  4. Aberration-free superresolution imaging via binary speckle pattern encoding and processing

    NASA Astrophysics Data System (ADS)

    Ben-Eliezer, Eyal; Marom, Emanuel

    2007-04-01

    We present an approach that provides superresolution beyond the classical limit as well as image restoration in the presence of aberrations; in particular, the ability to obtain superresolution while extending the depth of field (DOF) simultaneously is tested experimentally. It is based on an approach, recently proposed, shown to increase the resolution significantly for in-focus images by speckle encoding and decoding. In our approach, an object multiplied by a fine binary speckle pattern may be located anywhere along an extended DOF region. Since the exact magnification is not known in the presence of defocus aberration, the acquired low-resolution image is electronically processed via a parallel-branch decoding scheme, where in each branch the image is multiplied by the same high-resolution synchronized time-varying binary speckle but with different magnification. Finally, a hard-decision algorithm chooses the branch that provides the highest-resolution output image, thus achieving insensitivity to aberrations as well as DOF variations. Simulation as well as experimental results are presented, exhibiting significant resolution improvement factors.

  5. METALLURGICAL PROGRAMS: CALCULATION OF MASS FROM VOLUME, DENSITY OF MIXTURES, AND CONVERSION OF ATOMIC TO WEIGHT PERCENT

    NASA Technical Reports Server (NTRS)

    Degroh, H.

    1994-01-01

    The Metallurgical Programs include three simple programs which calculate solutions to problems common to metallurgical engineers and persons making metal castings. The first program calculates the mass of a binary ideal (alloy) given the weight fractions and densities of the pure components and the total volume. The second program calculates the densities of a binary ideal mixture. The third program converts the atomic percentages of a binary mixture to weight percentages. The programs use simple equations to assist the materials staff with routine calculations. The Metallurgical Programs are written in Microsoft QuickBASIC for interactive execution and have been implemented on an IBM PC-XT/AT operating MS-DOS 2.1 or higher with 256K bytes of memory. All instructions needed by the user appear as prompts as the software is used. Data is input using the keyboard only and output is via the monitor. The Metallurgical programs were written in 1987.

  6. Polar codes for achieving the classical capacity of a quantum channel

    NASA Astrophysics Data System (ADS)

    Guha, Saikat; Wilde, Mark

    2012-02-01

    We construct the first near-explicit, linear, polar codes that achieve the capacity for classical communication over quantum channels. The codes exploit the channel polarization phenomenon observed by Arikan for classical channels. Channel polarization is an effect in which one can synthesize a set of channels, by ``channel combining'' and ``channel splitting,'' in which a fraction of the synthesized channels is perfect for data transmission while the other fraction is completely useless for data transmission, with the good fraction equal to the capacity of the channel. Our main technical contributions are threefold. First, we demonstrate that the channel polarization effect occurs for channels with classical inputs and quantum outputs. We then construct linear polar codes based on this effect, and the encoding complexity is O(N log N), where N is the blocklength of the code. We also demonstrate that a quantum successive cancellation decoder works well, i.e., the word error rate decays exponentially with the blocklength of the code. For a quantum channel with binary pure-state outputs, such as a binary-phase-shift-keyed coherent-state optical communication alphabet, the symmetric Holevo information rate is in fact the ultimate channel capacity, which is achieved by our polar code.

  7. MMTF-An efficient file format for the transmission, visualization, and analysis of macromolecular structures.

    PubMed

    Bradley, Anthony R; Rose, Alexander S; Pavelka, Antonín; Valasatava, Yana; Duarte, Jose M; Prlić, Andreas; Rose, Peter W

    2017-06-01

    Recent advances in experimental techniques have led to a rapid growth in complexity, size, and number of macromolecular structures that are made available through the Protein Data Bank. This creates a challenge for macromolecular visualization and analysis. Macromolecular structure files, such as PDB or PDBx/mmCIF files can be slow to transfer, parse, and hard to incorporate into third-party software tools. Here, we present a new binary and compressed data representation, the MacroMolecular Transmission Format, MMTF, as well as software implementations in several languages that have been developed around it, which address these issues. We describe the new format and its APIs and demonstrate that it is several times faster to parse, and about a quarter of the file size of the current standard format, PDBx/mmCIF. As a consequence of the new data representation, it is now possible to visualize structures with millions of atoms in a web browser, keep the whole PDB archive in memory or parse it within few minutes on average computers, which opens up a new way of thinking how to design and implement efficient algorithms in structural bioinformatics. The PDB archive is available in MMTF file format through web services and data that are updated on a weekly basis.

  8. MMTF—An efficient file format for the transmission, visualization, and analysis of macromolecular structures

    PubMed Central

    Pavelka, Antonín; Valasatava, Yana; Prlić, Andreas

    2017-01-01

    Recent advances in experimental techniques have led to a rapid growth in complexity, size, and number of macromolecular structures that are made available through the Protein Data Bank. This creates a challenge for macromolecular visualization and analysis. Macromolecular structure files, such as PDB or PDBx/mmCIF files can be slow to transfer, parse, and hard to incorporate into third-party software tools. Here, we present a new binary and compressed data representation, the MacroMolecular Transmission Format, MMTF, as well as software implementations in several languages that have been developed around it, which address these issues. We describe the new format and its APIs and demonstrate that it is several times faster to parse, and about a quarter of the file size of the current standard format, PDBx/mmCIF. As a consequence of the new data representation, it is now possible to visualize structures with millions of atoms in a web browser, keep the whole PDB archive in memory or parse it within few minutes on average computers, which opens up a new way of thinking how to design and implement efficient algorithms in structural bioinformatics. The PDB archive is available in MMTF file format through web services and data that are updated on a weekly basis. PMID:28574982

  9. Calculation Package for the Analysis of Performance of Cells 1-6, with Underdrain, of the Environmental Management Waste Management Facility Oak Ridge, Tennessee

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gonzales D.

    2010-03-30

    This calculation package presents the results of an assessment of the performance of the 6 cell design of the Environmental Management Waste Management Facility (EMWMF). The calculations show that the new cell 6 design at the EMWMF meets the current WAC requirement. QA/QC steps were taken to verify the input/output data for the risk model and data transfer from modeling output files to tables and calculation.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ritchie, L.T.; Johnson, J.D.; Blond, R.M.

    The CRAC2 computer code is a revision of the Calculation of Reactor Accident Consequences computer code, CRAC, developed for the Reactor Safety Study. The CRAC2 computer code incorporates significant modeling improvements in the areas of weather sequence sampling and emergency response, and refinements to the plume rise, atmospheric dispersion, and wet deposition models. New output capabilities have also been added. This guide is to facilitate the informed and intelligent use of CRAC2. It includes descriptions of the input data, the output results, the file structures, control information, and five sample problems.

  11. AFT Program Description Navigation/Strike Tasks. Phase II,

    DTIC Science & Technology

    1972-09-01

    1 Subroutine ............... 2- 96 2-23 Data Input/Output - PMSG : 1 Subroutine ................ 2-97 2-24 Data Input/Output - LPMSG: 1 Subroutine...T99DI3 GOFLAG Exercise Start Flag PAD Roll Rate (degrees/second) PHIS Bank Angle (degrees) PMSG 17 KBP Message INPUT STUDENT FILE DATA 2-41 PMSG T3 KBP...Message CRASH PMSG T4 KBP Message DEPRESS THE RESET-TO-ZERO CONSOLE BUTTON PSI F-4 Heading (degrees) PSIAFT Desired AFT Heading RCIS Average Rate-of

  12. VizieR Online Data Catalog: Photometric study of fourteen low-mass binaries (Korda+, 2017)

    NASA Astrophysics Data System (ADS)

    Korda, D.; Zasche, P.; Wolf, M.; Kucakova, H.; Honkova, K.; Vrastil, J.

    2018-05-01

    All new photometric observations of 14 binaries were carried out in the Ondrejov Observatory in the Czech Republic with the 0.65 m reflecting-type telescope and the G2-3200 CCD camera. Observations were collected from 2015 February to 2016 November in the I, R, and V filters (Bessell 1990PASP..102.1181B). Some of the older observations obtained only in the R filter were used for refining the individual orbital periods. The stars were primarily chosen from the catalog of Hoffman et al. (2008, J/AJ/136/1067). For the selection of suitable stars, we used several criteria. Each binary's classification as a low-mass binary was performed using the photometric indices J-H and H-K, which are known from the 2MASS survey (Cutri et al. 2003, Cat. II/246; J-H>0.25 and H-K>0.07 Pecaut & Mamajek (2013, J/ApJS/208/9; www.pas.rochester.edu/~emamajek/EEMdwarfUBVIJHKcolorsTeff.txt)). Furthermore, we selected binary systems that have short orbital periods (P<1.5 days) and we chose the declination to be higher than +30°. The last criterion was that these systems cannot have been analyzed in detail before. We chose 11 systems in Hoffman's catalog (2008, J/AJ/136/1067), 2 more were found in the measured field (one of them is on the edge of criteria), and 1 star was added later. (6 data files).

  13. Face Alignment via Regressing Local Binary Features.

    PubMed

    Ren, Shaoqing; Cao, Xudong; Wei, Yichen; Sun, Jian

    2016-03-01

    This paper presents a highly efficient and accurate regression approach for face alignment. Our approach has two novel components: 1) a set of local binary features and 2) a locality principle for learning those features. The locality principle guides us to learn a set of highly discriminative local binary features for each facial landmark independently. The obtained local binary features are used to jointly learn a linear regression for the final output. This approach achieves the state-of-the-art results when tested on the most challenging benchmarks to date. Furthermore, because extracting and regressing local binary features are computationally very cheap, our system is much faster than previous methods. It achieves over 3000 frames per second (FPS) on a desktop or 300 FPS on a mobile phone for locating a few dozens of landmarks. We also study a key issue that is important but has received little attention in the previous research, which is the face detector used to initialize alignment. We investigate several face detectors and perform quantitative evaluation on how they affect alignment accuracy. We find that an alignment friendly detector can further greatly boost the accuracy of our alignment method, reducing the error up to 16% relatively. To facilitate practical usage of face detection/alignment methods, we also propose a convenient metric to measure how good a detector is for alignment initialization.

  14. DOVIS 2.0: an efficient and easy to use parallel virtual screening tool based on AutoDock 4.0.

    PubMed

    Jiang, Xiaohui; Kumar, Kamal; Hu, Xin; Wallqvist, Anders; Reifman, Jaques

    2008-09-08

    Small-molecule docking is an important tool in studying receptor-ligand interactions and in identifying potential drug candidates. Previously, we developed a software tool (DOVIS) to perform large-scale virtual screening of small molecules in parallel on Linux clusters, using AutoDock 3.05 as the docking engine. DOVIS enables the seamless screening of millions of compounds on high-performance computing platforms. In this paper, we report significant advances in the software implementation of DOVIS 2.0, including enhanced screening capability, improved file system efficiency, and extended usability. To keep DOVIS up-to-date, we upgraded the software's docking engine to the more accurate AutoDock 4.0 code. We developed a new parallelization scheme to improve runtime efficiency and modified the AutoDock code to reduce excessive file operations during large-scale virtual screening jobs. We also implemented an algorithm to output docked ligands in an industry standard format, sd-file format, which can be easily interfaced with other modeling programs. Finally, we constructed a wrapper-script interface to enable automatic rescoring of docked ligands by arbitrarily selected third-party scoring programs. The significance of the new DOVIS 2.0 software compared with the previous version lies in its improved performance and usability. The new version makes the computation highly efficient by automating load balancing, significantly reducing excessive file operations by more than 95%, providing outputs that conform to industry standard sd-file format, and providing a general wrapper-script interface for rescoring of docked ligands. The new DOVIS 2.0 package is freely available to the public under the GNU General Public License.

  15. VizieR Online Data Catalog: WDMS from LAMOST DR1 (Ren+, 2014)

    NASA Astrophysics Data System (ADS)

    Ren, J. J.; Rebassa-Mansergas, A.; Luo, A. L.; Zhao, Y. H.; Xiang, M. S.; Liu, X. W.; Zhao, G.; Jin, G.; Zhang, Y.

    2014-08-01

    The ascii data of all LAMOST DR1 DA/M binary spectra are presented. The complete table of stellar parameters, magnitudes, radial velocities of the LAMOST DA/M binaries are also provided. The stellar parameters table includes the white dwarf stellar parameters (effective temperature, surface gravity and mass), spectral type of the companions and distance when available, however only those with a S/N higher 12 (second column) are considered in the analysis of the paper. Spectral types of -1 imply that no values are available. For completeness, the table also include 181 systems that are not considered by us as DA/M binaries but that show blue and red components in their spectra. These are flagged as 1 in the last column. The magnitudes table includes the SDSS or Xuyi magnitudes (when available) and coordinates. The radial velocities includes the NaI 8183.27,8194.81 absorption doublet and Halpha emission radial velocities and errors, as well as the Heliocentric Julian dates and the telescope used for obtaining the spectra (either LAMOST or SDSS). (4 data files).

  16. VizieR Online Data Catalog: Minima of 41 binaries from entire Kepler mission (Gies+, 2015)

    NASA Astrophysics Data System (ADS)

    Gies, D. R.; Matson, R. A.; Guo, Z.; Lester, K. V.; Orosz, J. A.; Peters, G. J.

    2016-06-01

    We embarked on a search for eclipse timing variations among a subset of 41 eclipsing binaries that were identified prior to the start of Kepler observations (see our first paper, Gies et al. 2012, cat. J/AJ/143/137). Our first paper documented the eclipse times in observations made over quarters Q0-Q9 (2009.3-2011.5). Now with the Kepler mission complete with observations through Q17 (ending 2013.4), we present here the eclipse timings for our sample of 41 binaries over the entire duration of the mission. The associated times given in our first paper were based upon UTC (Coordinated Universal Time) while the current set uses TDB (Barycentric Dynamical Time), and here we report the times in reduced Barycentric Julian Date (BJD-2400000 days). We used the Simple Aperture Photometry (SAP) flux except in the case of KIC04678873. The list of targets appears in Table1. The eclipse timing measurements were made in almost the same way as described in our first paper. Our measurements appear in Table2. (2 data files).

  17. VizieR Online Data Catalog: UBV light curves of DQ Tau and UZ Tau E (Ardila+, 2015)

    NASA Astrophysics Data System (ADS)

    Ardila, D. R.; Jonhs-Krull, C.; Herczeg, G. J.; Mathieu, R. D.; Quijano-Vodniza, A.

    2016-01-01

    DQ Tau was observed with HST/COS four times per binary orbit, during three consecutive binary orbits, at phases ~0, ~0.2, ~0.5, and ~0.7 (in 2011 Feb, Mar). The original experimental design called for observations of UZ Tau E with the same cadence. However, the NUV observations at phase ~0.7 in the second orbit and both the FUV and NUV observations at phase ~0 in the third orbit failed. They were replaced by observations at phases ~0 and ~0.5 in a fourth binary orbit (in 2011 Feb, Mar, Apr). We obtained contemporaneous ground-based UBV photometry with the 14" telescope from the University of Narino Observatory, optical spectroscopy with the Sandiford Echelle Spectrometer on the 2.1m Otto Struve Telescope at McDonald Observatory, near-infrared spectroscopy with the CSHELL spectrograph on the NASA Infrared Telescope Facility, and near-infrared spectroscopy with GNIRS instrument on Gemini North. In this paper we focus on the U-band photometry only. UBV observations were obtained before and during the HST campaign. See table 3. (1 data file).

  18. "AFacet": a geometry based format and visualizer to support SAR and multisensor signature generation

    NASA Astrophysics Data System (ADS)

    Rosencrantz, Stephen; Nehrbass, John; Zelnio, Ed; Sudkamp, Beth

    2018-04-01

    When simulating multisensor signature data (including SAR, LIDAR, EO, IR, etc...), geometry data are required that accurately represent the target. Most vehicular targets can, in real life, exist in many possible configurations. Examples of these configurations might include a rotated turret, an open door, a missing roof rack, or a seat made of metal or wood. Previously we have used the Modelman (.mmp) format and tool to represent and manipulate our articulable models. Unfortunately Modelman is now an unsupported tool and an undocumented binary format. Some work has been done to reverse engineer a reader in Matlab so that the format could continue to be useful. This work was tedious and resulted in an incomplete conversion. In addition, the resulting articulable models could not be altered and re-saved in the Modelman format. The AFacet (.afacet) articulable facet file format is a replacement for the binary Modelman (.mmp) file format. There is a one-time straight forward path for conversion from Modelman to the AFacet format. It is a simple ASCII, comma separated, self-documenting format that is easily readable (and in many cases usefully editable) by a human with any text editor, preventing future obsolescence. In addition, because the format is simple, it is relatively easy for even the most novice programmer to create a program to read and write AFacet files in any language without any special libraries. This paper presents the AFacet format, as well as a suite of tools for creating, articulating, manipulating, viewing, and converting the 370+ (when this paper was written) models that have been converted to the AFacet format.

  19. Integrated data systems : a summary report.

    DOT National Transportation Integrated Search

    1975-01-01

    The purpose of this report is to provide a general outline of the automated data systems work under way at the Research Council. Included is a discussion of file contents, automated procedures, and outputs provided. In addition, a time schedule for i...

  20. An interactive environment for the analysis of large Earth observation and model data sets

    NASA Technical Reports Server (NTRS)

    Bowman, Kenneth P.; Walsh, John E.; Wilhelmson, Robert B.

    1994-01-01

    Envision is an interactive environment that provides researchers in the earth sciences convenient ways to manage, browse, and visualize large observed or model data sets. Its main features are support for the netCDF and HDF file formats, an easy to use X/Motif user interface, a client-server configuration, and portability to many UNIX workstations. The Envision package also provides new ways to view and change metadata in a set of data files. It permits a scientist to conveniently and efficiently manage large data sets consisting of many data files. It also provides links to popular visualization tools so that data can be quickly browsed. Envision is a public domain package, freely available to the scientific community. Envision software (binaries and source code) and documentation can be obtained from either of these servers: ftp://vista.atmos.uiuc.edu/pub/envision/ and ftp://csrp.tamu.edu/pub/envision/. Detailed descriptions of Envision capabilities and operations can be found in the User's Guide and Reference Manuals distributed with Envision software.

  1. Could Blobs Fuel Storage-Based Convergence between HPC and Big Data?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Matri, Pierre; Alforov, Yevhen; Brandon, Alvaro

    The increasingly growing data sets processed on HPC platforms raise major challenges for the underlying storage layer. A promising alternative to POSIX-IO- compliant file systems are simpler blobs (binary large objects), or object storage systems. Such systems offer lower overhead and better performance at the cost of largely unused features such as file hierarchies or permissions. Similarly, blobs are increasingly considered for replacing distributed file systems for big data analytics or as a base for storage abstractions such as key-value stores or time-series databases. This growing interest in such object storage on HPC and big data platforms raises the question:more » Are blobs the right level of abstraction to enable storage-based convergence between HPC and Big Data? In this paper we study the impact of blob-based storage for real-world applications on HPC and cloud environments. The results show that blobbased storage convergence is possible, leading to a significant performance improvement on both platforms« less

  2. bwtool: a tool for bigWig files

    PubMed Central

    Pohl, Andy; Beato, Miguel

    2014-01-01

    BigWig files are a compressed, indexed, binary format for genome-wide signal data for calculations (e.g. GC percent) or experiments (e.g. ChIP-seq/RNA-seq read depth). bwtool is a tool designed to read bigWig files rapidly and efficiently, providing functionality for extracting data and summarizing it in several ways, globally or at specific regions. Additionally, the tool enables the conversion of the positions of signal data from one genome assembly to another, also known as ‘lifting’. We believe bwtool can be useful for the analyst frequently working with bigWig data, which is becoming a standard format to represent functional signals along genomes. The article includes supplementary examples of running the software. Availability and implementation: The C source code is freely available under the GNU public license v3 at http://cromatina.crg.eu/bwtool. Contact: andrew.pohl@crg.eu, andypohl@gmail.com Supplementary information: Supplementary data are available at Bioinformatics online. PMID:24489365

  3. HDF-EOS 5 Validator

    NASA Technical Reports Server (NTRS)

    Ullman, Richard; Bane, Bob; Yang, Jingli

    2008-01-01

    A computer program partly automates the task of determining whether an HDF-EOS 5 file is valid in that it conforms to specifications for such characteristics as attribute names, dimensionality of data products, and ranges of legal data values. ["HDF-EOS" and variants thereof are defined in "Converting EOS Data From HDF-EOS to netCDF" (GSC-15007-1), which is the first of several preceding articles in this issue of NASA Tech Briefs.] Previously, validity of a file was determined in a tedious and error-prone process in which a person examined human-readable dumps of data-file-format information. The present software helps a user to encode the specifications for an HDFEOS 5 file, and then inspects the file for conformity with the specifications: First, the user writes the specifications in Extensible Markup Language (XML) by use of a document type definition (DTD) that is part of the program. Next, the portion of the program (denoted the validator) that performs the inspection is executed, using, as inputs, the specifications in XML and the HDF-EOS 5 file to be validated. Finally, the user examines the output of the validator.

  4. Data Processing Aspects of MEDLARS

    PubMed Central

    Austin, Charles J.

    1964-01-01

    The speed and volume requirements of MEDLARS necessitate the use of high-speed data processing equipment, including paper-tape typewriters, a digital computer, and a special device for producing photo-composed output. Input to the system is of three types: variable source data, including citations from the literature and search requests; changes to such master files as the medical subject headings list and the journal record file; and operating instructions such as computer programs and procedures for machine operators. MEDLARS builds two major stores of data on magnetic tape. The Processed Citation File includes bibliographic citations in expanded form for high-quality printing at periodic intervals. The Compressed Citation File is a coded, time-sequential citation store which is used for high-speed searching against demand request input. Major design considerations include converting variable-length, alphanumeric data to mechanical form quickly and accurately; serial searching by the computer within a reasonable period of time; high-speed printing that must be of graphic quality; and efficient maintenance of various complex computer files. PMID:14119287

  5. DATA PROCESSING ASPECTS OF MEDLARS.

    PubMed

    AUSTIN, C J

    1964-01-01

    The speed and volume requirements of MEDLARS necessitate the use of high-speed data processing equipment, including paper-tape typewriters, a digital computer, and a special device for producing photo-composed output. Input to the system is of three types: variable source data, including citations from the literature and search requests; changes to such master files as the medical subject headings list and the journal record file; and operating instructions such as computer programs and procedures for machine operators. MEDLARS builds two major stores of data on magnetic tape. The Processed Citation File includes bibliographic citations in expanded form for high-quality printing at periodic intervals. The Compressed Citation File is a coded, time-sequential citation store which is used for high-speed searching against demand request input. Major design considerations include converting variable-length, alphanumeric data to mechanical form quickly and accurately; serial searching by the computer within a reasonable period of time; high-speed printing that must be of graphic quality; and efficient maintenance of various complex computer files.

  6. Long wavelength propagation capacity, version 1.1 (computer diskette)

    NASA Astrophysics Data System (ADS)

    1994-05-01

    File Characteristics: software and data file. (72 files); ASCII character set. Physical Description: 2 computer diskettes; 3 1/2 in.; high density; 1.44 MB. System Requirements: PC compatible; Digital Equipment Corp. VMS; PKZIP (included on diskette). This report describes a revision of the Naval Command, Control and Ocean Surveillance Center RDT&E Division's Long Wavelength Propagation Capability (LWPC). The first version of this capability was a collection of separate FORTRAN programs linked together in operation by a command procedure written in an operating system unique to the Digital Equipment Corporation (Ferguson & Snyder, 1989a, b). A FORTRAN computer program named Long Wavelength Propagation Model (LWPM) was developed to replace the VMS control system (Ferguson & Snyder, 1990; Ferguson, 1990). This was designated version 1 (LWPC-1). This program implemented all the features of the original VMS plus a number of auxiliary programs that provided summaries of the files and graphical displays of the output files. This report describes a revision of the LWPC, designated version 1.1 (LWPC-1.1)

  7. VizieR Online Data Catalog: Light curves of Algol-type binaries. VI. FR Ori (Yang+, 2014)

    NASA Astrophysics Data System (ADS)

    Yang, Y.-G.; Wei, J.-Y.; Li, H.-L.

    2014-09-01

    New photometry of FR Ori was performed in 2012 November (Nov 7-8 and Nov 16-20) and December (Dec 5-9), using the 60cm telescope at the Xinglong Station (XLs) of the National Astronomical Observatories of China (NAOC). A Princeton Instrument 1024*1024 CCD camera was mounted at this telescope. The standard Johnson/Cousins set of BVIR filters was applied. (3 data files).

  8. VizieR Online Data Catalog: BVRI light curves of GR Boo (Wang+, 2017)

    NASA Astrophysics Data System (ADS)

    Wang, D.; Zhang, L.; Han, X. L.; Lu, H.

    2017-11-01

    We observed the eclipsing binary GR Boo on May 12, 22 and 24 in 2015 using the SARA 90-cm telescope located at Kitt Peak National Observatory, Arizona, USA. This telescope was equipped with an ARC CCD camera with a resolution of 2048x2048pixels but used at 2x2 binning, resulting in 1024x1024pixels. We used the Bessel BVRI filters. (1 data file).

  9. Transportable Maps Software. Volume I.

    DTIC Science & Technology

    1982-07-01

    being collected at the beginning or end of the routine. This allows the interaction to be followed sequentially through its steps by anyone reading the...flow is either simple sequential , simple conditional (the equivalent of ’if-then-else’), simple iteration (’DO-loop’), or the non-linear recursion...input raster images to be in the form of sequential binary files with a SEGMENTED record type. The advantage of this form is that large logical records

  10. UFO - The Universal FEYNRULES Output

    NASA Astrophysics Data System (ADS)

    Degrande, Céline; Duhr, Claude; Fuks, Benjamin; Grellscheid, David; Mattelaer, Olivier; Reiter, Thomas

    2012-06-01

    We present a new model format for automatized matrix-element generators, the so-called Universal FEYNRULES Output (UFO). The format is universal in the sense that it features compatibility with more than one single generator and is designed to be flexible, modular and agnostic of any assumption such as the number of particles or the color and Lorentz structures appearing in the interaction vertices. Unlike other model formats where text files need to be parsed, the information on the model is encoded into a PYTHON module that can easily be linked to other computer codes. We then describe an interface for the MATHEMATICA package FEYNRULES that allows for an automatic output of models in the UFO format.

  11. cljam: a library for handling DNA sequence alignment/map (SAM) with parallel processing.

    PubMed

    Takeuchi, Toshiki; Yamada, Atsuo; Aoki, Takashi; Nishimura, Kunihiro

    2016-01-01

    Next-generation sequencing can determine DNA bases and the results of sequence alignments are generally stored in files in the Sequence Alignment/Map (SAM) format and the compressed binary version (BAM) of it. SAMtools is a typical tool for dealing with files in the SAM/BAM format. SAMtools has various functions, including detection of variants, visualization of alignments, indexing, extraction of parts of the data and loci, and conversion of file formats. It is written in C and can execute fast. However, SAMtools requires an additional implementation to be used in parallel with, for example, OpenMP (Open Multi-Processing) libraries. For the accumulation of next-generation sequencing data, a simple parallelization program, which can support cloud and PC cluster environments, is required. We have developed cljam using the Clojure programming language, which simplifies parallel programming, to handle SAM/BAM data. Cljam can run in a Java runtime environment (e.g., Windows, Linux, Mac OS X) with Clojure. Cljam can process and analyze SAM/BAM files in parallel and at high speed. The execution time with cljam is almost the same as with SAMtools. The cljam code is written in Clojure and has fewer lines than other similar tools.

  12. KungFQ: a simple and powerful approach to compress fastq files.

    PubMed

    Grassi, Elena; Di Gregorio, Federico; Molineris, Ivan

    2012-01-01

    Nowadays storing data derived from deep sequencing experiments has become pivotal and standard compression algorithms do not exploit in a satisfying manner their structure. A number of reference-based compression algorithms have been developed but they are less adequate when approaching new species without fully sequenced genomes or nongenomic data. We developed a tool that takes advantages of fastq characteristics and encodes them in a binary format optimized in order to be further compressed with standard tools (such as gzip or lzma). The algorithm is straightforward and does not need any external reference file, it scans the fastq only once and has a constant memory requirement. Moreover, we added the possibility to perform lossy compression, losing some of the original information (IDs and/or qualities) but resulting in smaller files; it is also possible to define a quality cutoff under which corresponding base calls are converted to N. We achieve 2.82 to 7.77 compression ratios on various fastq files without losing information and 5.37 to 8.77 losing IDs, which are often not used in common analysis pipelines. In this paper, we compare the algorithm performance with known tools, usually obtaining higher compression levels.

  13. AN ADA NAMELIST PACKAGE

    NASA Technical Reports Server (NTRS)

    Klumpp, A. R.

    1994-01-01

    The Ada Namelist Package, developed for the Ada programming language, enables a calling program to read and write FORTRAN-style namelist files. A namelist file consists of any number of assignment statements in any order. Features of the Ada Namelist Package are: the handling of any combination of user-defined types; the ability to read vectors, matrices, and slices of vectors and matrices; the handling of mismatches between variables in the namelist file and those in the programmed list of namelist variables; and the ability to avoid searching the entire input file for each variable. The principle user benefits of this software are the following: the ability to write namelist-readable files, the ability to detect most file errors in the initialization phase, a package organization that reduces the number of instantiated units to a few packages rather than to many subprograms, a reduced number of restrictions, and an increased execution speed. The Ada Namelist reads data from an input file into variables declared within a user program. It then writes data from the user program to an output file, printer, or display. The input file contains a sequence of assignment statements in arbitrary order. The output is in namelist-readable form. There is a one-to-one correspondence between namelist I/O statements executed in the user program and variables read or written. Nevertheless, in the input file, mismatches are allowed between assignment statements in the file and the namelist read procedure statements in the user program. The Ada Namelist Package itself is non-generic. However, it has a group of nested generic packages following the nongeneric opening portion. The opening portion declares a variety of useraccessible constants, variables and subprograms. The subprograms are procedures for initializing namelists for reading, reading and writing strings. The subprograms are also functions for analyzing the content of the current dataset and diagnosing errors. Two nested generic packages follow the opening portion. The first generic package contains procedures that read and write objects of scalar type. The second contains subprograms that read and write one and two-dimensional arrays whose components are of scalar type and whose indices are of either of the two discrete types (integer or enumeration). Subprograms in the second package also read and write vector and matrix slices. The Ada Namelist ASCII text files are available on a 360k 5.25" floppy disk written on an IBM PC/AT running under the PC DOS operating system. The largest subprogram in the package requires 150k of memory. The package was developed using VAX Ada v. 1.5 under DEC VMS v. 4.5. It should be portable to any validated Ada compiler. The software was developed in 1989, and is a copyrighted work with all copyright vested in NASA.

  14. User's guide to revised method-of-characteristics solute-transport model (MOC--version 31)

    USGS Publications Warehouse

    Konikow, Leonard F.; Granato, G.E.; Hornberger, G.Z.

    1994-01-01

    The U.S. Geological Survey computer model to simulate two-dimensional solute transport and dispersion in ground water (Konikow and Bredehoeft, 1978; Goode and Konikow, 1989) has been modified to improve management of input and output data and to provide progressive run-time information. All opening and closing of files are now done automatically by the program. Names of input data files are entered either interactively or using a batch-mode script file. Names of output files, created automatically by the program, are based on the name of the input file. In the interactive mode, messages are written to the screen during execution to allow the user to monitor the status and progress of the simulation and to anticipate total running time. Information reported and updated during a simulation include the current pumping period and time step, number of particle moves, and percentage completion of the current time step. The batch mode enables a user to run a series of simulations consecutively, without additional control. A report of the model's activity in the batch mode is written to a separate output file, allowing later review. The user has several options for creating separate output files for different types of data. The formats are compatible with many commercially available applications, which facilitates graphical postprocessing of model results. Geohydrology and Evaluation of Stream-Aquifer Relations in the Apalachicola-Chattahoochee-Flint River Basin, Southeastern Alabama, Northwestern Florida, and Southwestern Georgia By Lynn J. Torak, Gary S. Davis, George A. Strain, and Jennifer G. Herndon Abstract The lower Apalachieola-Chattahoochec-Flint River Basin is underlain by Coastal Plain sediments of pre-Cretaceous to Quaternary age consisting of alternating units of sand, clay, sandstone, dolomite, and limestone that gradually thicken and dip gently to the southeast. The stream-aquifer system consism of carbonate (limestone and dolomite) and elastic sediments, which define the Upper Floridan aquifer and Intermediate system, in hydraulic connection with the principal rivers of the basin and other surface-water features, natural and man made. Separate digital models of the Upper Flori-dan aquifer and Intermediate system were constructed by using the U.S. Geological Survey's MODular Finite-Element model of two dimensional ground-water flow, based on concep- tualizations of the stream-aquifer system, and calibrated to drought conditions of October 1986. Sensitivity analyses performed on the models indicated that aquifer hydraulic conductivity, lateral and vertical boundary flows, and pumpage have a strong influence on groundwater levels. Simulated pumpage increases in the Upper Floridan aquifer, primarily in the Dougherty Plain physiographic district of Georgia,. caused significant reductions in aquifer discharge to streams that eventually flow to Lake Seminole and the Apalachicola River and Bay. Simulated pumpage increases greater than 3 times the October 1986 rates caused drying ofsome stream reaches and parts of the Upper Floridan aquifer in Georgia. Water budgets prepared from simulation results indicate that ground- water discharge to streams and recharge by horizontal and vertical flow are the principal mechanisms for moving water through the flow system. The potential for changes in ground-water quality is high in areas where chemical constituents can be mobilized by these mechanisms. Less than 2 percent of ground-water discharge to streams comes from the Intermediate system; thus, it plays a minor role in the hydrodynamics of the stream- aquifer system.

  15. File format for normalizing radiological concentration exposure rate and dose rate data for the effects of radioactive decay and weathering processes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kraus, Terrence D.

    2017-04-01

    This report specifies the electronic file format that was agreed upon to be used as the file format for normalized radiological data produced by the software tool developed under this TI project. The NA-84 Technology Integration (TI) Program project (SNL17-CM-635, Normalizing Radiological Data for Analysis and Integration into Models) investigators held a teleconference on December 7, 2017 to discuss the tasks to be completed under the TI program project. During this teleconference, the TI project investigators determined that the comma-separated values (CSV) file format is the most suitable file format for the normalized radiological data that will be outputted frommore » the normalizing tool developed under this TI project. The CSV file format was selected because it provides the requisite flexibility to manage different types of radiological data (i.e., activity concentration, exposure rate, dose rate) from other sources [e.g., Radiological Assessment and Monitoring System (RAMS), Aerial Measuring System (AMS), Monitoring and Sampling). The CSV file format also is suitable for the file format of the normalized radiological data because this normalized data can then be ingested by other software [e.g., RAMS, Visual Sampling Plan (VSP)] used by the NA-84’s Consequence Management Program.« less

  16. IDSP- INTERACTIVE DIGITAL SIGNAL PROCESSOR

    NASA Technical Reports Server (NTRS)

    Mish, W. H.

    1994-01-01

    The Interactive Digital Signal Processor, IDSP, consists of a set of time series analysis "operators" based on the various algorithms commonly used for digital signal analysis work. The processing of a digital time series to extract information is usually achieved by the application of a number of fairly standard operations. However, it is often desirable to "experiment" with various operations and combinations of operations to explore their effect on the results. IDSP is designed to provide an interactive and easy-to-use system for this type of digital time series analysis. The IDSP operators can be applied in any sensible order (even recursively), and can be applied to single time series or to simultaneous time series. IDSP is being used extensively to process data obtained from scientific instruments onboard spacecraft. It is also an excellent teaching tool for demonstrating the application of time series operators to artificially-generated signals. IDSP currently includes over 43 standard operators. Processing operators provide for Fourier transformation operations, design and application of digital filters, and Eigenvalue analysis. Additional support operators provide for data editing, display of information, graphical output, and batch operation. User-developed operators can be easily interfaced with the system to provide for expansion and experimentation. Each operator application generates one or more output files from an input file. The processing of a file can involve many operators in a complex application. IDSP maintains historical information as an integral part of each file so that the user can display the operator history of the file at any time during an interactive analysis. IDSP is written in VAX FORTRAN 77 for interactive or batch execution and has been implemented on a DEC VAX-11/780 operating under VMS. The IDSP system generates graphics output for a variety of graphics systems. The program requires the use of Versaplot and Template plotting routines and IMSL Math/Library routines. These software packages are not included in IDSP. The virtual memory requirement for the program is approximately 2.36 MB. The IDSP system was developed in 1982 and was last updated in 1986. Versaplot is a registered trademark of Versatec Inc. Template is a registered trademark of Template Graphics Software Inc. IMSL Math/Library is a registered trademark of IMSL Inc.

  17. Tracking state deployments of commercial vehicle information systems and networks : 1998 Washington State report

    DOT National Transportation Integrated Search

    1999-12-01

    Volume III of the Logical Architecture contract deliverable documents the Data Dictionary. This formatted version of the Teamwork model data dictionary is mechanically produced from the Teamwork CDIF (Case Data Interchange Format) output file. It is ...

  18. Spec2Harv: Converting Spectrum output to HARVEST input

    Treesearch

    Eric J. Gustafson; Luke V. Rasmussen; Larry A. Leefers

    2003-01-01

    Spec2Harv was developed to automate the conversion of harvest schedules generated by the Spectrum model into script files that can be used by the HARVEST simulation model to simulate the implementation of the Spectrum schedules in a spatially explicit way.

  19. Optical flip-flops in a polarization-encoded optical shadow-casting scheme.

    PubMed

    Rizvi, R A; Zubairy, M S

    1994-06-10

    We propose a novel scheme that optically implements various types of binary sequential logic elements. This is based on a polarization-encoded optical shadow-casting system. The proposed system architecture is capable of implementing synchronous as well as asynchronous sequential circuits owing to the inherent structural flexibility of optical shadow casting. By employing the proposed system, we present the design and implementation schemes of a J-K flip-flop and clocked R-S and D latches. The main feature of these flip-flops is that the propagation of the signal from the input plane to the output (i.e., processing) and from the output plane to the source plane (i.e., feedback) is all optical. Consequently the efficiency of these elements in terms of speed is increased. The only electronic part in the system is the detection of the outputs and the switching of the source plane.

  20. Highly parallel reconfigurable computer architecture for robotic computation having plural processor cells each having right and left ensembles of plural processors

    NASA Technical Reports Server (NTRS)

    Fijany, Amir (Inventor); Bejczy, Antal K. (Inventor)

    1994-01-01

    In a computer having a large number of single-instruction multiple data (SIMD) processors, each of the SIMD processors has two sets of three individual processor elements controlled by a master control unit and interconnected among a plurality of register file units where data is stored. The register files input and output data in synchronism with a minor cycle clock under control of two slave control units controlling the register file units connected to respective ones of the two sets of processor elements. Depending upon which ones of the register file units are enabled to store or transmit data during a particular minor clock cycle, the processor elements within an SIMD processor are connected in rings or in pipeline arrays, and may exchange data with the internal bus or with neighboring SIMD processors through interface units controlled by respective ones of the two slave control units.

Top